Digital Twin: Human Biomechanics and Squat Analysis for

Rehabilitation using Accelerometer Sensor Data, Unity 3D, and Raspberry Pi

Current Students: Ian Abeyta, PhD Student and Jason Weinstein

Status: Current (Poster)

The goal is to improve exercise technique and prevent injuries by integrating a human digital twin with a multi-sensor device to monitor human biomechanics. Unity 3D & Mixamo are used to visualize each joint as the rigging set-up for 3D animation. Sensor data is obtained from prototype hardware technology that integrates 3 accelerometers attached to the joint centers of the hip, knee and ankle designed to capture the complex dynamics of human motion in real-time. Collected data is used for machine learning analysis, which would enhance the Unity model to provide feedback for proper exercise technique. The objectives of this study is to assess the validity and reliability of machine learning models’ ability to correctly recognize squat exercise movements.
1) Analyze the accelerometer data collected from inertial mass units (IMU) on human movement.
2) Operationalize this information as features for machine learning models.
3) Compare analysis to single accelerometer recorded movement as well as 3D video data capture of human movement during squat exercises.
4) Evaluate the collected data against full-body pose databases (DIP-IMU)
5) Provide real-time feedback for better squat exercises

Floor Floor Floor

Squat exercise joint movement

Squat exercise joint movement for an avatar using Mixamo

Several location targets have been added to the map to allow users to navigate to key locations within the building
Floor Floor Floor

How it works – Post Processing: Current device configuration: 6 sensors = 3 right leg + 3 left leg. 1,000 readings per second

Data Analysis

BioInformatics – Machine Learning – Augmented Reality version 1 (BIMLAR 1)
CPU: 1 Raspberry Pi 4 2MB

The accelerometers are located on three positions on the right side of the human body, at a close approximation to joint centers. The hip sensor are placed at the anterior superior iliac spine of the hip girdle. The knee sensor is placed on the lateral aspect of the lateral epicondyle of the femur. The ankle sensor is placed on the lateral malleolus of the ankle. Each sensor is encased in a 3D-printed, plastic housing through will protect against minor impacts. Each case is secured to the body with Velcro straps and elastic bands. The central processing unit is a Raspberry Pi 4 B, 2MB RAM [RPi4]; during movement, the RPi4 will maintain connection to an external monitor, will require power from a USB-C cable, and will have a USB fob connected to an external mouse and keyboard. Python 3 programming will be used to gather XYZ acceleration data from the sensors at a rate of 120Hz, which upon termination will produce a timestamped CSV file to be stored on the RPi4 SD card.

Applications for Use

  • Fitness
    • Personal Training: ‘Proper’ technique can be evaluated and shared
    • Martial Arts: study new skills through an avatar or against an opponent
    • Dance: learn new moves inside of a ghost or in an ensemble
  • Physical therapy
    • Telemedicine: Patients are not required to be at PT office for each meeting
    • Patient Adherence: PTs can evaluate patient rehabilitation
    • Explicit Knowledge: Greater precision/fidelity from one PT to the next
  • Entertainment
    • Movies: Computer-generated imagery for fictional characters and safer stunts
    • Video Games: multiplayer competitions such as races or battle royal

Device for Motion Capture

An Arduino Mega 2560 Microcontroller is programmed with C++ to receive sensor data via a PCA 9548A I2C Multiplexer.
6 LSM6DSOX (6 DOF) Inertial Mass Units (IMU) are wired for various segments to rest on the lateral aspect of each leg.
Python interprets the output from each sensor that produces xyz acceleration data and xyz gyroscope data at a rate of 40 hertz, 
Data is then used to calculate xyz displacement data via the Euler method.

In  conclusion, the aim of this project is to make human biomechanics more accessible for a wider audience through the use of Unity for data visualization. Numerous sensors are being used. This radically improves their utility for populations that cannot dedicate the space and funding to video motion capture endeavors.

Publications

  1. Sharma, S., Pesaladinne R., "Spatial Analysis and Visual Communication of Emergency Information through Augmented Reality", Proceedings of the IS&T International Symposium on Electronic Imaging (EI 2024) in the Engineering Reality of Virtual Reality Conference, DOI: 10.2352/J.ImagingSci.Technol.2023.67.6.060401, January 21-25, 2024.

DVXR LABORATORY (C) 2022-2024, ALL RIGHTS RESERVED