The goal is to improve exercise technique and prevent injuries by integrating a human
digital twin with a multi-sensor device to monitor human biomechanics. Unity 3D &
Mixamo are used to visualize each joint as the rigging set-up for 3D animation. Sensor
data is obtained from prototype hardware technology that integrates 3 accelerometers
attached to the joint centers of the hip, knee and ankle designed to capture the complex
dynamics of human motion in real-time. Collected data is used for machine learning
analysis, which would enhance the Unity model to provide feedback for proper exercise
technique. The objectives of this study is to assess the validity and reliability
of machine learning models’ ability to correctly recognize squat exercise movements.
1) Analyze the accelerometer data collected from inertial mass units (IMU) on human
movement.
2) Operationalize this information as features for machine learning models.
3) Compare analysis to single accelerometer recorded movement as well as 3D video
data capture of human movement during squat exercises.
4) Evaluate the collected data against full-body pose databases (DIP-IMU)
5) Provide real-time feedback for better squat exercises
Squat exercise joint movement |
Squat exercise joint movement for an avatar using Mixamo |
Several location targets have been added to the map to allow users to navigate to key locations within the building |
How it works – Post Processing: Current device configuration: 6 sensors = 3 right leg + 3 left leg. 1,000 readings per second |
Data Analysis |
BioInformatics – Machine Learning – Augmented Reality version 1 (BIMLAR 1) |
The accelerometers are located on three positions on the right side of the human body, at a close approximation to joint centers. The hip sensor are placed at the anterior superior iliac spine of the hip girdle. The knee sensor is placed on the lateral aspect of the lateral epicondyle of the femur. The ankle sensor is placed on the lateral malleolus of the ankle. Each sensor is encased in a 3D-printed, plastic housing through will protect against minor impacts. Each case is secured to the body with Velcro straps and elastic bands. The central processing unit is a Raspberry Pi 4 B, 2MB RAM [RPi4]; during movement, the RPi4 will maintain connection to an external monitor, will require power from a USB-C cable, and will have a USB fob connected to an external mouse and keyboard. Python 3 programming will be used to gather XYZ acceleration data from the sensors at a rate of 120Hz, which upon termination will produce a timestamped CSV file to be stored on the RPi4 SD card.
Applications for Use
Device for Motion Capture
An Arduino Mega 2560 Microcontroller is programmed with C++ to receive sensor data
via a PCA 9548A I2C Multiplexer.
6 LSM6DSOX (6 DOF) Inertial Mass Units (IMU) are wired for various segments to rest
on the lateral aspect of each leg.
Python interprets the output from each sensor that produces xyz acceleration data
and xyz gyroscope data at a rate of 40 hertz,
Data is then used to calculate xyz displacement data via the Euler method.
In conclusion, the aim of this project is to make human biomechanics more accessible for a wider audience through the use of Unity for data visualization. Numerous sensors are being used. This radically improves their utility for populations that cannot dedicate the space and funding to video motion capture endeavors.
Publications
DVXR LABORATORY (C) 2022-2024, ALL RIGHTS RESERVED