Abstract Search

ISEF | Projects Database | Finalist Abstract

Back to Search Results | Print PDF

Enhancing Mobility Assistance: A Point Cloud and IMU-Based Terrain-Adaptive Lower Extremity Exoskeleton Design

Booth Id:
ROBO074

Category:
Robotics and Intelligent Machines

Year:
2024

Finalist Names:
Zhang, Jiawen (School: Horace Mann School)

Abstract:
Patients with mobility issues or lower limb injuries often require caregiver assistance. Lower extremity exoskeletons can ease this burden. However, the current models lack terrain adjusting function, affecting environmental adaptability. This study proposes an exoskeleton design using Inertia Measurement Units and RGBD cameras to classify terrains and make gait estimation. The IMUs acquire lower extremity roll angles, and the RGBD camera captures point cloud data of the terrain. The roll angles are passed into a deep linear neural network trained with roll angles on different terrains, and the model outputs predicted roll angles. The point cloud data from the camera are inputted into a PointNet++ model that returns the classified terrain. The PointNet++ model is activated when the user switches terrains to help achieve a smooth transition, as transitioning terrains change the pattern of the roll angles. Pre-recorded angles of a new terrain will used as the first few inputs but will then switch back to the prediction of the linear model when angles on the new terrain are recorded. The predicted angles are sent to the servo actuators connected to the leg joints, which align the lower limbs to those angles. The results show that the lower-limb exoskeleton accurately classifies the terrains and makes accurate roll angle predictions. The exoskeleton can thus provide effective assistance of steady movement for users.