Abstract Search

ISEF | Projects Database | Finalist Abstract

Back to Search Results | Print PDF

MiniMesh: Real-Time 5,000-Node Anatomical Human Body Mesh Reconstruction for Portable Devices

Booth Id:
ENBM078

Category:
Biomedical Engineering

Year:
2024

Finalist Names:
Mathew, Daniel (School: Poolesville High School)

Abstract:
Human Mesh Recovery is the challenge of predicting thousands of three-dimensional points on the human body surface, supporting critical applications in medicine, augmented reality, animation, and more. MiniMesh is a portable solution for this computationally complex task that accurately reconstructs a 5,000-node human body mesh in real-time while maintaining anatomical validity. This is vital for medical applications including assessment of gait, robotic surgery, and rapid anthropometric measurement. While previous alternatives run offline or lack medical viability, MiniMesh preserved efficiency and accuracy by splitting the volumetric pose estimation problem into several efficient kinematic and planar pose estimation tasks. This unorthodox approach maintained accuracy comparable to state-of-the-art solutions at a fraction of the processing time. The kinematic pose estimation models computed 127 2D landmarks and solved the Perspective-n-Point (PnP) problem to calculate 3D orientation. Concurrently, the planar pose estimation algorithm extracted the human silhouette from the image in real-time. Both pieces of spatial information were fed into a custom rendering engine that mapped every point on a human subject’s image to a standard 3D mesh. Without further optimization, MiniMesh can process 20 images per second, with some models running in less than 6 milliseconds per image, all on a single core of a CPU. All parts of the algorithm, and as a whole, ran above 90% accuracy on a laptop and maintained real-time processing speed. Running on only portable devices, MiniMesh is a novel, resource-efficient solution to the complex human mesh reconstruction problem—making this technology available to everyone.