Abstract Search

ISEF | Projects Database | Finalist Abstract

Back to Search Results | Print PDF

Giving Robot Hands a Feeling

Booth Id:
EBED005

Category:
Embedded Systems

Year:
2024

Finalist Names:
Bao, Edward (School: BASIS San Antonio Shavano Campus)

Abstract:
Grasping is a critical function for human-like robots to automate human labor. Currently, robotic grasping is limited by robotic hands' inability to precisely set grasping conditions before touching the object. The solution is to optimize pre-touch sensors integrated into robot hands for versatile ranging and material detection of the objects. In this project, a new optoacoustic sensor is designed, fabricated, and tested to address this fundamental challenge. It focuses and reflects short laser pulses onto the target to generate wideband ultrasound signals, which are received by a microphone built into the sensor. The sensor's range to the target is estimated from the travel time of the ultrasound signals. The material and structure type of the target is predicted from the frequency spectra by a machine learning classifier. Experimental results show that the overall ranging accuracy (after calibration with curving fitting) is good and sufficient for estimating the sensor's distance for planning the grasping. The classifier gives high accuracy in differentiating common household materials, various thicknesses and inner structures, and levels of fruit firmness. Therefore, the optoacoustic pre-touch sensor is viable for integration into robotic hands to provide critical target information for more accurate robotic grasping.