Abstract Search

ISEF | Projects Database | Finalist Abstract

Back to Search Results | Print PDF

Project Mashiro: Synthetic Aperture Radar Imaging with Inertial Sensor Fusion Using Custom FMCW Hardware and Extensible DSP

Booth Id:
EBED010T

Category:
Embedded Systems

Year:
2021

Finalist Names:
Sehgal, Pranav (School: Beaverton Academy of Science and Engineering)
Solberg, Tim (School: Beaverton Academy of Science and Engineering)

Abstract:
Sensors empower the modern world as we know it: from automobilism to weather forecasts to aviation to temperature controlled homes. Some of these applications require precision geospatial information to function. Most traditional mapping methods utilize visible spectrum imaging to construct databases of stitched pictures, recreating objects and environments in 3D space. Although traditional methods can be accessible and intuitive, techniques that take advantage of the broader electromagnetic spectrum may offer various benefits. In particular, the use of radio frequency signals for imaging enable better performance in conditions that limit conventional optical imaging methods and provide another dimension of previously inaccessible information. Synthetic aperture radar imaging (SAR) produces improved spatial resolution by effectively synthesizing a radar system with a greatly increased aperture using signal processing algorithms like Omega-K while minimizing undesirable focusing phenomena like Fraunhofer diffraction. Advancements in VLSI and microelectromechanical devices (MEMS) enable access to cost effective and physically diminutive inertial measurement sensors with on board processing as off-the-shelf devices. Our approach improves on traditional SAR techniques by effectively applying sensor fusion capability to improve resolution and dependability of the resolved radar image, complementing existing processing methods like minimum entropy autofocus. We are building original hardware and software from the ground up to fully demonstrate the viability of our solution in real world environments. This novel application of motion correction stems from our previous projects involving unmanned aerial vehicles and amateur rocketry avionics.

Awards Won:
Second Award of $2,000