Abstract Search

ISEF | Projects Database | Finalist Abstract

Back to Search Results | Print PDF

Ophidia Vision: Inside the Terrarium Lab

Booth Id:
ROBO002

Category:
Robotics and Intelligent Machines

Year:
2023

Finalist Names:
Bostic, Sydney (School: Spring Mills High School)

Abstract:
“Ophidia Vision: Inside the Terrarium Lab” is an experiment that used machine learning to detect a reptile’s identity (a ball python). The scientific question is: Can a facial recognition machine learning model be modified and trained to detect a reptile’s identity? Using machine learning and cameras is less invasive to animals than other current ways of animal identification. The purpose of the experiment was to convert an existing machine learning model for face recognition of humans into a model to recognize a baby ball python snake. Koch, Zemel, & Salakhutdinov (2015) included the main data source used to lay the foundation for implementing a new type of model for a reptile. The experiment included photography, implementing a code project, learning to use many python language platforms, training the model, and analyzing the results. A library called OpenCV (computer vision) was a source I added to this experiment to program a laptop’s webcam as my camera. The original snake images (the dependent variable) and modified images were used to create my new facial recognition model for the ball python. I used a Siamese neural network to train my model to learn the bodies of reptiles and its one-shot model to analyze the collection of images in a sequence. Colab imported an image dataset to train an image classifier on it and to evaluate a model using a few lines of code. The hypothesis was: 70% of the time, the precision rate of the model will be greater than 0.5 on a scale of 0 to 1. My data supported my hypothesis, and the precision rate of the experiment was 100%. I believe this experiment was a step toward creating safer engineering practices for snakes. I also felt like a social justice worker and an engineer.