Booth Id:
ROBO076
Category:
Robotics and Intelligent Machines
Year:
2022
Finalist Names:
Shunk, Joshua (School: Perry High School)
Abstract:
In order to develop complex relationships between their inputs and outputs, deep neural networks train, and adjust, a large number of parameters. To make these networks function at high accuracy, vast amounts of data are needed. Sometimes, however, the quantity of data needed is not present, or obtainable, for accurate training. Neuron-specific dropout (NSDropout) is a tool to address this problem. NSDropout looks at both the training and validation passes of a layer in a model. By comparing the average values produced by each neuron, for each class in a data set, the network is able to drop specific neurons. NSDropout is able to predict what features, or noise, the model is looking at during training that is not present when looking at samples from validation. Unlike dropout, the "thinned" networks cannot be "unthinned" for testing. Neuron-specific dropout was tested on three standardized image classification datasets, and trained on various architectures, using various amounts of data. NSDropout was compared to traditional regularization methods. Salient features were also analyzed such as; changing the proportion of neurons dropped, the effect of data set size, and the effect of the mask. Neuron-specific dropout has proved to achieve similar, if not better, testing accuracy with far less data than traditional regularization methods including dropout. Experimentation has shown that neuron-specific dropout reduces the chance of a network overfitting, and reduces the need for large training samples on supervised learning tasks in image recognition, all while producing best-in-class results.
Awards Won:
First Award of $5,000
Dudley R. Herschbach SIYSS Award
Arizona State University: Arizona State University ISEF Scholarship (valued at up to $52,000 each)
National Security Agency Research Directorate : Second Place Award “Cybersecurity”