Abstract Search

ISEF | Projects Database | Finalist Abstract

Back to Search Results | Print PDF

Comparison Between Gradient Descent and Newton Method Optimization Algorithms Resulting in the Creation of a Unique Algorithm

Booth Id:
SOFT053

Category:
Systems Software

Year:
2018

Finalist Names:
Fosdick, Matthew (School: Empire High School)

Abstract:
Neural networks are widely used to either determine what group an item belongs in or what value a certain set of data would output using two basic types of derivative-based optimization algorithms: first order and second order. This project seeks to both determine how these neural networks perform relative to each other using a direct comparison, and also create a new optimization algorithm with information gained from studying the performance of a first order and second order optimization algorithm. This experiment found that for the dataset used, the first order optimization algorithm preformed better than the second order optimization algorithms. This indicated that more simple algorithms preformed better than more complicated algorithm. A new optimization algorithm was created, a "zero order" algorithm, which used solely the loss level at different synapse values to lower the loss rather than calculating any derivatives. The algorithm works by randomly generating many synapses and calculating loss for them. Then the algorithm averages all the synapses, weighted on their loss value to create a better set of synapses. This set is added to the pool of synapses and the average repeats, theoretically decreasing loss in each iteration. The 0th order optimization algorithm is initially unsuccessful, as the loss oscillates between two values and slowly converges. This may be due to the way that the weights for the average are calculated, and further experimentation on the function used to calculate values is needed. If this 0th order algorithm is successful, almost any machine learning process can be sped up, and critical systems that utilize machine learning such as medical networks, or space station stabilization processes can be trained faster and more reliably.