Booth Id:
ROBO007
Category:
Robotics and Intelligent Machines
Year:
2018
Finalist Names:
Ivanov, Ivan (School: American College of Sofia)
Abstract:
During a prolonged time execution, deep recurrent neural networks suffer from the so-called \textit{long-term dependency problem} due to their recurrent connection. Although Long Short-Term Memory (LSTM) networks provide a temporary solution to this problem, they have inferior long-term memory capabilities which limit their applications. We use a recent approach for a recurrent neural network model implementing a unitary matrix in its recurrent connection to deal with long-term dependencies, without affecting its memory abilities. The model is capable of high technical results, but due to insufficient implementation does not achieve the expected performance. We optimize the implementation and architecture of the model, achieving time performance up to 5 times better than the original implementation. Additionally, we apply our improved model to three common real-life problems: the automatic text understanding task, the speech recognition task, and cryptoanalysis, and outperform the widely used LSTM model.