Master's Thesis Presentation
Department of Statistics
The University of Chicago
“Comparison of Optimization Algorithms and Potential Improvement of ADAM Based on Time-Series”
Wednesday, October 4, 2023, at 10:30 AM
Jones 303, 5747 S. Ellis Avenue
In this study, we provide a comprehensive comparison of various optimization methods across multiple model architectures and introduce a novel Adam-derived optimization technique that aims to enhance the performance observed in the conventional Adam method. Initially, we conduct an exhaustive evaluation of prominent optimization techniques, including SGD, ASGD, and ADAM. We assess their performance in terms of convergence and stability, especially when complemented with learning rate decay strategies. This comparative study spans two primary models: the Convolutional Neural Network (CNN) and the Deep Q-Network (DQN) – a model pivotal for game-based machine learning.
Building on the foundation of the Adam optimization method, we present "TAdam", a sophisticated variant that incorporates a time series function to achieve exponential decay. Our modification to the time series function ensures that TAdam is not just an iteration of Adam but a more evolved version. Preliminary results indicate that TAdam consistently outperforms the standard Adam optimizer in a variety of scenarios, suggesting its potential as a robust optimization choice for future machine learning endeavors.