On stochastic optimization and the Adam optimizer: Divergence, convergence rates, and acceleration techniques
Mathematics (jointly with Jiequn Han and Weinan E) by the International Congress of Basic Science (ICBS) (2024). Details on the activities of his research group can be found at the webpage http://www.ajentzen [...] References: [1] S. Dereich & A. Jentzen, Convergence rates for the Adam optimizer, arXiv:2407.21078 (2024), 43 pages. [2] S. Dereich, R. Graeber, & A. Jentzen, Non-convergence of Adam and other adaptive [...] stochastic gradient descent optimization methods for non-vanishing learning rates, arXiv:2407.08100 (2024), 54 pages. [3] T. Do, A. Jentzen, & A. Riekert, Non-convergence to the optimal risk for Adam and …