This article is an orphan, as no other articles link to it. Please introduce links to this page from related articles; try the Find link tool for suggestions. (October 2016) |
The Bregman-Lagrangian framework permits a systematic understanding of the matching rates associated with higher-order gradient methods in discrete and continuous time. Based on Bregman divergence, the Lagrangian is a continuous time dynamical system whose Euler-Lagrange equations can be linked to Nesterov's accelerated gradient method for gradient-based optimization. The associated Bregman Hamiltonian allows for practical implementation of numerical discretizations. The approach has been generalized to optimization on Riemannian manifolds.
References
- Wibisono, Andre; Wilson, Ashia C.; Jordan, Michael I. (March 14, 2016). "A variational perspective on accelerated methods in optimization". Proceedings of the National Academy of Sciences. 113 (47): E7351 – E7358. arXiv:1603.04245v1. Bibcode:2016PNAS..113E7351W. doi:10.1073/pnas.1614734113. PMC 5127379. PMID 27834219.
- Zhang, Peiyuan; Orvieto, Antonio; Daneshmand, Hadi (2021). Rethinking the Variational Interpretation of Accelerated Optimization Methods. Curran Associates, Inc. pp. 14396–14406. Retrieved 17 December 2024.
- Bravetti, Alessandro; Daza-Torres, Maria L.; Flores-Arguedas, Hugo; Betancourt, Michael (June 2023). "Bregman dynamics, contact transformations and convex optimization". Information Geometry. 6 (1): 355–377. arXiv:1912.02928. doi:10.1007/s41884-023-00105-0.
- Duruisseaux, Valentin; Leok, Melvin (June 2022). "A Variational Formulation of Accelerated Optimization on Riemannian Manifolds". SIAM Journal on Mathematics of Data Science. 4 (2): 649–674. arXiv:2101.06552. doi:10.1137/21M1395648.
This physics-related article is a stub. You can help Misplaced Pages by expanding it. |