Iutzeler, Franck
, Pauwels, Edouard
ORCID: https://orcid.org/0000-0002-8180-075X and Vaiter, Samuel
(2024)
Derivatives of Stochastic Gradient Descent in parametric optimization.
In: Advances in Neural Information Processing Systems 37
Globerson, A., Mackey, L., Belgrave, D., Fan, A., Paquet, U., Tomczak, J. and Zhang, Chi (eds.)
Neural Information Processing Systems Foundation, Inc. (NeurIPS).
Series “Advances in Neural Information Processing Systems,volume 37”
Vancouver pp. 118859-118882.
ISBN 9798331314385
Preview |
Text
Download (2MB) | Preview |
Abstract
We consider stochastic optimization problems where the objective depends on some parameter, as commonly found in hyperparameter optimization for instance. We investigate the behavior of the derivatives of the iterates of Stochastic Gradient Descent (SGD) with respect to that parameter and show that they are driven by an inexact SGD recursion on a different objective function, perturbed by the convergence of the original SGD. This enables us to establish that the derivatives of SGD converge to the derivative of the solution mapping in terms of mean squared error whenever the objective is strongly convex. Specifically, we demonstrate that with constant step-sizes, these derivatives stabilize within a noise ball centered at the solution derivative, and that with vanishing step-sizes they exhibit O(log(k)2/k) convergence rates. Additionally, we prove exponential convergence in the interpolation regime. Our theoretical findings are illustrated by numerical experiments on synthetic tasks.
| Item Type: | Book Section |
|---|---|
| Language: | English |
| Date: | 2024 |
| Place of Publication: | Vancouver |
| Subjects: | B- ECONOMIE ET FINANCE |
| Divisions: | TSE-R (Toulouse) |
| Site: | UT1 |
| Date Deposited: | 22 Jan 2026 10:45 |
| Last Modified: | 02 Feb 2026 14:36 |
| OAI Identifier: | oai:tse-fr.eu:131269 |
| URI: | https://publications.ut-capitole.fr/id/eprint/51802 |

Tools
Tools
