Bolte, JérômeIdRefORCIDORCID: https://orcid.org/0000-0002-1676-8407, Pauwels, EdouardIdRefORCIDORCID: https://orcid.org/0000-0002-8180-075X and Vaiter, SamuelIdRef (2023) One-step differentiation of iterative algorithms. Advances in Neural Information Processing Systems 36. pp. 77089-77103.

[thumbnail of NeurIPS-2023-one-step-differentiation-of-iterative-algorithms-Paper-Conference.pdf]
Preview
Text
Download (810kB) | Preview

Abstract

In appropriate frameworks, automatic differentiation is transparent to the user, at the cost of being a significant computational burden when the number of operations is large. For iterative algorithms, implicit differentiation alleviates this issue but requires custom implementation of Jacobian evaluation. In this paper, we study one-step differentiation, also known as Jacobian-free backpropagation, a method as easy as automatic differentiation and as performant as implicit differentiation for fast algorithms (e.g. superlinear optimization methods). We provide a complete theoretical approximation analysis with specific examples (Newton's method, gradient descent) along with its consequences in bilevel optimization. Several numerical examples illustrate the well-foundness of the one-step estimator.

Item Type: Article
Language: English
Date: 2023
Refereed: Yes
Subjects: B- ECONOMIE ET FINANCE
Divisions: TSE-R (Toulouse)
Site: UT1
Date Deposited: 02 Feb 2026 14:40
Last Modified: 02 Feb 2026 14:40
OAI Identifier: oai:tse-fr.eu:131304
URI: https://publications.ut-capitole.fr/id/eprint/51899
View Item

Downloads

Downloads per month over past year