Group by: Item Type | Date | No Grouping
Number of items: 13.

Bolte, JérômeIdRef, Combettes, CyrilleIdRef and Pauwels, EdouardIdRef (2024) The iterates of the Frank–Wolfe algorithm may not converge. Mathematics of Operations Research, Vol. 49 (N° 4). pp. 2049-2802.

Bolte, JérômeIdRef, Pauwels, EdouardIdRef and Silveti Falls, AntonioIdRef (2024) Differentiating nonsmooth solutions to parametric monotone inclusion problems. SIAM Journal on Optimization, Vol. 34 (N° 1/2024).

Bolte, JérômeIdRef, Le, TamIdRef, Pauwels, EdouardIdRef and Silveti-Falls, AntonioIdRef (2022) Nonsmooth implicit differentiation for machine learning and optimization. In: NIPS'21: 35th International Conference on Neural Information Processing Systems, 6-14 décembre 2021, En ligne.

Bolte, JérômeIdRef, Le, Tam and Pauwels, EdouardIdRef (2022) Subgradient sampling for nonsmooth nonconvex minimization. TSE Working Paper, n. 22-1310, Toulouse

Bolte, JérômeIdRef, Glaudin, Lilian, Pauwels, EdouardIdRef and Serrurier, Matthieu (2021) A Hölderian backtracking method for min-max and min-min problems. TSE Working Paper, n. 21-1243, Toulouse

Bolte, JérômeIdRef, Chen, Zheng and Pauwels, EdouardIdRef (2020) The multiproximal linearization method for convex composite problems. Mathematical Programming, vol. 182. pp. 1-36.

Bolte, JérômeIdRef, Pauwels, EdouardIdRef and Rios-Zertuche, Rodolfo (2020) Long term dynamics of the subgradient method for Lipschitz path differentiable functions. TSE Working Paper, n. 20-1110, Toulouse

Bolte, JérômeIdRef and Pauwels, EdouardIdRef (2020) Curiosities and counterexamples in smooth convex optimization. TSE Working Paper, n. 20-1080, Toulouse

Bolte, JérômeIdRef and Pauwels, EdouardIdRef (2020) A mathematical model for automatic differentiation in machine learning. In: Advances in Neural Information Processing Systems 33 (NeurIPS 2020) Larochelle, Hugo, Ranzato, M., Hadsell, R., Balcan, M.F. and Lin, H. (eds.) MIT Press. ISBN 9781713829546 (In Press)

Bolte, JérômeIdRef and Pauwels, EdouardIdRef (2019) Conservative set valued fields, automatic differentiation, stochastic gradient methods and deep learning. Mathematical Programming. pp. 1-33. (In Press)

Bolte, JérômeIdRef, Castera, Camille, Pauwels, EdouardIdRef and Févotte, CédricIdRef (2019) An Inertial Newton Algorithm for Deep Learning. TSE Working Paper, n. 19-1043, Toulouse

Bolte, JérômeIdRef, Hochart, Antoine and Pauwels, Edouard (2018) Qualification conditions in semi-algebraic programming. SIAM Journal on Optimization, 28 (2). pp. 1867-1891.

Bolte, JérômeIdRef and Pauwels, EdouardIdRef (2016) Majorization-minimization procedures and convergence of SQP methods for semi-algebraic and tame programs. Mathematics of Operations Research, vol. 41 (n° 2). pp. 442-465.

This list was generated on Thu May 8 15:29:47 2025 CEST.