RT Journal Article SR 00 ID 10.1214/24-EJP1235 A1 Crespo, Marelys A1 Gadat, Sébastien A1 Gendre, Xavier T1 Stochastic gradient langevin dynamics for (weakly) log-concave posterior distributions JF Electronic Journal of Probability YR 2024 FD 2024 VO Vol. 29 SP 1 OP 40 K1 Log-concave models , K1 Stochastic gradient Langevin dynamics K1 Weak convexity AB In this paper, we investigate a continuous time version of the Stochastic Langevin Monte Carlo method, introduced in [39], that incorporates a stochastic sampling step inside the traditional overdamped Langevin diffusion. This method is popular in machine learning for sampling posterior distribution. We will pay specific attention in our work to the computational cost in terms of n (the number of observations that produces the posterior distribution), and d (the dimension of the ambient space where the parameter of interest is living). We derive our analysis in the weakly convex framework, which is parameterized with the help of the Kurdyka- Lojasiewicz (KL) inequality, that permits to handle a vanishing curvature settings, which is far less restrictive when compared to the simple strongly convex case. We establish that the final horizon of simulation to obtain an ε approximation (in terms of entropy) is of the order (d log(n)²)(1+r)² [log²(ε−1) + n²d²(1+r) log4(1+r)(n)] with a Poissonian subsampling of parameter n(d log²(n))1+r)−1, where the parameter r is involved in the KL inequality and varies between 0 (strongly convex case) and 1 (limiting Laplace situation). PB cElectronic Journal of Probability and Electronic Communications in Probability SN 1083-6489 LK https://publications.ut-capitole.fr/id/eprint/50330/ UL http://tse-fr.eu/pub/130251