RT Journal Article SR 00 A1 Gadat, Sébastien A1 Lalanne, Clément T1 Privately learning smooth distribution on the hypercube by projections JF Proceedings of Machine Learning Research YR 2024 FD 2024 VO Vol. 235 SP 25936 OP 25975 AB Fueled by the ever-increasing need for statistics that guarantee the privacy of their training sets, this article studies the centrally-private estimation of Sobolev-smooth densities of probability over the hypercube in dimension d. The contributions of this article are two-fold : Firstly, it generalizes the one-dimensional results of (Lalanne et al., 2023b) to non-integer levels of smoothness and to a high-dimensional setting, which is important for two reasons : it is more suited for modern learning tasks, and it allows understanding the relations between privacy, dimensionality and smoothness, which is a central question with differential privacy. Secondly, this article presents a private strategy of estimation that is data-driven (usually referred to as adaptive in Statistics) in order to privately choose an estimator that achieves a good bias-variance trade-off among a finite family of private projection estimators without prior knowledge of the ground-truth smoothness β. This is achieved by adapting the Lepskii method for private selection, by adding a new penalization term that makes the estimation privacy-aware. PB JMLR SN 2640-3498 LK https://publications.ut-capitole.fr/id/eprint/50353/ UL http://tse-fr.eu/pub/130279