Gadat, Sébastien
and Lalanne, Clément
(2024)
Privately learning smooth distribution on the hypercube by projections.
Proceedings of Machine Learning Research, Vol. 235.
25936 - 25975.
Abstract
Fueled by the ever-increasing need for statistics that guarantee the privacy of their training sets, this article studies the centrally-private estimation of Sobolev-smooth densities of probability over the hypercube in dimension d. The contributions of this article are two-fold : Firstly, it generalizes the one-dimensional results of (Lalanne et al., 2023b) to non-integer levels of smoothness and to a high-dimensional setting, which is important for two reasons : it is more suited for modern learning tasks, and it allows understanding the relations between privacy, dimensionality and smoothness, which is a central question with differential privacy. Secondly, this article presents a private strategy of estimation that is data-driven (usually referred to as adaptive in Statistics) in order to privately choose an estimator that achieves a good bias-variance trade-off among a finite family of private projection estimators without prior knowledge of the ground-truth smoothness β. This is achieved by adapting the Lepskii method for private selection, by adding a new penalization term that makes the estimation privacy-aware.
| Item Type: | Article |
|---|---|
| Language: | English |
| Date: | 2024 |
| Refereed: | Yes |
| Place of Publication: | Cambridge |
| Subjects: | B- ECONOMIE ET FINANCE |
| Divisions: | TSE-R (Toulouse) |
| Site: | UT1 |
| Date Deposited: | 03 Feb 2025 08:46 |
| Last Modified: | 22 Jan 2026 08:02 |
| OAI Identifier: | oai:tse-fr.eu:130279 |
| URI: | https://publications.ut-capitole.fr/id/eprint/50353 |

Tools
Tools
