Show simple item record

dc.contributor.authorAnderson, Prashanti
dc.contributor.authorBakshi, Ainesh
dc.contributor.authorMajid, Mahbod
dc.contributor.authorTiegel, Stefan
dc.date.accessioned2025-12-22T21:40:41Z
dc.date.available2025-12-22T21:40:41Z
dc.date.issued2025-06-15
dc.identifier.isbn979-8-4007-1510-5
dc.identifier.urihttps://hdl.handle.net/1721.1/164433
dc.descriptionSTOC ’25, Prague, Czechiaen_US
dc.description.abstractWe consider the task of privately obtaining prediction error guarantees in ordinary least-squares regression problems with Gaussian covariates (with unknown covariance structure). We provide the first sample-optimal polynomial time algorithm for this task under both pure and approximate differential privacy. We show that any improvement to the sample complexity of our algorithm would violate either statistical-query or information-theoretic lower bounds. Additionally, our algorithm is robust to a small fraction of arbitrary outliers and achieves optimal error rates as a function of the fraction of outliers. In contrast, all prior efficient algorithms either incurred sample complexities with sub-optimal dimension dependence, scaling with the condition number of the covariates, or obtained a polynomially worse dependence on the privacy parameters. Our technical contributions are two-fold: first, we leverage resilience guarantees of Gaussians within the sum-of-squares framework. As a consequence, we obtain efficient sum-of-squares algorithms for regression with optimal robustness rates and sample complexity. Second, we generalize the recent robustness-to-privacy framework of Hopkins, Kamath, Majid, and Narayanan to account for the geometry induced by the covariance of the input samples. This framework crucially relies on the robust estimators to be sum-of-squares algorithms, and combining the two steps yields a sample-optimal private regression algorithm. We believe our techniques are of independent interest, and we demonstrate this by obtaining an efficient algorithm for covariance-aware mean estimation, with an optimal dependence on the privacy parameters.en_US
dc.publisherACM|Proceedings of the 57th Annual ACM Symposium on Theory of Computingen_US
dc.relation.isversionofhttps://doi.org/10.1145/3717823.3718218en_US
dc.rightsCreative Commons Attributionen_US
dc.rights.urihttps://creativecommons.org/licenses/by/4.0/en_US
dc.sourceAssociation for Computing Machineryen_US
dc.titleSample-Optimal Private Regression in Polynomial Timeen_US
dc.typeArticleen_US
dc.identifier.citationPrashanti Anderson, Ainesh Bakshi, Mahbod Majid, and Stefan Tiegel. 2025. Sample-Optimal Private Regression in Polynomial Time. In Proceedings of the 57th Annual ACM Symposium on Theory of Computing (STOC '25). Association for Computing Machinery, New York, NY, USA, 2341–2349.en_US
dc.contributor.departmentMassachusetts Institute of Technology. Department of Electrical Engineering and Computer Scienceen_US
dc.contributor.departmentMassachusetts Institute of Technology. Department of Mathematicsen_US
dc.identifier.mitlicensePUBLISHER_POLICY
dc.eprint.versionFinal published versionen_US
dc.type.urihttp://purl.org/eprint/type/ConferencePaperen_US
eprint.statushttp://purl.org/eprint/status/NonPeerRevieweden_US
dc.date.updated2025-08-01T08:41:31Z
dc.language.rfc3066en
dc.rights.holderThe author(s)
dspace.date.submission2025-08-01T08:41:32Z
mit.licensePUBLISHER_CC
mit.metadata.statusAuthority Work and Publication Information Neededen_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record