Show simple item record

dc.contributor.authorChia, Nai-Hui
dc.contributor.authorGilyen, Andras Pal
dc.contributor.authorLi, Tongyang
dc.contributor.authorLin, Han-Hsuan
dc.contributor.authorTang, Ewin
dc.contributor.authorWang, Chunhao
dc.date.accessioned2025-09-02T18:57:53Z
dc.date.available2025-09-02T18:57:53Z
dc.date.issued2022-10-27
dc.identifier.issn0004-5411
dc.identifier.urihttps://hdl.handle.net/1721.1/162596
dc.description.abstractWe present an algorithmic framework for quantum-inspired classical algorithms on close-to-low-rank matrices, generalizing the series of results started by Tang’s breakthrough quantum-inspired algorithm for recommendation systems [STOC’19]. Motivated by quantum linear algebra algorithms and the quantum singular value transformation (SVT) framework of Gilyén et al. [STOC’19], we develop classical algorithms for SVT that run in time independent of input dimension, under suitable quantum-inspired sampling assumptions. Our results give compelling evidence that in the corresponding QRAM data structure input model, quantum SVT does not yield exponential quantum speedups. Since the quantum SVT framework generalizes essentially all known techniques for quantum linear algebra, our results, combined with sampling lemmas from previous work, suffice to generalize all prior results about dequantizing quantum machine learning algorithms. In particular, our classical SVT framework recovers and often improves the dequantization results on recommendation systems, principal component analysis, supervised clustering, support vector machines, low-rank regression, and semidefinite program solving. We also give additional dequantization results on low-rank Hamiltonian simulation and discriminant analysis. Our improvements come from identifying the key feature of the quantum-inspired input model that is at the core of all prior quantum-inspired results: ℓ2-norm sampling can approximate matrix products in time independent of their dimension. We reduce all our main results to this fact, making our exposition concise, self-contained, and intuitive.en_US
dc.publisherACMen_US
dc.relation.isversionofhttp://dx.doi.org/10.1145/3549524en_US
dc.rightsArticle is made available in accordance with the publisher's policy and may be subject to US copyright law. Please refer to the publisher's site for terms of use.en_US
dc.sourceAssociation for Computing Machineryen_US
dc.titleSampling-based sublinear low-rank matrix arithmetic framework for dequantizing quantum machine learningen_US
dc.typeArticleen_US
dc.identifier.citationNai-Hui Chia, András Pal Gilyén, Tongyang Li, Han-Hsuan Lin, Ewin Tang, and Chunhao Wang. 2022. Sampling-based Sublinear Low-rank Matrix Arithmetic Framework for Dequantizing Quantum Machine Learning. J. ACM 69, 5, Article 33 (October 2022), 72 pages.en_US
dc.contributor.departmentMassachusetts Institute of Technology. Center for Theoretical Physicsen_US
dc.relation.journalJournal of the ACMen_US
dc.identifier.mitlicensePUBLISHER_POLICY
dc.eprint.versionFinal published versionen_US
dc.type.urihttp://purl.org/eprint/type/JournalArticleen_US
eprint.statushttp://purl.org/eprint/status/PeerRevieweden_US
dc.date.updated2025-09-01T07:49:19Z
dc.language.rfc3066en
dc.rights.holderACM
dspace.date.submission2025-09-01T07:49:20Z
mit.journal.volume69en_US
mit.journal.issue5en_US
mit.licensePUBLISHER_POLICY
mit.metadata.statusAuthority Work and Publication Information Neededen_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record