Arbitrary Polynomial Separations in Trainable Quantum Machine Learning Journal Article uri icon

Overview

abstract

  • Recent theoretical results in quantum machine learning have demonstrated a general trade-off between the expressive power of quantum neural networks (QNNs) and their trainability; as a corollary of these results, practical exponential separations in expressive power over classical machine learning models are believed to be infeasible as such QNNs take a time to train that is exponential in the model size. We here circumvent these negative results by constructing a hierarchy of efficiently trainable QNNs that exhibit unconditionally provable, polynomial memory separations of arbitrary constant degree over classical neural networks—including state-of-the-art models, such as Transformers—in performing a classical sequence modeling task. This construction is also computationally efficient, as each unit cell of the introduced class of QNNs only has constant gate complexity. We show that contextuality—informally, a quantitative notion of semantic ambiguity—is the source of the expressivity separation, suggesting that other learning tasks with this property may be a natural setting for the use of quantum learning algorithms.

publication date

  • January 20, 2026

Date in CU Experts

  • February 2, 2026 9:45 AM

Full Author List

  • Anschuetz ER; Gao X

author count

  • 2

Other Profiles

Electronic International Standard Serial Number (EISSN)

  • 2521-327X

Additional Document Info

start page

  • 1976

end page

  • 1976

volume

  • 10

number

  • 1976