Abstract
We apply digitized quantum annealing (QA) and quantum approximate optimization algorithm (QAOA) to a paradigmatic task of supervised learning in artificial neural networks: the optimization of synaptic weights for the binary perceptron. At variance with the usual QAOA applications to MaxCut, or to quantum spin-chains ground-state preparation, here the classical cost function is characterized by highly nonlocal multispin interactions. Yet, we provide evidence for the existence of optimal smooth solutions for the QAOA parameters, which are transferable among typical instances of the same problem, and we prove numerically an enhanced performance of QAOA over traditional QA. We also investigate on the role of the classical cost-function landscape geometry in this problem. By artificially breaking this geometrical structure, we show that the detrimental effect of a gap-closing transition, encountered in QA, is also negatively affecting the performance of our QAOA implementation.
3 More- Received 20 May 2022
- Revised 21 November 2022
- Accepted 15 December 2022
DOI:https://doi.org/10.1103/PhysRevB.107.094202
©2023 American Physical Society