Quantum Kitchen Sinks: An algorithm for machine learning on near-term quantum computers

  1. C. M. Wilson,
  2. J. S. Otterbach,
  3. N. Tezak,
  4. R. S. Smith,
  5. G. E. Crooks,
  6. and M. P. da Silva
Noisy intermediate-scale quantum computing devices are an exciting platform for the exploration of the power of near-term quantum applications. Performing nontrivial tasks in such aframework requires a fundamentally different approach than what would be used on an error-corrected quantum computer. One such approach is to use hybrid algorithms, where problems are reduced to a parameterized quantum circuit that is often optimized in a classical feedback loop. Here we described one such hybrid algorithm for machine learning tasks by building upon the classical algorithm known as random kitchen sinks. Our technique, called quantum kitchen sinks, uses quantum circuits to nonlinearly transform classical inputs into features that can then be used in a number of machine learning algorithms. We demonstrate the power and flexibility of this proposal by using it to solve binary classification problems for synthetic datasets as well as handwritten digits from the MNIST database. We can show, in particular, that small quantum circuits provide significant performance lift over standard linear classical algorithms, reducing classification error rates from 50% to <0.1%, and from 4.1% to 1.4%in these two examples, respectively. [/expand]

Unsupervised Machine Learning on a Hybrid Quantum Computer

  1. J. S. Otterbach,
  2. R. Manenti,
  3. N. Alidoust,
  4. A. Bestwick,
  5. M. Block,
  6. B. Bloom,
  7. S. Caldwell,
  8. N. Didier,
  9. E. Schuyler Fried,
  10. S. Hong,
  11. P. Karalekas,
  12. C. B. Osborn,
  13. A. Papageorge,
  14. E. C. Peterson,
  15. G. Prawiroatmodjo,
  16. N. Rubin,
  17. Colm A. Ryan,
  18. D. Scarabelli,
  19. M. Scheer,
  20. E. A. Sete,
  21. P. Sivarajah,
  22. Robert S. Smith,
  23. A. Staley,
  24. N. Tezak,
  25. W. J. Zeng,
  26. A. Hudson,
  27. Blake R. Johnson,
  28. M. Reagor,
  29. M. P. da Silva,
  30. and C. Rigetti
Machine learning techniques have led to broad adoption of a statistical model of computing. The statistical distributions natively available on quantum processors are a superset of
those available classically. Harnessing this attribute has the potential to accelerate or otherwise improve machine learning relative to purely classical performance. A key challenge toward that goal is learning to hybridize classical computing resources and traditional learning techniques with the emerging capabilities of general purpose quantum processors. Here, we demonstrate such hybridization by training a 19-qubit gate model processor to solve a clustering problem, a foundational challenge in unsupervised learning. We use the quantum approximate optimization algorithm in conjunction with a gradient-free Bayesian optimization to train the quantum machine. This quantum/classical hybrid algorithm shows robustness to realistic noise, and we find evidence that classical optimization can be used to train around both coherent and incoherent imperfections.