Maxwell T. West, Jamie Heredge, Martin Sevior, Muhammad Usman
{"title":"可证明的可训练旋转等变量子机器学习","authors":"Maxwell T. West, Jamie Heredge, Martin Sevior, Muhammad Usman","doi":"10.1103/prxquantum.5.030320","DOIUrl":null,"url":null,"abstract":"Exploiting the power of quantum computation to realize superior machine learning algorithms has been a major research focus of recent years, but the prospects of quantum machine learning (QML) remain dampened by considerable technical challenges. A particularly significant issue is that generic QML models suffer from so-called barren plateaus in their training landscapes—large regions where cost function gradients vanish exponentially in the number of qubits employed, rendering large models effectively untrainable. A leading strategy for combating this effect is to build problem-specific models that take into account the symmetries of their data in order to focus on a smaller, relevant subset of Hilbert space. In this work, we introduce a family of rotationally equivariant QML models built upon the quantum Fourier transform, and leverage recent insights from the Lie-algebraic study of QML models to prove that (a subset of) our models do not exhibit barren plateaus. In addition to our analytical results we numerically test our rotationally equivariant models on a dataset of simulated scanning tunneling microscope images of phosphorus impurities in silicon, where rotational symmetry naturally arises, and find that they dramatically outperform their generic counterparts in practice.","PeriodicalId":501296,"journal":{"name":"PRX Quantum","volume":"48 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-07-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Provably Trainable Rotationally Equivariant Quantum Machine Learning\",\"authors\":\"Maxwell T. West, Jamie Heredge, Martin Sevior, Muhammad Usman\",\"doi\":\"10.1103/prxquantum.5.030320\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Exploiting the power of quantum computation to realize superior machine learning algorithms has been a major research focus of recent years, but the prospects of quantum machine learning (QML) remain dampened by considerable technical challenges. A particularly significant issue is that generic QML models suffer from so-called barren plateaus in their training landscapes—large regions where cost function gradients vanish exponentially in the number of qubits employed, rendering large models effectively untrainable. A leading strategy for combating this effect is to build problem-specific models that take into account the symmetries of their data in order to focus on a smaller, relevant subset of Hilbert space. In this work, we introduce a family of rotationally equivariant QML models built upon the quantum Fourier transform, and leverage recent insights from the Lie-algebraic study of QML models to prove that (a subset of) our models do not exhibit barren plateaus. In addition to our analytical results we numerically test our rotationally equivariant models on a dataset of simulated scanning tunneling microscope images of phosphorus impurities in silicon, where rotational symmetry naturally arises, and find that they dramatically outperform their generic counterparts in practice.\",\"PeriodicalId\":501296,\"journal\":{\"name\":\"PRX Quantum\",\"volume\":\"48 1\",\"pages\":\"\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-07-31\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"PRX Quantum\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1103/prxquantum.5.030320\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"PRX Quantum","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1103/prxquantum.5.030320","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Exploiting the power of quantum computation to realize superior machine learning algorithms has been a major research focus of recent years, but the prospects of quantum machine learning (QML) remain dampened by considerable technical challenges. A particularly significant issue is that generic QML models suffer from so-called barren plateaus in their training landscapes—large regions where cost function gradients vanish exponentially in the number of qubits employed, rendering large models effectively untrainable. A leading strategy for combating this effect is to build problem-specific models that take into account the symmetries of their data in order to focus on a smaller, relevant subset of Hilbert space. In this work, we introduce a family of rotationally equivariant QML models built upon the quantum Fourier transform, and leverage recent insights from the Lie-algebraic study of QML models to prove that (a subset of) our models do not exhibit barren plateaus. In addition to our analytical results we numerically test our rotationally equivariant models on a dataset of simulated scanning tunneling microscope images of phosphorus impurities in silicon, where rotational symmetry naturally arises, and find that they dramatically outperform their generic counterparts in practice.