The problem of selecting an appropriate number of features in supervised learning problems is investigated. Starting with common methods in machine learning, the feature selection task is treated as a quadratic unconstrained optimisation problem (QUBO), which can be tackled with classical numerical methods as well as within a quantum computing framework. The different results in small problem instances are compared. According to the results of the authors’ study, whether the QUBO method outperforms other feature selection methods depends on the data set. In an extension to a larger data set with 27 features, the authors compare the convergence behaviour of the QUBO methods via quantum computing with classical stochastic optimisation methods. Due to persisting error rates, the classical stochastic optimisation methods are still superior.
{"title":"Quantum computer based feature selection in machine learning","authors":"Gerhard Hellstern, Vanessa Dehn, Martin Zaefferer","doi":"10.1049/qtc2.12086","DOIUrl":"10.1049/qtc2.12086","url":null,"abstract":"<p>The problem of selecting an appropriate number of features in supervised learning problems is investigated. Starting with common methods in machine learning, the feature selection task is treated as a quadratic unconstrained optimisation problem (QUBO), which can be tackled with classical numerical methods as well as within a quantum computing framework. The different results in small problem instances are compared. According to the results of the authors’ study, whether the QUBO method outperforms other feature selection methods depends on the data set. In an extension to a larger data set with 27 features, the authors compare the convergence behaviour of the QUBO methods via quantum computing with classical stochastic optimisation methods. Due to persisting error rates, the classical stochastic optimisation methods are still superior.</p>","PeriodicalId":100651,"journal":{"name":"IET Quantum Communication","volume":"5 3","pages":"232-252"},"PeriodicalIF":2.8,"publicationDate":"2024-02-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://onlinelibrary.wiley.com/doi/epdf/10.1049/qtc2.12086","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142174183","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Antonio Macaluso, Luca Clissa, Stefano Lodi, Claudio Sartori
Ensemble methods aggregate predictions from multiple models, typically demonstrating improved accuracy and reduced variance compared to individual classifiers. However, they often come with significant memory usage and computational time requirements. A novel quantum algorithm that leverages quantum superposition, entanglement, and interference to construct an ensemble of classification models using bagging as an aggregation strategy is introduced. Through the generation of numerous quantum trajectories in superposition, the authors achieve B transformations of the training set with only