Goto

Collaborating Authors

 annealer


Deriving Representative Structure from Music Corpora

Shapiro, Ilana, Ruanqianqian, null, Huang, null, Novack, Zachary, Wang, Cheng-i, Dong, Hao-Wen, Berg-Kirkpatrick, Taylor, Dubnov, Shlomo, Lerner, Sorin

arXiv.org Artificial Intelligence

Western music is an innately hierarchical system of interacting levels of structure, from fine-grained melody to high-level form. In order to analyze music compositions holistically and at multiple granularities, we propose a unified, hierarchical meta-representation of musical structure called the structural temporal graph (STG). For a single piece, the STG is a data structure that defines a hierarchy of progressively finer structural musical features and the temporal relationships between them. We use the STG to enable a novel approach for deriving a representative structural summary of a music corpus, which we formalize as a dually NP-hard combinatorial optimization problem extending the Generalized Median Graph problem. Our approach first applies simulated annealing to develop a measure of structural distance between two music pieces rooted in graph isomorphism. Our approach then combines the formal guarantees of SMT solvers with nested simulated annealing over structural distances to produce a structurally sound, representative centroid STG for an entire corpus of STGs from individual pieces. To evaluate our approach, we conduct experiments verifying that structural distance accurately differentiates between music pieces, and that derived centroids accurately structurally characterize their corpora.


Transfer of Knowledge through Reverse Annealing: A Preliminary Analysis of the Benefits and What to Share

Osaba, Eneko, Villar-Rodriguez, Esther

arXiv.org Artificial Intelligence

Being immersed in the NISQ-era, current quantum annealers present limitations for solving optimization problems efficiently. To mitigate these limitations, D-Wave Systems developed a mechanism called Reverse Annealing, a specific type of quantum annealing designed to perform local refinement of good states found elsewhere. Despite the research activity around Reverse Annealing, none has theorized about the possible benefits related to the transfer of knowledge under this paradigm. This work moves in that direction and is driven by experimentation focused on answering two key research questions: i) is reverse annealing a paradigm that can benefit from knowledge transfer between similar problems? and ii) can we infer the characteristics that an input solution should meet to help increase the probability of success? To properly guide the tests in this paper, the well-known Knapsack Problem has been chosen for benchmarking purposes, using a total of 34 instances composed of 14 and 16 items.


Quantum Annealing for Robust Principal Component Analysis

Tomeo, Ian, Markopoulos, Panos P., Savakis, Andreas

arXiv.org Machine Learning

Principal component analysis is commonly used for dimensionality reduction, feature extraction, denoising, and visualization. The most commonly used principal component analysis method is based upon optimization of the L2-norm, however, the L2-norm is known to exaggerate the contribution of errors and outliers. When optimizing over the L1-norm, the components generated are known to exhibit robustness or resistance to outliers in the data. The L1-norm components can be solved for with a binary optimization problem. Previously, L1-BF has been used to solve the binary optimization for multiple components simultaneously. In this paper we propose QAPCA, a new method for finding principal components using quantum annealing hardware which will optimize over the robust L1-norm. The conditions required for convergence of the annealing problem are discussed. The potential speedup when using quantum annealing is demonstrated through complexity analysis and experimental results. To showcase performance against classical principal component analysis techniques experiments upon synthetic Gaussian data, a fault detection scenario and breast cancer diagnostic data are studied. We find that the reconstruction error when using QAPCA is comparable to that when using L1-BF.


Towards Arbitrary QUBO Optimization: Analysis of Classical and Quantum-Activated Feedforward Neural Networks

Lai, Chia-Tso, Blank, Carsten, Schmelcher, Peter, Mukherjee, Rick

arXiv.org Artificial Intelligence

Quadratic Unconstrained Binary Optimization (QUBO) is at the heart of many industries and academic fields such as logistics, supply chain, finance, pharmaceutical science, chemistry, IT, and energy sectors, among others [1]. These problems typically involve optimizing a large number of binary variables, which makes finding exact solutions exponentially more difficult. Consequently, most QUBO problems are classified as NP-hard [2, 3]. To address this challenge, we developed a powerful feedforward neural network (FNN) optimizer for arbitrary QUBO problems. In this work, we demonstrate that the FNN optimizer can provide highquality approximate solutions for large problems, including dense 80-variable weighted MaxCut and random QUBOs, achieving an average accuracy of over 99% in less than 1.1 seconds on an 8-core CPU. Additionally, the FNN optimizer outperformed the Gurobi optimizer [4] by 72% on 200-variable random QUBO problems within a 100-second computation time limit, exhibiting strong potential for real-time optimization tasks. Building on this model, we explored the novel approach of integrating FNNs with a quantum annealer-based activation function to create a quantum-classical encoderdecoder (QCED) optimizer, aiming to further enhance the performance of FNNs in QUBO optimization.


Quantum Annealing-Based Algorithm for Efficient Coalition Formation Among LEO Satellites

Venkatesh, Supreeth Mysore, Macaluso, Antonio, Nuske, Marlon, Klusch, Matthias, Dengel, Andreas

arXiv.org Artificial Intelligence

The increasing number of Low Earth Orbit (LEO) satellites, driven by lower manufacturing and launch costs, is proving invaluable for Earth observation missions and low-latency internet connectivity. However, as the number of satellites increases, the number of communication links to maintain also rises, making the management of this vast network increasingly challenging and highlighting the need for clustering satellites into efficient groups as a promising solution. This paper formulates the clustering of LEO satellites as a coalition structure generation (CSG) problem and leverages quantum annealing to solve it. We represent the satellite network as a graph and obtain the optimal partitions using a hybrid quantum-classical algorithm called GCS-Q. The algorithm follows a top-down approach by iteratively splitting the graph at each step using a quadratic unconstrained binary optimization (QUBO) formulation. To evaluate our approach, we utilize real-world three-line element set (TLE/3LE) data for Starlink satellites from Celestrak. Our experiments, conducted using the D-Wave Advantage annealer and the state-of-the-art solver Gurobi, demonstrate that the quantum annealer significantly outperforms classical methods in terms of runtime while maintaining the solution quality. The performance achieved with quantum annealers surpasses the capabilities of classical computers, highlighting the transformative potential of quantum computing in optimizing the management of large-scale satellite networks.


Adiabatic Quantum Support Vector Machines

Date, Prasanna, Woun, Dong Jun, Hamilton, Kathleen, Perez, Eduardo A. Coello, Shekhar, Mayanka Chandra, Rios, Francisco, Gounley, John, Suh, In-Saeng, Humble, Travis, Tourassi, Georgia

arXiv.org Artificial Intelligence

Adiabatic quantum computers can solve difficult optimization problems (e.g., the quadratic unconstrained binary optimization problem), and they seem well suited to train machine learning models. In this paper, we describe an adiabatic quantum approach for training support vector machines. We show that the time complexity of our quantum approach is an order of magnitude better than the classical approach. Next, we compare the test accuracy of our quantum approach against a classical approach that uses the Scikit-learn library in Python across five benchmark datasets (Iris, Wisconsin Breast Cancer (WBC), Wine, Digits, and Lambeq). We show that our quantum approach obtains accuracies on par with the classical approach. Finally, we perform a scalability study in which we compute the total training times of the quantum approach and the classical approach with increasing number of features and number of data points in the training dataset. Our scalability results show that the quantum approach obtains a 3.5--4.5 times speedup over the classical approach on datasets with many (millions of) features.


CaloQVAE : Simulating high-energy particle-calorimeter interactions using hybrid quantum-classical generative models

Hoque, Sehmimul, Jia, Hao, Abhishek, Abhishek, Fadaie, Mojde, Toledo-Marín, J. Quetzalcoatl, Vale, Tiago, Melko, Roger G., Swiatlowski, Maximilian, Fedorko, Wojciech T.

arXiv.org Artificial Intelligence

Department of Physics and Astronomy, University of Waterloo, Ontario N2L 3G1, Canada The Large Hadron Collider's high luminosity era presents major computational challenges in the analysis of collision events. Large amounts of Monte Carlo (MC) simulation will be required to constrain the statistical uncertainties of the simulated datasets below these of the experimental data. Modelling of high-energy particles propagating through the calorimeter section of the detector is the most computationally intensive MC simulation task. We introduce a technique combining recent advancements in generative models and quantum annealing for fast and efficient simulation of high-energy particle-calorimeter interactions. The Large Hadron Collider (LHC) is the highest energy particle showers is critical to enable the highest quality particle accelerator in the world, and currently collides measurements, but simulating each shower from first protons at s = 13.6 TeV at a rate of 2 10 We deploy a restricted "High-Luminosity LHC" (HL-LHC) dataset will enable Boltzmann machine (RBM) to encode a rich description significantly more precise measurements of the Higgs boson of particle showers in detectors, and use quantum and other Standard Model particles.


An Optimization Case Study for solving a Transport Robot Scheduling Problem on Quantum-Hybrid and Quantum-Inspired Hardware

Leib, Dominik, Seidel, Tobias, Jäger, Sven, Heese, Raoul, Jones, Caitlin Isobel, Awasthi, Abhishek, Niederle, Astrid, Bortz, Michael

arXiv.org Artificial Intelligence

Quantum computing (QC) is a field that has witnessed a rapid increase in interest and development over the past few decades since it was theoretically shown that quantum computers can provide an exponential speedup for certain tasks (Deutsch, Jozsa 1992; Grover 1996; Shor 1994). Translating this potential into a practically relevant quantum advantage, however, has proven to be a very challenging endeavor. Nevertheless, the emerging field is considered to have a highly disruptive potential for many domains, for example in machine learning (Schuld, Sinayskiy, Petruccione 2015), chemical simulations (Cao et al. 2019) and optimization (Li et al. 2020), the domain of this work. Due to the fact that optimization problems are of utmost importance also for industrial applications, we investigated a potential advantage of quantum and quantum-inspired technology for the so-called transport robot scheduling problem (TRSP), a real-world use-case in optimization that is derived from an industrial application of an automatized robot in a high-throughput laboratory. The optimization task is to plan a time-efficient schedule for the robot's movements as it transports chemical samples between a rack and multiple machines to conduct experiments.


Hybrid quantum-classical machine learning for generative chemistry and drug design

Gircha, A. I., Boev, A. S., Avchaciov, K., Fedichev, P. O., Fedorov, A. K.

arXiv.org Artificial Intelligence

Deep generative chemistry models emerge as powerful tools to expedite drug discovery. However, the immense size and complexity of the structural space of all possible drug-like molecules pose significant obstacles, which could be overcome with hybrid architectures combining quantum computers with deep classical networks. As the first step toward this goal, we built a compact discrete variational autoencoder (DVAE) with a Restricted Boltzmann Machine (RBM) of reduced size in its latent layer. The size of the proposed model was small enough to fit on a state-of-the-art D-Wave quantum annealer and allowed training on a subset of the ChEMBL dataset of biologically active compounds. Finally, we generated 2331 novel chemical structures with medicinal chemistry and synthetic accessibility properties in the ranges typical for molecules from ChEMBL. The presented results demonstrate the feasibility of using already existing or soon-to-be-available quantum computing devices as testbeds for future drug discovery applications.


A hybrid quantum-classical approach for inference on restricted Boltzmann machines

Kālis, Mārtiņš, Locāns, Andris, Šikovs, Rolands, Naseri, Hassan, Ambainis, Andris

arXiv.org Artificial Intelligence

Boltzmann machine is a powerful machine learning model with many real-world applications, for example by constructing deep belief networks. Statistical inference on a Boltzmann machine can be carried out by sampling from its posterior distribution. However, uniform sampling from such a model is not trivial due to an extremely multi-modal distribution. Quantum computers have the promise of solving some non-trivial problems in an efficient manner. We explored the application of a D-Wave quantum annealer to generate samples from a restricted Boltzmann machine. The samples are further improved by Markov chains in a hybrid quantum-classical setup. We demonstrated that quantum annealer samples can improve the performance of Gibbs sampling compared to random initialization. The hybrid setup is considerably more efficient than a pure classical sampling. We also investigated the impact of annealing parameters (temperature) to improve the quality of samples. By increasing the amount of classical processing (Gibbs updates) the benefit of quantum annealing vanishes, which may be justified by the limited performance of today's quantum computers compared to classical.