ng-rc
Superconducting Qubit Readout Using Next-Generation Reservoir Computing
Kent, Robert, Lienhard, Benjamin, Lafyatis, Gregory, Gauthier, Daniel J.
Quantum processors require rapid and high-fidelity simultaneous measurements of many qubits. While superconducting qubits are among the leading modalities toward a useful quantum processor, their readout remains a bottleneck. Traditional approaches to processing measurement data often struggle to account for crosstalk present in frequency-multiplexed readout, the preferred method to reduce the resource overhead. Recent approaches to address this challenge use neural networks to improve the state-discrimination fidelity. However, they are computationally expensive to train and evaluate, resulting in increased latency and poor scalability as the number of qubits increases. We present an alternative machine learning approach based on next-generation reservoir computing that constructs polynomial features from the measurement signals and maps them to the corresponding qubit states. This method is highly parallelizable, avoids the costly nonlinear activation functions common in neural networks, and supports real-time training, enabling fast evaluation, adaptability, and scalability. Despite its lower computational complexity, our reservoir approach is able to maintain high qubit-state-discrimination fidelity. Relative to traditional methods, our approach achieves error reductions of up to 50% and 11% on single- and five-qubit datasets, respectively, and delivers up to 2.5x crosstalk reduction on the five-qubit dataset. Compared with recent machine-learning methods, evaluating our model requires 100x fewer multiplications for single-qubit and 2.5x fewer for five-qubit models. This work demonstrates that reservoir computing can enhance qubit-state discrimination while maintaining scalability for future quantum processors.
- North America > United States > Ohio > Franklin County > Columbus (0.04)
- North America > United States > New Jersey > Mercer County > Princeton (0.04)
- North America > United States > New York > New York County > New York City (0.04)
- Europe > United Kingdom > England > Cambridgeshire > Cambridge (0.04)
Infinite-dimensional next-generation reservoir computing
Grigoryeva, Lyudmila, Ting, Hannah Lim Jing, Ortega, Juan-Pablo
Next-generation reservoir computing (NG-RC) has attracted much attention due to its excellent performance in spatio-temporal forecasting of complex systems and its ease of implementation. This paper shows that NG-RC can be encoded as a kernel ridge regression that makes training efficient and feasible even when the space of chosen polynomial features is very large. Additionally, an extension to an infinite number of covariates is possible, which makes the methodology agnostic with respect to the lags into the past that are considered as explanatory factors, as well as with respect to the number of polynomial covariates, an important hyperparameter in traditional NG-RC. We show that this approach has solid theoretical backing and good behavior based on kernel universality properties previously established in the literature. Various numerical illustrations show that these generalizations of NG-RC outperform the traditional approach in several forecasting applications.
Predicting Chaotic System Behavior using Machine Learning Techniques
Rao, Huaiyuan, Zhao, Yichen, Lai, Qiang
The RCs have been developed from the original Echostate Time series data have attracted significant attention across network (ESN)-based to nonlinear vector autoregression various fields in the natural and social sciences because of (NVAR), which also called NG-RC [18]. An NVAR machine is their potential applications. The analysis and prediction of time created where the feature vector is composed of time-delayed series data have been the focus of extensive research over the observations of the dynamical system, along with nonlinear past few decades [1]-[5]. Chaotic time series are among the functions of these observations. It requires no random matrices, most complex because even small perturbation in initial values fewer metaparameters, and provides interpretable results can lead to significant variations in their behaviors. Due to which reflects the nature of the nonlinear model. In addition, their sensitivity to initial conditions, it is a challenging task to it is 33 162 times less costly to simulate than a typical predict chaotic time behaviors.
- North America > United States > Georgia > Fulton County > Atlanta (0.04)
- Asia > China > Jiangxi Province > Nanchang (0.04)
Tree-based Learning for High-Fidelity Prediction of Chaos
Giammarese, Adam, Rana, Kamal, Bollt, Erik M., Malik, Nishant
Model-free forecasting of the temporal evolution of chaotic systems is crucial but challenging. Existing solutions require hyperparameter tuning, significantly hindering their wider adoption. In this work, we introduce a tree-based approach not requiring hyperparameter tuning: TreeDOX. It uses time delay overembedding as explicit short-term memory and Extra-Trees Regressors to perform feature reduction and forecasting. We demonstrate the state-of-the-art performance of TreeDOX using the Henon map, Lorenz and Kuramoto-Sivashinsky systems, and the real-world Southern Oscillation Index.
- North America > United States > New York > Monroe County > Rochester (0.05)
- Europe > Germany > Brandenburg > Potsdam (0.05)
- Europe > Germany > Berlin (0.04)
Small jet engine reservoir computing digital twin
Wright, C. J., Biederman, N., Gyovai, B., Gauthier, D. J., Wilhelm, J. P.
Machine learning was applied to create a digital twin of a numerical simulation of a single-scroll jet engine. A similar model based on the insights gained from this numerical study was used to create a digital twin of a JetCat P100-RX jet engine using only experimental data. Engine data was collected from a custom sensor system measuring parameters such as thrust, exhaust gas temperature, shaft speed, weather conditions, etc. Data was gathered while the engine was placed under different test conditions by controlling shaft speed. The machine learning model was generated (trained) using a next-generation reservoir computer, a best-in-class machine learning algorithm for dynamical systems. Once the model was trained, it was used to predict behavior it had never seen with an accuracy of better than 1.8% when compared to the testing data.
- North America > United States > Ohio > Franklin County > Columbus (0.05)
- North America > United States > Ohio > Athens County > Athens (0.04)
- North America > United States > Ohio > Cuyahoga County > Cleveland (0.04)
- (6 more...)
Quantum Next Generation Reservoir Computing: An Efficient Quantum Algorithm for Forecasting Quantum Dynamics
Sornsaeng, Apimuk, Dangniam, Ninnat, Chotibut, Thiparat
Next Generation Reservoir Computing (NG-RC) is a modern class of model-free machine learning that enables an accurate forecasting of time series data generated by dynamical systems. We demonstrate that NG-RC can accurately predict full many-body quantum dynamics in both integrable and chaotic systems. This is in contrast to the conventional application of reservoir computing that concentrates on the prediction of the dynamics of observables. In addition, we apply a technique which we refer to as skipping ahead to predict far future states accurately without the need to extract information about the intermediate states. However, adopting a classical NG-RC for many-body quantum dynamics prediction is computationally prohibitive due to the large Hilbert space of sample input data. In this work, we propose an end-to-end quantum algorithm for many-body quantum dynamics forecasting with a quantum computational speedup via the block-encoding technique. This proposal presents an efficient model-free quantum scheme to forecast quantum dynamics coherently, bypassing inductive biases incurred in a model-based approach.
- North America > United States > New York > New York County > New York City (0.04)
- Europe > Germany (0.04)
- Asia > Thailand > Phitsanulok > Phitsanulok (0.04)
- (3 more...)
Controlling dynamical systems to complex target states using machine learning: next-generation vs. classical reservoir computing
Haluszczynski, Alexander, Köglmayr, Daniel, Räth, Christoph
Controlling nonlinear dynamical systems using machine learning allows to not only drive systems into simple behavior like periodicity but also to more complex arbitrary dynamics. For this, it is crucial that a machine learning system can be trained to reproduce the target dynamics sufficiently well. On the example of forcing a chaotic parametrization of the Lorenz system into intermittent dynamics, we show first that classical reservoir computing excels at this task. In a next step, we compare those results based on different amounts of training data to an alternative setup, where next-generation reservoir computing is used instead. It turns out that while delivering comparable performance for usual amounts of training data, next-generation RC significantly outperforms in situations where only very limited data is available. This opens even further practical control applications in real world problems where data is restricted.
- Europe > Germany > North Rhine-Westphalia > Cologne Region > Bonn (0.04)
- Europe > Germany > Bavaria > Upper Bavaria > Munich (0.04)
Controlling Chaotic Maps using Next-Generation Reservoir Computing
Kent, Robert M., Barbosa, Wendson A. S., Gauthier, Daniel J.
In this work, we combine nonlinear system control techniques with next-generation reservoir computing, a best-in-class machine learning approach for predicting the behavior of dynamical systems. We demonstrate the performance of the controller in a series of control tasks for the chaotic H\'enon map, including controlling the system between unstable fixed-points, stabilizing the system to higher order periodic orbits, and to an arbitrary desired state. We show that our controller succeeds in these tasks, requires only 10 data points for training, can control the system to a desired trajectory in a single iteration, and is robust to noise and modeling error.
- North America > United States > Ohio > Franklin County > Columbus (0.04)
- North America > United States > Florida > Palm Beach County > Boca Raton (0.04)
Learning Spatiotemporal Chaos Using Next-Generation Reservoir Computing
Barbosa, Wendson A. S., Gauthier, Daniel J.
Forecasting the behavior of high-dimensional dynamical systems using machine learning requires efficient methods to learn the underlying physical model. We demonstrate spatiotemporal chaos prediction using a machine learning architecture that, when combined with a next-generation reservoir computer, displays state-of-the-art performance with a computational time $10^3-10^4$ times faster for training process and training data set $\sim 10^2$ times smaller than other machine learning algorithms. We also take advantage of the translational symmetry of the model to further reduce the computational cost and training data, each by a factor of $\sim$10.