Goto

Collaborating Authors

 beam splitter


Neural surrogates for designing gravitational wave detectors

Ruiz-Gonzalez, Carlos, Arlt, Sören, Lehner, Sebastian, Berzins, Arturs, Drori, Yehonathan, Adhikari, Rana X, Brandstetter, Johannes, Krenn, Mario

arXiv.org Artificial Intelligence

Physics simulators are essential in science and engineering, enabling the analysis, control, and design of complex systems. In experimental sciences, they are increasingly used to automate experimental design, often via combinatorial search and optimization. However, as the setups grow more complex, the computational cost of traditional, CPU-based simulators becomes a major limitation. Here, we show how neural surrogate models can significantly reduce reliance on such slow simulators while preserving accuracy. Taking the design of interferometric gravitational wave detectors as a representative example, we train a neural network to surrogate the gravitational wave physics simulator Finesse, which was developed by the LIGO community. Despite that small changes in physical parameters can change the output by orders of magnitudes, the model rapidly predicts the quality and feasibility of candidate designs, allowing an efficient exploration of large design spaces. Our algorithm loops between training the surrogate, inverse designing new experiments, and verifying their properties with the slow simulator for further training. Assisted by auto-differentiation and GPU parallelism, our method proposes high-quality experiments much faster than direct optimization. Solutions that our algorithm finds within hours outperform designs that take five days for the optimizer to reach. Though shown in the context of gravitational wave detectors, our framework is broadly applicable to other domains where simulator bottlenecks hinder optimization and discovery.


Modeling and benchmarking quantum optical neurons for efficient neural computation

Andrisani, Andrea, Vessio, Gennaro, Sgobba, Fabrizio, Di Lena, Francesco, Santamaria, Luigi Amato, Castellano, Giovanna

arXiv.org Artificial Intelligence

Quantum optical neurons (QONs) are emerging as promising computational units that leverage photonic interference to perform neural operations in an energy-efficient and physically grounded manner. Building on recent theoretical proposals, we introduce a family of QON architectures based on Hong-Ou-Mandel (HOM) and Mach-Zehnder (MZ) interferometers, incorporating different photon modulation strategies -- phase, amplitude, and intensity. These physical setups yield distinct pre-activation functions, which we implement as fully differentiable modules in software. We evaluate these QONs both in isolation and as building blocks of multilayer networks, training them on binary and multiclass image classification tasks using the MNIST and FashionMNIST datasets. Our experiments show that two configurations -- HOM-based amplitude modulation and MZ-based phase-shifted modulation -- achieve performance comparable to that of classical neurons in several settings, and in some cases exhibit faster or more stable convergence. In contrast, intensity-based encodings display greater sensitivity to distributional shifts and training instabilities. These results highlight the potential of QONs as efficient and scalable components for future quantum-inspired neural architectures and hybrid photonic-electronic systems.


Inverse Design of Diffractive Metasurfaces Using Diffusion Models

Hen, Liav, Yosef, Erez, Raviv, Dan, Giryes, Raja, Scheuer, Jacob

arXiv.org Artificial Intelligence

Metasurfaces are ultra-thin optical elements composed of engineered sub-wavelength structures that enable precise control of light. Their inverse design - determining a geometry that yields a desired optical response - is challenging due to the complex, nonlinear relationship between structure and optical properties. This often requires expert tuning, is prone to local minima, and involves significant computational overhead. In this work, we address these challenges by integrating the generative capabilities of diffusion models into computational design workflows. Using an RCWA simulator, we generate training data consisting of metasurface geometries and their corresponding far-field scattering patterns. We then train a conditional diffusion model to predict meta-atom geometry and height from a target spatial power distribution at a specified wavelength, sampled from a continuous supported band. Once trained, the model can generate metasurfaces with low error, either directly using RCWA-guided posterior sampling or by serving as an initializer for traditional optimization methods. We demonstrate our approach on the design of a spatially uniform intensity splitter and a polarization beam splitter, both produced with low error in under 30 minutes. To support further research in data-driven metasurface design, we publicly release our code and datasets.


Efficient data transport over multimode light-pipes with Megapixel images using differentiable ray tracing and Machine-learning

Lim, Joowon, Gladrow, Jannes, Kelly, Douglas, O'Shea, Greg, Verkes, Govert, Stefanovici, Ioan, Nowozin, Sebastian, Thomsen, Benn

arXiv.org Artificial Intelligence

Retrieving images transmitted through multi-mode fibers is of growing interest, thanks to their ability to confine and transport light efficiently in a compact system. Here, we demonstrate machine-learning-based decoding of large-scale digital images (pages), maximizing page capacity for optical storage applications. Using a millimeter-sized square cross-section waveguide, we image an 8-bit spatial light modulator, presenting data as a matrix of symbols. Normally, decoders will incur a prohibitive O(n^2) computational scaling to decode n symbols in spatially scrambled data. However, by combining a digital twin of the setup with a U-Net, we can retrieve up to 66 kB using efficient convolutional operations only. We compare trainable ray-tracing-based with eigenmode-based twins and show the former to be superior thanks to its ability to overcome the simulation-to-experiment gap by adjusting to optical imperfections. We train the pipeline end-to-end using a differentiable mutual-information estimator based on the von-Mises distribution, generally applicable to phase-coding channels.


Sound-based quantum computers could be built using chip-sized device

New Scientist

A crucial building block for quantum computers based on sound has been shown to work for the first time. One popular way of building quantum computers is to encode information into quantum states of particles of light, then send them through a maze of mirrors and lenses to manipulate that information. Andrew Cleland at the University of Chicago and his colleagues set out to do the same with particles of sound. Sound is created when an object or a substance, like air, vibrates. We hear it as a continuous noise, but it is actually a collection of tiny chunks of vibration, or particles of sound, called phonons.


Breaking the scaling limits of analog computing

#artificialintelligence

As machine-learning models become larger and more complex, they require faster and more energy-efficient hardware to perform computations. Conventional digital computers are struggling to keep up. An analog optical neural network could perform the same tasks as a digital one, such as image classification or speech recognition, but because computations are performed using light instead of electrical signals, optical neural networks can run many times faster while consuming less energy. However, these analog devices are prone to hardware errors that can make computations less precise. Microscopic imperfections in hardware components are one cause of these errors.


Advanced quantum computer made available to the public for first time

New Scientist

A quantum computer that encodes information in pulses of light has solved a task in 36 microseconds that would take the best supercomputer at least 9000 years to complete. The researchers behind the machine have also connected it to the internet, allowing others to program it for their own use – the first time such a powerful quantum computer has been made available to the public. Quantum computers rely on the strange properties of quantum mechanics to theoretically perform certain calculations far more quickly than conventional computers. A long-standing goal in the field, known as quantum advantage or quantum supremacy, has been to demonstrate that quantum computers can actually beat regular machines. Google was the first to do so in 2019 with its Sycamore processor, which can solve a problem involving sampling random numbers that is essentially impossible for classical machines.


Scientists develop a real-life 'cyclops lens' with a laser pointer

Daily Mail - Science & tech

Scientists have taken inspiration from X-Men's Cyclops and created a contact lens that points a red laser at what the wearer is looking at. The regular-size contact lens is fitted with a vertical cavity surface emitting laser (VCSEL) which points in the direction the user is looking. French engineers used off-the-shelf components to create a working prototype which can be used for gaze recognition. Gaze recognition is a budding field of research and could be the next frontier for computer systems. Instead of using a touch screen or a mouse to control a device, gaze recognition would allow users to select options on a display just by looking at them.


A Neural-Net Based on Light Could Best Digital Computers

#artificialintelligence

We now perform mathematical calculations so often and so effortlessly with digital electronic computers that it's easy to forget that there was ever any other way to compute things. In an earlier era, though, engineers had to devise clever strategies to calculate the solutions they needed using various kinds of analog computers. Some of those early computers were electronic, but many were mechanical, relying on gears, balls and disks, hydraulic pumps and reservoirs, or the like. For some applications, like the processing of synthetic-aperture radar data in the 1960s, the analog computations were done optically. That approach gave way to digital computations as electronic technology improved. Curiously, though, some researchers are once again exploring the use of analog optical computers for a modern-day computational challenge: neural-network calculations.


Designing quantum experiments with a genetic algorithm

Nichols, Rosanna, Mineh, Lana, Rubio, Jesús, Matthews, Jonathan C. F., Knott, Paul A.

arXiv.org Artificial Intelligence

We introduce a genetic algorithm that designs quantum optics experiments for engineering quantum states with specific properties. Our algorithm is powerful and flexible, and can easily be modified to find methods of engineering states for a range of applications. Here we focus on quantum metrology. First, we consider the noise-free case, and use the algorithm to find quantum states with a large quantum Fisher information (QFI). We find methods, which only involve experimental elements that are available with current technology, for engineering quantum states with up to a 100-fold improvement over the best classical state, and a 20-fold improvement over the optimal Gaussian state. Such states are a superposition of the vacuum with a large number of photons (around 80), and can hence be seen as Schr\"odinger-cat-like states. We then apply the two most dominant noise sources in our setting -- photon loss and imperfect heralding -- and use the algorithm to find quantum states that still improve over the optimal Gaussian state with realistic levels of noise. This will open up experimental and technological work in using exotic non-Gaussian states for quantum-enhanced phase measurements. Finally, we use the Bayesian mean square error to look beyond the regime of validity of the QFI, finding quantum states with precision enhancements over the alternatives even when the experiment operates in the regime of limited data.