Goto

Collaborating Authors

Accelerating Search

Communications of the ACM

Workers insert a new CMS Beam Pipe during maintenance on the Large Hadron Collider. Everything about the Large Hadron Collider (LHC), the particle accelerator most famous for the Nobel Prize-winning discovery of the elusive Higgs boson, is massive--from its sheer size to the grandeur of its ambition to unlock some of the most fundamental secrets of the universe. At 27 kilometers (17 miles) in circumference, the accelerator is easily the largest machine in the world. This size enables the LHC, housed deep beneath the ground at CERN (the European Organization for Nuclear Research) near Geneva, to accelerate protons to speeds infinitesimally close to the speed of light, thus creating proton-on-proton collisions powerful enough to recreate miniature Big Bangs. The data about the output of these collisions, which is processed and analyzed by a worldwide network of computing centers and thousands of scientists, is measured in petabytes: for example, one of the LHC's main pixel detectors, the ultra-durable high-precision cameras that capture information about these collisions, records an astounding 40 million pictures per second--far too much to store in its entirety.


How Artificial Intelligence Is Addressing Real World Physics Problems

#artificialintelligence

Particle Physics: AI is used in high-energy physics problems. One of the biggest physics discoveries, the Higgs boson particle or "God particle", was discovered using the neural network. Researches at the Large Hadron Collider (LHC) have to deal with millions of data each day and go through and analyse it manually, which is a very tedious process. Also, particles like the Higgs boson, or any other particle of a great discovery for that matter, lies in the noise of this data. A processor of quantum computer called the annealer helped the LHC to detect this particle.


Job One for Quantum Computers: Boost Artificial Intelligence

#artificialintelligence

In the early '90s, Elizabeth Behrman, a physics professor at Wichita State University, began working to combine quantum physics with artificial intelligence--in particular, the then-maverick technology of neural networks. Most people thought she was mixing oil and water. "I had a heck of a time getting published," she recalled. "The neural-network journals would say, 'What is this quantum mechanics?' and the physics journals would say, 'What is this neural-network garbage?'"


Optimization Search Finds a Heart of Glass

Communications of the ACM

Stanford University visiting researcher Alireza Marandi (right) and post-doctoral scholar Peter McMahon inspect a prototype of a new light-based computer. A 20th-century theoretical model of the way magnetism develops in cooling solids is driving the development of analog computers that could deliver results with much less electrical power than today's super-computers. But the work may instead yield improved digital algorithms rather than a mainstream analog architecture. Helmut Katzgraber, associate professor at Texas A&M in College Station, TX, argues, "There is a deep synergy between classical optimization, statistical physics, high-performance computing, and quantum computing. Those things really go hand in hand.


Estimating the Density of States of Boolean Satisfiability Problems on Classical and Quantum Computing Platforms

arXiv.org Artificial Intelligence

Given a Boolean formula $\phi(x)$ in conjunctive normal form (CNF), the density of states counts the number of variable assignments that violate exactly $e$ clauses, for all values of $e$. Thus, the density of states is a histogram of the number of unsatisfied clauses over all possible assignments. This computation generalizes both maximum-satisfiability (MAX-SAT) and model counting problems and not only provides insight into the entire solution space, but also yields a measure for the \emph{hardness} of the problem instance. Consequently, in real-world scenarios, this problem is typically infeasible even when using state-of-the-art algorithms. While finding an exact answer to this problem is a computationally intensive task, we propose a novel approach for estimating density of states based on the concentration of measure inequalities. The methodology results in a quadratic unconstrained binary optimization (QUBO), which is particularly amenable to quantum annealing-based solutions. We present the overall approach and compare results from the D-Wave quantum annealer against the best-known classical algorithms such as the Hamze-de Freitas-Selby (HFS) algorithm and satisfiability modulo theory (SMT) solvers.