The universe could be a neural network -- an interconnected computational system similar in structure to the human brain -- a controversial theory has proposed. As created by computer scientists, artificial neural networks are made up of various nodes -- equivalent to biological neurons -- that process and pass on signals. The network can change as it is used -- such as by increasing the weight given to certain nodes and connections -- allowing it to'learn' as it goes along. For example, given a set of cat pictures to study, a network can learn to pick out characteristic cat features on its own -- and so tell them apart from other animals. However, physicist Vitaly Vanchurin of the University of Minnesota Duluth believes that -- on a fundamental level -- everything we know may be one of these systems.
It's not every day that we come across a paper that attempts to redefine reality. But in a provocative preprint uploaded to arXiv this summer, a physics professor at the University of Minnesota Duluth named Vitaly Vanchurin attempts to reframe reality in a particularly eye-opening way -- suggesting that we're living inside a massive neural network that governs everything around us. In other words, he wrote in the paper, it's a "possibility that the entire universe on its most fundamental level is a neural network." For years, physicists have attempted to reconcile quantum mechanics and general relativity. The first posits that time is universal and absolute, while the latter argues that time is relative, linked to the fabric of space-time.
Iedereen weet toch dat we parasieten zijn op een hele grote ui?';) New research indicates the whole universe could be a giant neural network TNW (...) If we're all nodes in a neural network, what's the network's purpose? Is the universe one giant, closed network or is it a single layer in a grander network? Or perhaps we're just one of trillions of other universes connected to the same network. When we train our neural networks we run thousands or millions of cycles until the AI is properly "trained." Are we just one of an innumerable number of training cycles for some larger-than-universal machine's greater purpose?
Physicists have always hoped that once we understood the fundamental laws of physics, they would make unambiguous predictions for physical quantities. We imagined that the underlying physical laws would explain why the mass of the Higgs particle must be 125 gigaelectron-volts, as was recently discovered, and not any other value, and also make predictions for new particles that are yet to be discovered. For example, we would like to predict what kind of particles make up the dark matter. These hopes now appear to have been hopelessly naïve. Our most promising fundamental theory, string theory, does not make unique predictions. It seems to contain a vast landscape of solutions, or "vacua," each with its own values of the observable physical constants. The vacua are all physically realized within an enormous eternally inflating multiverse. Our problems arise because the multiverse is an infinite expanse of space and time. Has the theory lost its mooring to observation?
Since the early days of quantum theory, the concept of wave function collapse has been looked upon as mathematically unquantifiable, observer-dependent, non-local, or simply inelegant. Consequently, modern interpretations of quantum theory often try to avoid or make irrelevant the need for wave collapse. This is ironic, since experimental quantum physics requires some variant of wave collapse wherever quantum phenomena interact with the classical universe of the observer. This paper proposes a pragmatic view in which wave function collapses are treated as real phenomena that occur in pairs. Paired collapses occur when two wave packets exchange real (vs. virtual) momentum-carrying force particles such as photons. To minimize reversibility, such pairs must be separated by a relativistically time-like interval. The resulting model resembles a network of future-predictive simulations (wave packets) linked together by occasional exchanges of data (force particles). Each data exchange “updates” the wave packets by eliminating the need for them to “consider” some range of possible futures. The rest of the paper explores the information processing implications of this idea of networked wave packets. It is postulated that similar networks of simulations in classical computers could provide faster, more efficient ways to process sensor data.