Goto

Collaborating Authors

Harnessing noise in optical computing for AI

#artificialintelligence

Artificial intelligence and machine learning are currently affecting our lives in many small but impactful ways. For example, AI and machine learning applications recommend entertainment we might enjoy through streaming services such as Netflix and Spotify. In the near future, it's predicted that these technologies will have an even larger impact on society through activities such as driving fully autonomous vehicles, enabling complex scientific research and facilitating medical discoveries. But the computers used for AI and machine learning demand a lot of energy. Currently, the need for computing power related to these technologies is doubling roughly every three to four months.


Harnessing Noise In Optical Computing For AI - AI Summary

#artificialintelligence

In the near future, it's predicted that these technologies will have an even larger impact on society through activities such as driving fully autonomous vehicles, enabling complex scientific research and facilitating medical discoveries. And cloud computing data centers used by AI and machine learning applications worldwide are already devouring more electrical power per year than some small countries. A research team led by the University of Washington has developed new optical computing hardware for AI and machine learning that is faster and much more energy efficient than conventional electronics. Optical computing noise essentially comes from stray light particles, or photons, that originate from the operation of lasers within the device and background thermal radiation. Of course the optical computer didn't have a human hand for writing, so its form of "handwriting" was to generate digital images that had a style similar to the samples it had studied, but were not identical to them.


Light-Powered Computers Brighten AI's Future

#artificialintelligence

The idea of building a computer that uses light rather than electricity goes back more than half a century. "Optical computing" has long promised faster performance while consuming much less energy than conventional electronic computers. The prospect of a practical optical computer has languished, however, as scientists have struggled to make the light-based components needed to outshine existing computers. Despite these setbacks, optical computers might now get a fresh start--researchers are testing a new type of photonic computer chip, which could pave the way for artificially intelligent devices as smart as self-driving cars, but small enough to fit in one's pocket. A conventional computer relies on electronic circuits that switch one another on and off in a dance carefully choreographed to correspond to, say, the multiplication of two numbers.


Physical systems perform machine-learning computations

#artificialintelligence

You may not be able to teach an old dog new tricks, but Cornell researchers have found a way to train physical systems, ranging from computer speakers and lasers to simple electronic circuits, to perform machine-learning computations, such as identifying handwritten numbers and spoken vowel sounds. Cornell researchers have successfully trained (from left to right) a computer speaker, a simple electronic circuit and a laser to perform machine-learning computations. The experiment is no mere stunt or parlor trick. By turning these physical systems into the same kind of neural networks that drive services like Google Translate and online searches, the researchers have demonstrated an early but viable alternative to conventional electronic processors – one with the potential to be orders of magnitude faster and more energy efficient than the power-gobbling chips in data centers and server farms that support many artificial-intelligence applications. "Many different physical systems have enough complexity in them that they can perform a large range of computations," said Peter McMahon, assistant professor of applied and engineering physics in the College of Engineering, who led the project.


Physical systems perform machine-learning computations

#artificialintelligence

You may not be able to teach an old dog new tricks, but Cornell researchers have found a way to train physical systems, ranging from computer speakers and lasers to simple electronic circuits, to perform machine-learning computations, such as identifying handwritten numbers and spoken vowel sounds. The experiment is no mere stunt or parlor trick. By turning these physical systems into the same kind of neural networks that drive services like Google Translate and online searches, the researchers have demonstrated an early but viable alternative to conventional electronic processors--one with the potential to be orders of magnitude faster and more energy efficient than the power-gobbling chips in data centers and server farms that support many artificial-intelligence applications. "Many different physical systems have enough complexity in them that they can perform a large range of computations," said Peter McMahon, assistant professor of applied and engineering physics in the College of Engineering, who led the project. "The systems we performed our demonstrations with look nothing like each other, and they seem to [be] having nothing to do with handwritten-digit recognition or vowel classification, and yet you can train them to do it."