Goto

Collaborating Authors

Harnessing noise in optical computing for AI

#artificialintelligence

Artificial intelligence and machine learning are currently affecting our lives in many small but impactful ways. For example, AI and machine learning applications recommend entertainment we might enjoy through streaming services such as Netflix and Spotify. In the near future, it's predicted that these technologies will have an even larger impact on society through activities such as driving fully autonomous vehicles, enabling complex scientific research and facilitating medical discoveries. But the computers used for AI and machine learning demand a lot of energy. Currently, the need for computing power related to these technologies is doubling roughly every three to four months.



Nano-scale mirror could be a breakthrough for optical computing

Engadget

Using a mere 2,000 atoms of cesium, Professor Julien Laurat and his team at the Pierre and Marie Curie University in Paris have created the world's smallest mirror. According to postdoctoral fellow Neil Corzo, who is also lead author on the team's research paper published in the Physical Review Letters journal this week, the nano-mirror has the same level of reflectance as materials that require tens of millions of atoms and could one day lead to new advances in optical computing. The mirror uses a nanoscale optical fiber only 400 nm in diameter to place the chain of cesium atoms in just the right alignment to reflect the light. Because of the extremely tiny scale, the atom chains had to be precisely spaced at half the wavelength of the light beam -- which also means the color of light had to be specifically chosen. As Popular Mechanics notes, the team was able to use the mirror to temporarily trap the light beam, essentially creating a sort of optical diode that can store and retrieve light pulses.


Light-based neural network does simple speech recognition

#artificialintelligence

While there are lots of things that artificial intelligence can't do yet--science being one of them--neural networks are proving themselves increasingly adept at a huge variety of pattern recognition tasks. These tasks can range anywhere from recognizing specific faces in photos to identifying specific patterns of particle decays in physics. Right now, neural networks are typically run on regular computers. Unfortunately, those networks are a poor architectural match; neurons combine both memory and calculations into a single unit, while our computers keep those functions separate. For this reason, some companies are exploring dedicated neural network chips.


Intel Labs Moving Mountains With Neuromorphic Computing And Photonics Technologies

#artificialintelligence

While the industry loves to combine "R&D" and we see this in every tech company's P&L, research and development are very different. Research is high risk, market making investments and discoveries that are unattached to products. Development is applying that research and other's IP to create an end product or services. Very few companies do research, and Intel has had a heritage in research for decades. One of the most exciting aspects of working as a tech analyst is, quite frankly, being one of the first to learn of these new, research-driven, cutting-edge technologies coming down the pipeline in the not-so-distant future--from the expected to the truly mind-boggling.