Goto

Collaborating Authors

Photonic Computing Company Takes Aim at Artificial Intelligence

#artificialintelligence

Chip startup Lightmatter has received an infusion of $11 million from investors to help bring the world's first silicon photonics processor for AI to market. Using technology originally developed at MIT, the company is promising "orders of magnitude performance improvements over what's feasible using existing technologies."


Photonics startup Lightmatter details its AI optical accelerator chip

#artificialintelligence

Ahead of the Hot Chips conference this week, photonics chip startup Lightmatter revealed the first technical details about its upcoming test chip. Unlike conventional processors and graphics cards, the test chip uses light to send signals, promising orders of magnitude higher performance and efficiency. The technology underpinning the test chip -- photonic integrated circuits -- stems from a 2017 paper coauthored by Lightmatter CEO and MIT alumnus Nicholas Harris that described a novel way to perform machine learning workloads using optical interference. Chips like the test chip, which is on track for a fall 2021 release, require only a limited amount of energy because light produces less heat than electricity. They also benefit from reduced latency and are less susceptible to changes in temperature, electromagnetic fields, and noise.


Lightmatter aims to reinvent AI-specific chips with photonic computing and $11M in funding

#artificialintelligence

It takes an immense amount of processing power to create and operate the "AI" features we all use so often, from playlist generation to voice recognition. Lightmatter is a startup that is looking to change the way all that computation is done -- and not in a small way. The company makes photonic chips that essentially perform calculations at the speed of light, leaving transistors in the dust. It just closed an $11 million series A.


Developers Turn To Analog For Neural Nets

#artificialintelligence

Machine-learning (ML) solutions are proliferating across a wide variety of industries, but the overwhelming majority of the commercial implementations still rely on digital logic for their solution. With the exception of in-memory computing, analog solutions mostly have been restricted to universities and attempts at neuromorphic computing. However, that's starting to change. "Everyone's looking at the fact that deep neural networks are so energy-intensive when you implement them in digital, because you've got all these multiply-and-accumulates, and they're so deep, that they can suck up enormous amounts of power," said Elias Fallon, software engineering group director for the Custom IC & PCB Group at Cadence. Some suggest we're reaching a limit with digital. "Digital architectural approaches have hit the wall to solve the deep neural network MAC (multiply-accumulate) operations," said Sumit Vishwakarma, product manager at Siemens EDA. "As the size of the DNN increases, weight access operations result in huge energy consumption." The current analog approaches aren't attempting to define an entirely new ML paradigm. "The last 50 years have all been focused on digital processing, and for good reason," said Thomas Doyle, CEO and co-founder of Aspinity.


Lightelligence releases prototype of its optical AI accelerator chip

#artificialintelligence

Accelerator chips that use light rather than electrons to carry out computations promise to supercharge AI model training and inference. In theory, they could process algorithms at the speed of light -- dramatically faster than today's speediest logic-gate circuits -- but so far, light's unpredictability has foiled most attempts to emulate transistors optically. Boston-based Lightelligence, though, claims it's achieved a measure of success with its optical AI chip, which today debuts in prototype form. It says that latency is improved up to 10,000 times compared with traditional hardware, and it estimates power consumption at "orders of magnitude" lower. The technology underpinning it has its origins in a 2017 paper coauthored by CEO Yichen Shen.