To Build a Better AI Supercomputer, Let There Be Light
Most artificial intelligence experts seem to agree that taking the next big leap in the field will depend at least partly on building supercomputers on a once unimaginable scale. At an event hosted by the venture capital firm Sequoia last month, the CEO of a startup called Lightmatter pitched a technology that might well enable this hyperscale computing rethink by letting chips talk directly to one another using light. Data today generally moves around inside computers--and in the case of training AI algorithms, between chips inside a data center--via electrical signals. Sometimes parts of those interconnections are converted to fiber-optic links for great bandwidth, but converting signals back and forth between optical and electrical creates a communications bottleneck. Instead, Lightmatter wants to directly connect hundreds of thousands or even millions of GPUs--those silicon chips that are crucial to AI training--using optical links.
Apr-4-2024, 16:00:00 GMT
- Technology: