Collaborating Authors

Glaxo's biology research with novel Cerebras machine shows hardware may change how AI is done


Artificial intelligence research experienced a renaissance in the last twenty years as a result in part of greater computing power, the rise of the graphics processing unit, or GPU. Now, novel AI computer systems may be poised to have a similarly large impact. They may change not just the speed of AI work but the kinds of experiments that are done in the field. AI is changing the entire nature of computing, and as part of an inevitable feedback loop, computing will end up changing the nature of AI. An example of that showed up this week in new work being done by GlaxoSmithKline, the big British drug maker.

Cerebras Systems Unveils the Industry's First Trillion Transistor Chip


WIRE)--Cerebras Systems, a startup dedicated to accelerating Artificial intelligence (AI) compute, today unveiled the largest chip ever built. Optimized for AI work, the Cerebras Wafer Scale Engine (WSE) is a single chip that contains more than 1.2 trillion transistors and is 46,225 square millimeters. The WSE is 56.7 times larger than the largest graphics processing unit which measures 815 square millimeters and 21.1 billion transistors1. The WSE also contains 3,000 times more high speed, on-chip memory, and has 10,000 times more memory bandwidth. In AI, chip size is profoundly important.

Cerebras Debuts AI Supercomputer-on-a-Wafer – HPCwire – IAM Network


Could a wafer scale silicon chip from Cerebras Systems be the first "supercomputer on a chip" worthy of the designation? Last week at Hot Chips held at.

Cerebras teases second-generation wafer-scale AI chip


Cerebras executive Sean Lie described the company's forthcoming second-generation AI chip at the Hot Chips conference, which is taking place virtually this year. Cerebras Systems, the Los Altos, California startup that a year ago unveiled the biggest chip ever seen, this afternoon gave a preview of its second-generation chip. The second-gen WSE, or "wafer-scale engine," chip, currently "running in our labs," will offer 850,000 individual compute cores in a chip that takes up almost the entire surface of a traditional silicon wafer, according to Cerebras executive Sean Lie. Lie was addressing the audience of Hot Chips, a computer chip conference taking place virtually this year. The processor has 2.6 trillion transistors in total, and it is manufactured by Taiwan Semiconductor in the company's 7-nanometer fabrication process.

Cerebras unveils the world's chunkiest AI chip


COMPUTER BRAINS are tiny rectangles, becoming tinier with each new generation. Or so it used to be. These days Andrew Feldman, the boss of Cerebras, a startup, pulls a block of Plexiglas out of his backpack. Baked into it is a microprocessor the size of letter paper. "It's the world's biggest," he says proudly, rattling off its technical specs: 400,000 cores (sub-brains), 18 gigabytes of memory and 1.2trn transistors.