Goto

Collaborating Authors

Cerebras prepares for the era of 120 trillion-parameter neural networks

ZDNet

Cerebras added to its previously annouced CS-2 AI computer with a new switch product, the SwarmX, that does routing but also calculations, and a memory computer containing 2.4 petabytes of DRAM and NAND, called MemoryX. Artificial intelligence in its deep learning form is producing neural networks that will have trillions and trillions of neural weights, or parameters, and the increasing scale presents special problems for the hardware and software used to develop such neural networks. "In two years, models got a thousand times bigger and they required a thousand times more compute," says Andrew Feldman, co-founder and CEO of AI system maker Cerebras Systems, summing up the recent history of neural nets in an interview with ZDNet via Zoom. "That is a tough trajectory," says Feldman. Feldman's company this week is unveiling new computers at the annual Hot Chips computer chip conference for advanced computing.


Cerebras Systems Announces World's First Brain-Scale Artificial Intelligence Solution

#artificialintelligence

WIRE)--Cerebras Systems, the pioneer in innovative compute solutions for Artificial Intelligence (AI), today unveiled the world's first brain-scale AI solution. The human brain contains on the order of 100 trillion synapses. The largest AI hardware clusters were on the order of 1% of human brain scale, or about 1 trillion synapse equivalents, called parameters. At only a fraction of full human brain-scale, these clusters of graphics processors consume acres of space and megawatts of power, and require dedicated teams to operate. Today, Cerebras announces technology enabling a single CS-2 accelerator--the size of a dorm room refrigerator--to support models of over 120 trillion parameters in size.


Cerebras Systems Lays The Foundation For Huge Artificial Intelligence

#artificialintelligence

OK, I thought I was done with preparing for HotChips, having prepared six blogs and three research papers. I was ready to take a few days off. Nonetheless, I decided to take a call from Andrew Feldman, CEO of Cerebras Systems. I have known and respected Andrew for over a decade and he always has exciting things to share. I'm so glad I took the call.


'We can solve this problem in an amount of time that no number of GPUs or CPUs can achieve,' startup Cerebras tells supercomputing conference

ZDNet

For certain classes of problems in high-performance computing, all supercomputers have an unavoidable, and fatal bottleneck: memory bandwidth. That is the argument made this week by one startup company at the SC20 supercomputing conference, which usually happens in San Diego but is happening this week virtually. The company making that argument is Cerebras Systems, the AI computer maker that contends its machine can achieve speed in solving problems that no existing system can. "We can solve this problem in an amount of time that no number of GPUs or CPUs can achieve," Cerebras's CEO, Andrew Feldman, told ZDNet in an interview by Zoom. "This means the CS-1 for this work is the fastest machine ever built, and it's faster than any combination of clustering of other processors," he added.


Glaxo's biology research with novel Cerebras machine shows hardware may change how AI is done

ZDNet

Artificial intelligence research experienced a renaissance in the last twenty years as a result in part of greater computing power, the rise of the graphics processing unit, or GPU. Now, novel AI computer systems may be poised to have a similarly large impact. They may change not just the speed of AI work but the kinds of experiments that are done in the field. AI is changing the entire nature of computing, and as part of an inevitable feedback loop, computing will end up changing the nature of AI. An example of that showed up this week in new work being done by GlaxoSmithKline, the big British drug maker.