Goto

Collaborating Authors

Artificial Intelligence Is Driving A Silicon Renaissance

#artificialintelligence

Bay Area startup Cerebras Systems recently unveiled the largest computer chip in history, ... [ ] purpose-built for AI. The semiconductor is the foundational technology of the digital age. It gave Silicon Valley its name. It sits at the heart of the computing revolution that has transformed every facet of society over the past half-century. The pace of improvement in computing capabilities has been breathtaking and relentless since Intel introduced the world's first microprocessor in 1971.


Artificial Intelligence Is Driving A Silicon Renaissance

#artificialintelligence

Bay Area startup Cerebras Systems recently unveiled the largest computer chip in history, ... [ ] purpose-built for AI. The semiconductor is the foundational technology of the digital age. It gave Silicon Valley its name. It sits at the heart of the computing revolution that has transformed every facet of society over the past half-century. The pace of improvement in computing capabilities has been breathtaking and relentless since Intel introduced the world's first microprocessor in 1971.


AI Chip Strikes Down the von Neumann Bottleneck With In-Memory Neural Net Processing - News

#artificialintelligence

Computer architecture is a highly dynamic field that has evolved significantly since its inception. Amongst all of the change and innovation in the field since the 1940s, one concept has remained integral and unscathed: the von Neumann Architecture. Recently, with the growth of artificial intelligence, architects are beginning to break the mold and challenge von Neumann's tenure. Specifically, two companies have teamed up to create an AI chip that performs neural network computations in hardware memory. The von Neumann architecture was first introduced by John von Neumann in his 1945 paper, "First Draft of a Report on the EDVAC."


Big data needs a hardware revolution

#artificialintelligence

Software companies make headlines but research on computer hardware could bring bigger rewards.Credit: Morris MacMatzen/Getty


Artificial Intelligence and Moore's law - Technowize

#artificialintelligence

From 1958, since the invention of the first integrated circuit till 1965, the number of components or transistor density in an integrated circuit has doubled every year, marked Gordon Moore. So when Intel, the pioneer of chip developments adapted Moore's law as standard principle for advancing the computing power, the whole semi-conductor industry followed this outline on their chips. But then with the constant advancement, the electronics industry benefited from the Moore's standard method of designing processor chips till 50 years. The technology today is tending to design artificial intelligence technology that matches the super intelligence of human brain.