Results


After Moore's Law: Predicting The Future Beyond Silicon Chips

#artificialintelligence

And for decades, the principle guiding much of the innovation in computing has been Moore's law -- a prediction, made by Intel co-founder Gordon Moore, that the number of transistors on a microprocessor chip would double every two years or so. And we can use (it) for problems that today are every expensive to execute on modern computers -- things like image recognition or voice recognition, things that today take an amazing amount of compute power to do well. "Roadmapping" Moore's law has really driven the industry in terms of making faster and smaller transistors. These domains are, for example, weather prediction, or what we call big data analytics, which is what Google does, or machine learning for recognition of voice or images, ... a lot of high-performance computing simulations, such as thermal process evolution simulations.