Goto

Collaborating Authors

Carnegie Mellon Dean Of Computer Science On The Future Of AI

#artificialintelligence

Andrew Moore's career path at Carnegie Mellon has become emblematic of the way the University fosters its star talent. He became a tenured professor at Carnegie Mellon in 2000. In 2006, Moore joined Google, where he was responsible for building a new engineering office. As a vice president of engineering, Andrew was responsible for Google Shopping, the company's retail segment. Moore returned to Carnegie Mellon in 2014 as the Dean of the Computer Science department.


The chips are down for Moore's law

#artificialintelligence

Next month, the worldwide semiconductor industry will formally acknowledge what has become increasingly obvious to everyone involved: Moore's law, the principle that has powered the information-technology revolution since the 1960s, is nearing its end. A rule of thumb that has come to dominate computing, Moore's law states that the number of transistors on a microprocessor chip will double every two years or so -- which has generally meant that the chip's performance will, too. The exponential improvement that the law describes transformed the first crude home computers of the 1970s into the sophisticated machines of the 1980s and 1990s, and from there gave rise to high-speed Internet, smartphones and the wired-up cars, refrigerators and thermostats that are becoming prevalent today. Kerri Smith finds out from industry experts what will happen when Moore's law falters None of this was inevitable: chipmakers deliberately chose to stay on the Moore's law track. At every stage, software developers came up with applications that strained the capabilities of existing chips; consumers asked more of their devices; and manufacturers rushed to meet that demand with next-generation chips. Since the 1990s, in fact, the semiconductor industry has released a research road map every two years to coordinate what its hundreds of manufacturers and suppliers are doing to stay in step with the law -- a strategy sometimes called More Moore.


AI is the Next Exascale – Rick Stevens on What that Means and Why It's Important

#artificialintelligence

HPCwire: Walk us through the program, give us a sense of what these AI and science town halls are all about and what they are trying to accomplish? RS: If you remember back in 2007, we had three town hall meetings – at Argonne, Berkeley and Oak Ridge – that launched the whole DOE Exascale project and so forth. At that time the idea was to get people together and ask them, for exascale, what if we could build these faster machines, what would you do with them. It was a way to get people thinking about the possibility of that and of course it took long time to get the exascale computing program going. With these town halls we are kind of asking a variation on that question. Now we're asking the question of what's the opportunity for AI in science or the application of science, particularly in the context of DOE, but more broadly because DOE's got a lot of collaborations with NIH and other agencies. So really asking the fundamental question of what do we have to do in the AI space to make it relevant for science. The point of the town halls – three in the labs and one in Washington in October – is go get people thinking about what opportunities there are in different scientific domains for breakthrough science that can be accomplished by leveraging AI and working AI into simulation, and bringing AI into big data, bringing AI to the facility and so forth. So that's the concept; it's really to get the community moving.


After Moore's Law: Predicting The Future Beyond Silicon Chips

NPR Technology

This 2005 silicon wafer with Pentium 4 processors was signed by Gordon Moore for the 40th anniversary of Moore's law. This 2005 silicon wafer with Pentium 4 processors was signed by Gordon Moore for the 40th anniversary of Moore's law. For several decades now, Georgia Tech professor Tom Conte has been studying how to improve computers: "How do we make them faster and more efficient next time around versus what we just made?" And for decades, the principle guiding much of the innovation in computing has been Moore's law -- a prediction, made by Intel co-founder Gordon Moore, that the number of transistors on a microprocessor chip would double every two years or so. What it's come to represent is an expectation, as The New York Times puts it, that "engineers would always find a way to make the components on computer chips smaller, faster and cheaper."


Artificial intelligence positioned to be a game-changer

#artificialintelligence

The search to improve and eventually perfect artificial intelligence is driving the research labs of some of the most advanced and best-known American corporations. They are investing billions of dollars and many of their best scientific minds in pursuit of that goal. All that money and manpower has begun to pay off.In the past few years, artificial intelligence -- or A.I. -- has taken a big leap -- making important strides in areas like medicine and military technology. What was once in the realm of science fiction has become day-to-day reality. You'll find A.I. routinely in your smart phone, in your car, in your household appliances and it is on the verge of changing everything. On 60 Minutes Overtime, Charlie Rose explores the labs at Carnegie Mellon on the cutting edge of A.I. See robots learning to go where humans can'... It was, for decades, primitive technology. But it now has abilities we never expected. It can learn through experience -- much the way humans do -- and it won't be long before machines, like their human creators, begin thinking for themselves, creatively. Independently with judgment -- sometimes better judgment than humans have. As we first reported last fall, the technology is so promising that IBM has staked its 106-year-old reputation on its version of artificial intelligence called Watson -- one of the most sophisticated computing systems ever built.