Results


Inside Microsoft's AI Comeback

#artificialintelligence

But while his peer scientists Yann LeCun and Geoffrey Hinton have signed on to Facebook and Google, respectively, Bengio, 53, has chosen to continue working from his small third-floor office on the hilltop campus of the University of Montreal. Shum, who is in charge of all of AI and research at Microsoft, has just finished a dress rehearsal for next week's Build developers conference, and he wants to show me demos. Shum has spent the past several years helping his boss, CEO Satya Nadella, make good on his promise to remake Microsoft around artificial intelligence. Bill Gates showed off a mapping technology in 1998, for example, but it never came to market; Google launched Maps in 2005.


Moore's Law may be out of steam, but the power of artificial intelligence is accelerating

#artificialintelligence

A paper from Google's researchers says they simultaneously used as many as 800 of the powerful and expensive graphics processors that have been crucial to the recent uptick in the power of machine learning (see "10 Breakthrough Technologies 2013: Deep Learning"). Feeding data into deep learning software to train it for a particular task is much more resource intensive than running the system afterwards, but that still takes significant oomph. Intel has slowed the pace at which it introduces generations of new chips with smaller, denser transistors (see "Moore's Law Is Dead. It also motivates the startups--and giants such as Google--creating new chips customized to power machine learning (see "Google Reveals a Powerful New AI Chip and Supercomputer").


Moore's Law may be out of steam, but the power of artificial intelligence is accelerating

#artificialintelligence

A paper from Google's researchers says they simultaneously used as many as 800 of the powerful and expensive graphics processors that have been crucial to the recent uptick in the power of machine learning (see "10 Breakthrough Technologies 2013: Deep Learning"). Feeding data into deep learning software to train it for a particular task is much more resource intensive than running the system afterwards, but that still takes significant oomph. Intel has slowed the pace at which it introduces generations of new chips with smaller, denser transistors (see "Moore's Law Is Dead. It also motivates the startups--and giants such as Google--creating new chips customized to power machine learning (see "Google Reveals a Powerful New AI Chip and Supercomputer").


Artificial Intelligence and Moore's law - Technowize

#artificialintelligence

From 1958, since the invention of the first integrated circuit till 1965, the number of components or transistor density in an integrated circuit has doubled every year, marked Gordon Moore. So when Intel, the pioneer of chip developments adapted Moore's law as standard principle for advancing the computing power, the whole semi-conductor industry followed this outline on their chips. But then with the constant advancement, the electronics industry benefited from the Moore's standard method of designing processor chips till 50 years. The technology today is tending to design artificial intelligence technology that matches the super intelligence of human brain.


How will Cognitive Computing Change the World.

#artificialintelligence

According to IBM CEO Ginni Rometty, it isn't going to be long ( 5 years-- according to the statement at ThinkForum) before every decision that is made by business is partly made by a cognitive system. These systems are touted as being systems that can learn, can understand and can help to define best practice in business. Last summer, when Rometty was speaking at Thinkforum in Sydney Australia, it didn't sound as though she considered it fiction on any level and in fact, much of what she's discussing already exists and is working to save us time and money. "Every industry has its Uber or Tesla, and many people say they are going to be a technology company of some kind. An important question is: When everyone is digital, who wins? "Digital for all has to be the foundation, but it's not the destination.


How computing will change amid challenges to Moore's Law

#artificialintelligence

Mark Papermaster is the chief technology officer and senior vice president of technology at AMD. We are in the midst of a true inflection point in computing, and the very way we interface with technology daily is changing. The rapid inclusion of embedded sensors and internet connectivity is turning most of the appliances we use into "smart devices" that can respond to our voice commands, while generating masses of data that is in turn analyzed in edge-of-network hub computers or the cloud. We are seeing virtual and augmented reality now just starting to ramp adoption, and these technologies require significant compute and graphics processing to have a more real-life experience. This is coupled with the phenomenal advancement in machine learning applications that can be trained to sift through masses of data and deliver timely and context-aware information, or take over mundane tasks.


How computing will change amid challenges to Moore's Law

#artificialintelligence

Mark Papermaster is the chief technology officer and senior vice president of technology at AMD. We are in the midst of a true inflection point in computing and the very way we interface with technology daily is changing. The rapid inclusion of embedded sensors and internet connectivity is turning most of the appliances we use into "smart devices" that can respond to our voice commands, while generating masses of data that is in turn analyzed in edge-of-network hub computers or the cloud. We are seeing virtual and augmented reality (VR / AR) now just starting to ramp adoption, and these technologies require significant compute and graphics processing to have a more real life experience. This is coupled with the phenomenal advancement in machine learning applications that can be trained to sift through masses of data and deliver timely and context aware information, or take over mundane tasks.


A framework for Industry 4.0 - welcome to the next industrial revolution

#artificialintelligence

We're surrounded by more and more connected devices we're calling the Internet of Things. We can turn our heating on from our phones on the commute home. Pegs can tell us when to bring the washing in so it doesn't get wet. Cars know the hazards ahead and warn us before we get there so that we can avoid them. Many of the'things' have been manufactured within the'Industrial Internet of Things' or'Industry 4.0'.


How AI, Cloud, And Robots Will Revolutionize SMB Accounting - Level 3

#artificialintelligence

Robots, cloud software and artificial intelligence are all things that accountants fear will make them irrelevant. But the truth is that the real future of accounting and technology isn't a technological advancement, it's a trend toward a different model altogether. Technology is evolving rapidly; according to Accenture, 80% of accounting and finance tasks will be delivered with automation in the next few years. This leaves many professionals wondering what the future of their jobs will look like, and where they fit into that future. There's no reason to be afraid.


Let's Take an In-Depth Look at Current Advances in Artificial Intelligence

#artificialintelligence

Artificial intelligence is one of the most prominent technologies currently being advanced. Not only is it a hot topic for researchers, but the world's greatest technological minds are fearful of its potential. Bill Gates, Stephen Hawking, Elon Musk, hundreds of the world's top minds have signed papers stating their fear about the destructive potential of AI systems. Regardless of the top minds in opposition, advances in the industry continue. Integrated AI systems today are already helping us get through daily life, according to Wired.