What is driving the'robot age' and how can businesses leverage the capabilities being produced? Artificial intelligence is one of the 21st century's dominant fields of innovation. So it's no surprise that cutting-edge robots and other advanced smart machines fall under the rapidly expanding Internet of Things, which is projected to reach 25 billion devices by 2020. Every day we're reading headlines on machines getting'smarter' and robotics transforming a variety of industries, but what's driving this'robot age' and how can businesses successfully integrate and leverage this advanced automation? It's clear that artificial intelligence (AI) is a new industrial revolution, one that's driving the rise of robotics.
Just over five years ago, IBM's Watson supercomputer crushed opponents in the televised quiz show Jeopardy. It was hard to foresee then, but artificial intelligence is now permeating our daily lives. Since then, IBM has expanded the Watson brand to a cognitive computing package with hardware and software used to diagnose diseases, explore for oil and gas, run scientific computing models, and allow cars to drive autonomously. The company has now announced new AI hardware and software packages. The original Watson used advanced algorithms and natural language interfaces to find and narrate answers.
Intel is taking a new direction in chip development as it looks to the future of artificial intelligence, with the company betting the technology will pervade applications and web services. The company on Thursday said it is developing new chips that will handle AI workloads, which will increasingly be a part of its chip future. For now, the AI chips will be released as specialized primary chips or co-processors in computers and separate from the major product lines. But over time, Intel could adapt and integrate the AI features into its mainstream server, IoT, and perhaps even PC chips. The AI features could be useful in servers, drones, robots, and autonomous cars.
We alluded to the possibility of Deep Learning and IoT previously where we said that Deep learning algorithms play an important role in IoT analytics because Machine data is sparse and / or has a temporal element to it. Devices may behave differently at different conditions. Hence, capturing all scenarios for data pre-processing/training stage of an algorithm is difficult. Deep learning algorithms can help to mitigate these risks by enabling algorithms learn on their own. This concept of machines learning on their own can be extended to machines teaching other machines.
Drones and robots are getting computer vision with higher-resolution cameras and artificial intelligence to recognize objects and images. Many are made with developer boards like Nvidia's Jetson TX1, which provides the smarts for auto-navigation and collision avoidance. TX1 has the horsepower to process live image feeds, and software tools to instantly analyze and provide context to visuals. The TX1 is now a lot faster and better equipped to handle AI and image processing. Nvidia's new Jetpack 2.3 software tools for TX1, announced on Tuesday, are a major update that doubles the deep-learning performance of the board.