FANUC, the world's largest maker of industrial robots, plans to start connecting 400,000 of their installed systems by the end of this year. The goal is to collect data about their operations and, through the use of deep learning, improve performance. Similarly, Kuka is building a deep-learning AI network for their industrial robots. FANUC is now moving forward to connect all its manufacturing robots. The system proactively detects and informs of a potential equipment or process problem before unexpected downtime occurs.
Many data scientists are getting excited about advances in large-scale machine learning, particularly recent success stories in computer vision and speech. While correctly identifying meaningful patterns in data sets is the promise of machine learning, AI in the data center currently is the buzz for big tech companies like Google, Amazon, Facebook AI Research, Twitter and many more startups with shared goals of making useful machine learning software. For researchers looking at the scientific and engineering challenges of understanding the brain and building computers, Neural Computation highlights common problems and techniques in modeling the brain, and in the design and construction of neurally-inspired information processing systems. H2O is open source (AI for Business, deep learning with H20), Spark is open source (SparkNet batch processing framework). It's very easy to see the algorithm development and the value-add.
IBM (NYSE: IBM) today revealed a series of new servers designed to help propel cognitive workloads and to drive greater data center efficiency. Featuring a new chip, the Linux-based lineup incorporates innovations from the OpenPOWER community that deliver higher levels of performance and greater computing efficiency than available on any x86-based server. Collaboratively developed with some of the world's leading technology companies, the new Power Systems are uniquely designed to propel artificial intelligence, deep learning, high performance data analytics and other compute-heavy workloads, which can help businesses and cloud service providers save money on data center costs. The three new systems are an expansion of IBM's Linux server portfolio comprised of IBM's specialized line of servers co-developed with fellow members of the OpenPOWER Foundation. The new servers join the Power Systems LC lineup that is designed to outperform x86-based servers on a variety of data-intensive workloads.
From smartphone assistants to image recognition and translation, machine learning already helps us in our everyday lives. But it can also help us to tackle some of the world's most challenging physical problems -- such as energy consumption. Large-scale commercial and industrial systems like data centers consume a lot of energy, and while much has been done to stem the growth of energy use, there remains a lot more to do given the world's increasing need for computing power. Google is taking many steps to reduce energy consumptions . Compared to five years ago, Google now get around 3.5 times the computing power out of the same amount of energy.
Google's acquisition of artificial intelligence startup DeepMind in 2014 has resulted in the search giant making a 15 percent improvement in power usage efficiency, DeepMind's co-founder reportedly said. Google reportedly paid 650 million for the London-based artificial intelligence firm founded by neuroscientist Demis Hassabis, Shane Legg and Mustafa Suleyman. The AI built by the company mastered playing the Atari video games but most of its projects are yet to translate into revenue. However, it is helping the tech giant tackle the massive energy bills it deals with because of its data centers. Google said it used 4,402,836 MWh of electricity in 2014, equivalent to the average yearly consumption of about 366,903 U.S. family homes, according to Bloomberg.