With IBM POWER9, we're all riding the AI wave - IBM Systems Blog: In the Making


There's a big connection between my love for water sports and hardware design -- both involve observing waves and planning several moves ahead. Four years ago, when we started sketching the POWER9 chip from scratch, we saw an upsurge of modern workloads driven by artificial intelligence and massive data sets. We are now ready to ride this new tide of computing with POWER9. It is a transformational architecture and an evolutionary shift from the archaic ways of computing promoted by x86. POWER9 is loaded with industry-leading new technologies designed for AI to thrive.

An Old Technique Could Put Artificial Intelligence in Your Hearing Aid


Dag Spicer is expecting a special package soon, but it's not a Black Friday impulse buy. The fist-sized motor, greened by corrosion, is from a historic room-sized computer intended to ape the human brain. It may also point toward artificial intelligence's future. Spicer is senior curator at the Computer History Museum in Mountain View, California. The motor in the mail is from the Mark 1 Perceptron, built by Cornell researcher Frank Rosenblatt in 1958.

What being an "AI first" company means for Google


Back at Google I/O, CEO Sundar Pichai outlined the company's vision as an "AI first" company, with a new focus on contextual information, machine learning, and using intelligent technology to improve customer experience. The launch of the Pixel 2 and 2 XL, the latest batch of Google Home products, and the Google Clips offer a glimpse into what this long-term strategic shift could mean. We'll get to Google's latest smartphones in a minute, but there's much more to explore about the company's latest strategy. As part of the Google I/O 2017 keynote, Sundar Pichai announced that the company's various machine learning and artificial intelligence efforts and teams are being brought together under a new initiative called will be focusing not only on research, but on developing tools such as TensorFlow and its new Cloud TPUs, and "applied AI".



With the boom in digital technologies, the world is producing over 2.5 exabytes of data every day. To put that into perspective, it is equivalent to the memory of 5 million laptops or 150 million phones. The deluge of data is forecast to increase with the passing day and with it has increased the need for powerful hardware that can support it. This hardware advancement refers to faster computing or processing speed and larger storage systems. Companies worldwide are investing in powerful computing with the R&Ds constantly in the race for making improved processors.

The Role of Hadoop in Digital Transformations and Managing the IoT


The digital transformation underway at Under Armour is erasing any stale stereotypes that athletes and techies don't mix. While hardcore runners sporting the company's latest microthread singlet can't see Hadoop, Apache Hive, Apache Spark, or Presto, these technologies are teaming up to track some serious mileage. Under Armour is working on a "connected fitness" vision that connects body, apparel, activity level, and health. By combining the data from all these sources into an app, consumers will gain a better understanding of their health and fitness, and Under Armour will be able to identify and respond to customer needs more quickly with personalized services and products. The company stores and analyzes data about food and nutrition, recipes, workout activities, music, sleep patterns, purchase histories, and more.

Artificial Intelligence: An Historic Perspective


We've discussed artificial intelligence (AI) quite a bit in this column thus far -- and with good reason. AI is currently THE topic in legal tech (although Blockchain is certainly running a close second), and it's almost impossible to carry on an in-depth discussion on the future of the legal industry without mentioning AI. Legal professionals, librarians, and analysts alike have speculated on the rise of the robo-lawyer, the role that increasingly sophisticated machines will play in the practice of law -- and even whether lawyers will cease to exist at some point in the future. Given the way in which AI has penetrated the conversation around legal technology, I think it makes sense to examine AI's larger history. To quote from one of my favorite musicians, Bob Marley: "In this great future, we can't forget our past."



Outcomes included the development of symbolic information processing which offered a new paradigm in brain modelling. Rather than pursuing true general intelligence, more companies and researchers are settling for Weak AI programs like Siri, Alexa, Cortana and chatbots. Not all researchers are content to settle for the Weak AI compromise and dedicated purists continue to pursue true AGI. Breakthroughs and developments in Weak AI can be rapid and each receives considerable public attention coupled with further resource investment.

Taking Machine Learning to the Edge - insideBIGDATA


In this special guest feature, Matthew C. King, IIoT Solutions Expert, FogHorn Systems, discusses how edge machine learning combines two hot industry trends – moving industrial internet of things (IIoT) compute to the edge of the network and the ability to model new efficiencies in industrial assets. Edge machine learning combines two hot industry trends – moving industrial internet of things (IIoT) compute to the edge of the network and the ability to model new efficiencies in industrial assets. Edge machine learning allows industrial companies to build complex algorithms that optimize machines, processes and even entire factories – all while eliminating crippling bandwidth and storage requirements. Edge machine learning mines industrial data to create better outcomes, safer working conditions and significant cost savings.

277 Data Science Key Terms, Explained


This post presents a collection of data science related key terms with concise, no-nonsense definitions, organized into 12 distinct topics. Enjoying a surge in research and industry, due mainly to its incredible successes in a number of different areas, deep learning is the process of applying deep neural network technologies - that is, neural network architectures with multiple hidden layers - to solve problems. Deep learning is a process, like data mining, which employs deep neural network architectures, which are particular types of machine learning algorithms. This post presents 16 key database concepts and their corresponding concise, straightforward definitions.

Intel unveils AI-focused Movidius VPU chip


Intel on Monday announced its next-generation Movidius vision processing unit with improved processing capabilities for edge devices such as drones, VR headsets, smart cameras, wearables, and robots. Its latest VPU is the Myriad X system-on-chip that's equipped with a dedicated Neural Compute Engine to support deep learning inferences at the edge. The Movidius Myriad 2 Vision Processing Unit will be used to run deep neural networks for higher accuracy, local video analytics. Intel's Movidius launches AI accelerator on a $79 USB stick The Movidius Neural Compute Stick compiles, tunes and accelerates neural networks at the edge.