If you are looking for an answer to the question What is Artificial Intelligence? and you only have a minute, then here's the definition the Association for the Advancement of Artificial Intelligence offers on its home page: "the scientific understanding of the mechanisms underlying thought and intelligent behavior and their embodiment in machines."
However, if you are fortunate enough to have more than a minute, then please get ready to embark upon an exciting journey exploring AI (but beware, it could last a lifetime) …
The latest proprietary Power servers from IBM, armed by the long-awaited IBM Power9 processors, look for relevance among next-generation enterprise workloads, but the company will need some help from its friends to take on its biggest market challenger. IBM emphasizes increased speed and bandwidth with its AC922 Power Systems to better take on high-performance computing tasks, such as building models for AI and machine learning training. The company said it plans to pursue mainstream commercial applications, such as building supply chains and medical diagnostics, but those broader-based opportunities may take longer to materialize. "Most big enterprises are doing research and development on machine learning, with some even deploying such projects in niche areas," said Patrick Moorhead, president and principal analyst at Moor Insights & Strategy. "But it will be 12 to 18 months before enterprises can even start driving serious volume in that space."
Companies running AI applications often need as much computing muscle as researchers who use supercomputers do. IBM's latest system is aimed at both audiences. The company last week introduced its first server powered by the new Power9 processor designed for AI and high-performance computing. The powerful technologies inside have already attracted the likes of Google and the US Department of Energy as customers. The new IBM Power System AC922 is equipped with two Power9 CPUs and from two to six NVIDIA Tesla V100 GPUs.
Tesla is taking its self-driving future into its own hands, which Elon Musk thinks will help usher the company into an era of fully autonomous vehicles in just two years. Musk confirmed the company has a team hard at work developing its own AI chips, which future Teslas will depend on in place of the Nvidia units currently used in the automaker's all-electric vehicles. Musk dropped the news at a private company party in Long Beach, according to CNBC. The team is being headed by Jim Keller, a former AMD and Apple chip architect who helped to design the iPhone maker's A4 and A5 processors. A Keller-led Tesla chip project was rumored back in September, but reports then claimed the automaker was working closely with chipmaker AMD to test the tech, which both companies denied.
Investors beware: there's plenty buzz around artificial intelligence (AI) as more and more companies say they're using it. In some cases, companies are using older data analytics tools and labeling it as AI for a public relations boost. But identifying companies actually getting material revenue growth from AI can be tricky. X AI uses computer algorithms to replicate the human ability to learn and make predictions. AI software needs computing power to find patterns and make inferences from large quantities of data.
IBM is ready to start shipping the first commercial server systems built around its recently released Power9 processor. Dubbed the AC922 Power Systems, these servers will ship by the end of December, and are specifically designed for artificial intelligence (AI) workloads, reports Enterprise Cloud News (Banking Technology's sister publication). The AC922 is the commercial version of the same severs that IBM, along with Nvidia and Mellanox Technologies is using to build two new supercomputers for the US Department of Energy. The "Summit" and "Sierra" supercomputers are expected to go online in 2018, and could reinvigorate the US's standing in the world of high-performance computing. At the heart of the AC922 is IBM's recently released Power9 processor.
IBM launched its first systems based on its Power9 processor and optimized for artificial intelligence workloads. Big Blue's Power Systems Servers can improve training times of deep learning frameworks by 4x, according to IBM. The Power9 processors and systems built on them are partly the product of collaboration in the OpenPower Foundation, which includes IBM, Google, Mellanox, Nvidia and a bevy of other players. Those technologies are designed to boost bandwidth and throughput in data movement. That movement is what boosts model training time.
Investors beware: there's plenty buzz around artificial intelligence (AI) as more and more companies say they're using it. In some cases, companies are using older data analytics tools and labeling it as AI for a public relations boost. But identifying companies actually getting material revenue growth from AI can be tricky. XAutoplay: On Off AI uses computer algorithms to replicate the human ability to learn and make predictions. AI software needs computing power to find patterns and make inferences from large quantities of data.
GE Healthcare is set to speed up the time taken to process medical images, thanks to a pair of partnerships announced on Sunday. The global giant will team up with Nvidia to update its 500,000 medical imaging devices worldwide with Revolution Frontier CT, which is claimed to be two times faster than the previous generation image processor. GE said the speedier Revolution Frontier would be better at liver lesion detection and kidney lesion characterisation, and has the potential to reduce the number of follow-up appointments and the number of non-interpretable scans. GE Healthcare is also making use of Nvidia in its new analytics platform, with sections of it to be placed in the Nvidia GPU Cloud. An average hospital generates 50 petabytes of data annually, GE said, but only 3 percent of that data is analysed, tagged, or made actionable.
Nigel Toon, the cofounder and CEO of Graphcore, a semiconductor startup based in the U.K., recalls that only a couple of years ago many venture capitalists viewed the idea of investing in semiconductor chips as something of joke. "You'd take an idea to a meeting," he says, "and many of the partners would roll about on the floor laughing." Now some chip entrepreneurs are getting a very different reception. Instead of rolling on the floor, investors are rolling out their checkbooks. Venture capitalists have good reason to be wary of silicon, even though it gave Silicon Valley its name.
Intel is ready to ship its long awaited computer chip used to power artificial intelligence projects by the end of the year. Intel CEO Brian Krzanich explained the chip-maker's foray into the red-hot field of artificial intelligence Tuesday and said that Facebook (fb) has assisted the company in prelude to its new chip's debut. "We are thrilled to have Facebook in close collaboration sharing its technical insights as we bring this new generation of AI hardware to market," Krzanich wrote. An Intel spokesperson wrote to Fortune in an email that while the two companies are collaborating, they do not have a formal partnership. The genesis of the Intel Nervana Neural Network Processor comes from Intel's acquisition of the chip startup Nervana Systems in 2016.