Results


Artificial Intelligence to Sort Through ISR Data Glut

#artificialintelligence

Inundated with more data than humans can analyze, the U.S. military and intelligence community are banking on machine learning and advanced computing technologies to separate the wheat from the chaff. The Defense Department operates more than 11,000 drones that collect hundreds of thousands of hours of video footage every year. "When it comes to intelligence, surveillance and reconnaissance, or ISR, we have more platforms and sensors than at any time in Department of Defense history," said Air Force Lt. Gen. John N.T. "Jack" Shanahan, director for defense intelligence (warfighter support) in the office of the undersecretary of defense for intelligence. "It's an avalanche of data that we are not capable of fully exploiting," he said at a technology conference in Washington, D.C., hosted by Nvidia, a Santa Clara, California-based artificial intelligence computing company. For example, the Pentagon has deployed a wide-area motion imagery sensor that can look at an entire city.


How to tell if AI or machine learning is real

#artificialintelligence

It refers specifically to software designed to detect patterns and observe outcomes, then use that analysis to adjust its own behavior or guide people to better results. Machine learning doesn't require the kind of perception and cognition that we associate with intelligence; it simply requires really good, really fast pattern matching and the ability to apply those patterns to its behavior and recommendations. Still, both can play a role in machine learning or AI systems (really, AI precursor systems), so it's not the use of the terms that's a red flag, but their flippant use. This is how Apple's Siri, Microsoft's Cortana, and Google Now work: They send your speech to the cloud, which translates it and figures out a response, then sends it back to your phone.


Google Is Already Late to China's AI Revolution

#artificialintelligence

Delivered amidst the week-long Go match between Chinese grandmaster Ke Jie and AlphaGo, a seminal machine created by Google's DeepMind artificial intelligence lab, Schmidt's words were not hyperbole. But many others have embraced deep learning in big ways, including the largest internet companies in China. "It's easy to fall into the old stereotype--the copy-to-China stereotype, that China is so far behind and they're just importing everything--but that's out of date," says Adam Coates, the American-born AI researcher who now oversees Baidu's Silicon Valley AI lab. And despite what Schmidt implied, Chinese companies like Baidu and Tencent are already starting to offer machine learning tools atop its own cloud computing services.


Google Is Already Late to China's AI Revolution

WIRED

Delivered amidst the week-long Go match between Chinese grandmaster Ke Jie and AlphaGo, a seminal machine created by Google's DeepMind artificial intelligence lab, Schmidt's words were not hyperbole. But many others have embraced deep learning in big ways, including the largest internet companies in China. "It's easy to fall into the old stereotype--the copy-to-China stereotype, that China is so far behind and they're just importing everything--but that's out of date," says Adam Coates, the American-born AI researcher who now oversees Baidu's Silicon Valley AI lab. And despite what Schmidt implied, Chinese companies like Baidu and Tencent are already starting to offer machine learning tools atop its own cloud computing services.


ARM wants to boost AI performance by 50X over 5 years

#artificialintelligence

ARM is unveiling its first Dynamiq processor designs today, and the company said that the family will boost artificial intelligence performance by more than 50 times over the next three to five years. The processors include the ARM Cortex-A75, which delivers massive single-thread compute performance at the high end; the ARM Cortex-A55, a high-efficiency processor; and the ARM Mali-G72 graphics processor, which expands the possibilities for virtual reality, gaming, and machine learning on premium mobile devices, with 40 percent more graphics performance. To better handle AI processing, ARM realized that it needs to make basic changes to the computing architecture, with faster, more efficient, and distributed intelligence between computing at the edge of the network (like in smartphones and laptops) and in the cloud-connected data centers, said Nandan Nayampally, vice president and general manager of the Compute Products Group at ARM, in a blog post. That AI technology also needs to be secure, as recent survey data shows 85 percent of global consumers are concerned about securing AI technology, Nayampally said.


How Machine Learning In The Database Can Change Industries And Save Lives - ARC

#artificialintelligence

The coming era will be defined by machine and deep learning and artificial intelligence, built on top of the mobile/cloud model. Computing has moved from massive mainframe access by terminals, to databases and personal computers, to the cloud and mobile devices. As Microsoft has shown, machine learning models can be moved to the edge by bringing artificial intelligence capabilities that used to only be able to run in the cloud to the device. This is done by building some compute in to edge devices (such as CPUs and GPUs etc., as we have seen with IoT maturation) and by bringing cloud computing capabilities to the edge through virtual machines and Docker-style containerization.


Machine Brains Advance Towards Human Mimicry

#artificialintelligence

Data intelligence firms like Elastic are building machine learning functions into their software as fast as they can. The software itself claims to enable AI to function more like a human brain because it integrates multiple brain areas. According to the firm, "Human brains integrate sight, sound and other senses when making a decision, but existing AI systems do not. Claiming that his firm's latest developments may help the future of AI, Versace notes that Neurala has patented its latest computer brain development under U.S. Patent No.


How fog computing pushes IoT intelligence to the edge

#artificialintelligence

Meeting these requirements is somewhat problematic through the current centralized, cloud-based model powering IoT systems, but can be made possible through fog computing, a decentralized architectural pattern that brings computing resources and application services closer to the edge, the most logical and efficient spot in the continuum between the data source and the cloud. Fog computing reduces the amount of data that is transferred to the cloud for processing and analysis, while also improving security, a major concern in the IoT industry. IoT nodes are closer to the action, but for the moment, they do not have the computing and storage resources to perform analytics and machine learning tasks. An example is Cisco's recent acquisition of IoT analytics company ParStream and IoT platform provider Jasper, which will enable the network giant to embed better computing capabilities into its networking gear and grab a bigger share of the enterprise IoT market, where fog computing is most crucial.


The Dummies Guide to Artificial Intelligence

#artificialintelligence

These days, even when I read news about India's state owned air carrier, as "New AI connectivity between cities X & Y", the first connection my (poor human) brain makes is to Artificial Intelligence. Only then it connects to Air India. This is because everyone (in my extended professional circle) is talking of Artificial Intelligence. Though I had done a short elective on AI in my computer engineering days (16 years back), done some basic LISP programming as part of it, had presented a paper on'Genetic Algorithms' in a seminar (again that long ago), and am an avid watcher of Sci-Fi movies around AI (and claim to have understood Matrix the first time I watched it - with subtitles though, ha!), I realize that my knowledge of what AI is, is no better than a layperson (or worse, since half knowledge is more dangerous). This article is an endeavor to sort of unpack AI (and the associated words like ML - machine learning, DL - deep learning) for myself.


The Dummies Guide to Artificial Intelligence

#artificialintelligence

These days, even when I read news about India's state owned air carrier, as "New AI connectivity between cities X & Y", the first connection my (poor human) brain makes is to'Artificial Intelligence'. Only then it connects to Air India. This is because everyone (in my extended professional circle) is talking of Artificial Intelligence. Though I had done a short elective on AI in my computer engineering days (16 years back), done some basic LISP programming as part of it, had presented a paper on'Genetic Algorithms' in a seminar (again that long ago), and am an avid watcher of Sci-Fi movies around AI (and claim to have understood Matrix the first time I watched it -- with subtitles though, ha!), I realize that my knowledge of what AI is, is no better than a layperson (or worse, since half knowledge is more dangerous). This article is an endeavor to sort of unpack AI (and the associated words like ML -- machine learning, DL -- deep learning) for myself.