google


The militarization of AI is coming. Can tech companies (and the rest of us) live with it? - SiliconANGLE

#artificialintelligence

Everybody knows that artificial intelligence is an exceptionally weaponizable technology. So it's no mystery why militaries everywhere are racing to exploit AI to its maximum potential. Autonomous vehicles, for example, will become the most formidable weapon systems humanity has ever developed. AI gives them the ability to see, hear, sense, and adjust real-time strategies far better and faster than most humans. It will almost certainly produce casualty counts in future battles that are staggering and lopsided, especially when one side is almost entirely composed of AI-powered intelligent weapons systems equipped with phalanxes of 3-D camera, millimeter-wave radar, biochemical detectors and other ambient sensors.


The World of Artificial Intelligence 8 Trends to Watch in 2018

#artificialintelligence

Computationally analyzing Big Data is not a passing trend. As volumes of data continue to grow, so will the improvements in analyzing big data. When it comes to applications of Predictive Analytics, we have only seen the tip of the iceberg. It has already helped organizations (i.e. All of these different types of Artificial Intelligence are tied together in a way that has profoundly changed the way we perform everyday tasks, and more is yet to come.


An Introduction to Hashing in the Era of Machine Learning

#artificialintelligence

"[…] we believe that the idea of replacing core components of a data management system through learned models has far reaching implications for future systems designs and that this work just provides a glimpse of what might be possible." Indeed the results presented by the team of Google and MIT researchers includes findings that could signal new competition for the most venerable stalwarts in the world of indexing: the B-Tree and the Hash Map. The engineering community is ever abuzz about the future of machine learning; as such the research paper has made its rounds on Hacker News, Reddit, and through the halls of engineering communities worldwide. New research is an excellent opportunity to reexamine the fundamentals of a field; and it's not often that something as fundamental (and well studied) as indexing experiences a breakthrough. This article serves as an introduction to hash tables, an abbreviated examination of what makes them fast and slow, and an intuitive view of the machine learning concepts that are being applied to indexing in the paper. In response to the findings of the Google/MIT collaboration, Peter Bailis and a team of Stanford researchers went back to the basics and warned us not to throw out our algorithms book just yet. Bailis' and his team at Stanford recreated the learned index strategy, and were able to achieve similar results without any machine learning by using a classic hash table strategy called Cuckoo Hashing. In a separate response to the Google/MIT collaboration, Thomas Neumann describes another way to achieve performance similar to the learned index strategy without abandoning the well tested and well understood B-Tree.


"Ok, Google -- How do you run Deep Learning Inference on Android Using TensorFlow?"

#artificialintelligence

There are many situations when running deep learning inferences on local devices is preferable for both individuals and companies: imagine traveling with no reliable internet connection available or dealing with privacy concerns and latency issues on transferring data to cloud-based services. Edge computing provides solutions to these problems by processing and analyzing data at the edge of network. Take the "Ok Google" feature as an example -- by training "Ok Google" with a user's voice, that user's mobile phone will be activated when capturing the keywords. This kind of small-footprint keyword-spotting (KWS) inference usually happens on-device so you don't have to worry that the service providers are listening to you all the time. The cloud-based services will only be initiated after you make the commands.


Deep Learning for Emojis with VS Code Tools for AI

#artificialintelligence

This post is the first in a two-part series, and is authored by Erika Menezes, Software Engineer at Microsoft. Visual content has always been a critical part of communication. Emojis are increasingly playing a crucial role in human dialogue conducted on leading social media and messaging platforms. Concise and fun to use, emojis can help improve communication between users and make dialogue systems more anthropomorphic and vivid. We also see an increasing investment in chatbots that allow users to complete task-oriented services such as purchasing auto insurance or movie tickets, or checking in for flights, etc., in a frictionless and personalized way from right within messaging apps.


European Scientists Call For AI Institute As US and China Pull Away

#artificialintelligence

US and China are pulling away in the race to build AI. A group of renowned artificial intelligence (AI) scientists have called for a new multinational AI hub to be built in Europe in a bid to help the continent compete with other parts of the world. AI is set to have a profound impact on Europe and the rest of the world in the coming decades and many believe the impact will be larger than that of the industrial revolution. In an open letter, the scientists wrote that "Europe is not keeping up" with North America and China and called for a new European Lab for Learning & Intelligent Systems, abbreviated as ELLIS. The group argues there are clusters of AI excellence within labs scattered across Europe "that play in the international top league" but "virtually all of the top people in those places are continuously being pursued for recruitment by US companies."


Google's Cloud TPU Matches Volta in Machine Learning at Much Lower Prices - ExtremeTech

#artificialintelligence

Over the past few years, Nvidia has established itself as a major leader in machine learning and artificial intelligence processing. The GPU designer dove into the HPC market over a decade ago when it launched the G80 and its parallel compute platform API, CUDA. Early leadership has paid off for Nvidia; the company holds 87 spots on the TOP500 list of supercomputers, compared with just 10 for Intel. But as machine learning and artificial intelligence workloads proliferate, vendors are emerging to give Nvidia a run for its money, including Google's new Cloud TPU. New benchmarks from RiseML put both Nvidia and Google's TPU head-to-head -- and the cost curve strongly favors Google.


Google is bleeding cash trying to take on Amazon in the smart home

#artificialintelligence

Google parent company Alphabet reported first quarter earnings for 2018 today, beating Wall Street estimates on sales and profit thanks in large part to its mammoth search advertising machine that continues to grow year after year. But one interesting highlight from the earnings announcement was just how much money the company's smart home company Nest earns in revenue and reports in losses. Because Nest was rolled back into Google proper earlier this year, Alphabet recast its quarterly earnings figures for 2017 to account for the fact that Nest revenues and losses would be moved from the "Other Bets" section of Alphabet's business to the standard Google revenue line item. Comparing the differences in quarterly revenues and operating income, we can see that Nest made about $726 million in revenue, yet it ultimately contributed a $621 million loss to the "Other Bets" section throughout the year. In other words, Google spent more than half a billion dollars last year to establish Nest in sectors like security cameras, alarm systems, and video doorbells.


Why more tech companies should put AI visionaries in the executive suite - SiliconANGLE

#artificialintelligence

Do enterprises really need chief artificial intelligence officers? In most industries, the correct answer would probably be no. For most businesses, AI may never rise to a level of strategic importance that requires a dedicated executive reporting directly to the chief executive. Even so, some high-tech companies might want to consider it. Elevating an AI expert to C-level status, though often quite expensive, may become strategically necessary if a business' survival depends on it.


Make iPhone Great Again: Top fixes we want for America's beloved smartphone

ZDNet

The iPhone is America's favorite smartphone device, and in the 10 years it has been out, Apple has been wildly successful with it as a profitable business and device/application ecosystem. But as with any mature platform, there is malaise or rot that can be introduced over time with the advent of new features and more complex code. You know what they say about karma. Read also: Why the next iPhone doesn't need to be faster or thinner Here are the things we'd like to see in future versions of iOS and the iPhone hardware to ... Make iPhone Great Again (MIGA)! One of the things about the iOS user experience (UX) that has annoyed us since the very beginning -- and only became worse when app groups were introduced -- is the complexities of organizing launch icons, finding apps, and removing them from the home screen.