A Machine Learning Landscape: Where AMD, Intel, NVIDIA, Qualcomm And Xilinx AI Engines Live

Forbes Technology

Without a doubt, 2016 was an amazing year for Machine Learning (ML) and Artificial Intelligence (AI) awareness in the press. But most people probably can't name 3 applications for machine learning, other than self-driving cars and perhaps their voice activated assistant hiding in their phone. There's also a lot of confusion about where the Artificial Intelligence program actually exists. When you ask Siri to play a song or tell you what the weather will be like tomorrow, does "she" live in your phone or in the Apple cloud? And while you ponder those obscure question, many investors and technology recommenders are trying to determine whether,,, or will provide the best underlying hardware chips, for which application and why.


Bayesian Inference of Online Social Network Statistics via Lightweight Random Walk Crawls

arXiv.org Machine Learning

Online social networks (OSN) contain extensive amount of information about the underlying society that is yet to be explored. One of the most feasible technique to fetch information from OSN, crawling through Application Programming Interface (API) requests, poses serious concerns over the the guarantees of the estimates. In this work, we focus on making reliable statistical inference with limited API crawls. Based on regenerative properties of the random walks, we propose an unbiased estimator for the aggregated sum of functions over edges and proved the connection between variance of the estimator and spectral gap. In order to facilitate Bayesian inference on the true value of the estimator, we derive the approximate posterior distribution of the estimate. Later the proposed ideas are validated with numerical experiments on inference problems in real-world networks.


AI classroom activity: Facial recognition

#artificialintelligence

Artificial intelligence (AI) is everywhere in our daily lives – search engines, social media, intelligent personal assistants such as Siri – and today's schoolchildren are a generation who will grow up with these AI technologies. I have a one year old daughter; it is distinctly possible that she does not need to learn how to drive when she grows up because self-driving vehicles will be the norm. As a computer scientist who works in a medical research institute, I witness firsthand how AI is transforming the way we screen our three-billion-character genome to discover disease-causing mutations, and detect cardiovascular risks by analysing data from wearable fitness trackers. Like it or not, AI will be an integral part of our children's future. The term AI may sound scary, possibly due to association with killer robots in science fiction.


Apple Is Following Google Into Making A Custom AI Chip

#artificialintelligence

Artificial intelligence has begun seeping its way into every tech product and service. Now, companies are changing the underlying hardware to accommodate this shift. Apple is the latest company creating a dedicated AI processing chip to speed up the AI algorithms and save battery life on its devices, according to Bloomberg. The Bloomberg report said the chip is internally known as the Apple Neural Engine and will be used to assist devices for facial and speech recognition tasks. The latest iPhone 7 runs some of its AI tasks (mostly related to photographer) using the image signal processor and the graphics processing unit integrated on its A10 Fusion chip.


Choose the right AI method for the job

#artificialintelligence

It's hard to remember the days when artificial intelligence seemed like an intangible, futuristic concept. This has been decades in the making, however, and the past 90 years have seen both renaissances and winters for the field of study. At present, AI is launching a persistent infiltration into our personal lives with the rise of self-driving cars and intelligent personal assistants. In the enterprise, we likewise see AI rearing its head in adaptive marketing and cybersecurity. The rise of AI is exciting, but people often throw the term around in an attempt to win buzzword bingo, rather than to accurately reflect technological capabilities.