US Air Force funds Explainable-AI for UAV tech

#artificialintelligence

Z Advanced Computing, Inc. (ZAC) of Potomac, MD announced on August 27 that it is funded by the US Air Force, to use ZAC's detailed 3D image recognition technology, based on Explainable-AI, for drones (unmanned aerial vehicle or UAV) for aerial image/object recognition. ZAC is the first to demonstrate Explainable-AI, where various attributes and details of 3D (three dimensional) objects can be recognized from any view or angle. "With our superior approach, complex 3D objects can be recognized from any direction, using only a small number of training samples," said Dr. Saied Tadayon, CTO of ZAC. "For complex tasks, such as drone vision, you need ZAC's superior technology to handle detailed 3D image recognition." "You cannot do this with the other techniques, such as Deep Convolutional Neural Networks, even with an extremely large number of training samples. That's basically hitting the limits of the CNNs," continued Dr. Bijan Tadayon, CEO of ZAC.


Internet of Things and Bayesian Networks

@machinelearnbot

As big data becomes more of cliche with every passing day, do you feel Internet of Things is the next marketing buzzword to grapple our lives. So what exactly is Internet of Thing (IoT) and why are we going to hear more about it in the coming days. Internet of thing (IoT) today denotes advanced connectivity of devices,systems and services that goes beyond machine to machine communications and covers a wide variety of domains and applications specifically in the manufacturing and power, oil and gas utilities. An application in IoT can be an automobile that has built in sensors to alert the driver when the tyre pressure is low. Built-in sensors on equipment's present in the power plant which transmit real time data and thereby enable to better transmission planning,load balancing.


Should you become a data scientist?

#artificialintelligence

There is no shortage of articles attempting to lay out a step-by-step process of how to become a data scientist. Are you a recent graduate? Do this… Are you changing careers? Do that… And make sure you're focusing on the top skills: coding, statistics, machine learning, storytelling, databases, big data… Need resources? Check out Andrew Ng's Coursera ML course, …". Although these are important things to consider once you have made up your mind to pursue a career in data science, I hope to answer the question that should come before all of this. It's the question that should be on every aspiring data scientist's mind: "should I become a data scientist?" This question addresses the why before you try to answer the how. What is it about the field that draws you in and will keep you in it and excited for years to come? In order to answer this question, it's important to understand how we got here and where we are headed. Because by having a full picture of the data science landscape, you can determine whether data science makes sense for you. Before the convergence of computer science, data technology, visualization, mathematics, and statistics into what we call data science today, these fields existed in siloes -- independently laying the groundwork for the tools and products we are now able to develop, things like: Oculus, Google Home, Amazon Alexa, self-driving cars, recommendation engines, etc. The foundational ideas have been around for decades... early scientists dating back to the pre-1800s, coming from wide range of backgrounds, worked on developing our first computers, calculus, probability theory, and algorithms like: CNNs, reinforcement learning, least squares regression. With the explosion in data and computational power, we are able to resurrect these decade old ideas and apply them to real-world problems. In 2009 and 2012, articles were published by McKinsey and the Harvard Business Review, hyping up the role of the data scientist, showing how they were revolutionizing the way businesses are operating and how they would be critical to future business success. They not only saw the advantage of a data-driven approach, but also the importance of utilizing predictive analytics into the future in order to remain competitive and relevant. Around the same time in 2011, Andrew Ng came out with a free online course on machine learning, and the curse of AI FOMO (fear of missing out) kicked in. Companies began the search for highly skilled individuals to help them collect, store, visualize and make sense of all their data. "You want the title and the high pay?


Mind Versus Machine: Human Insight in the Age of Automation

#artificialintelligence

The future is dystopian: a world in which we humble humans will be replaced by fleets of slick automatons – mechanical menials destined to not only solder, weld and glue us out of jobs, but account, diagnose, and translate us out, too. Or, so goes a certain line of argument. Certainly, there have been some heavyweight concerns voiced about the rise of artificial intelligence. Of course, there are counterarguments too. Just as the Industrial Revolution sparked fears around the supplanting of man by machine (fears which lead some as far as destroying the new mechanical marvels: hence today's use of the word'Luddite' to denote those opposed to technological progress), all new vistas are likely to provoke both optimism and hesitance.


The Many Faces of Exponential Weights in Online Learning

arXiv.org Machine Learning

A standard introduction to online learning might place Online Gradient Descent at its center and then proceed to develop generalizations and extensions like Online Mirror Descent and second-order methods. Here we explore the alternative approach of putting exponential weights (EW) first. We show that many standard methods and their regret bounds then follow as a special case by plugging in suitable surrogate losses and playing the EW posterior mean. For instance, we easily recover Online Gradient Descent by using EW with a Gaussian prior on linearized losses, and, more generally, all instances of Online Mirror Descent based on regular Bregman divergences also correspond to EW with a prior that depends on the mirror map. Furthermore, appropriate quadratic surrogate losses naturally give rise to Online Gradient Descent for strongly convex losses and to Online Newton Step. We further interpret several recent adaptive methods (iProd, Squint, and a variation of Coin Betting for experts) as a series of closely related reductions to exp-concave surrogate losses that are then handled by Exponential Weights. Finally, a benefit of our EW interpretation is that it opens up the possibility of sampling from the EW posterior distribution instead of playing the mean. As already observed by Bubeck and Eldan, this recovers the best-known rate in Online Bandit Linear Optimization.