Goto

Collaborating Authors

 AAAI AI-Alert for Mar 21, 2017


Israeli drone crashes in Syria, circumstances unclear

FOX News

JERUSALEM โ€“ The Israeli military has confirmed that a drone crashed in Syria earlier this week in unclear circumstances. In a statement, the military said the "Skylark" went down on Sunday and that the incident was being investigated. Tuesday's statement said there is "no risk of a breach of information." Hezbollah's media arm published photographs of what it said was a drone it had shot down after infiltrating Syrian airspace in the Golan Heights. Although Israel is not actively fighting in the Syrian civil war, it keeps close tabs on its enemies Iran and Lebanon's Iranian-backed Hezbollah militant group, which are both backing Syrian government forces.


ImageNet: VGGNet, ResNet, Inception, and Xception with Keras - PyImageSearch

#artificialintelligence

A few months ago I wrote a tutorial on how to classify images using Convolutional Neural Networks (specifically, VGG16) pre-trained on the ImageNet dataset with Python and the Keras deep learning library. The pre-trained networks inside of Keras are capable of recognizing 1,000 different object categories, similar to objects we encounter in our day-to-day lives with high accuracy. Back then, the pre-trained ImageNet models were separate from the core Keras library, requiring us to clone a free-standing GitHub repo and then manually copy the code into our projects. This solution worked well enough; however, since my original blog post was published, the pre-trained networks (VGG16, VGG19, ResNet50, Inception V3, and Xception) have been fully integrated into the Keras core (no need to clone down a separate repo anymore) -- these implementations can be found inside the applications sub-module. Because of this, I've decided to create a new, updated tutorial that demonstrates how to utilize these state-of-the-art networks in your own classification projects.


IBM sets new speech recognition accuracy record

#artificialintelligence

IBM announced an important milestone in conversational speech recognition last year. The company managed to develop a system that achieves a 6.9 percent word error rate. Despite the success, IBM continued to work hard on its speech recognition technology and has recently achieved a new industry record of 5.5 percent. In an official blog post, the company said that the word error rate was measured with the help of recorded conversations between people discussing usual everyday topics like buying a car. These recordings, which are known as the "SWITCHBOARD" corpus, have been used in the industry to benchmark speech recognition systems for more than 20 years.


How Drive.ai Is Mastering Autonomous Driving With Deep Learning

#artificialintelligence

Among all of the self-driving startups working toward Level 4 autonomy (a self-driving system that doesn't require human intervention in most scenarios), Mountain View, Calif.-based Drive.ai's Drive sees deep learning as the only viable way to make a truly useful autonomous car in the near term, says Sameep Tandon, cofounder and CEO. "If you look at the long-term possibilities of these algorithms and how people are going to build [self-driving cars] in the future, having a learning system just makes the most sense. There's so much complication in driving, there are so many things that are nuanced and hard, that if you have to do this in ways that aren't learned, then you're never going to get these cars out there." It's only been about a year since Drive went public, but already, the company has a fleet of four vehicles navigating (mostly) autonomously around the San Francisco Bay Area--even in situations (such as darkness, rain, or hail) that are notoriously difficult for self-driving cars. Last month, we went out to California to take a ride in one of Drive's cars, and to find out how it's using deep learning to master autonomous driving.


Alexa is not coming to an office near you

#artificialintelligence

If you've recently had your first interaction with a voice-based personal assistant like Amazon's Alexa or Apple's Siri, you might get the sense that artificial intelligence is just a few years away from being able to talk and act like a human. It will soon be capable of managing our schedules, troubleshooting technical issues, or even holding conversation. According to a recent Wall Street Journal piece titled "Alexa and Cortana May Be Heading to the Office," many businesses share that hope. One startup profiled in the piece uses "an Amazon Echo attached to the office ceiling for such tasks as adding events to their calendars," while another is building a virtual assistant to set meetings on behalf of human users. The belief that natural language processing is right around the corner seems to be widespread: About half of IT professionals in the Spiceworks survey cited in the article said they plan to use intelligent assistants in a corporate setting in the next three years.


Alexa and Cortana May Be Heading to the Office

#artificialintelligence

The next assistant in many offices could be named Alexa or Cortana. In 2016, Silicon Valley obsessed over how text-based bots in apps like Slack could make employees more efficient, turning complicated tasks or forms into conversational texts. Now, following the success of Amazon Inc.'s Alexa and Alphabet Inc.'s Google Home, people in the technology industry are increasingly thinking about how such voice-activated devices can be made useful in the workplace. The workplace offers challenges that experts say intelligent assistants built for home use so far haven't effectively met, mostly in the area of voice recognition. Workers at Goodwinds Inc. in New York City, for example, have used an Amazon Echo attached to the office ceiling for such tasks as adding events to their calendars and setting reminders for meetings, says Vinay Patankar, chief executive of the workflow-management startup.


AI Assistants โ€“ The New Productivity Tool

#artificialintelligence

Many discussions around artificial intelligence (AI) have focused on the new developments in machine learning, such as assisting with cancer recognition and even detecting earthquakes. However, the use of AI in the creation of a virtual assistant is starting to gain some traction. A recent article from VentureBeat stated that 2017 is the year that "virtual assistants really arrived." In this post, we wanted to look at the rise of virtual assistants and how they could change the way we work. It is hard to imagine an office desk these days without a computer.


Ideas on interpreting machine learning

#artificialintelligence

For more on advances in machine learning, prediction, and technology, check out the Data science and advanced analytics sessions at Strata Hadoop World London, May 22-25, 2017. Early price ends April 7. You've probably heard by now that machine learning algorithms can use big data to predict whether a donor will give to a charity, whether an infant in a NICU will develop sepsis, whether a customer will respond to an ad, and on and on. Machine learning can even drive cars and predict elections. I believe it can, but these recent high-profile hiccups should leave everyone who works with data (big or not) and machine learning algorithms asking themselves some very hard questions: do I understand my data? Do I understand the model and answers my machine learning algorithm is giving me? And do I trust these answers? Unfortunately, the complexity that bestows the extraordinary predictive abilities on machine learning algorithms also makes the answers the algorithms produce hard to ...


Bosch will sell Nvidia's self-driving system to automakers

#artificialintelligence

Nvidia has announced a new partnership with Bosch to sell its Drive PX 2 driver-assist platform to automakers. In effect, the deal gives Nvidia a go-to-market strategy for its self-driving hardware and software platform. Bosch joins ZF as the two so-called tier-one suppliers that will sell Nvidia's technology to automakers. Nvidia's technology uses "deep learning" artificial intelligence, which is a fancy way of saying its computer brain learns like a human does: instead of needing to be programmed for every possible driving scenario, it learns what the appropriate behavior is, even for unexpected situations. Theoretically, a car company looking to make its car capable of autonomous driving will also be able to go to Bosch or ZF and buy that technology to integrate into their cars, and sell those to consumers.


Envisioning the future of robotics

Robohub

Robotics is said to be the next technological revolution. Many seem to agree that robots will have a tremendous impact over the following years, and some are heavily betting on it. Companies are investing billions buying other companies, and public authorities are discussing legal frameworks to enable a coherent growth of robotics. Understanding where the field of robotics is heading is more than mere guesswork. While much public concern focuses on the potential societal issues that will arise with the advent of robots, in this article, we present a review of some of the most relevant milestones that happened in robotics over the last decades.