Milestone


The Next AI Milestone: Bridging the Semantic Gap – Intuition Machine – Medium

#artificialintelligence

Contextual Adaptation -- Where systems construct contextual explanatory models for classes of real world phenomena. I write about these two in previous articles (see: "The Only Way to Make Deep Learning Interpretable is to have it Explain Itself" and "The Meta Model and Meta Meta Model of Deep Learning" DARPA's presentation nails it, by highlighting what's going on in current state-of-the-art research. Deep Learning systems have flaws analogous to our own intuitions having flaws. Just to recap, here's the roadmap that I have ( explained here): It's a Deep Learning roadmap and does not cover developments in other AI fields.


Exscientia Enters Strategic Drug Discovery Collaboration with GSK

#artificialintelligence

Exscientia, an innovative company at the forefront of Artificial Intelligence (AI)-driven drug discovery, is pleased to announce it has entered into a strategic drug discovery collaboration with GlaxoSmithKline (GSK). Exscientia will receive research payments from GSK to undertake new discovery programmes with nominated targets with the goal of delivering pre-clinical candidates. As part of this collaboration, Exscientia is incentivised to reduce the number of compounds required for synthesis and assay in order to achieve lead and candidate compound goals. Exscientia is at the forefront of Artificial Intelligence (AI)-driven drug discovery and design.


Amazon's Alexa passes 15,000 skills, up from 10,000 in February

#artificialintelligence

Amazon's Alexa voice platform has now passed 15,000 skills -- the voice-powered apps that run on devices like the Echo speaker, Echo Dot, newer Echo Show and others. In the meantime, Amazon's Alexa is surging ahead, building out an entire voice app ecosystem so quickly that it hasn't even been able to implement the usual safeguards -- like a team that closely inspects apps for terms of service violations, for example, or even tools that allow developers to make money from their creations. In the long run, Amazon's focus on growth over app ecosystem infrastructure could catch up with it. In addition, Google Home has just 378 voice apps available as of June 30, Voicebot notes.


Amazon's Alexa passes 15,000 skills, up from 10,000 in February

#artificialintelligence

Amazon's Alexa voice platform has now passed 15,000 skills – the voice-powered apps that run on devices like the Echo speaker, Echo Dot, newer Echo Show and others. In the meantime, Amazon's Alexa is surging ahead, building out an entire voice app ecosystem so quickly that it hasn't even been able to implement the usual safeguards – like a team that closely inspects apps for terms of service violations, for example, or even tools that allow developers to make money from their creations. In the long run, Amazon's focus on growth over app ecosystem infrastructure could catch up with it. In addition, Google Home has just 378 voice apps available as of June 30, Voicebot notes.


Socionext Achieves Significant Milestone from Collaboration on Artificial Intelligence

#artificialintelligence

The companies achieved initial results in reading ultrasound images from Socionext's viewphii mobile ultrasound solution by Artificial Brain SOINN. In this initial trial, SOINN learned to read subcutaneous fat thickness from abdominal ultrasound images. SOINN can accurately read fat tissue thickness from 80 percent of the data within 5 percent margin of error. Based on the findings, the companies believe that AI has the potential to be used for assisting technicians in reading images and detecting human errors in medical image handling.


One Big Question: Why is artificial intelligence still kind of dumb?

#artificialintelligence

The real limitation of artificial intelligence's "intelligence" is our ability produce it. AI systems produce non-intelligent false-positives, are unable to understand contextual information, and are not sufficiently granular. But the fact remains that AI falls short in the most trivial of tasks for humans –interacting with the physical world and perceiving natural signals – indicating that AI systems are simply powerful computing machines with a misleading title. Visual understanding and the ability to navigate the physical world intelligently, therefore, are more proper benchmarks for this milestone than playing poker.


Unisys Makes Headway in Advanced Data Analytics

#artificialintelligence

Unisys Corporation has announced new advanced data analytics milestones including the launch of its new Machine Learning-as-a-Service offering and the proposed launch of its new Artificial Intelligence Center of Excellence. Based on this, machine learning algorithms are implemented to analyze the data and to develop advanced analytics models which can provide insights and predict outcomes. Registered users can access machine learning experts and algorithms for building predictive advanced analytics models. "We want to give users an opportunity to learn about how artificial intelligence can help build those capabilities by giving them access to the unique tools and expertise offered by Unisys to help them remain competitive in the future," says Fontecilla.


IBM inches toward human-like accuracy for speech recognition

Engadget

Microsoft claimed to reach a 5.9 percent word error rate last October using neural language models resembling associative word clouds. "As part of our process in reaching today's milestone, we determined human parity is actually lower than what anyone has yet achieved -- at 5.1 percent," George Saon, IBM principal research scientist, wrote in a blog post this week. IBM reached the 5.5 percent milestone by combining so-called Long Short-Term Memory, an artificial neural network, and WaveNet language models with three strong acoustic models. SWITCHBOARD is not the industry standard for measuring human parity, however, which makes breakthroughs harder to achieve.


IBM hits new AI milestone with new industry record for speech recognition - Computer Business Review

#artificialintelligence

IBM reached a new AI milestone in speech recognition, achieving an industry record of 5.5% word error rate using the Switchboard linguistic corpus. IBM achieved another major AI milestone in conversational speech recognition last year with a computer system that reached a word error rate of 6.9%. IBM cognitive computing vice president Michael Karasick said: "These speech developments build on decades of research, and achieving speech recognition comparable to that of humans is a complex task. We believe it is only a matter of time before we achieve parity on speech recognition with humans."


Shakey is first robot to receive IEEE Milestone award

Robohub

Logical reasoning, autonomous plan creation, robust real-world plan execution, machine learning, computer vision, navigation, and communication in ordinary English were integrated in a physical system for the first time. In more specific technical terms, Shakey is historically significant for three distinct reasons: (1) Its control software was structured--a first for robots--in a layered architecture that became a model for subsequent robots; (2) Its computer vision, planning and navigation methods have been used not only in many subsequent robots but in a wide variety of consumer and industrial applications; and (3) Shakey served as an existence proof that encouraged later developers to develop more advanced robots. The session included a discussion by a distinguished panel: Prof. Ruzena Bajcsy (UC Berkeley, Director of CITRIS), Rodney Brooks (former head of the CS/AI Lab at MIT, founder of both iRobot and Rethink Robotics), Peter Hart (Shakey project leader and the most-cited author in the field of Robotics according to Google Scholar), Nils Nilsson (Shakey project leader, former Chair of CS at Stanford), James Kuffner (Director of Robotics Research at Google), Prof. Benjamin Kuipers (University of Michigan), and Prof Manuela Veloso (endowed Chair in AI and Robotics at CMU). As cast on a bronze plaque at SRI International, the IEEE Milestone's citation reads: "Stanford Research Institute's Artificial Intelligence Center developed the world's first mobile, intelligent robot, SHAKEY.