With more than 40,000 third-party skills for Alexa users to choose from, developers could use some help getting their particular skills in front of those users. Amazon is offering that help in the form of a new interface that lets developers describe what requests their skills can fulfill. Amazon explained earlier that it's using machine learning to help users discover specific Alexa skills that can fulfill their requests, even if they don't know the name of the skill. Now, developers can input information into the new CanFulfillIntentRequest interface to augment that machine learning model. So for instance, if a customer asks, "Alexa, where is the best surfing today near Santa Barbara?" Alexa would use CanFulfillIntentRequest to ask surfing skills whether they can understand and fulfill the request.
Alexa will soon be able to recall information you've directed her to remember, as well as have more natural conversations that don't require every command to begin with "Alexa." She'll also be able to launch skills in response to questions you ask, without explicit instructions to do so. The features are the first of what Amazon says are many launches this year that will make its virtual assistant more personalized, smarter, and more engaging. The news was announced this morning in a keynote presentation from the head of the Alexa Brain group, Ruhi Sarikaya, speaking at the World Wide Web Conference in Lyon, France. He explained that the Alexa Brain initiative is focused on improving Alexa's ability to track context and memory within and across dialog sessions, as well as make it easier for users to discover and interact with Alexa's now over 40,000 third-party skills.
Amazon's Alexa voice assistant has so many skills, it can be hard to know exactly how to activate them sometimes. But the company is building a new way for people to discover skills without having to request them specifically by name. Amazon announced today that it's launching a new beta feature called the CanFulfillIntentRequest interface that developers can use to make their Alexa skills more easily discoverable. The tool uses machine learning to find skills that can help fulfill a person's request when they don't know exactly what you're looking for. For example, if they ask, "Alexa, where are the best surfing spots today near Santa Barbara," the voice assistant will scour all the various surfing skills on the platform to determine whether they can understand and help process the request.
Alexa-like voice services traditionally have supported small numbers of well-separated domains, such as calendar or weather. In an effort to extend the capabilities of Alexa, Amazon in 2015 released the Alexa Skills Kit, so third-party developers could add to Alexa's voice-driven capabilities. We refer to new third-party capabilities as skills, and Alexa currently has more than 40,000. Four out of five Alexa customers with an Echo device have used a third-party skill, but we are always looking for ways to make it easier for customers to find and engage with skills. For example, we recently announced we are moving toward skill invocation that doesn't require mentioning a skill by name.
Amazon is introducing a feature that will allow Alexa to suggest voice apps when you don't know what app to ask for. Instead, you will be able to simply tell Alexa what you want to accomplish to receive recommendations of skills to help you achieve that goal. Now in beta, the CanFulfillIntentRequest interface can be used by the creators of Alexa skills in the U.S. to alert Amazon to the kinds of questions their skill may be able to answer or queries the skill is able to fulfill. Amazon's Alexa Skill Store now has more than 30,000 voice apps, and getting users' attention can be something of a challenge. The intent request introduced today should help users discover more voice apps and connect developers making skills with a larger audience.