Language Learning


How AI could help you learn sign language

#artificialintelligence

Sign languages aren't easy to learn and are even harder to teach. They use not just hand gestures but also mouthings, facial expressions and body posture to communicate meaning. This complexity means professional teaching programs are still rare and often expensive. But this could all change soon, with a little help from artificial intelligence (AI). My colleagues and I are working on software for teaching yourself sign languages in an automated, intuitive way.


Spatial-Temporal Graph Convolutional Networks for Sign Language Recognition

arXiv.org Machine Learning

Abstract--The recognition of sign language is a challenging task with an important role in society to facilitate the communication ofdeaf persons. We propose a new approach of Spatial-Temporal Graph Convolutional Network to sign language recognition based on the human skeletal movements. The method uses graphs to capture the signs dynamics in two dimensions, spatial and temporal, considering the complex aspects of the language. Additionally, we present a new dataset of human skeletons for sign language based on ASLLVD to contribute to future related studies. I. INTRODUCTION Sign language is a visual communication skill that enables individuals with different types of hearing impairment to communicate in society. It is the language used by most deaf people in their daily lives and, moreover, it is the symbol of identification between the members of that community and the main force that unites them. The sign language has a very close relationship with the culture of the country or even regions, and for this reason, each nation has its language [1]. According to the World Health Organization, the number of deaf people is about 466 million, and the organization estimates that by 2050 this number exceeds 900 million, which is equivalent to a forecast of 1 in 10 individuals around the world [2].


How AI could help you learn sign language

#artificialintelligence

Sign languages aren't easy to learn and are even harder to teach. They use not just hand gestures but also mouthings, facial expressions and body posture to communicate meaning. This complexity means professional teaching programmes are still rare and often expensive. But this could all change soon, with a little help from artificial intelligence (AI). My colleagues and I are working on software for teaching yourself sign languages in an automated, intuitive way.


How AI Could Help You Learn Sign Language

#artificialintelligence

Sign languages aren't easy to learn and are even harder to teach. They use not just hand gestures but also mouthings, facial expressions and body posture to communicate meaning. This complexity means professional teaching programmes are still rare and often expensive. But this could all change soon, with a little help from artificial intelligence (AI). My colleagues and I are working on software for teaching yourself sign languages in an automated, intuitive way.


BabyAI: First Steps Towards Grounded Language Learning With a Human In the Loop

arXiv.org Artificial Intelligence

Allowing humans to interactively train artificial agents to understand language instructions is desirable for both practical and scientific reasons, but given the poor data efficiency of the current learning methods, this goal may require substantial research efforts. Here, we introduce the BabyAI research platform to support investigations towards including humans in the loop for grounded language learning. The BabyAI platform comprises an extensible suite of 19 levels of increasing difficulty. The levels gradually lead the agent towards acquiring a combinatorially rich synthetic language which is a proper subset of English. The platform also provides a heuristic expert agent for the purpose of simulating a human teacher. We report baseline results and estimate the amount of human involvement that would be required to train a neural network-based agent on some of the BabyAI levels. We put forward strong evidence that current deep learning methods are not yet sufficiently sample efficient when it comes to learning a language with compositional properties. How can a human train an intelligent agent to understand natural language instructions? We believe that this research question is important from both technological and scientific perspectives. No matter how advanced AI technology becomes, human users may want to customize their intelligent helpers to be able to better understand their desires and needs.


How we used AI to translate sign language in real time.

#artificialintelligence

Using artificial intelligence to translate sign language in real time - see how we used Python to train a neural network with 86% accuracy in less than a day. Imagine a world where anyone can communicate using sign language over video. Inspired by this vision, some of our engineering team decided to bring this idea to HealthHack 2018. In less than 48 hours and using the power of artificial intelligence, their team was able to produce a working prototype which translated signs from the Auslan alphabet to English text in real time. People who are hearing impaired are left behind in video consultations.


Large-scale Cloze Test Dataset Created by Teachers

arXiv.org Artificial Intelligence

Cloze tests are widely adopted in language exams to evaluate students' language proficiency. In this paper, we propose the first large-scale human-created cloze test dataset CLOTH, containing questions used in middle-school and high-school language exams. With missing blanks carefully created by teachers and candidate choices purposely designed to be nuanced, CLOTH requires a deeper language understanding and a wider attention span than previously automatically-generated cloze datasets. We test the performance of dedicatedly designed baseline models including a language model trained on the One Billion Word Corpus and show humans outperform them by a significant margin. We investigate the source of the performance gap, trace model deficiencies to some distinct properties of CLOTH, and identify the limited ability of comprehending the long-term context to be the key bottleneck.


This Amazon Echo mod lets Alexa understand sign language

#artificialintelligence

It seems like voice interfaces are going to be a big part of the future of computing; popping up in phones, smart speakers, and even household appliances. But how useful is this technology for people who don't communicate using speech? Are we creating a system that locks out certain users? These were the questions that inspired software developer Abhishek Singh to create a mod that lets Amazon's Alexa assistant understand some simple sign language commands. In a video, Singh demonstrates how the system works.


Movies, Neural Networks Boost AI Language Skills - insideBIGDATA

#artificialintelligence

When we discuss about artificial intelligence (AI), how are machines learning? What kinds of projects feed into greater understanding? For our friends over at IBM, one surprising answer is movies. To build smarter AI systems, IBM researchers are using movie plots and neural networks to explore new ways of enhancing the language understanding capabilities of AI models. IBM will present key findings from two papers on these topics at the Association for Computational Linguistics (ACL) annual meeting this week in Melbourne, Australia.


Memrise raises $15.5M as its AI-based language-learning app passes 35M users

#artificialintelligence

Memrise, a UK startup whose eponymous language-learning app employs machine learning and localised content to adapt to users' needs as they progress through their lessons, has raised another $15.5 million in funding to expand its product. The funding comes after a period of strong growth: Memrise has now passed 35 million users globally across its 20 language courses, and it tipped into profitability in Q1 of this year. Ed Cooke, who co-founded the app with Ben Whately and Greg Detre, told TechCrunch that this places it as the second-most popular language app globally in terms of both users and revenues. This round, a Series B, was led by Octopus Ventures and Korelya Capital, along with participation from existing investors Avalon Ventures and Balderton Capital. Memrise is not disclosing its valuation -- it has raised a relatively modest $22 million to date -- but Cooke (who is also the CEO) said the plan will be to use the funding to expand its AI platform and add in more features for users.