Language Learning


AI-powered language learning promises to fast-track fluency

#artificialintelligence

A linguistics company is using AI to shorten the time it takes to learn a new language. It takes about 200 hours, using traditional methods, to gain basic proficiency in a new language. This AI-powered platform claims it can teach from beginner to fluency in just a few months – through once-daily 20 minute lessons. Learning a new language is hard. Some people seem to pick up new dialects with ease, but for the rest of us it's a trudge through rote memorization.


?utm_source=dlvr.it&utm_medium=twitter

#artificialintelligence

A linguistics company is using AI to shorten the time it takes to learn a new language. It takes about 200 hours, using traditional methods, to gain basic proficiency in a new language. This AI-powered platform claims it can teach from beginner to fluency in just a few months – through once-daily 20 minute lessons. Learning a new language is hard. Some people seem to pick up new dialects with ease, but for the rest of us it's a trudge through rote memorization.


Machines Are Developing Language Skills Inside Virtual Worlds

MIT Technology Review

Both the DeepMind and CMU approaches use deep reinforcement learning, popularized by DeepMind's Atari-playing AI. A neural network is fed raw pixel data from a virtual environment and uses rewards, like points in a computer game, to learn by trial and error (see "10 Breakthrough Technologies 2017: Reinforcement Learning"). By running through millions of training scenarios at accelerated speeds, both AI programs learned to associate words with particular objects and characteristics, which let them follow the commands. The millions of training runs required means Domingos is not convinced pure deep reinforcement learning will ever crack the real world.


Automatic sign language translators turn signing into text

New Scientist

Machine translation systems that convert sign language into text and back again are helping people who are deaf or have difficulty hearing to communicate with those who cannot sign. A sign language user can approach a bank teller and sign to the KinTrans camera that they'd like assistance, for example. KinTrans's machine learning algorithm translates each sign as it is made and then a separate algorithm turns those signs into a sentence that makes grammatical sense. KinTrans founder Mohamed Elwazer says his system can already recognise thousands of signs in both American and Arabic sign language with 98 per cent accuracy.


From machine learning to Python language skills: 6 tech skill sets that fetch maximum salary

#artificialintelligence

With both machine learning and data analytics skill set, one can easily fetch an average pay of Rs 13.94 lakh per annum (LPA). Although knowledge of machine learning algorithms do add to the highest package, the skill set alone can fetch a handsome Rs 10.43 LPA on average. If the latest Analytics India Industry Report 2017 – Salaries & Trends report is anything to go by, one could make an average of Rs 10.40 LPA with exceptional R language skills. One of the most popular programming languages, professionals with Python skill set can make around Rs 10.12 LPA on average.


Investigating Bias In AI Language Learning

#artificialintelligence

We recommend addressing this through the explicit characterization of acceptable behavior. One such approach is seen in the nascent field of fairness in machine learning, which specifies and enforces mathematical formulations of nondiscrimination in decision-making. Another approach can be found in modular AI architectures, such as cognitive systems, in which implicit learning of statistical regularities can be compartmentalized and augmented with explicit instruction of rules of appropriate conduct . Certainly, caution must be used in incorporating modules constructed via unsupervised machine learning into decision-making systems.


Reimagining Language Learning with NLP and Reinforcement Learning

#artificialintelligence

The way we learn natural languages hasn't really changed for decades. We now have beautiful apps like Duolingo and Spaced Repetition software like Anki, but I'm talking about our fundamental approach. We still follow pre-defined curricula, and do essentially random exercises. Learning isn't personalized, and learning isn't driven by data. And I think there's a big opportunity to change that.


AI computer learns to speak like a four-year-old child

AITopics Original Links

A computer that learns to talk in the same way as a young child by holding conversations with humans has been developed by scientists. The machine, which uses cutting edge artificial neural network technology to mimic the way the human brain works, was given 1,500 sentences from literature about language structure. It was then able to use this to learn how to construct new sentences with nouns, verbs, adjectives and pronouns when having a conversation with a real human. Researchers used connections between two million artificial neurons to mimic some of the processes that take place in the human brain as we learn to speak (illustrated). While some of the sentences had the rather functional approach of a computer rather than the finesse of a natural speaker, the results are still impressive.


Kinect sensor can translate sign language into SPEECH and TEXT

AITopics Original Links

Microsoft's Kinect has already proved its credentials in reading simple hand and body movements in the gaming world. But now a team of Chinese researchers have added sign language to its motion-sensing capabilities. Scientists at Microsoft Research Asia recently demonstrated software that allows Kinect to read sign language using hand tracking. What's impressive is that it can do this in real-time, translating sign language to spoken language and vice versa at conversational speeds. The system, dubbed the Kinect Sign Language Translator, is capable of capturing a conversation from both sides.


IoT - A Support Vector Machine Implementation for Sign Language Recognition on Intel Edison.

#artificialintelligence

This project explores the development of a sign language to speech translation glove by implementing a Support Vector Machine(SVM) on the Intel Edison to recognize various letters signed by sign language users. Support Vector Machines (SVMs) are machine learning supervised models with associated learning algorithms that analyze data used for classification and regression analysis. Given a set of training examples, each marked for belonging to one of n categories, an SVM training algorithm builds a model that assigns new examples into one category or the other, making it a non-probabilistic binary linear classifier. It is important that users set their own preferred interval since a user just beginning to learn sign language will sign at a slower rate compared to a sign language expert.