Language Learning


AI-powered language learning promises to fast-track fluency

#artificialintelligence

A linguistics company is using AI to shorten the time it takes to learn a new language. It takes about 200 hours, using traditional methods, to gain basic proficiency in a new language. This AI-powered platform claims it can teach from beginner to fluency in just a few months – through once-daily 20 minute lessons. Learning a new language is hard. Some people seem to pick up new dialects with ease, but for the rest of us it's a trudge through rote memorization.


Mondly releases language-learning voice chatbot

ZDNet

Brasov, Transylvania-based Mondly has released a voice chatbot to help you learn languages. The chatbot interacts with users by using a speech recognition engine that allows users to practice speaking in a foreign language and receive adaptive audio-visual responses. AI techniques are becoming part of every day computing: here's how they're being used to help online retailers keep up with the competition. The app uses speech recognition technology from Nuance enhanced with an object recognition engine that identifies objects in the text. It understands millions of phrases in 33 languages.


What's universal grammar? Evidence rebuts Chomsky's theory of language learning

#artificialintelligence

This article was originally published by Scientific American. The idea that we have brains hardwired with a mental template for learning grammar -- famously espoused by Noam Chomsky of the Massachusetts Institute of Technology -- has dominated linguistics for almost half a century. Recently, though, cognitive scientists and linguists have abandoned Chomsky's "universal grammar" theory in droves because of new research examining many different languages -- and the way young children learn to understand and speak the tongues of their communities. That work fails to support Chomsky's assertions. The research suggests a radically different view, in which learning of a child's first language does not rely on an innate grammar module. Instead the new research shows that young children use various types of thinking that may not be specific to language at all -- such as the ability to classify the world into categories (people or objects, for instance) and to understand the relations among things. These capabilities, coupled with a unique hu man ability to grasp what others intend to communicate, allow language to happen.


Evidence Rebuts Chomsky's Theory of Language Learning

#artificialintelligence

The idea that we have brains hardwired with a mental template for learning grammar--famously espoused by Noam Chomsky of the Massachusetts Institute of Technology--has dominated linguistics for almost half a century. Recently, though, cognitive scientists and linguists have abandoned Chomsky's "universal grammar" theory in droves because of new research examining many different languages--and the way young children learn to understand and speak the tongues of their communities. That work fails to support Chomsky's assertions. The research suggests a radically different view, in which learning of a child's first language does not rely on an innate grammar module. Instead the new research shows that young children use various types of thinking that may not be specific to language at all--such as the ability to classify the world into categories (people or objects, for instance) and to understand the relations among things. These capabilities, coupled with a unique hu man ability to grasp what others intend to communicate, allow language to happen. The new findings indicate that if researchers truly want to understand how children, and others, learn languages, they need to look outside of Chomsky's theory for guidance.