Goto

Collaborating Authors

 second language


Should you still learn a second language if AI can translate for you?

New Scientist

AI translation apps can help you connect with other people – but at what cost? I have long remembered a conversation I had 20 years ago with one of my professors, an expert in what we then called artificial intelligence, which, in many ways, is wildly different to what we now call AI. In this exchange, he confidently told me there was no point learning a second language. Computers would soon erase language barriers, he said.

  second language

'Don't ask what AI can do for us, ask what it is doing to us': are ChatGPT and co harming human intelligence?

The Guardian

Imagine for a moment you are a child in 1941, sitting the common entrance exam for public schools with nothing but a pencil and paper. You read the following: "Write, for no more than a quarter of an hour, about a British author." Today, most of us wouldn't need 15 minutes to ponder such a question. We'd get the answer instantly by turning to AI tools such as Google Gemini, ChatGPT or Siri. Offloading cognitive effort to artificial intelligence has become second nature, but with mounting evidence that human intelligence is declining, some experts fear this impulse is driving the trend.


Evaluating Multimodal Generative AI with Korean Educational Standards

Park, Sanghee, Kim, Geewook

arXiv.org Artificial Intelligence

This paper presents the Korean National Educational Test Benchmark (KoNET), a new benchmark designed to evaluate Multimodal Generative AI Systems using Korean national educational tests. KoNET comprises four exams: the Korean Elementary General Educational Development Test (KoEGED), Middle (KoMGED), High (KoHGED), and College Scholastic Ability Test (KoCSAT). These exams are renowned for their rigorous standards and diverse questions, facilitating a comprehensive analysis of AI performance across different educational levels. By focusing on Korean, KoNET provides insights into model performance in less-explored languages. We assess a range of models - open-source, open-access, and closed APIs - by examining difficulties, subject diversity, and human error rates. The code and dataset builder will be made fully open-sourced at https://github.com/naver-ai/KoNET.


Unknown Word Detection for English as a Second Language (ESL) Learners Using Gaze and Pre-trained Language Models

Ding, Jiexin, Zhao, Bowen, Wang, Yuntao, Liu, Xinyun, Hao, Rui, Chatterjee, Ishan, Shi, Yuanchun

arXiv.org Artificial Intelligence

English as a Second Language (ESL) learners often encounter unknown words that hinder their text comprehension. Automatically detecting these words as users read can enable computing systems to provide just-in-time definitions, synonyms, or contextual explanations, thereby helping users learn vocabulary in a natural and seamless manner. This paper presents EyeLingo, a transformer-based machine learning method that predicts the probability of unknown words based on text content and eye gaze trajectory in real time with high accuracy. A 20-participant user study revealed that our method can achieve an accuracy of 97.6%, and an F1-score of 71.1%. We implemented a real-time reading assistance prototype to show the effectiveness of EyeLingo. The user study shows improvement in willingness to use and usefulness compared to baseline methods.


The ChatGPT secret: is that text message from your friend, your lover – or a robot?

The Guardian

When Tim first tried ChatGPT, he wasn't very impressed. He had a play around, but ended up cancelling his subscription. Then he started having marriage troubles. Seeking to alleviate his soul-searching and sleepless nights, he took up journalling and found it beneficial. From there, it was a small step to unburdening himself to the chatbot, he says: "ChatGPT is the perfect journal – because it will talk back." Tim started telling the platform about himself, his wife, Jill, and their recurring conflicts.


Do Language Models Have a Critical Period for Language Acquisition?

Constantinescu, Ionut, Pimentel, Tiago, Cotterell, Ryan, Warstadt, Alex

arXiv.org Artificial Intelligence

Humans appear to have a critical period (CP) for language acquisition: Second language (L2) acquisition becomes harder after early childhood, and ceasing exposure to a first language (L1) after this period (but not before) typically does not lead to substantial loss of L1 proficiency. It is unknown whether these CP effects result from innately determined brain maturation or as a stabilization of neural connections naturally induced by experience. In this study, we use language models (LMs) to test the extent to which these phenomena are peculiar to humans, or shared by a broader class of language learners. We vary the age of exposure by training LMs on language pairs in various experimental conditions, and find that LMs, which lack any direct analog to innate maturational stages, do not show CP effects when trained sequentially on L1 and L2. Our results contradict the claim that CP effects are an inevitable result of learning in statistical learners, and they are consistent with an innate mechanism for CP effects. We show that we can reverse-engineer the CP by introducing a regularizer partway through training to simulate a maturational decrease in plasticity. All in all, our results suggest that L1 learning on its own may not be enough to induce a CP, and additional engineering is necessary to make language models more cognitively plausible.


Selective Forgetting Can Help AI Learn Better

WIRED

The original version of this story appeared in Quanta Magazine. A team of computer scientists has created a nimbler, more flexible type of machine learning model. The trick: It must periodically forget what it knows. And while this new approach won't displace the huge models that undergird the biggest apps, it could reveal more about how these programs understand language. The new research marks "a significant advance in the field," said Jea Kwon, an AI engineer at the Institute for Basic Science in South Korea.


GazeReader: Detecting Unknown Word Using Webcam for English as a Second Language (ESL) Learners

Ding, Jiexin, Zhao, Bowen, Huang, Yuqi, Wang, Yuntao, Shi, Yuanchun

arXiv.org Artificial Intelligence

Automatic unknown word detection techniques can enable new applications for assisting English as a Second Language (ESL) learners, thus improving their reading experiences. However, most modern unknown word detection methods require dedicated eye-tracking devices with high precision that are not easily accessible to end-users. In this work, we propose GazeReader, an unknown word detection method only using a webcam. GazeReader tracks the learner's gaze and then applies a transformer-based machine learning model that encodes the text information to locate the unknown word. We applied knowledge enhancement including term frequency, part of speech, and named entity recognition to improve the performance. The user study indicates that the accuracy and F1-score of our method were 98.09% and 75.73%, respectively. Lastly, we explored the design scope for ESL reading and discussed the findings.


Should I Learn Coding as a Second Language?

WIRED

"I can't code, and this bums me out because--with so many books and courses and camps--there are so many opportunities to learn these days. I suspect I'll understand the machine revolution a lot better if I speak their language. Should I at least try?" Dear Decoder, Your desire to speak the "language" of machines reminds me of Ted Chiang's short story "The Evolution of Human Science." The story imagines a future in which nearly all academic disciplines have become dominated by superintelligent "metahumans" whose understanding of the world vastly surpasses that of human experts. Reports of new metahuman discoveries--although ostensibly written in English and published in scientific journals that anyone is welcome to read--are so complex and technically abstruse that human scientists have been relegated to a role akin to theologians, trying to interpret texts that are as obscure to them as the will of God was to medieval Scholastics.


Knowledge of foreign languages lasts a lifetime, new research shows

Daily Mail - Science & tech

While French is one of the most popular GCSEs in the UK, many Brits are nervous when it comes to using their language skills later in life. But a new suggests there's nothing to fear - even if it has been decades since you last studied a foreign language. Researchers from the University of York have shown that people tested on foreign languages 50 years after they last sat any exam perform just as well as recent students. 'We often say if you don't use a language, you will lose it, but this doesn't seem to be the case,' said Professor Monika Schmid, Head of the University of York's Department of Language and Linguistics. During recent tests, experts from Abertay University in Dundee, found that speaking more than one language didn't have any cognitive benefit.