Goto

Collaborating Authors

 nlu capability


mPMR: A Multilingual Pre-trained Machine Reader at Scale

Xu, Weiwen, Li, Xin, Lam, Wai, Bing, Lidong

arXiv.org Artificial Intelligence

We present multilingual Pre-trained Machine Reader (mPMR), a novel method for multilingual machine reading comprehension (MRC)-style pre-training. mPMR aims to guide multilingual pre-trained language models (mPLMs) to perform natural language understanding (NLU) including both sequence classification and span extraction in multiple languages. To achieve cross-lingual generalization when only source-language fine-tuning data is available, existing mPLMs solely transfer NLU capability from a source language to target languages. In contrast, mPMR allows the direct inheritance of multilingual NLU capability from the MRC-style pre-training to downstream tasks. Therefore, mPMR acquires better NLU capability for target languages. mPMR also provides a unified solver for tackling cross-lingual span extraction and sequence classification, thereby enabling the extraction of rationales to explain the sentence-pair classification process.


All we are is words...

#artificialintelligence

Welcome to our first newsletter!! This (weekly) newsletter exists to track the progress of natural language understanding (NLU) first companies and the impacts / opportunities they are having / creating commercially and socially. How do we define NLU? Quick step back for overview...For some time now we've been trying to create a non-human entity...box...thing...machine that can hear us, understand us and speak back to us, in a manner that is indistinguishable from a human. This was the original framing of the Turing test. No humanoid like figure to go along with the hearing, understanding and talking.