Collaborating Authors

A Survey on Lexical Simplification

Journal of Artificial Intelligence Research

Lexical Simplification is the process of replacing complex words in a given sentence with simpler alternatives of equivalent meaning. This task has wide applicability both as an assistive technology for readers with cognitive impairments or disabilities, such as Dyslexia and Aphasia, and as a pre-processing tool for other Natural Language Processing tasks, such as machine translation and summarisation. The problem is commonly framed as a pipeline of four steps: the identification of complex words, the generation of substitution candidates, the selection of those candidates that fit the context, and the ranking of the selected substitutes according to their simplicity. In this survey we review the literature for each step in this typical Lexical Simplification pipeline and provide a benchmarking of existing approaches for these steps on publicly available datasets. We also provide pointers for datasets and resources available for the task.

Rapidly-Exploring Quotient-Space Trees: Motion Planning using Sequential Simplifications Artificial Intelligence

Motion planning problems can be simplified by admissible projections of the configuration space to sequences of lower-dimensional quotient-spaces, called sequential simplifications. To exploit sequential simplifications, we present the Quotient-space Rapidly-exploring Random Trees (QRRT) algorithm. QRRT takes as input a start and a goal configuration, and a sequence of quotient-spaces. The algorithm grows trees on the quotient-spaces both sequentially and simultaneously to guarantee a dense coverage. QRRT is shown to be (1) probabilistically complete, and (2) can reduce the runtime by at least one order of magnitude. However, we show in experiments that the runtime varies substantially between different quotient-space sequences. To find out why, we perform an additional experiment, showing that the more narrow an environment, the more a quotient-space sequence can reduce runtime.

Facebook's AI streamlines sentences while preserving meaning


Simplifying text's grammar and structure is a useful skill most of us acquire in school, but AI typically has a tougher go of it, owing to a lack of linguistic knowledge. That said, scientists at Facebook AI Research and Inria are progressing toward a simplification model dubbed ACCESS (AudienCe-CEntric Sentence Simplification), which they claim enables customization of text length, amount of paraphrasing, lexical complexity, syntactic complexity, and other parameters while preserving coherency. "Text simplification can be beneficial for people with cognitive disabilities, such as aphasia, dyslexia, and autism, but also for second language learners and people with low literacy," wrote the researchers in a preprint paper detailing their work. "The type of simplification needed for each of these audiences is different … Yet, research in text simplification has been mostly focused on developing models that generate a single generic simplification for a given source text with no possibility to adapt outputs for the needs of various target populations. To this end, the team tapped seq2seq, a general-purpose encoder-decoder framework that takes data and its context as inputs.

Unsupervised Lexical Simplification for Non-Native Speakers

AAAI Conferences

Lexical Simplification is the task of replacing complex words with simpler alternatives. We propose a novel, unsupervised approach for the task. It relies on two resources: a corpus of subtitles and a new type of word embeddings model that accounts for the ambiguity of words. We compare the performance of our approach and many others over a new evaluation dataset, which accounts for the simplification needs of 400 non-native English speakers. The experiments show that our approach outperforms state-of-the-art work in Lexical Simplification.

IRS boss Koskinen backs tax reform, calls system 'a mess'

FOX News

IRS Commissioner John Koskinen said Wednesday that the agency fully supports tax reform efforts, acknowledging the existing U.S. tax code is such a "mess" that even he struggles to fill out his federal returns. It is difficult for the IRS commissioner to fill out his tax returns," Koskinen, who has a Yale law degree, told Fox News at a Washington, D.C., conference on improving government services. Koskinen, who became commissioner in 2013 under then-President Barack Obama, made clear the agency doesn't meddle in policy but said he started talking with the new administration about tax reform just weeks after then-candidate Donald Trump was elected and the IRS continues to offer help. "One of the questions was, 'If major tax reform simplification was on the table, would there be anybody in the IRS opposed to that?' I was quick to respond, after having talked to employees, that nobody is more supportive of tax simplification than the IRS," he said. Koskinen said he started soliciting suggestions from employees last year about ways to simplify and streamline the code and called such reform "one of our highest priorities."