ginnie: New Artificial Intelligence Software Set to Help eCommerce Sel


The launch comes after years of research and improvements using artificial intelligence (AI) and natural language generation (NLG). According to Statista 2.14 billion people worldwide will buy goods and services online by 2021. In Canada, PayPal reports that eCommerce businesses are growing 28 times more than those who are not selling online. As more consumers turn to the web to research and purchase items, product content becomes increasingly important in order to drive this level of growth. According to Salsify 51% of the time, a product listing with more bullets will convert at a higher rate and outrank its top competitor.

Unifying Human and Statistical Evaluation for Natural Language Generation Artificial Intelligence

How can we measure whether a natural language generation system produces both high quality and diverse outputs? Human evaluation captures quality but not diversity, as it does not catch models that simply plagiarize from the training set. On the other hand, statistical evaluation (i.e., perplexity) captures diversity but not quality, as models that occasionally emit low quality samples would be insufficiently penalized. In this paper, we propose a unified framework which evaluates both diversity and quality, based on the optimal error rate of predicting whether a sentence is human- or machine-generated. We demonstrate that this error rate can be efficiently estimated by combining human and statistical evaluation, using an evaluation metric which we call HUSE. On summarization and chit-chat dialogue, we show that (i) HUSE detects diversity defects which fool pure human evaluation and that (ii) techniques such as annealing for improving quality actually decrease HUSE due to decreased diversity.

Pretraining-Based Natural Language Generation for Text Summarization Artificial Intelligence

In this paper, we propose a novel pretraining-based encoder-decoder framework, which can generate the output sequence based on the input sequence in a two-stage manner. For the encoder of our model, we encode the input sequence into context representations using BERT. For the decoder, there are two stages in our model, in the first stage, we use a Transformer-based decoder to generate a draft output sequence. In the second stage, we mask each word of the draft sequence and feed it to BERT, then by combining the input sequence and the draft representation generated by BERT, we use a Transformer-based decoder to predict the refined word for each masked position. To the best of our knowledge, our approach is the first method which applies the BERT into text generation tasks. As the first step in this direction, we evaluate our proposed method on the text summarization task. Experimental results show that our model achieves new state-of-the-art on both CNN/Daily Mail and New York Times datasets.

r/MachineLearning - [D] What is the SOTA for Natural Language Generation in 2019?


Hierarchical Neural story generation had the most realistic paragraphs of any text generation paper I came across last summer when I was doing research on this topic. I also find it fun that they used a subreddit as their training set (writingprompts). The architecture is a mixture of self attention layers and convolutional layers. Generate a prompt and then generate a full story. I tried to extend their work to prompt, outline, story but the results were meh (they were similar quality to not bothering with the outline step).

Is Augmented Analytics a Threat to Business Intelligence? Analytics Insight


A term coined by Gartner, Augmented analytics is used to describe the integration of natural language generation, text mining, natural language processing, and automated data processing capabilities into Business Intelligence (BI) systems.

Variational Cross-domain Natural Language Generation for Spoken Dialogue Systems Artificial Intelligence

Cross-domain natural language generation (NLG) is still a difficult task within spoken dialogue modelling. Given a semantic representation provided by the dialogue manager, the language generator should generate sentences that convey desired information. Traditional template-based generators can produce sentences with all necessary information, but these sentences are not sufficiently diverse. With RNN-based models, the diversity of the generated sentences can be high, however, in the process some information is lost. In this work, we improve an RNN-based generator by considering latent information at the sentence level during generation using the conditional variational autoencoder architecture. We demonstrate that our model outperforms the original RNN-based generator, while yielding highly diverse sentences. In addition, our model performs better when the training data is limited.

Opinion Artificial Intelligence and its impact on human evolution


Humans have a tendency to make things smarter and smarter: the telephone became a smartphone, the wristwatch became a smartwatch. Another example is where humans enabled a computer to ingest data, process it, provide an outcome, then learn from additional new data and provide an improved outcome. In layman terms, this is cognition and technologies that enable cognition are cognitive technologies such as Machine Learning, Natural Language Processing, Natural Language Generation, etc. This in my view, is one of the most important change that will impact the human race. I believe this will compliment humans and not replace them.

Google pulls gender pronouns from Gmail Smart Compose to reduce bias


Gmail's Smart Compose can save you valuable time when you're firing off a quick message, but don't expect it to refer to people as "him" or "her" -- Google is playing it safe on that front. Product leaders have revealed to Reuters that Google removed gender pronouns from Smart Compose's phrase suggestions after realizing that the AI-guided feature could be biased. When a scientist talked about meeting an investor in January, for example, Gmail offered the follow-up "do you want to meet him" -- not considering the possibility that the investor could be a woman. The problem is a typical one with natural language generation systems like Google's: it's based on huge volumes of historical data. As certain fields tend to be dominated by people from one gender, the AI can sometimes assume that a person belongs to that gender.

Narrative Science Employs Natural Language Generation - Nanalyze


If you've spent much time on Nanalyze, you know that we're passionate about technology and believe that we're living in the most exciting times in history. Our job is to keep you up-to-date about these changes in a variety of fields, so you can make informed financial decisions about where to invest or not--and learn some pretty cool stuff along the way. We talk about the good, the bad and ugly no matter what. Then we came across Narrative Science and its natural language generation (NLG) platform Quill, which uses artificial intelligence technology to write everything from financial reports to sports news. We knew that English degree would be obsolete someday.

Workday plots People Analytics, efforts to gauge workforce performance


Workday outlined People Analytics, an application that will filter through human resources data and give executives an at-a-glance view of workforce trends they need to act on. Using artificial intelligence, machine learning and analytics, Workday People Analytics will look for enterprise employee patterns within Workday Human Capital Management. People Analytics will aim to find connections, predict the most important issue to see and explain workplace trends in a narrative powered by natural language generation. People Analytics, which was highlighted at the Workday Rise conference in Las Vegas, falls under a category called augmented analytics by Workday. The gist is that People Analytics will dynamically generate HR analytics to surface issues such as organizational composition, diversity, hiring, retention and attrition, talent gaps and performance.