Goto

Collaborating Authors

Generation


AI in Analytics: Powering the Future of Data Analytics - Dataconomy

#artificialintelligence

Augmented analytics: the combination of AI and analytics is the latest innovation in data analytics. For organizations, data analysis has evolved from hiring "unicorn" data scientists – to having smart applications that provide actionable insights for decision-making in just a few clicks, thanks to AI. Augmenting by definition means making something greater in strength or value. Augmented analytics, also known as AI-driven analytics, helps in identifying hidden patterns in large data sets and uncovers trends and actionable insights. It leverages technologies such as Analytics, Machine Learning, and Natural Language Generation to automate data management processes and assist with the hard parts of analytics. The capabilities of AI are poised to augment analytics activities and enable companies to internalize data-driven decision-making while enabling everyone in the organization to easily deal with data.


Women in Robotics Update: Ecem Tuglan, Tuong Anh Ens, Sravanthi Kanchi, Kajal Gada, Dimitra Gkatzia

Robohub

Welcome to the first of our Women in Robotics Spotlights, where we share stories from women who haven't yet been featured in our Annual Showcase but who are working on all sorts of interesting projects. We hope these stories provide inspiration to everyone to join us working in the field of robotics. And if you're a woman working in robotics, why not contribute your story too! "Making robots communicate with humans in natural language is a fascinating challenge. There is a lot going on during interactions between robots and humans. Humans make gestures, observe or interact with visible objects in the environment, and display emotions. What motivates me is equipping social robots with the ability to interact seamlessly, by recognizing a given situation and talking about it" says Dimitra Gkatzia who specializes in Natural Language Generation for Human-Robot Interaction.


The Zenith of Natural Language Technologies: Conversational AI - insideBIGDATA

#artificialintelligence

Conversational AI is the layman's term for Natural Language Interaction (NLI), a subset of Natural Language Processing (NLP) that involves almost all natural language technologies. NLI synthesizes aspects of NLP, Natural Language Understanding (NLU), Natural Language Generation (NLG), and Natural Language Querying (NLQ) to facilitate the rapid, conversational exchanges of popular platforms such as Amazon Alexa, for example. Although each aforesaid natural language technology is based on NLP, one can argue the most vital to conversational AI is NLG, which produces linguistic summaries or explanations of what are oftentimes quantified data. When coupled with NLQ, this capacity enables users to swiftly ask (and receive answers) to questions, which forms the bulk of conversational AI. NLP's role is to accurately convert data (the questions asked) into text according to conventions for parts of speech and grammar.


A Checklist For Artificial Intelligence On Workstations - insideHPC

#artificialintelligence

Firms of all sizes are leveraging workstations as part of their artificial intelligence (AI) workflows. In the past, many firms relied on highly-scaled servers in data centers or private/public cloud infrastructure to run their AI applications. However, the results of a recent survey commissioned by Dell and executed by Forrester summarized in this white paper have indicated a quarter of firms are actually using workstations today to run core AI business applications and are experiencing the benefits that workstations can offer. Workstations can handle core AI functionality like pre-trained vertical solutions, image and video analysis, data science, and natural language generation. They do so for applications where data security is a priority but where timelines are more flexible, or cost is a greater consideration compared with running these workloads in data centers or in the public cloud. The survey asked artificial intelligence (AI) and workstation decisions makers in the US to evaluate their organizations' current and future plans for using workstations to execute AI-related projects (mobile workstations excluded).


Python Tutorial For Beginners

#artificialintelligence

Get 80 free Python Tutorial, start Learning Python programming language from basic to advanced. Python is an object-oriented programming language created by Guido Rossum in 1989. It is ideally designed for rapid prototyping of complex applications. It has interfaces to many OS system calls and libraries and is extensible to C or C . Many large companies use the Python programming language include NASA, Google, YouTube, BitTorrent, etc. Python programming is widely used in Artificial Intelligence, Natural Language Generation, Neural Networks and other advanced fields of Computer Science. Python had deep focus on code readability & this class will teach you python from basics.


Adapting a Language Model for Controlled Affective Text Generation

arXiv.org Artificial Intelligence

Human use language not just to convey information but also to express their inner feelings and mental states. In this work, we adapt the state-of-the-art language generation models to generate affective (emotional) text. We posit a model capable of generating affect-driven and topic focused sentences without losing grammatical correctness as the affect intensity increases. We propose to incorporate emotion as prior for the probabilistic state-of-the-art text generation model such as GPT-2. The model gives a user the flexibility to control the category and intensity of emotion as well as the topic of the generated text. Previous attempts at modelling fine-grained emotions fall out on grammatical correctness at extreme intensities, but our model is resilient to this and delivers robust results at all intensities. We conduct automated evaluations and human studies to test the performance of our model, and provide a detailed comparison of the results with other models. In all evaluations, our model outperforms existing affective text generation models.


A Checklist For Artificial Intelligence On Workstations - insideHPC

#artificialintelligence

Firms of all sizes are leveraging workstations as part of their artificial intelligence (AI) workflows. In the past, many firms relied on highly-scaled servers in data centers or private/public cloud infrastructure to run their AI applications. However, the results of a recent survey commissioned by Dell and executed by Forrester summarized in this white paper, "A Checklist For Artificial Intelligence On Workstations," have indicated a quarter of firms are actually using workstations today to run core AI business applications and are experiencing the benefits that workstations can offer. Workstations can handle core AI functionality like pre-trained vertical solutions, image and video analysis, data science, and natural language generation. They do so for applications where data security is a priority but where timelines are more flexible, or cost is a greater consideration compared with running these workloads in data centers or in the public cloud.


Microsoft is granted exclusive rights to use OpenAI's GPT-3

#artificialintelligence

Microsoft and OpenAI's close relationship has taken another leap forward with the former gaining exclusive GPT-3 access. GPT-3 has been the talk of the AI town in recent months. OpenAI's innovation can help to create convincing articles and the company once deemed it too dangerous to release in a world where misinformation and fake news is already problematic. OpenAI never made GPT-3 publicly available but instead provided access to a limited number of trusted researchers. Microsoft announced today that it now has the exclusive rights to leverage GPT-3's "technical innovations to develop and deliver advanced AI solutions for our customers, as well as create new solutions that harness the amazing power of advanced natural language generation."


Explaining Creative Artifacts

arXiv.org Artificial Intelligence

Human creativity is often described as the mental process of combining associative elements into a new form, but emerging computational creativity algorithms may not operate in this manner. Here we develop an inverse problem formulation to deconstruct the products of combinatorial and compositional creativity into associative chains as a form of post-hoc interpretation that matches the human creative process. In particular, our formulation is structured as solving a traveling salesman problem through a knowledge graph of associative elements. We demonstrate our approach using an example in explaining culinary computational creativity where there is an explicit semantic structure, and two examples in language generation where we either extract explicit concepts that map to a knowledge graph or we consider distances in a word embedding space. We close by casting the length of an optimal traveling salesman path as a measure of novelty in creativity.


Mark-Evaluate: Assessing Language Generation using Population Estimation Methods

arXiv.org Artificial Intelligence

We propose a family of metrics to assess language generation derived from population estimation methods widely used in ecology. More specifically, we use mark-recapture and maximum-likelihood methods that have been applied over the past several decades to estimate the size of closed populations in the wild. We propose three novel metrics: ME$_\text{Petersen}$ and ME$_\text{CAPTURE}$, which retrieve a single-valued assessment, and ME$_\text{Schnabel}$ which returns a double-valued metric to assess the evaluation set in terms of quality and diversity, separately. In synthetic experiments, our family of methods is sensitive to drops in quality and diversity. Moreover, our methods show a higher correlation to human evaluation than existing metrics on several challenging tasks, namely unconditional language generation, machine translation, and text summarization.