Goto

Collaborating Authors

Results


Closing the Gender Data Gap in AI

#artificialintelligence

The first computer algorithm is said to have been written in the early 1840s, for the prototype of the Analytical Engine. Ada Lovelace, a mathematician dubbed a "female genius" in her posthumous biography. As the field of computing developed over the next century following Lovelace's death, the typing work involved in creating computer programs was seen as "women's work," a role viewed as akin to switchboard operator or secretary. Women wrote the software, while men made the hardware -- the latter seen, at the time, as the more prestigious of the two tasks. And, during the Space Race of the 1950s and '60s, three Black women, known as "human computers," broke gender and racial barriers to help NASA send the first men into orbit.


Text Generation using GPT-J with Hugging Face 🤗 and Segmind

#artificialintelligence

Text generation is the task of automatically generating text using a machine learning system. A good text generation system can make it really hard to distinguish between human and machine-written text pieces.


What's new in Microsoft Azure's NLP AI services

#artificialintelligence

If you want to begin using machine learning in your applications, Microsoft offers several different ways to jumpstart development. One key technology, Microsoft's Azure Cognitive Services, offers a set of managed machine learning services with pretrained models and REST API endpoints. These models offer most of the common use cases, from working with text and language, to recognizing speech and images. Machine learning is still evolving, with new models being released and new hardware to help speed up inferencing, and so Microsoft regularly updates its Cognitive Services. The latest major update, announced at Build 2022, features a lot of changes to its tools for working with text, bringing three different services under one umbrella.


Machine Translation Evaluation with Cometinho

#artificialintelligence

The European Association for Machine Translation (EAMT) conference is a venue where MT researchers, users and translators gather to discuss the latest advances in the industry. It is really interesting to go there and see what is going on in the European continent in terms of MT development and adoption. In this article, I want to share some ideas from the Best Paper Award of this year. Its title is "Searching for COMETINHO: The Little Metric That Could", from the research lab of Unbabel, a company based in Lisbon, Portugal that offers translation services using MT and human translators. You can find the online version of the paper in the ACL Anthology.


AI: The emerging Artificial General Intelligence debate

#artificialintelligence

Since Google's artificial intelligence (AI) subsidiary DeepMind published a paper a few weeks ago describing a generalist agent they call Gato (which can perform various tasks using the same trained model) and claimed that artificial general intelligence (AGI) can be achieved just via sheer scaling, a heated debate has ensued within the AI community. While it may seem somewhat academic, the reality is that if AGI is just around the corner, our society--including our laws, regulations, and economic models--is not ready for it. Indeed, thanks to the same trained model, generalist agent Gato is capable of playing Atari, captioning images, chatting, or stacking blocks with a real robot arm. It can also decide, based on its context, whether to output text, join torques, button presses, or other tokens. As such, it does seem a much more versatile AI model than the popular GPT-3, DALL-E 2, PaLM, or Flamingo, which are becoming extremely good at very narrow specific tasks, such as natural language writing, language understanding, or creating images from descriptions.


Three ideas from linguistics that everyone in AI should know

#artificialintelligence

Everybody knows that large language models like GPT-3 and LaMDA have made tremendous strides, at least in some respects, and powered past many benchmarks, and Cosmo recently described DALL-E but most in the field also agree that something is still missing. A growing body of evidence shows that state-of-the-art models learn to exploit spurious statistical patterns in datasets... instead of learning meaning in the flexible and generalizable way that humans do." Since then, the results on benchmarks have gotten better, but there's still something missing. Reference: Words and sentence don't exist in isolation. Language is about a connection between words (or sentence) and the world; the sequences of words that large language models utter lack connection to the external world.


Expressive Querying for Accelerating Visual Analytics

Communications of the ACM

In this work, we introduce the problem of visualization search and highlight two underlying challenges of search enumeration and visualization matching.


Language Models

Communications of the ACM

A transformer has strong language representation ability; a very large corpus contains rich language expressions (such unlabeled data can be easily obtained) and training large-scale deep learning models has become more efficient. Therefore, pre-trained language models can effectively represent a language's lexical, syntactic, and semantic features. Pre-trained language models, such as BERT and GPTs (GPT-1, GPT-2, and GPT-3), have become the core technologies of current NLP. Pre-trained language model applications have brought great success to NLP. "Fine-tuned" BERT has outperformed humans in terms of accuracy in language-understanding tasks, such as reading comprehension.8,17 "Fine-tuned" GPT-3 has also reached an astonishing level of fluency in text-generation tasks.3


How to get started with machine learning and AI

#artificialintelligence

Back in the 1950s, in the earliest days of what we now call artificial intelligence, there was a debate over what to name the field. Herbert Simon, co-developer of both the logic theory machine and the General Problem Solver, argued that the field should have the much more anodyne name of "complex information processing." This certainly doesn't inspire the awe that "artificial intelligence" does, nor does it convey the idea that machines can think like humans. However, "complex information processing" is a much better description of what artificial intelligence actually is: parsing complicated data sets and attempting to make inferences from the pile. Some modern examples of AI include speech recognition (in the form of virtual assistants like Siri or Alexa) and systems that determine what's in a photograph or recommend what to buy or watch next.


Amazon demos Alexa reading a bedtime story in the voice of a boy's deceased grandma

ZDNet

Amazon's intelligent, voice-enabled assistant Alexa has become an integral part of everyday experiences. Alexa gets more than 1 billion requests per week, Amazon said Wednesday, while customers have access to more than 100,000 Alexa skills. Now, the technology giant is developing a new capability for Alexa, so she can help you remember loved ones who have passed away: the ability to communicate with others' voices. On Wednesday at the re:MARS conference (Amazon's event for Machine Learning, Automation, Robotics, and Space), Amazon's Rohit Prasad briefly described the skill. He showed a short video of a boy speaking to an Amazon Echo speaker.