Goto

Collaborating Authors

Jury: Evaluating performance of NLG models

#artificialintelligence

Jury is an evaluation package for NLG systems. It allows using many metrics in one go. Also, it implements concurrency among evaluation metrics and supports evaluating with multiple predictions. Jury uses datasets package for metrics, and thus supports any metrics that datasets package has. Default evaluation metrics are, BLEU, METEOR and ROUGE-L. As of today 28 metrics are available in the "datasets" package, to see all supported metrics, see datasets/metrics.


Semantic Similarity Using Transformers

#artificialintelligence

Semantic Similarity, or Semantic Textual Similarity, is a task in the area of Natural Language Processing (NLP) that scores the relationship between texts or documents using a defined metric. Semantic Similarity has various applications, such as information retrieval, text summarization, sentiment analysis, etc. There have been a lot of approaches for Semantic Similarity. The most straightforward and effective method now is to use a powerful model (e.g. The similarity score indicates whether two texts have similar or more different meanings.


As AI Becomes More Ever Capable, Will It End Up Helping, Or Hindering, The Hackers?

#artificialintelligence

Hacking events have increasingly been in the news this year, as a range of serious ransomware and supply chain hacks have wrecked chaos on businesses and infrastructure. The latest (as of July 2021) is a supply-chain-ransomware attack against Miami-based software firm Kaseya, affecting 1500 of its customers - with the hackers (threat-actors) demanding $70 million in cryptocurrency to release the data. According to the World Economic Forum, cyber-attacks now stand side by side with climate change and natural disasters as one of the most pressing threats to humanity. No doubt ways will eventually be found to detect and pre-empt these latest styles of attack. The cybersecurity industry is defined by continual, if largely gradual, innovation - as new threats emerge, technology that protects, detects and responds to the attacks also emerges. This cat and mouse dynamic has been a fundamental trait of the industry to date: a permanently iterating relationship that supercharges the development of new technologies on both sides, where even a small edge over adversaries can pay dividends (or ransoms).


Machine learning applications need less data than has been assumed

#artificialintelligence

A combined team of researchers from the University of British Columbia and the University of Alberta has found that at least some machine learning applications can learn from far fewer examples than has been assumed. In their paper published in the journal Nature Machine Intelligence, the group describes testing they carried out with machine learning applications created to predict certain types of molecular structures. Machine learning can be used in a wide variety of applications--one of the most well-known is learning to spot people or objects in photographs. Such applications typically require huge amounts of data for training. In this new effort, the researchers have found that in some instances, machine learning applications do not need such huge amounts of data to be useful.


StreetLight Data Partnership Aims to Help Expand EV Chargers

#artificialintelligence

With electric vehicles slowly gaining momentum toward becoming the dominant form of transportation in the U.S., two startups have struck up a partnership to help cities and utilities figure out where to put more car chargers. StreetLight Data, which sells transportation data to local governments, will offer Volta Charging's PredictEV tool to its customers. The tool uses AI to generate suggestions about where electric charging infrastructure would be most useful -- an urban planning consideration that is becoming more important as more electric vehicles hit the streets. Today, electric vehicles make up only around 2 percent of new vehicles sold in the U.S., but that number is rising rapidly. In 2020, Pew Research found that the number of EVs sold in the country had more than tripled since 2016.


The Rise of the Transformers: Explaining the Tech Underlying GPT-3

#artificialintelligence

The capabilities of GPT -3 has led to a debate between some as to whether or not GPT-3 and its underlying architecture will enable Artificial General Intelligence (AGI) in the future against those (many being from the school of logic and symbolic AI) who believe that without some form of logic there can be no AGI. The truth of the matter is that we don't know as we don't really fully understand the human brain. With science and engineering we work upon the basis of observation and testing. This section also addresses points raised by Esaú Flores. Gary Grossman in an article entitled Are we entering the AI Twilight Zone between AI and AGI? observed that in February 2020, Geoffrey Hinton, the University of Toronto professor who is a pioneer of Deep Learning, noted: "There are one trillion synapses in a cubic centimeter of the brain. If there is such a thing as general AI, [the system] would probably require one trillion synapses." The human brain has a huge number of synapses. Each of the 1011 (one hundred billion) neurons has on average 7,000 synaptic connections (synapses) to other neurons. It has been estimated that the brain of a three-year-old child has about 1015 synapses (1 quadrillion).


OpenAI releases Triton, a programming language for AI workload optimization

#artificialintelligence

All the sessions from Transform 2021 are available on-demand now. OpenAI today released Triton, an open source, Python-like programming language that enables researchers to write highly efficient GPU code for AI workloads. Triton makes it possible to reach peak hardware performance with relatively little effort, OpenAI claims, producing code on par with what an expert could achieve in as few as 25 lines. Deep neural networks have emerged as an important type of AI model, capable of achieving state-of-the-art performance across natural language processing, computer vision, and other domains. The strength of these models lies in their hierarchical structure, which generates a large amount of highly parallelizable work well-suited for multicore hardware like GPUs.


Artificial Intelligence in HR is Balancing Tech and Touch

#artificialintelligence

Recently the pandemic has pushed digital transformation to the front of the line. While collaborative tools allowed us to work from home and maintain close contact with our co-workers, the next step is just around the corner, thanks to artificial intelligence and machine learning. In every element of the company, the pandemic is driving a move towards a hybrid work paradigm, changing people's management and the way we work. Enterprises are on the verge of digital transformation and the use of artificial intelligence in HR departments will accelerate this process. Digital transformation improves the customer experience while also unlocking new value.


Microsoft buys 'spend intelligence' vendor Suplari to bolster Dynamics 365

ZDNet

Microsoft has acquired Suplari, a Seattle-based vendor that provides "spend intelligence" information for managing supplier spending for an undisclosed amount. Microsoft announced the deal on July 28. Microsoft plans to bring together the Suplari Spend Intelligence Cloud with Microsoft Dynamics 365, its ERP/ CRM offering, which already includes a number of "insights" modules. Microsoft officials said Suplari helps companies transform data from sources like contracts, purchase orders, invoices, expenses and such into actionable insights. From Microsoft's blog post on the Suplari acquisition: "Together with Dynamics 365, the Suplari Spend Intelligence Cloud will help customers maximize financial visibility by using AI to automate the analysis of current data and historical patterns from multiple data sources. It will also help customers enhance financial decision-making by predicting the best spend management actions moving forward."


Synthetic Data May Be The Solution to AI Privacy Concerns

#artificialintelligence

AI is hungry for data. Training and testing the machine-learning tools to perform desired tasks consumes huge lakes of data. More data often means better AI. Yet gathering this data, especially data concerning people's behavior and transactions, can be risky. For example, In January of this year, the US FTC reached a consent order with a company called Everalbum, a developer of photography apps.