Goto

Collaborating Authors

Results


The Next Generation Of Artificial Intelligence

#artificialintelligence

It has only been 8 years since the modern era of deep learning began at the 2012 ImageNet competition. Progress in the field since then has been breathtaking and relentless. If anything, this breakneck pace is only accelerating. Five years from now, the field of AI will look very different than it does today. Methods that are currently considered cutting-edge will have become outdated; methods that today are nascent or on the fringes will be mainstream.


Drug discovery with explainable artificial intelligence

#artificialintelligence

Deep learning bears promise for drug discovery, including advanced image analysis, prediction of molecular structure and function, and automated generation of innovative chemical entities with bespoke properties. Despite the growing number of successful prospective applications, the underlying mathematical models often remain elusive to interpretation by the human mind. There is a demand for ‘explainable’ deep learning methods to address the need for a new narrative of the machine language of the molecular sciences. This Review summarizes the most prominent algorithmic concepts of explainable artificial intelligence, and forecasts future opportunities, potential applications as well as several remaining challenges. We also hope it encourages additional efforts towards the development and acceptance of explainable artificial intelligence techniques. Drug discovery has recently profited greatly from the use of deep learning models. However, these models can be notoriously hard to interpret. In this Review, Jiménez-Luna and colleagues summarize recent approaches to use explainable artificial intelligence techniques in drug discovery.


Learning AI If You Suck at Math - Part Eight - The Musician in the Machine

#artificialintelligence

"Attention takes two sentences, turns them into a matrix where the words of one sentence form the columns, and the words of another sentence form the rows, and then it makes matches, identifying relevant context." Check out the graphic from the Attention is All You Need paper below. It's two sentences, in different languages (French and English), translated by a professional human translator. The attention mechanism can generate a heat map, showing what French words the model focused on to generate the translated English words in the output.


GPT-3: Intelligent A.I. or Vacant Programming?

#artificialintelligence

A recent article published in the Guardian caught the attention of internet users worldwide. Unlike ordinary works of journalism that go viral, however, this particular piece was not written by a human. In a style that is evocative and attention-grabbing, The Guardian aptly titled it: "A robot wrote this entire article. Are you scared yet, human?" The "robot" in question is GPT-3, or "Generative Pre-Trained Transformative 3", OpenAI's third iteration of an autoregressive language model that uses deep learning to produce human-like text.


Ten Research Challenge Areas in Data Science · Harvard Data Science Review

#artificialintelligence

To drive progress in the field of data science, we propose 10 challenge areas for the research community to pursue. Since data science is broad, with methods drawing from computer science, statistics, and other disciplines, and with applications appearing in all sectors, these challenge areas speak to the breadth of issues spanning science, technology, and society. We preface our enumeration with meta-questions about whether data science is a discipline. We then describe each of the 10 challenge areas. The goal of this article is to start a discussion on what could constitute a basis for a research agenda in data science, while recognizing that the field of data science is still evolving. Although data science builds on knowledge from computer science, engineering, mathematics, statistics, and other disciplines, data science is a unique field with many mysteries to unlock: fundamental scientific questions and pressing problems of societal importance.


Artificial Intelligence -- The Rise of Technological Era

#artificialintelligence

The dawn of the Artificial Intelligence (AI) era is upon us. The buzz-words Artificial Intelligence, Machine Learning (ML) and Deep Learning (DL) are used quite frequently in recent times. Let us introspect about each aspect individually to really appreciate these concepts. Firstly, I will list out the more boring formal definition and then proceed to explain them more intuitively with analogies to try and understand the concepts better. For today, let's start off with the big daddy of them all -- Artificial Intelligence (AI).


GPT-3 Artificial Intelligence Model for Mobile Applications - OpenXcell

#artificialintelligence

OpenAI's GPT-3 has been in the news since its launch last month owing to the exciting features and largest language model trained in the current era. OpenAI announced this deep-learning model for natural language processing with over 175 billion parameters and set a benchmark for surpassing high-performance meeting NLP benchmarks. Generative Pretrained Transformer-3 is the third generation of OpenAI's machine learning model algorithms for a straightforward interpretation of voice, text, answering various questions by analyzing data and giving accurate output. With exceptional language abilities, GPT-3 is pre-trained with a vast amount of 45TB text and more than 499 billion words, resulting in 175 billion parameters. GPT-3 is also seeing a major future in mobile application development as well.


Getting Started in AI Research - KDnuggets

#artificialintelligence

Focus on research in Artificial Intelligence (AI) is nowadays growing more and more every year, particularly in fields such as Deep Learning, Reinforcement Learning and Natural Language Processing (Figure 1). State of the art research in AI is usually carried out in top universities research groups and research-focused companies such as Deep Mind or Open AI, but what if you would like to give your own contribution in your spare time? In this article, we are going to explore different possible approaches you can take in order to be always up to date with the latest in research and how to provide your own contribution. One of the main problems which have affected the AI research field is the possible inability to efficiently reproduce models and results claimed in some publications (Reproducibility Challenge). In fact, many research articles published every year contains just an explanation of the derided topic and model developed but no source code to reproduce their results.


Understanding Transformers, the Data Science Way

#artificialintelligence

Transformers have become the defacto standard for NLP tasks nowadays. While the Transformer architecture was introduced with NLP, they are now being used in Computer Vision and to generate music as well. I am sure you would all have heard about the GPT3 Transformer and its applications thereof. But all these things aside, they are still hard to understand as ever. It has taken me multiple readings through the Google research paper that first introduced transformers along with just so many blog posts to really understand how a transformer works. So, I thought of putting the whole idea down in as simple words as possible and with some very basic Math and some puns as I am a proponent of having some fun while learning. I will try to keep both the jargon and the technicality to a minimum, yet it is such a topic that I could only do so much. And my goal is to make the reader understand even the most gory details of Transformer by the end of this post. Also, this is officially my longest post both in terms of time taken to write it as well as length of the post. So, here goes -- This post will be a highly conversational one and it is about "Decoding The Transformer".


Facebook PyText is an Open Source Framework for Rapid NLP Experimentation

#artificialintelligence

I recently started a new newsletter focus on AI education. TheSequence is a no-BS( meaning no hype, no news etc) AI-focused newsletter that takes 5 minutes to read. The goal is to keep you up to date with machine learning projects, research papers and concepts. Natural language processing(NLP) has become the best known discipline in the deep learning space in rencet years. Part of that popularity have brought together an explosion of tools and frameworks such as Google Cloud, Azure LUIS, AWS Lex or Watson Assistant, NLP that have enable the implementation of simple NLP applications without requiring any deep learning knowledge.