Goto

Collaborating Authors

Results


AstraZeneca is using PyTorch-powered algorithms to discover new drugs

#artificialintelligence

Since it launched in 2017, Facebook's machine-learning framework PyTorch has been put to good use, with applications ranging from powering Elon Musk's autonomous cars to driving robot-farming projects. Now pharmaceutical firm AstraZeneca has revealed how its in-house team of engineers are tapping PyTorch too, and for equally as important endeavors: to simplify and speed up drug discovery. Combining PyTorch with Microsoft Azure Machine Learning, AstraZeneca's technology can comb through massive amounts of data to gain new insights about the complex links between drugs, diseases, genes, proteins or molecules. Those insights are used to feed an algorithm that can, in turn, recommend a number of drug targets for a given disease for scientists to test in the lab. The method could allow for huge strides in a sector like drug discovery, which so far has been based on costly and time-consuming trial-and-error methods.


AstraZeneca is using PyTorch-powered algorithms to discover new drugs

ZDNet

Since it launched in 2018, Facebook's machine learning framework PyTorch has been put to good use, with applications ranging from powering Elon Musk's autonomous cars to driving robot farming projects. Now pharmaceutical firm AstraZeneca has revealed how its in-house team of engineers are tapping PyTorch too, and for equally as important endeavors: to simplify and speed up drug discovery. Combining PyTorch with Microsoft Azure Machine Learning, AstraZeneca's technology can comb through massive amounts of data to gain new insights about the complex links between drugs, diseases, genes, proteins or molecules. Those insights are used to feed an algorithm that can, in turn, recommend a number of drug targets for a given disease for scientists to test in the lab. The method could allow for huge strides in a sector like drug discovery, which so far has been based on costly and time-consuming trial-and-error methods.


Report: State of Artificial Intelligence in India - 2020

#artificialintelligence

Artificial Intelligence or AI is a field of Data Science that trains computers to learn from experience, adjust to inputs, and perform tasks of certain cognitive levels. Over the last few years, AI has emerged as a significant data science function and, by utilizing advanced algorithms and computing power, AI is transforming the functional, operational, and strategic landscape of various business domains. AI algorithms are designed to make decisions, often using real-time data. Using sensors, digital data, and even remote inputs, AI algorithms combine information from a variety of different sources, analyze the data instantly, and act on the insights derived from the data. Most AI technologies – from advanced recommendation engines to self-driving cars – rely on diverse deep learning models. By utilizing these complex models, AI professionals are able to train computers to accomplish specific tasks by recognizing patterns in the data. Analytics India Magazine (AIM), in association with Jigsaw Academy, has developed this study on the Artificial Intelligence market to understand the developments of the AI market in India, covering the market in terms of Industry and Company Type. Moreover, the study delves into the market size of the different categories of AI and Analytics startups / boutique firms. As a part of the broad Data Science domain, the Artificial Intelligence technology function has so far been classified as an emerging technology segment. Moreover, the AI market in India has, till now, been dominated by the MNC Technology and the GIC or Captive firms. Domestic firms, Indian startups, and even International Technology startups across various sectors have, so far, not made a significant investment, in terms of operations and scale, in the Indian AI market. Additionally, IT services and Boutique AI & Analytics firms had not, till a couple of years ago, developed full-fledged AI offerings in India for their clients.


Deep learning on cell signaling networks establishes AI for single-cell biology

#artificialintelligence

Computer systems that emulate key aspects of human problem solving are commonly referred to as artificial intelligence (AI). This field has seen massive progress over the last years. Most notably, deep learning enabled groundbreaking progress in areas such as self-driving cars, computers beating the best human players in strategy games (Go, chess), computer games, and in poker, and initial applications in diagnostic medicine. Deep learning is based on artificial neural networks--networks of mathematical functions that are iteratively reorganized until they accurately map the data describing a given problem to its solution. In biology, deep learning has established itself as a powerful method to predict phenotypes (i.e., observable characteristics of cells or individuals) from genome data (for example gene expression profiles).


Deep Learning On Cell Signaling Networks Establishes AI For Single-Cell Biology

#artificialintelligence

Computer systems that emulate key aspects of human problem solving are commonly referred to as artificial intelligence (AI). This field has seen massive progress over the last years. Most notably, deep learning enabled groundbreaking progress in areas such as self-driving cars, computers beating the best human players in strategy games (Go, chess), computer games, and in poker, and initial applications in diagnostic medicine. Deep learning is based on artificial neural networks - networks of mathematical functions that are iteratively reorganized until they accurately map the data describing a given problem to its solution. In biology, deep learning has established itself as a powerful method to predict phenotypes (i.e., observable characteristics of cells or individuals) from genome data (for example gene expression profiles).


GPT-3 Creative Fiction

#artificialintelligence

What if I told a story here, how would that story start?" Thus, the summarization prompt: "My second grader asked me what this passage means: …" When a given prompt isn't working and GPT-3 keeps pivoting into other modes of completion, that may mean that one hasn't constrained it enough by imitating a correct output, and one needs to go further; writing the first few words or sentence of the target output may be necessary.


Explainable Artificial Intelligence: a Systematic Review

arXiv.org Artificial Intelligence

This has led to the development of a plethora of domain-dependent and context-specific methods for dealing with the interpretation of machine learning (ML) models and the formation of explanations for humans. Unfortunately, this trend is far from being over, with an abundance of knowledge in the field which is scattered and needs organisation. The goal of this article is to systematically review research works in the field of XAI and to try to define some boundaries in the field. From several hundreds of research articles focused on the concept of explainability, about 350 have been considered for review by using the following search methodology. In a first phase, Google Scholar was queried to find papers related to "explainable artificial intelligence", "explainable machine learning" and "interpretable machine learning". Subsequently, the bibliographic section of these articles was thoroughly examined to retrieve further relevant scientific studies. The first noticeable thing, as shown in figure 2 (a), is the distribution of the publication dates of selected research articles: sporadic in the 70s and 80s, receiving preliminary attention in the 90s, showing raising interest in 2000 and becoming a recognised body of knowledge after 2010. The first research concerned the development of an explanation-based system and its integration in a computer program designed to help doctors make diagnoses [3]. Some of the more recent papers focus on work devoted to the clustering of methods for explainability, motivating the need for organising the XAI literature [4, 5, 6].


Why the AI revolution now? Because of 6 key factors.

#artificialintelligence

About: Data-Driven Science (DDS) provides training for people building a career in Artificial Intelligence (AI). In recent years, AI has been taking off and became a topic that is frequently making it into the news. But why is that actually? AI research has started in the mid-twentieth century when mathematician Alan Turing asked the question "Can Machines Think?" in a famous paper in 1950. However, it's been not until the 21st century that Artificial Intelligence has shaped real-world applications that are impacting billions of people and most industries across the globe.


Top 100 Artificial Intelligence Companies 2020

#artificialintelligence

As artificial intelligence has become a growing force in business, today's top AI companies are leaders in this emerging technology. Often leveraging cloud computing, AI companies mix and match myriad technologies. Foremost among these is machine learning, but today's AI leading firms tech ranging from predictive analytics to business intelligence to data warehouse tools to deep learning. Entire industries are being reshaped by AI. RPA companies have completely shifted their platforms. AI in healthcare is changing patient care in numerous – and major – ways. AI companies are attracting massive investment from venture capitalist firms and giant firms like Microsoft and Google. Academic AI research is growing, as are AI job openings across a multitude of industries. All of this is documented in the AI Index, produced by Stanford University's Human-Centered AI Institute. Consulting giant Accenture believes AI has the potential to boost rates of profitability by an average of 38 percentage points and could lead to an economic boost of $14 trillion in additional gross value added (GVA) by 2035. In truth, artificial intelligence holds not just possibilities, but a plethora of risks. "It will have a huge economic impact but also change society, and it's hard to make strong predictions, but clearly job markets will be affected," said Yoshua Bengio, a professor at the University of Montreal, and head of the Montreal Institute for Learning Algorithms. To keep up with the AI market, we have updated our list of top AI companies playing a key role in shaping the future of AI. We feature artificial intelligence companies that are commercially successful as well as those that have invested significantly in artificial intelligence. AI companies in the years ahead are forecast to see exponential growth in deep learning, machine learning and natural language processing.


Julia Language in Machine Learning: Algorithms, Applications, and Open Issues

arXiv.org Machine Learning

Machine learning is driving development across many fields in science and engineering. A simple and efficient programming language could accelerate applications of machine learning in various fields. Currently, the programming languages most commonly used to develop machine learning algorithms include Python, MATLAB, and C/C ++. However, none of these languages well balance both efficiency and simplicity. The Julia language is a fast, easy-to-use, and open-source programming language that was originally designed for high-performance computing, which can well balance the efficiency and simplicity. This paper summarizes the related research work and developments in the application of the Julia language in machine learning. It first surveys the popular machine learning algorithms that are developed in the Julia language. Then, it investigates applications of the machine learning algorithms implemented with the Julia language. Finally, it discusses the open issues and the potential future directions that arise in the use of the Julia language in machine learning.