Artificial intelligence and machine learning techniques are already proving effective in pharmaceutical procedures. Drug discovery is one of the crucial procedures to find new candidate medications in the field of medicine, biotechnology and pharmacology. According to the U.S. FDA, there are five steps for the development of a new drug. These include discovery and development, preclinical research, clinical research, FDA review, and FDA post-market safety monitoring. Since drug discovery requires huge amounts of data and research, many pharmaceutical companies are embracing AI and machine learning to accelerate the pace of drug discovery.
The Internet media is trending now with numerous mentions of "big data", "machine learning" and "artificial intelligence" all together destined to revolutionize pharmaceutical and biotech industries and the way drugs are discovered. These new technologies are believed to make drug discovery cheaper, faster, and more productive. First, let's review briefly some of the basic concepts in the heart of new technologies. The term "big data" by itself is more of a marketing nature. It describes an abstract concept of having large volumes of data obtained from various channels in multiple formats, which needs to be arranged in such a way, that it can be possible to quickly access, search, update, and analyze it to output useful information.
Drugs can only work if they stick to their target proteins in the body. Assessing that stickiness is a key hurdle in the drug discovery and screening process. The new technique, dubbed DeepBAR, quickly calculates the binding affinities between drug candidates and their targets. The approach yields precise calculations in a fraction of the time compared to previous state-of-the-art methods. The researchers say DeepBAR could one day quicken the pace of drug discovery and protein engineering.
THERE ARE MANY REASONS that promising drugs wash out during pharmaceutical development, and one of them is cytochrome P450. A set of enzymes mostly produced in the liver, CYP450, as it is commonly called, is involved in breaking down chemicals and preventing them from building up to dangerous levels in the bloodstream. Many experimental drugs, it turns out, inhibit the production of CYP450--a vexing side effect that can render such a drug toxic in humans. Drug companies have long relied on conventional tools to try to predict whether a drug candidate will inhibit CYP450 in patients, such as by conducting chemical analyses in test tubes, looking at CYP450 interactions with better-understood drugs that have chemical similarities, and running tests on mice. But their predictions are wrong about a third of the time.
Among various applications of AI technology in the pharmaceutical industry, some are viewed as most important and worth more depth of exploration. The first step in drug development is to understand the biological origin and mechanism of the disease, and then to determine suitable targets through high-throughput technologies such as shRNA screening and deep sequencing, and finally to find relevant patterns through a large number of diverse data sources. This is huge work and often presents an important challenge for traditional methods. Unlike traditional methods, AI can systematically analyze existing literature and data in just a few seconds. This real-time "omics" database analysis can more accurately understand pathological cells and molecular mechanisms, and it can be used for complex diseases such as neurodegenerative diseases.