If you are looking for an answer to the question What is Artificial Intelligence? and you only have a minute, then here's the definition the Association for the Advancement of Artificial Intelligence offers on its home page: "the scientific understanding of the mechanisms underlying thought and intelligent behavior and their embodiment in machines."
However, if you are fortunate enough to have more than a minute, then please get ready to embark upon an exciting journey exploring AI (but beware, it could last a lifetime) …
Drug interactions, including drug-drug interactions (DDIs) and drug-food constituent interactions (DFIs), can trigger unexpected pharmacological effects, including adverse drug events (ADEs), with causal mechanisms often unknown. However, current prediction methods do not provide sufficient details beyond the chance of DDI occurrence, or require detailed drug information often unavailable for DDI prediction. To tackle this problem, Dr. Jae Yong Ryu, Assistant Professor Hyun Uk Kim and Distinguished Professor Sang Yup Lee, all from the Department of Chemical and Biomolecular Engineering at Korea Advanced Institute of Science and Technology (KAIST), developed a computational framework, named DeepDDI, that accurately predicts 86 DDI types for a given drug pair. The research results were published online in Proceedings of the National Academy of Sciences of the United States of America (PNAS) on April 16, 2018, which is entitled "Deep learning improves prediction of drug-drug and drug-food interactions." DeepDDI takes structural information and names of two drugs in pair as inputs, and predicts relevant DDI types for the input drug pair.
It's important not to overstate the security risks of the Amazon Echo and other so-called smart speakers. They're useful, fun, and generally have well thought-out privacy protections. Then again, putting a mic in your home naturally invites questions over whether it can be used for eavesdropping--which is why researchers at the security firm Checkmarx started fiddling with Alexa, to see if they could turn it into a spy device. They did, with no intensive meddling required. The attack, which Amazon has since fixed, follows the intended flow of using and programming an Echo.
The artificial intelligence (AI) industry will be worth $1.2 trillion in 2018, with customer experience solutions creating the most business value. On Wednesday, Gartner released estimates on the projected value of AI over the course of this year. According to the research firm, the global enterprise value derived from AI will total $1.2 trillion this year, a 70 percent increase from 2017. AI-derived business value is projected to reach up to $3.9 trillion by 2022. "AI promises to be the most disruptive class of technologies during the next 10 years due to advances in computational power, volume, velocity and variety of data, as well as advances in deep neural networks (DNNs)," said John-David Lovelock, research vice president at Gartner.
AI has evolved significantly since the days of Siri's languid chats with John Malkovich. It not only eases the burden of compiling and parsing information but is beginning to offer new and unique insights. Marketers, in particular, are finding that AI is more applicable to their business challenges, with user-friendly, pragmatic products seeing successful adoption. And, as AI and machine learning marketing capabilities are giving B2B marketers more lift and scale across their programs, this exciting tech is poised to become standard best practice as fast as it becomes available. Here are four key ways that AI platforms and technologies are changing the game for B2B marketers right now.
Imagine a drone pilot remotely flying a quadrotor, using an onboard camera to navigate and land. Unfamiliar flight dynamics, terrain, and network latency can make this system challenging for a human to control. One approach to this problem is to train an autonomous agent to perform tasks like patrolling and mapping without human intervention. This strategy works well when the task is clearly specified and the agent can observe all the information it needs to succeed. Unfortunately, many real-world applications that involve human users do not satisfy these conditions: the user's intent is often private information that the agent cannot directly access, and the task may be too complicated for the user to precisely define.
In 1938, when there were just about one-tenth the number of cars on U.S. roadways as there are today, a brilliant psychologist and a pragmatic engineer joined forces to write one of the most influential works ever published on driving. A self-driving car's killing of a pedestrian in Arizona highlights how their work is still relevant today – especially regarding the safety of automated and autonomous vehicles. James Gibson, the psychologist in question, and the engineer Laurence Crooks, his partner, evaluated a driver's control of a vehicle in two ways. The first was to measure what they called the "minimum stopping zone," the distance it would take to stop after the driver slammed on the brakes. The second was to look at the driver's psychological perception of the possible hazards around the vehicle, which they called the "field of safe travel."
There has been a fair amount of hype surrounding conversational interfaces and conversational UI lately -- and for good reason. Interfaces allow users to interact with machines using natural language processing either by talking or writing, akin to human-to-human conversations. With every passing day, we get closer to bridging the gap of natural communication between man and machine. The process of filling out a form or sharing your information on most websites has become less tedious and more informal due to the wide adaptation of conversational UI. However, prior to the steady adaptation of conversational UI, we had to trudge through a barrage of web forms in order to access content or seek out information.
So, what kind of things can this'smart' tech do? Just a few months ago, an AI machine managed to complete a University level math exam 12 times faster than it normally takes the average human. How? Through the art of machine learning; where computers learn and adapt through experience without explicitly being programmed. Furthermore, Facebook made headlines in 2017 when their chatbots created their own language. Some Fake News stories say that the engineer's pulled the plug in a panic after they were getting too smart.
There is no denying that we are entering a new phase in how technology helps to connect brands with consumers. We are moving from visual interfaces to text and voice. Speaking to our devices like we speak to our friends, family and colleagues. As messenger apps secure the lion's share of our connected time, customers will increasingly expect and want to interact with brands in these channels. And Facebook, and Viber, and Kik, want to connect you with brands on their platform, like (most of) China does on WeChat which has 800 million users and millions of services available within the app.
Despite recent blows to the footwear industry, there's ample reason to be hopeful with cutting-edge technology. "We're living in amazing times where new innovations coming out of research in these fields is capturing people's imaginations. And those innovations will be realities in the not-too-distant future," said Acharya. "Today, we're creating machine-learning algorithms to help retailers incorporate all these sources of data and solve new business challenges." He pointed to returns forecasting -- and how data science can help -- as one example.