interaction


How to create the 'perfect' AI-driven bot - Content Loop

#artificialintelligence

Our experience creating our travel assistant app, Mezi, illustrates key principles of AI regarding the ongoing role of human involvement and how to draw the dividing line between valued assistance and unwelcome intrusion. Remember, the goal isn't for your AI to be perfect, it's for your AI to be part of a perfect service. Both to surface the full value proposition of your service, and to deepen that sense of trust, it's important for your AI to anticipate a customer's needs and offer useful help without having to be asked. Snehal Shinde is the CTO and cofounder at Mezi, the personal shopping and travel service.


Do You Have AI Trust Issues?

#artificialintelligence

The truth is that increasing numbers of consumers each year are putting their trust into machines that gather data about them and then use it to understand their behavior and preferences in far quicker, easier and often more accurate ways than human beings ever could. Indeed, a recent study conducted by my employer among 6,000 global consumers, found that 88 percent of respondents wanted to be told whether they were interacting with a real person or a machine when they received customer service help. Instead, the process will involve humans working alongside AI to generate insights, predict behavior and make recommendations that will have a significant impact on the way that we as consumers interact with organizations. Only when that question has been answered and that obstacle overcome will organizations be able to harness the power of humans and machines working together to provide truly seamless customer experiences.


The Reality of the Artificial Intelligence Revolution - Talend

#artificialintelligence

The capability to teach machines to interpret data is the key underpinning technology that will enable more complex forms of AI that can be autonomous in their responses to input. There have been obvious failings of this technology (the unfiltered Microsoft chatbot, "Tay," as a prime example), but the application of properly developed and managed artificial systems for interaction is an important step along the route to full AI. There are so many repetitive tasks involved in any scientific or research project that using robotic intelligence engines to manage and perfect the more complex and repetitive tasks would greatly increase the speed at which new breakthroughs could be uncovered. Learning from repetition, improving patterns, and developing new processes is well within reach of current AI models, and will strengthen in the coming years as advances in Artificial Intelligence – specifically machine learning and neural networking – continue.


AI in Contact Centers

Communications of the ACM

Indeed, rather than simply being used to replace contact center workers, artificial intelligence (Al)-based technologies, including machine learning, natural language processing, and even sentiment analysis, are being strategically deployed to improve the overall customer experience by providing functionality that would be too time-consuming or expensive to do manually. The company uses AI "bots" to handle routine tasks by utilizing natural language processing to interpret what customers are asking, search the business knowledge base system for an answer, and then interpreting this raw data into an intelligent, human-friendly response. Burgess highlights the power of machine learning and natural language processing to quickly process front-end requests, which often make up a significant amount of call volume and labor costs. The sheer number of possible phrases, words, and interactions does make it more challenging to automate the customer service experience, though with machine learning technology that can review thousands or millions of interactions, organizations can tailor responses based on its learnings.


More human than human: banking's AI future is all in the voice » Banking Technology

#artificialintelligence

Aditya Challa of IMImobile takes a look at some current and future applications of artificial intelligence (AI) for customer experience in the financial and banking sector, as well as the challenges that still need to be overcome. That might be a little optimistic, but it's clear that with the continued advancements in machine learning, the capacity of AI-powered technologies to understand complex problems, gather relevant information, decide outcomes, and intelligently interact with customers is growing rapidly. Not only will banks be able to quite literally develop their own "tone of voice", but banks will be able to use AI to gather customer data and give highly personalised advice on products such as loans and mortgages, reducing costly human interaction while delivering improved customer experience. IMimobile's Alexa skill for one of the high-street banks uses Machine Learning and Natural Language Processing, which means it can maintain context and add personality.


What AI-enhanced health care could look like in 5 years

#artificialintelligence

Meeker's analysis highlighted the opportunities surrounding digital innovation in patient empowerment and health management, improvements to clinical pathways and protocols, and preventative health. These technologies can be leveraged to capture the massive volume of data that describes a patient's past and present state, project potential future states, analyze that data in real time, assist in reasoning about the best way to achieve patient and physician goals, and provide both patient and physician constant real-time support. But new technologies, including computer vision, natural language understanding, and machine learning, present interface capabilities that enable individuals to easily "show" or "talk to" their AI virtual assistant about what they're doing. With the algorithm developments of deep learning, symbolic AI, computer vision, natural language, and machine learning combined with a smartphone -- which puts the power of a supercomputer in everyone's pocket and is always with you, always on, and always connected -- we are at the beginning of the AI era.


How artificial intelligence is defining the future of brick-and-mortar shopping

#artificialintelligence

An example is Macy's On Call, a mobile web application that uses the Watson's cognitive computing power and location-based software to help shoppers get information while they're navigating the company's stores. As is with all machine learning–based platforms, every customer interaction makes On Call smarter. Everseen, a software company founded in Cork, Ireland, uses computer vision and AI algorithms to analyze video feeds from retailers' staffed registers and self-checkout feeds and automatically detect when a product is left unscanned. Go uses computer vision, machine learning algorithms and IoT sensors to understand customers interactions across the store.


Why chatbots need a big push from deep learning

#artificialintelligence

At the forefront of this resurgence are the fields of conversational interactions (personal assistants or chatbots), computer vision and autonomous navigation, which thanks to advances in hardware, data availability and revolutionary machine learning techniques, have enjoyed tremendous progress within the span of just a few years. Despite being the foundation of remarkably powerful systems, this approach lacks the flexibility needed to handle the kind of information needed to carry a realistic conversation. This level of automatic and extremely broad reasoning still eludes AI researchers and is perhaps one of the last frontiers in the way of truly intelligent and autonomous AI agents, conversational bots included. To do so we must merge multiple disciplines including deep learning, statistics, and others, building technology that blends consumer preferences, environment and language into one piece of intelligent, flexible software.


Robots Learn to Speak Body Language

IEEE Spectrum Robotics Channel

Called OpenPose, the system can track body movement, including hands and face, in real time. It uses computer vision and machine learning to process video frames, and can even keep track of multiple people simultaneously. It no longer requires the camera-lined dome to determine body poses, making the technology mobile and accessible. By perceiving and interpreting your physical gestures, the robot may even learn to read emotions by tracking body language.


Trust is the New Clarity - BCGDV Pollen

#artificialintelligence

Mario Gamper, Vice President of Strategic Design, discusses why AI assistants might force Experience Design into a rhetorical turn. Here's a thing I'm curious about: What impact will the advent of AI assistants have on the design teams that create digital user experience and interfaces? In order to create user experiences that are simple, meaningful and joyful, designers have developed an expansive toolkit for guiding users through complexity, and towards good decisions. As a result, voice-based UI will fundamentally shift the gravitational core of interaction from logic to emotion and credibility.