Artificial intelligence and cognitive computing: the what, why and where
Instead of talking about artificial intelligence (AI) many describe the current wave of AI innovation and acceleration with – admittedly somewhat differently positioned – terms and concepts such as cognitive computing or focus on several real-life applications of artificial intelligence that often start with words such as "smart", "intelligent", "predictive" and, indeed, "cognitive", depending on the exact application – and vendor. Despite the term issues, artificial intelligence is essential for and in, among others, information management, medicine/healthcare, data analysis, digital transformation, security (cybersecurity and others), various consumer applications, scientific advances, FinTech, predictive systems and so much more. There are many reasons why several vendors doubt using the term artificial intelligence for AI solutions/innovations and often package them in another term (trust us, we've been there). Artificial intelligence (AI) is a term that has somewhat of a negative connotation in general perception but also in the perception of technology leaders and firms. One major issue is that artificial intelligence – which is really a broad concept/reality, covering many technologies and realities – has become like a thing we talk about and also seem to need to have an opinion/feeling about, with thanks to, among others, popular culture. Hollywood loves AI (or better: superintelligence, not the same).
Feb-21-2017, 07:20:09 GMT