Artificial intelligence and cognitive computing: the what, why and where

#artificialintelligence 

Although artificial intelligence (as a set of technologies, not in the sense of mimicking human intelligence) is here since a long time in many forms and ways, it's a term that quite some people, certainly IT vendors, don't like to use that much anymore – but artificial intelligence is real, for your business too. Instead of talking about artificial intelligence (AI) many describe the current wave of AI innovation and acceleration with – admittedly somewhat differently positioned – terms and concepts such as cognitive computing or focus on several real-life applications of artificial intelligence that often start with words such as "smart" (omni-present in anything Internet of Things as well), "intelligent", "predictive" and, indeed, "cognitive", depending on the exact application – and vendor. Despite the term issues, artificial intelligence is essential for and in, among others, information management, medicine/healthcare, data analysis, digital transformation, security (cybersecurity and others), various consumer applications, scientific advances, FinTech, predictive systems and so much more. There are many reasons why several vendors doubt using the term artificial intelligence for AI solutions/innovations and often package them in another term (trust us, we've been there). Artificial intelligence (AI) is a term that has somewhat of a negative connotation in general perception but also in the perception of technology leaders and firms. One major issue is that artificial intelligence – which is really a broad concept/reality, covering many technologies and realities – has become like a thing we talk about and also seem to need to have an opinion/feeling about, with thanks to, among others, popular culture.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found