Goto

Collaborating Authors


Cognitive collaboration

#artificialintelligence

Although artificial intelligence (AI) has experienced a number of "springs" and "winters" in its roughly 60-year history, it is safe to expect the current AI spring to be both lasting and fertile. Applications that seemed like science fiction a decade ago are becoming science fact at a pace that has surprised even many experts. The stage for the current AI revival was set in 2011 with the televised triumph of the IBM Watson computer system over former Jeopardy! This watershed moment has been followed rapid-fire by a sequence of striking breakthroughs, many involving the machine learning technique known as deep learning. Computer algorithms now beat humans at games of skill, master video games with no prior instruction, 3D-print original paintings in the style of Rembrandt, grade student papers, cook meals, vacuum floors, and drive cars.1 All of this has created considerable uncertainty about our future relationship with machines, the prospect of technological unemployment, and even the very fate of humanity. Regarding the latter topic, Elon Musk has described AI "our biggest existential threat." Stephen Hawking warned that "The development of full artificial intelligence could spell the end of the human race." In his widely discussed book Superintelligence, the philosopher Nick Bostrom discusses the possibility of a kind of technological "singularity" at which point the general cognitive abilities of computers exceed those of humans.2 Discussions of these issues are often muddied by the tacit assumption that, because computers outperform humans at various circumscribed tasks, they will soon be able to "outthink" us more generally. Continual rapid growth in computing power and AI breakthroughs notwithstanding, this premise is far from obvious.


Technology Becomes Us: The Age of Human-Computer Interaction

#artificialintelligence

The science and application of HCI continues to evolve with more practitioners, scientists, researchers and developers seek to further what it means to human society and how it can be leveraged to address social and economic issues as well as to determine how people can think and work smarter. It's become such a relevant area of study that university programs and degrees are now available for HCI as a way of furthering the understanding and application for this segment of computer science. At the same time, new jobs are emerging to use these degrees related to furthering areas like those being studied by IBM or as new companies develop applications for artificial intelligence and connected devices that bring us further into the world of computers.


Transforming the User Experience through Artificial Intelligence

#artificialintelligence

How do we design AI systems that augment and empower people? This course connects human-computer interaction (HCI), the multidisciplinary field that focuses on designing interactions between humans and technology, to the transformative effects of AI so that you can better serve your customers and drive your company forward. You'll learn to make informed decisions on how and when our company should be designing smart and AI-based products to change how you work, learn and communicate. Note: This course was previously titled "Enhance User Experience with Human-Computer Interaction." All of the content has remained the same.


Keep Your Thinking Machines, I'll Take Human-Computer Interaction Any Day

#artificialintelligence

It's hard to discuss the role of Artificial Intelligence (AI) in the workplace until you decide what AI is. Some academics tell us -- using lots of words -- that AI is computers that think, learn and ultimately act like humans while others hold that maximizing the interaction between computers and their humans -- such as in Human Computer Interaction or HCI -- qualifies as the closest thing to AI we are likely to see. Until you decide on which side of that dichotomy you fall, it's difficult to understand how, or if, AI contributes to business, and if so, how to improve its contributions. Our fascination with the idea of machines that think like humans goes back millennia, but it's only recently that it appears to potentially be in reach. And while AI research has uncovered some amazing technological capabilities, it has also run into a quagmire in its attempts to 1) agree on just what human intelligence is; and 2) the extent to which technology might be capable of replicating it.