"Interacting with a computer requires adopting some metaphor to guide our actions and expectations. Most human-computer interfaces can be classified according to two dominant metaphors: (1) agent and (2) environment. Interactions based on an agent metaphor treat the computer as an intermediary that responds to user requests. In the environment metaphor, a model of the task domain is presented for the user to interact with directly. The term agent has come to refer to the automation of aspects of human-computer interaction (HCI), such as anticipating commands or autonomously performing actions. Norman's 1984 model of HCI is introduced as reference to organize and evaluate research in human-agent interaction (HAI). A wide variety of heterogeneous research involving HAI is shown to reflect automation of one of the stages of action or evaluation within Norman's model. Improvements in HAI are expected to result from a more heterogeneous use of methods that target multiple stages simultaneously."
Technology is too often viewed through a utopian or alarmist lens, and it's worth noting that Licklider's work spanned the sublime and sobering alike. He presaged much of the Internet revolution, and his research led to such breakthroughs as the modern graphical user interface. ... It's easy to argue that life and death decisions should never be left to machines, but Licklider's vision was much broader, recognizing technology as an enabler for many human capacities.See text of Licklider's article Man-Computer Symbiosis.
Technology is too often viewed through a utopian or alarmist lens, and it's worth noting that Licklider's work spanned the sublime and sobering alike. He presaged much of the Internet revolution, and his research led to such breakthroughs as the modern graphical user interface. ... It's easy to argue that life and death decisions should never be left to machines, but Licklider's vision was much broader, recognizing technology as an enabler for many human capacities.
See text of Licklider's article Man-Computer Symbiosis.
Man-computer symbiosis is an expected development in cooperative interaction between men and electronic computers. It will involve very close coupling between the human and the electronic members of the partnership. The main aims are 1) to let computers facilitate formulative thinking as they now facilitate the solution of formulated problems, and 2) to enable men and computers to cooperate in making decisions and controlling complex situations without inflexible dependence on predetermined programs. In the anticipated symbiotic partnership, men will set the goals, formulate the hypotheses, determine the criteria, and perform the evaluations. Computing machines will do the routinizable work that must be done to prepare the way for insights and decisions in technical and scientific thinking. Preliminary analyses indicate that the symbiotic partnership will perform intellectual operations much more effectively than man alone can perform them. Prerequisites for the achievement of the effective, cooperative association include developments in computer time sharing, in memory components, in memory organization, in programming languages, and in input and output equipment.
See also: ACM Digital Library citation
Visit the project page at the Intelligent Information Laboratory (InfoLab) at Northwestern University and read their entire collection of related papers.
The HEART approach (Happiness, Engagement, Adoption, Retention, and Task success) was put in practice several years ago. The guidelines make a great deal of sense in terms of staying on top of user preferences
I expected "gesture control" to be immediately intuitive. But as I slip on the MYO--a flexible band that fits around my forearm--a cursor on a laptop in front of me begins somersaulting wildly across the screen, tracking my erratic arm movements.
Meet Zoe: a digital talking head which can express human emotions on demand with "unprecedented realism" and could herald a new era of human-computer interaction.
Ketan Banjara’s living room isn't cluttered with remote controls. To shush the music, he simply holds a finger up to his lips. And when he gets up from the couch and leaves the room, his TV screen pauses automatically. Banjara is a cofounder of PredictGaze, a startup that combines gaze detection, gesture recognition, and facial-feature recognition to create more natural ways to control everything from your TV to your car.