Modeling Design Process

AI Magazine

This article discusses building a computable design process model, which is a prerequisite for realizing intelligent computer-aided design systems. First, we introduce general design theory, from which a descriptive model of design processes is derived. In this model, the concept of metamodels plays a crucial role in describing the evolutionary nature of design. Second, we show a cognitive design process model obtained by observing design processes using a protocol analysis method. We then discuss a computable model that can explain most parts of the cognitive model and also interpret the descriptive model. In the computable model, a design process is regarded as an iterative logical process realized by abduction, deduction, and circumscription. We implemented a design simulator that can trace design processes in which design specifications and design solutions are gradually revised as the design proceeds.


Multimodal Cognitive Architecture: Making Perception More Central to Intelligent Behavior

AAAI Conferences

I propose that the notion of cognitive state be broadened from the current predicate-symbolic, Language-of-Thought framework to a multi-modal one, where perception and kinesthetic modalities participate in thinking. In contrast to the roles assigned to perception and motor activities as modules external to central cognition in the currently dominant theories in AI and Cognitive Science, in the proposed approach, central cognition incorporates parts of the perceptual machinery. I motivate and describe the proposal schematically, and describe the implementation of a bimodal version in which a diagrammatic representation component is added to the cognitive state. The proposal explains our rich multimodal internal experience, and can be a key step in the realization of embodied agents. The proposed multimodal cognitive state can significantly enhance the agent's problem solving. Note: Memory, as well as the information retrieved from memory and from perception, represented in a predicate-symbolic form.


Building emotionally aware cars on the path to full autonomy

#artificialintelligence

Recent innovations around the autonomous car have shaken up the automotive industry. Manufacturers and their suppliers are all accelerating their work on the cars of the future, both regular human-operated cars as well as driverless or semi-autonomous vehicles. But beyond just issues of autonomy, these cars of the future are undergoing a fundamental shift in human-machine interaction. Consumers today crave more relational and conversational interactions with devices, as evidenced by the popularity of chatbots and virtual assistants like Siri and Alexa – and the automotive industry has taken notice. As such, next-generation cars are emerging as advanced artificial intelligence (AI) systems that will power an entirely new automotive experience in which cars will become conversational interfaces between the driver, passengers, the vehicle itself and its controls -- all connected to the IoT and mobile devices we use.


Meet the Woman Pioneering Work To Make AI Emotionally Intelligent

#artificialintelligence

Humans are already forming relationships with their artificial intelligence (AI) assistants, so we should make that technology as emotionally aware as possible by teaching it to respond to our feelings. That is the premise of Rana el Kaliouby, cofounder and CEO of Affectiva, an MIT spinout company that sells emotion recognition technology based on her computer science PhD, which she spent building the first ever computer that can recognise emotions. The machine learning-based software uses a camera or webcam to identify parts of human faces (eyebrows, the corners of eyes, etc), classify expressions and map them onto emotions like joy, disgust, surprise, anger, and so on, in real time. "We are getting lots of interest around chatbots, self-driving cars, anything with a conversational interface. If it's interfacing with a human it needs social and emotional skills.