This article discusses building a computable design process model, which is a prerequisite for realizing intelligent computer-aided design systems. First, we introduce general design theory, from which a descriptive model of design processes is derived. In this model, the concept of metamodels plays a crucial role in describing the evolutionary nature of design. Second, we show a cognitive design process model obtained by observing design processes using a protocol analysis method. We then discuss a computable model that can explain most parts of the cognitive model and also interpret the descriptive model. In the computable model, a design process is regarded as an iterative logical process realized by abduction, deduction, and circumscription. We implemented a design simulator that can trace design processes in which design specifications and design solutions are gradually revised as the design proceeds.
Humans are already forming relationships with their artificial intelligence (AI) assistants, so we should make that technology as emotionally aware as possible by teaching it to respond to our feelings. That is the premise of Rana el Kaliouby, cofounder and CEO of Affectiva, an MIT spinout company that sells emotion recognition technology based on her computer science PhD, which she spent building the first ever computer that can recognise emotions. The machine learning-based software uses a camera or webcam to identify parts of human faces (eyebrows, the corners of eyes, etc), classify expressions and map them onto emotions like joy, disgust, surprise, anger, and so on, in real time. "We are getting lots of interest around chatbots, self-driving cars, anything with a conversational interface. If it's interfacing with a human it needs social and emotional skills.
Recent innovations around the autonomous car have shaken up the automotive industry. Manufacturers and their suppliers are all accelerating their work on the cars of the future, both regular human-operated cars as well as driverless or semi-autonomous vehicles. But beyond just issues of autonomy, these cars of the future are undergoing a fundamental shift in human-machine interaction. Consumers today crave more relational and conversational interactions with devices, as evidenced by the popularity of chatbots and virtual assistants like Siri and Alexa – and the automotive industry has taken notice. As such, next-generation cars are emerging as advanced artificial intelligence (AI) systems that will power an entirely new automotive experience in which cars will become conversational interfaces between the driver, passengers, the vehicle itself and its controls -- all connected to the IoT and mobile devices we use.
A robot has just set a new record for the fastest-solved Rubik's Cube, according to its makers. The Sub1 Reloaded robot took just 0.637 seconds to analyse the toy and make 21 moves, so that each of the cube's sides showed a single colour. That beats a previous record of 0.887 seconds, which was achieved by an earlier version of the same machine using a different processor. Infineon provided its chip to highlight advancements in self-driving car tech. But one expert has questioned the point of the stunt.
In the evolution to humanize technology, Affectiva is carving a niche. Its software development kit (SDK) and cloud-based API allow developers to enrich digital experiences by adding "emotion awareness" to apps from games to medical devices. And that means that machines can collect data and respond to users' emotions in real time, mostly based on facial recognition techniques. It's what the company calls, Emotion AI. As noted in a recent Forbes article: "Affectiva's technology has proven transformative for industries like automotive, market research, robotics, education, and gaming, but also for use cases like teaching autistic children emotion recognition and nonverbal social cues."