nviso
BrainChip and NVISO Partner on Human Behavioral Analytics in Automotive and Edge AI Devices
The initial effort will include implementing NVISO's AI solutions for Social Robots and In-cabin Monitoring Systems on BrainChip's Akida processors. Developers of automotive and consumer technologies are striving for devices that better respond to human behavior--which requires tools and applications to interpret human behavior captured from cameras and sensors on devices. However, these environments can be constrained by limited compute performance, power consumption, and cloud connectivity lapses. Since information is not sent off-device, user privacy and security are also protected. NVISO's technology is uniquely able to analyze signals of human behavior such as facial expressions, emotions, identity, head poses, gaze, gestures, activities, and objects with which users interact.
- Europe > Switzerland > Vaud > Lausanne (0.06)
- North America > United States > California > Orange County > Laguna Hills (0.05)
7 Startups Giving Artificial Intelligence (AI) Emotions - Nanalyze
Founded in 2012, Israeli startup Beyond Verbal has taken in $10.1 million in 4 rounds of funding to develop a technology that "analyzes emotions from vocal intonations". Like CrowdEmotion, nViso's technology tracks the movement of 43 facial muscles using a simple webcam and then uses AI to interpret your emotions. The Company uses a branch of artificial intelligence called Natural Language Processing (NLP) techniques to capture people's emotions, social concerns, thinking styles, psychology, and even their use of parts of speech. The startup developed a technique to "read" human emotional state called Transdermal Optical Imaging (TOI) using a conventional video camera to extract information from the blood flow underneath the human face.
- Transportation > Ground > Road (0.73)
- Information Technology > Robotics & Automation (0.73)
- Transportation > Passenger (0.49)
- Health & Medicine > Therapeutic Area > Neurology (0.48)
7 Startups Giving Artificial Intelligence (AI) Emotions - Nanalyze
We've written a lot about artificial intelligence (AI) here at Nanalyze, and just when we feel like there's not much more we can add to the topic, we find loads more interesting companies to write about. There has been a lot of talk lately about how machines just won't be able to capture that "human element" of emotions or "emotional intelligence" as it is often called. The act of building an emotional quotient or EQ as a layer on top of AI is being referred to as affective computing, a topic we covered before. The first step towards AI being able to demonstrate emotional intelligence, is that it needs to see emotions in our behaviour, hear our voices, and feel our anxieties. To do this, AI must be able to extract emotional cues or data from us through conventional means like eye tracking, galvanic skin response, voice and written word analysis, brain activity via EEG, facial mapping, and even gait analysis.
- Oceania > New Zealand (0.05)
- North America > United States > New York (0.05)
- North America > Canada > Ontario > Toronto (0.05)
- Health & Medicine > Therapeutic Area > Neurology (0.68)
- Transportation > Ground > Road (0.48)
The Big Bang Moment in A.I.
Remember what you were doing back in 2011? Maybe you had heard the term Big Data, but did you ever hear of Artificial Intelligence (AI) or Machine Learning? In the past five years, three things came together that make AI possible at commercial scale that's caused a big stir in 2016: Today I'm humbled to welcome Tim Llewellynn to the MakerZone! Tim is CEO and co-Founder at nViso (www.nviso.ch), To many people the whole "AI" thing came as kind of a surprise in 2016, but not to Tim and his team. The team at nViso are a commercial entity but have their roots in L'cole Polytechnique Federale Lausanne (EPFL).
- Information Technology > Communications > Social Media (1.00)
- Information Technology > Artificial Intelligence (1.00)
- Information Technology > Data Science > Data Mining > Big Data (0.56)