This New Tech Puts AI In Touch with Its Emotions--and Yours
A new "empathic voice interface" launched today by Hume AI, a New York–based startup, makes it possible to add a range of emotionally expressive voices, plus an emotionally attuned ear, to large language models from Anthropic, Google, Meta, Mistral, and OpenAI--portending an era when AI helpers may more routinely get all gushy on us. "We specialize in building empathic personalities that speak in ways people would speak, rather than stereotypes of AI assistants," says Hume AI cofounder Alan Cowen, a psychologist who has coauthored a number of research papers on AI and emotion, and who previously worked on emotional technologies at Google and Facebook. WIRED tested Hume's latest voice technology, called EVI 2 and found its output to be similar to that developed by OpenAI for ChatGPT. Later, a real movie star, Scarlett Johansson, claimed OpenAI had ripped off her voice.) Like ChatGPT, Hume is far more emotionally expressive than most conventional voice interfaces. If you tell it that your pet has died, for example, it will adopt a suitable somber and sympathetic tone.
- Country:
- Europe > Netherlands (0.06)
- North America > United States
- New York (0.26)
- Industry:
- Health & Medicine > Therapeutic Area
- Psychiatry/Psychology > Mental Health (0.40)
- Leisure & Entertainment (0.53)
- Media > Film (0.53)
- Health & Medicine > Therapeutic Area
- Technology: