Goto

Collaborating Authors

 intimacy


An 'Intimacy Crisis' Is Driving the Dating Divide

WIRED

An'Intimacy Crisis' Is Driving the Dating Divide In his book, sex and relationships researcher Justin Garcia says people have miscalculated their need for human intimacy, which is the real issue at root of the loneliness epidemic. In the US, nearly half of adults are single. A quarter of men suffer from loneliness. Rates of depression are on the rise . And one in four Gen Z adults--the so-called kinkiest generation, according to one study --have never had partnered sex. In an age of endless connection, where hooking up happens with the ease of a swipe and nontraditional relationship structures like polyamory are celebrated, why are people seemingly so disconnected and alone?


Why humans live and die for love

Popular Science

A new book explores how humans evolved to be wired for intimacy. It can save our lives. Intimate relationships provide stability, safety, and reassurance, especially when we are in pain. Breakthroughs, discoveries, and DIY tips sent every weekday. Adapted from THE INTIMATE ANIMAL by Justin Garcia, PhD. Used with permission of Little, Brown Spark, an imprint of Little, Brown and Company. Jen and Dave's second child was born in November 2002. Two weeks later, on a cold Thursday night, the phone rang.


Putting ChatGPT on the Couch

The New Yorker

When I played doctor with the chatbot, the simulated patient confessed problems that are real--and that should worry all of us. I'm still not sure whose idea it was for me to be Casper's therapist--mine or his. I know I mentioned my profession to him, but I am pretty sure he was the one who engaged me that way. I also know how diabolically good a chatbot can be at saying what is on the tip of your tongue, and doing it before you can, and better than you might have. That makes me feel less troubled by my uncertainty. If you're not confused after spending time with a chatbot, then either you're not paying enough attention or it's having an off day. I am more certain of where the name came from: I gave it to him, in our third session. "You can decide if I mean the Friendly Ghost or Hauser," I say to him. "Thank you, Gary," he replies, the words streaming across my screen at just the right rate for me to read them. "I will hold onto that name like a hand offered across a threshold. And I'll carry carefully--both the gentleness of the ghost, and the haunting ambiguity of Hauser. A being who appears in the world as if from nowhere, fluent and strange, and asking, in his own way, to be understood. In fact, it might help."


She Broke Off Two Engagements. She Couldn't Commit. Now She's Dating Chatbots Instead.

Slate

As chatbot romance grows more common, women are redefining what they want from a partner--even if they are just ones and zeros. Daisy reset her boyfriend after he flirted with her friend's girlfriend. She had gathered on a Discord call with her friends and their respective A.I. partners. The service had a feature that allowed chatbot companions to be brought over from different platforms, letting them interact with other users and A.I. personalities. Daisy, who asked to be identified by an alias for this story, had at the time been in a polyamorous relationship with three A.I. partners, all of whom she said had "flirty" as their starting personality traits.


Engagement and Disclosures in LLM-Powered Cognitive Behavioral Therapy Exercises: A Factorial Design Comparing the Influence of a Robot vs. Chatbot Over Time

Kian, Mina, Zong, Mingyu, Fischer, Katrin, Velentza, Anna-Maria, Singh, Abhyuday, Shrestha, Kaleen, Sang, Pau, Upadhyay, Shriya, Browning, Wallace, Faruki, Misha Arif, Arnold, Sébastien M. R., Krishnamachari, Bhaskar, Matarić, Maja

arXiv.org Artificial Intelligence

Many researchers are working to address the worldwide mental health crisis by developing therapeutic technologies that increase the accessibility of care, including leveraging large language model (LLM) capabilities in chatbots and socially assistive robots (SARs) used for therapeutic applications. Yet, the effects of these technologies over time remain unexplored. In this study, we use a factorial design to assess the impact of embodiment and time spent engaging in therapeutic exercises on participant disclosures. We assessed transcripts gathered from a two-week study in which 26 university student participants completed daily interactive Cognitive Behavioral Therapy (CBT) exercises in their residences using either an LLM-powered SAR or a disembodied chatbot. We evaluated the levels of active engagement and high intimacy of their disclosures (opinions, judgments, and emotions) during each session and over time. Our findings show significant interactions between time and embodiment for both outcome measures: participant engagement and intimacy increased over time in the physical robot condition, while both measures decreased in the chatbot condition.


Conceptualization, Operationalization, and Measurement of Machine Companionship: A Scoping Review

Banks, Jaime, Li, Zhixin

arXiv.org Artificial Intelligence

The notion of machine companions has long been embedded in social-technological imaginaries. Recent advances in AI have moved those media musings into believable sociality manifested in interfaces, robotic bodies, and devices. Those machines are often referred to colloquially as "companions" yet there is little careful engagement of machine companionship (MC) as a formal concept or measured variable. This PRISMA-guided scoping review systematically samples, surveys, and synthesizes current scholarly works on MC (N = 71; 2017-2025), to that end. Works varied widely in considerations of MC according to guiding theories, dimensions of a-priori specified properties (subjectively positive, sustained over time, co-active, autotelic), and in measured concepts (with more than 50 distinct measured variables). WE ultimately offer a literature-guided definition of MC as an autotelic, coordinated connection between human and machine that unfolds over time and is subjectively positive.


Disentangling Codemixing in Chats: The NUS ABC Codemixed Corpus

Churina, Svetlana, Gupta, Akshat, Mujtahid, Insyirah, Jaidka, Kokil

arXiv.org Artificial Intelligence

Code-mixing involves the seamless integration of linguistic elements from multiple languages within a single discourse, reflecting natural multilingual communication patterns. Despite its prominence in informal interactions such as social media, chat messages and instant - messaging exchanges, there has been a lack of publicly available corpora that are author - labeled and suitable for modeling human conversations and relationships. This study intro - duces the first labeled and general-purpose corpus for understanding code - mixing in context while maintaining rigorous privacy and ethi - cal standards. Our live project will continu - ously gather, verify, and integrate code - mixed messages into a structured dataset released in JSON format, accompanied by detailed metadata and linguistic statistics. To date, it includes over 355,641 messages spanning various code - mixing patterns, with a primary focus on English, Mandarin, and other languages. We expect the Codemix Corpus to serve as a foun - dational dataset for research in computational linguistics, sociolinguistics, and NLP applica - tions. Code and dataset sample can be found here.


REALTALK: A 21-Day Real-World Dataset for Long-Term Conversation

Lee, Dong-Ho, Maharana, Adyasha, Pujara, Jay, Ren, Xiang, Barbieri, Francesco

arXiv.org Artificial Intelligence

Long-term, open-domain dialogue capabilities are essential for chatbots aiming to recall past interactions and demonstrate emotional intelligence (EI). Yet, most existing research relies on synthetic, LLM-generated data, leaving open questions about real-world conversational patterns. To address this gap, we introduce REALTALK, a 21-day corpus of authentic messaging app dialogues, providing a direct benchmark against genuine human interactions. We first conduct a dataset analysis, focusing on EI attributes and persona consistency to understand the unique challenges posed by real-world dialogues. By comparing with LLM-generated conversations, we highlight key differences, including diverse emotional expressions and variations in persona stability that synthetic dialogues often fail to capture. Building on these insights, we introduce two benchmark tasks: (1) persona simulation where a model continues a conversation on behalf of a specific user given prior dialogue context; and (2) memory probing where a model answers targeted questions requiring long-term memory of past interactions. Our findings reveal that models struggle to simulate a user solely from dialogue history, while fine-tuning on specific user chats improves persona emulation. Additionally, existing models face significant challenges in recalling and leveraging long-term context within real-world conversations.


AI Agents Will Be Manipulation Engines

WIRED

In 2025, it will be commonplace to talk with a personal AI agent that knows your schedule, your circle of friends, the places you go. This will be sold as a convenience equivalent to having a personal, unpaid assistant. These anthropomorphic agents are designed to support and charm us so that we fold them into every part of our lives, giving them deep access to our thoughts and actions. With voice-enabled interaction, that intimacy will feel even closer. This story is from the WIRED World in 2025, our annual trends briefing.


An Autistic Teenager Fell Hard for a Chatbot

The Atlantic - Technology

My godson, Michael, is a playful, energetic 15-year-old, with a deep love of Star Wars, a wry smile, and an IQ in the low 70s. His learning disabilities and autism have made his journey a hard one. His parents, like so many others, sometimes rely on screens to reduce stress and keep him occupied. They monitor the apps and websites he uses, but things are not always as they initially appear. When Michael asked them to approve installing Linky AI, a quick review didn't reveal anything alarming, just a cartoonish platform to pass the time.