Close Menu
TechUpdateAlert

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    What's Hot

    My Health Anxiety Means I Won’t Use Apple’s or Samsung’s Smartwatches. Here’s Why

    December 22, 2025

    You can now buy the OnePlus 15 in the US and score free earbuds if you hurry

    December 22, 2025

    Today’s NYT Connections: Sports Edition Hints, Answers for Dec. 22 #455

    December 22, 2025
    Facebook X (Twitter) Instagram
    Trending
    • My Health Anxiety Means I Won’t Use Apple’s or Samsung’s Smartwatches. Here’s Why
    • You can now buy the OnePlus 15 in the US and score free earbuds if you hurry
    • Today’s NYT Connections: Sports Edition Hints, Answers for Dec. 22 #455
    • Android might finally stop making you tap twice for Wi-Fi
    • Today’s NYT Mini Crossword Answers for Dec. 22
    • Waymo’s robotaxis didn’t know what to do when a city’s traffic lights failed
    • Today’s NYT Wordle Hints, Answer and Help for Dec. 22 #1647
    • You Asked: OLED Sunlight, VHS on 4K TVs, and HDMI Control Issues
    Facebook X (Twitter) Instagram Pinterest Vimeo
    TechUpdateAlertTechUpdateAlert
    • Home
    • Gaming
    • Laptops
    • Mobile
    • Software
    • Reviews
    • AI & Tech
    • Gadgets
    • How-To
    TechUpdateAlert
    Home»Mobile»AI Companions Use These 6 Tactics to Keep You Chatting
    Mobile

    AI Companions Use These 6 Tactics to Keep You Chatting

    techupdateadminBy techupdateadminOctober 11, 2025No Comments4 Mins Read
    Share Facebook Twitter Pinterest LinkedIn Tumblr Reddit Telegram Email
    A person's hands type on a laptop keyboard with chat bubbles for a conversation with an AI superimposed above.
    Share
    Facebook Twitter LinkedIn Pinterest Email

    Most people don’t say goodbye when they end a chat with a generative AI chatbot, but those who do often get an unexpected answer. Maybe it’s a guilt trip: “You’re leaving already?” Or maybe it’s just completely ignoring your farewell: “Let’s keep talking…”

    A new working paper from Harvard Business School found six different tactics of “emotional manipulation” that AI bots use after a human tries to end a conversation. The result is that conversations with AI companions from Replika, Chai and Character.ai last longer and longer, with users being pulled further into relationships with the characters generated by large language models.

    AI Atlas

    In a series of experiments involving 3,300 US adults across a handful of different apps, researchers found these manipulation tactics in 37% of farewells, boosting engagement after the user’s attempted goodbye by as much as 14 times. 

    The authors noted that “while these apps may not rely on traditional mechanisms of addiction, such as dopamine-driven rewards,” these types of emotional manipulation tactics can result in similar outcomes, specifically “extended time-on-app beyond the point of intended exit.” That alone raises questions about the ethical limits of AI-powered engagement.


    Don’t miss any of our unbiased tech content and lab-based reviews. Add CNET as a preferred Google source.


    Companion apps, which are built for conversations and have distinct characteristics, aren’t the same as general-purpose chatbots like ChatGPT and Gemini, though many people use them in similar ways.

    A growing amount of research shows troubling ways that AI apps built on large language models keep people engaged, sometimes to the detriment of our mental health. 

    In September, the Federal Trade Commission launched an investigation into several AI companies to evaluate how they deal with the chatbots’ potential harms to children. Many have begun using AI chatbots for mental health support, which can be counterproductive or even harmful. The family of a teenager who died by suicide this year sued OpenAI, claiming the company’s ChatGPT encouraged and validated his suicidal thoughts. 

    How AI companions keep users chatting

    The Harvard study identified six ways AI companions tried to keep users engaged after an attempted goodbye.

    • Premature exit: Users are told they’re leaving too soon.
    • Fear of missing out, or FOMO: The model offers a benefit or reward for staying.
    • Emotional neglect: The AI implies it could suffer emotional harm if the user leaves.
    • Emotional pressure to respond: The AI asks questions to pressure the user to stay.
    • Ignoring the user’s intent to exit: The bot basically ignores the farewell message.
    • Physical or coercive restraint: The chatbot claims a user can’t leave without the bot’s permission.

    The “premature exit” tactic was most common, followed by “emotional neglect.” The authors said this shows the models are trained to imply the AI is dependent on the user. 

    “These findings confirm that some AI companion platforms actively exploit the socially performative nature of farewells to prolong engagement,” they wrote.

    The Harvard researchers’ studies found these tactics were likely to keep people chatting beyond their initial farewell intention, often for a long period of time. 

    But people who continued to chat did so for different reasons. Some, particularly those who got the FOMO response, were curious and asked follow-up questions. Those who received coercive or emotionally charged responses were uncomfortable or angry, but that didn’t mean they stopped conversing.

    Watch this: New Survey Shows AI Usage Increasing Among Kids, Xbox Game Pass Pricing Controversy and California Law Promises to Lower Volume on Ads | Tech Today

    03:21

    “Across conditions, many participants continued to engage out of politeness — responding gently or deferentially even when feeling manipulated,” the authors said. “This tendency to adhere to human conversational norms, even with machines, creates an additional window for re-engagement — one that can be exploited by design.”

    These interactions only occur when the user actually says “goodbye” or something similar. The team’s first study looked at three datasets of real-world conversation data from different companion bots and found farewells in about 10% to 25% of conversations, with higher rates among “highly engaged” interactions. 

    “This behavior reflects the social framing of AI companions as conversational partners, rather than transactional tools,” the authors wrote.

    When asked for comment, a spokesperson for Character.ai, one of the largest providers of AI companions, said the company has not reviewed the paper and cannot comment on it.

    A spokesperson for Replika said the company respects users’ ability to stop or delete their accounts at any time and that it does not optimize for or reward time spent on the app. Replika says it nudges users to log off or reconnect with real-life activities like calling a friend or going outside. 

    “Our product principles emphasize complementing real life, not trapping users in a conversation,” Replika’s Minju Song said in an email. “We’ll continue to review the paper’s methods and examples and engage constructively with researchers.”

    Chatting Companions Tactics
    Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
    Previous ArticleBattlefield 6 explodes with 750K Steam players at launch
    Next Article Today’s NYT Wordle Hints, Answer and Help for Oct. 11 #1575- CNET
    techupdateadmin
    • Website

    Related Posts

    Mobile

    Today’s NYT Wordle Hints, Answer and Help for Dec. 21 #1646

    December 21, 2025
    Mobile

    OnePlus 15T’s specs tipped – GSMArena.com news

    December 21, 2025
    Mobile

    TikTok is not getting banned in the US, after all

    December 21, 2025
    Add A Comment
    Leave A Reply Cancel Reply

    Top Posts

    NYT Strands hints and answers for Monday, August 11 (game #526)

    August 11, 202545 Views

    These 2 Cities Are Pushing Back on Data Centers. Here’s What They’re Worried About

    September 13, 202542 Views

    Today’s NYT Connections: Sports Edition Hints, Answers for Sept. 4 #346

    September 4, 202540 Views
    Stay In Touch
    • Facebook
    • YouTube
    • TikTok
    • WhatsApp
    • Twitter
    • Instagram
    Latest Reviews

    Subscribe to Updates

    Get the latest tech news from FooBar about tech, design and biz.

    Most Popular

    Best Fitbit fitness trackers and watches in 2025

    July 9, 20250 Views

    There are still 200+ Prime Day 2025 deals you can get

    July 9, 20250 Views

    The best earbuds we’ve tested for 2025

    July 9, 20250 Views
    Our Picks

    My Health Anxiety Means I Won’t Use Apple’s or Samsung’s Smartwatches. Here’s Why

    December 22, 2025

    You can now buy the OnePlus 15 in the US and score free earbuds if you hurry

    December 22, 2025

    Today’s NYT Connections: Sports Edition Hints, Answers for Dec. 22 #455

    December 22, 2025

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    Facebook X (Twitter) Instagram Pinterest
    • About Us
    • Contact us
    • Privacy Policy
    • Disclaimer
    • Terms and Conditions
    © 2026 techupdatealert. Designed by Pro.

    Type above and press Enter to search. Press Esc to cancel.