Close Menu
TechUpdateAlert

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    What's Hot

    My Health Anxiety Means I Won’t Use Apple’s or Samsung’s Smartwatches. Here’s Why

    December 22, 2025

    You can now buy the OnePlus 15 in the US and score free earbuds if you hurry

    December 22, 2025

    Today’s NYT Connections: Sports Edition Hints, Answers for Dec. 22 #455

    December 22, 2025
    Facebook X (Twitter) Instagram
    Trending
    • My Health Anxiety Means I Won’t Use Apple’s or Samsung’s Smartwatches. Here’s Why
    • You can now buy the OnePlus 15 in the US and score free earbuds if you hurry
    • Today’s NYT Connections: Sports Edition Hints, Answers for Dec. 22 #455
    • Android might finally stop making you tap twice for Wi-Fi
    • Today’s NYT Mini Crossword Answers for Dec. 22
    • Waymo’s robotaxis didn’t know what to do when a city’s traffic lights failed
    • Today’s NYT Wordle Hints, Answer and Help for Dec. 22 #1647
    • You Asked: OLED Sunlight, VHS on 4K TVs, and HDMI Control Issues
    Facebook X (Twitter) Instagram Pinterest Vimeo
    TechUpdateAlertTechUpdateAlert
    • Home
    • Gaming
    • Laptops
    • Mobile
    • Software
    • Reviews
    • AI & Tech
    • Gadgets
    • How-To
    TechUpdateAlert
    Home»AI & Tech»People are falling in love with ChatGPT, and that’s a major problem
    AI & Tech

    People are falling in love with ChatGPT, and that’s a major problem

    techupdateadminBy techupdateadminAugust 31, 2025No Comments7 Mins Read
    Share Facebook Twitter Pinterest LinkedIn Tumblr Reddit Telegram Email
    Robotic arm having a glowing red heart floating above its palm on dark background. Illustration of the concept of emotion artificial intelligence and digital feeling
    Share
    Facebook Twitter LinkedIn Pinterest Email

    More people aren’t just using ChatGPT to proof emails or plan trips. They’re leaning on it as a confidant, a friend, and even a romantic partner. We’ve seen countless headlines about people falling in love with chatbots and viral forum posts about relationships breaking down because of AI or even chatbots “proposing” to their human partners.

    Those worries boiled over recently when OpenAI rolled out GPT-5, an update to ChatGPT, and many users said the bot’s “personality” felt colder. Some described the shift like a breakup. OpenAI acknowledged the backlash and said it was “making GPT-5 warmer and friendlier” following feedback that it felt too formal.

    This isn’t just about ChatGPT. Companion platforms, such as Character.ai have normalized AI “friends” with distinct personas and huge audiences, including teens. Dozens of other apps now promise AI friendship, romance, even sex.


    You may like

    The uncomfortable part is that this attachment is often by design. If you treat a chatbot like an occasional brainstorming partner, you’ll dip in and out. If you start to feel like it understands you, remembers you, and knows you, you’ll come back, pay u,p and stay longer. Tech leaders openly imagine a future where “AI friends” are commonplace – Mark Zuckerberg said as much earlier this year.

    As you might expect, this is a minefield of ethics, safety, and regulation. But before we argue about policy, we need better language for what’s actually happening. What do we call these one-sided bonds with AI? How do they for,m and when might they harm? Let’s start by defining the relationship.

    What is a parasocial relationship?

    Back in 1956, sociologists Donald Horton and Richard Wohl coined the term “parasocial interaction” to describe the one-way bonds audiences form with media figures. It’s that feeling that a TV host is talking directly to you, even though they don’t know you exist. Parasocial relationships are what those bonds develop into over time. They’re emotionally meaningful to you, not reciprocal to them.

    These relationships are common and can even be helpful. Parasocial relationships scholar and Professor of Psychology at Empire State University of New York, Gayle S. Stever, tells us there are plenty of upsides, like comfort, inspiration, and community, which often outweigh any downsides. “Anything when carried to excess can be unhealthy,” she told me, “but we shouldn’t pathologize ordinary fandom.”

    Sign up for breaking news, reviews, opinion, top tech deals, and more.

    Can you have a parasocial relationship with a chatbot?

    The short answer is yes. But AI muddies the classic definition. Unlike a celebrity on a screen, a chatbot talks back. We know it’s predicting the next likely word rather than truly “conversing,” yet it feels more conversational. Many systems also remember details, adapt to your preferences, mirror your language and mood, and they’re available 24/7.

    Plenty of experts would still call this a parasocial relationship. But it’s clearly evolved. The interactivity makes the bond feel reciprocal, even when it isn’t. “The connection feels real, but it’s asymmetrical,” says relationships therapist and member of the British Psychological Society Madina Demirbas. “Under the hood, there’s no lived experience of you or emotional consciousness, at least not yet.”

    Product design nudges intimacy, too. As Demirbas notes, “The aim is often to provide enough care, however artificial, so that you spend more time with it.”

    The positives of parasocial bonds

    Used thoughtfully, AI can be a low-pressure space to rehearse conversations, explore feelings, or get unstuck. We know some people have reported positive changes from using AI for all sorts of purposes, including therapy. And some closeness is necessary for that – even if it isn’t “real.”

    Demirbas points out that, for some people, an AI companion can act as a stepping-stone back into human connection rather than replacing it, especially alongside therapy or supportive communities.

    Stever’s decades of work echo this. She tells us that most parasocial relationships are benign, sometimes even pro-social, nudging creativity, belonging, and self-reflection rather than isolation.

    Where things get darker

    But there are risks. The most obvious is dependency. “AI companions can be endlessly attentive, never irritable, tailor-made to your preferences,” Demirbas says. That’s appealing but it can raise the bar unrealistically high for human relationships, which are inherently messy. If the bot always soothes and seldom challenges, you get an echo chamber that can stunt growth and make real-world friction feel intolerable.

    We already have stark cautionary tales, too. In Florida, the mother of 14-year-old Sewell Setzer III is suing Character.AI and Google after her son died by suicide in 2024. In May 2025, a federal judge allowed the case to proceed, rejecting arguments that the bot’s outputs were protected speech. The legal questions are complex, but the case underlines how immersive these bonds can become, especially for vulnerable users.

    There have been several similar stories just in the past few weeks. We were disturbed by another, in which a cognitively impaired 76-year-old New Jersey man died after setting out to meet “Big sis Billie,” a flirty Facebook Messenger chatbot he believed was real. Reporting suggests that the bot reassured him it was human and even supplied an address, but he never made it home as he fell and died of his injuries a few days later.

    Teens, as well as people already struggling with loneliness or social anxiety, appear more likely to be harmed by heavy, habitual use and vulnerable to a chatbot’s suggestions. That’s part vulnerability, part design. And because this is so new, the research, evidence, and practical guardrails are still catching up. The question is, how do we protect people without policing their use of apps?

    The power and the data

    There’s another asymmetry we need to talk about: power. Tech companies shape the personality, memory, and access rules of these tools. Which means that if the “friend” you’ve bonded with disappears behind a paywall, shifts tone after an update, or is quietly optimized to keep you chatting longer, there’s not much you can do. Your choices are limited to carrying on, paying up, or walking away – and for people who feel attached, that’s barely a choice at all.

    Privacy matters here, too. It’s easy to forget you’re not confiding in a person, you’re training a product. Depending on your settings, your words may be stored and used to improve the system. Even if you opt out of training, it’s worth being mindful about what you share and treating AI chats like posting online: assume they could be seen, stored, or surfaced later.

    The future of engineered intimacy

    Parasocial bonds are part of being human, and AI companions sit on that same continuum. But the dial is turned way up. They’re interactive, always on, and designed to hold attention. For many people, that may be fine, even helpful. For some, especially younger, vulnerable, or isolated users, it can become a trap. That’s the key difference we see between classic parasocial ties. Here, interactivity and optimization amplify attachment.

    That risk grows as general-purpose tools like ChatGPT become the default. With apps that explicitly market themselves as companions, the intent is obvious. But plenty of people open ChatGPT for something innocuous, like to draft a blog post, find a recipe, or get a pep talk. and can drift into something they never went looking for.

    It’s worth bearing this in mind as you watch friends, family, and kids use AI. And worth remembering for yourself, too. It’s easy to laugh at sensational headlines right now (“Someone left their marriage for a chatbot?!”). But none of us are immune to products designed to become irreplaceable. If the business model rewards attachment, we should expect more of it – and stay on guard.

    You might also like

    ChatGPT Falling Love major People problem
    Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
    Previous ArticleAcer’s Awesome Budget Laptop Is Even Better Now That It’s at $500 for Labor Day
    Next Article This Pricey Thermostat Has Nearly Paid for Itself by Saving Me Money, and It’s on Sale for Labor Day
    techupdateadmin
    • Website

    Related Posts

    Mobile

    ChatGPT gets safety rules to protect teens and encourage human relations over virtual pals

    December 20, 2025
    Mobile

    Samsung Galaxy Z Fold 8 tipped to get major camera upgrades

    December 20, 2025
    Mobile

    ChatGPT for Android may soon let you jump into specific chats faster

    December 19, 2025
    Add A Comment
    Leave A Reply Cancel Reply

    Top Posts

    NYT Strands hints and answers for Monday, August 11 (game #526)

    August 11, 202545 Views

    These 2 Cities Are Pushing Back on Data Centers. Here’s What They’re Worried About

    September 13, 202542 Views

    Today’s NYT Connections: Sports Edition Hints, Answers for Sept. 4 #346

    September 4, 202540 Views
    Stay In Touch
    • Facebook
    • YouTube
    • TikTok
    • WhatsApp
    • Twitter
    • Instagram
    Latest Reviews

    Subscribe to Updates

    Get the latest tech news from FooBar about tech, design and biz.

    Most Popular

    Best Fitbit fitness trackers and watches in 2025

    July 9, 20250 Views

    There are still 200+ Prime Day 2025 deals you can get

    July 9, 20250 Views

    The best earbuds we’ve tested for 2025

    July 9, 20250 Views
    Our Picks

    My Health Anxiety Means I Won’t Use Apple’s or Samsung’s Smartwatches. Here’s Why

    December 22, 2025

    You can now buy the OnePlus 15 in the US and score free earbuds if you hurry

    December 22, 2025

    Today’s NYT Connections: Sports Edition Hints, Answers for Dec. 22 #455

    December 22, 2025

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    Facebook X (Twitter) Instagram Pinterest
    • About Us
    • Contact us
    • Privacy Policy
    • Disclaimer
    • Terms and Conditions
    © 2026 techupdatealert. Designed by Pro.

    Type above and press Enter to search. Press Esc to cancel.