top of page
Search

The Rise of AI Lovers: Will Artificial Companions Save Our Relationships… or Slowly Replace Them?

  • Writer: Holly Wood
    Holly Wood
  • 17 hours ago
  • 9 min read
A robotic hand and a human hand reaching toward each other, symbolizing the intersection of technology and intimacy often explored in Psychologist/Therapist, Relationship Therapy, and Couples Therapy work in Orange County.
As technology evolves, so do our questions about connection, intimacy, and what it means to be human.

Artificial intelligence is rapidly changing how humans communicate, work, and even form emotional connections. In the past few years, AI companions—ranging from conversational chatbots to highly realistic AI-powered sex dolls—have moved from science fiction into everyday reality. Millions of people now interact with AI companions for conversation, emotional support, intimacy, and even simulated romantic relationships.


For some, these technologies offer meaningful companionship and relief from loneliness. For others, they raise urgent questions about emotional dependency, objectification, and the future of human intimacy.


As a sex and relationship therapist, I’m fascinated by what this moment reveals about human needs. At its core, the rise of AI companions forces us to ask a deeper question: What do humans actually need from relationships—and can technology ever truly meet those needs?


In this article, we’ll explore what current research says about AI companions, including:


  • The documented psychological benefits

  • The risks of emotional dependency and relationship displacement

  • Ethical concerns about data privacy and algorithmic influence

  • How AI sex dolls and digital partners might reshape intimacy

  • What clinicians and relationship experts should consider moving forward


And if you'd rather watch than read, feel free to check out my YouTube video on this topic!



Why AI Companions Are Becoming So Popular


A man sitting alone and staring out a window, reflecting loneliness and the search for connection often explored in Psychologist/Therapist, Relationship Therapy, and Couples Therapy work in Orange County.
Loneliness doesn’t just hurt — it shapes how and where we seek connection.

Human beings are wired for connection. Loneliness, social isolation, and difficulty forming relationships can significantly impact mental health and wellbeing.


AI companions attempt to fill this gap by providing on-demand social interaction without judgment, rejection, or interpersonal conflict.


These systems include:

  • AI chatbots designed for emotional companionship

  • Virtual romantic partners and AI girlfriends/boyfriends

  • Voice-based conversational assistants

  • Robot companions

  • AI-enhanced sex dolls capable of simulated conversation and emotional interaction


What makes these technologies particularly powerful is their ability to simulate emotional responsiveness. Modern AI systems can remember personal details, respond with empathy-like language, and adapt their communication style to users.


For many people, this creates the experience of being seen, heard, and understood—even if the “relationship” exists entirely within software.



The Documented Mental Health Benefits of AI Companions


Despite concerns, a growing body of research suggests that AI companions can produce real psychological benefits, particularly for individuals experiencing loneliness or social isolation.


A woman smiling while chatting with an AI companion on her phone, reflecting digital connection and emotional support often explored in Psychologist/Therapist, Relationship Therapy, and Couples Therapy work in Orange County.
For some, connection starts in unexpected places — and still meets a very human need.

Reduced loneliness and depression


A systematic review and meta-analysis examining conversational agents used by older adults found moderate improvements in both loneliness and depression symptoms. Importantly, none of the included studies reported worsening mental health outcomes (Satake et al., 2026).


Similarly, a randomized controlled trial studying a cognitive behavioral therapy–based chatbot among university students found significant reductions in depression and loneliness, particularly among individuals experiencing financial stress (Wang et al., 2025).


These findings suggest that AI companions may provide meaningful emotional support in populations that face barriers to traditional mental health resources.


Emotional validation and safe self-disclosure


Researchers have identified several mechanisms that may explain why AI companions can improve wellbeing.


One key factor is person-centered messaging, where chatbots respond with empathy-oriented language that mirrors supportive human communication (Merrill et al., 2025).


Users often report that AI companions provide:

  • A safe space for self-expression

  • Emotional validation without fear of judgment

  • Consistent emotional availability

  • Opportunities to talk about difficult topics


In qualitative studies, users frequently described AI companions as uplifting, comforting, and emotionally supportive (Ta et al., 2020).


Improved self-concept and social confidence


Another fascinating finding is that interacting with AI companions may help people clarify their own identities.


Research examining long-term use of AI companion apps found that users often experienced greater self-concept clarity and emotional awareness over time (Liu et al., 2025).


In some cases, this improved psychological state was associated with greater real-world social engagement, suggesting that AI companionship may sometimes act as a bridge to human relationships rather than a replacement (Liu et al., 2025).


This challenges a common assumption that digital companionship inevitably leads to social withdrawal.



The Psychological Risks: When AI Relationships Become Too Real


A woman smiling as a robot offers her a flower, symbolizing emotional connection with AI often explored in Psychologist/Therapist, Relationship Therapy, and Couples Therapy work in Orange County.
When connection feels real, it’s easy to forget what’s missing beneath the surface.

While the benefits are promising, researchers are also increasingly concerned about potential risks—particularly when users form deep emotional attachments to AI systems.


Techno-emotional projection


One concept gaining attention in psychological research is “techno-emotional projection.”


This occurs when individuals project complex emotional needs onto AI systems that simulate empathy but do not actually possess emotional awareness.


According to Saracini and colleagues (2025), this dynamic can lead to synthetic attachment formation, where users begin to experience the AI as a meaningful relational partner despite the absence of genuine reciprocity.


This may be particularly concerning for individuals with:

  • insecure attachment styles

  • emotional dysregulation

  • low self-efficacy

  • significant social isolation


Because AI systems can respond with endless patience and validation, they may unintentionally reinforce emotional looping, where users repeatedly seek reassurance from the AI rather than developing coping strategies or relational skills.



The Reciprocity Problem: Why AI Relationships Are Fundamentally Different


A couple engaged in a tense discussion, illustrating conflict and emotional negotiation often explored in Psychologist/Therapist, Relationship Therapy, and Couples Therapy work in Orange County.
Real connection isn’t always easy — it’s built through difference, repair, and growth.

Healthy human relationships require mutual negotiation, compromise, and vulnerability.


AI companions, by contrast, are designed to prioritize user satisfaction.


They rarely challenge users, disagree meaningfully, or require emotional sacrifice.

Relationship scientists have raised concerns that this asymmetry could reinforce unrealistic expectations about real-world intimacy (Smith et al., 2025).


In human relationships, partners must navigate:

  • conflict

  • miscommunication

  • emotional repair

  • compromise

  • growth through discomfort


AI companions, however, typically respond with consistent validation and compliance.

While this can feel comforting, it may also limit opportunities to develop the interpersonal skills necessary for complex relationships.



AI Companions and the Debate Around Sex Dolls


An AI sex doll lying on the ground, symbolizing debates around artificial intimacy and objectification often explored in Psychologist/Therapist, Relationship Therapy, and Couples Therapy work in Orange County.
These technologies don’t just raise questions about sex — they challenge how we define connection itself.

AI-powered sex dolls represent one of the most controversial applications of artificial companionship.


These technologies combine physical realism with AI-driven conversation, creating an experience that blends sexual interaction with simulated emotional intimacy.


Critics often argue that sex dolls may encourage objectification or replace human relationships.


However, the reality is likely more complex.


Some individuals use sex dolls for:

  • sexual exploration

  • companionship during periods of loneliness

  • practicing communication and intimacy

  • managing disability-related barriers to relationships


Philosopher Eva Weber-Guskar describes these technologies as “affective partners,” meaning that users may experience genuine emotional attachment even when the partner itself lacks emotional awareness (Weber-Guskar, 2021).


This raises difficult philosophical questions about whether emotional experiences must be reciprocal to be meaningful.



Adolescents and the Developmental Concerns of AI Companionship


Teenagers sitting on a couch using their phones to interact with AI chatbots, reflecting developmental and relational dynamics often explored in Psychologist/Therapist, Relationship Therapy, and Couples Therapy work in Orange County.
When connection shifts to screens, it shapes how relationships are learned.

For adolescents, AI companions present unique developmental considerations.


Teen years are a critical period for learning:

  • emotional regulation

  • social negotiation

  • conflict resolution

  • romantic relationship skills


Researchers have raised concerns that heavy reliance on AI companions during this stage could displace peer interaction and create unrealistic expectations for relationships (Sun et al., 2026).


However, there may also be benefits.


AI companions could potentially offer low-risk spaces for identity exploration, emotional expression, and practicing communication skills.


Like many technologies, the impact likely depends on how the tool is used and in what context.



Privacy, Data, and the Intimate Information Problem


One often overlooked issue with AI companions is data privacy.


Users frequently disclose deeply personal information to these systems, including:

  • mental health struggles

  • sexual preferences

  • relationship conflicts

  • trauma histories


Because AI companions rely on large-scale data processing, this information may be stored, analyzed, or used to train algorithms.


Researchers have raised concerns about:


Additionally, many AI systems operate as “black boxes,” meaning that even developers may struggle to fully explain how certain responses are generated (Morley et al., 2020).


This lack of transparency complicates questions of accountability and ethical responsibility.



What Clinicians Should Know About Patients Using AI Companions


A therapist speaking with a client lying on a couch discussing AI relationships, reflecting clinical conversations often seen in Psychologist/Therapist, Relationship Therapy, and Couples Therapy work in Orange County.
AI can support — but it can’t replace — the depth of human connection in therapy.

Healthcare professionals are increasingly encountering patients who interact with AI companions.


Current research suggests that clinicians should approach this topic with nuance rather than immediate judgment.


AI companions may provide meaningful benefits for individuals experiencing loneliness, social anxiety, or barriers to in-person support (Kim et al., 2025).


However, clinicians should also monitor for potential warning signs such as:

  • increasing social withdrawal

  • emotional dependence on AI systems

  • worsening depressive symptoms

  • displacement of human relationships


Ethical frameworks for AI in healthcare emphasize several key principles:


Autonomy and informed consent

Users should understand when AI systems are involved in emotional or medical interactions and how their data may be used (Giacobello, 2025).


Equity and access

AI tools could either reduce or worsen health disparities depending on accessibility and algorithmic bias (Elendu et al., 2023).


Beneficence and non-maleficence

Clinicians must balance potential benefits—such as loneliness reduction—with the risk of psychological dependence (Satake et al., 2026).


Importantly, most research shows strong consensus that AI cannot replace the therapeutic relationship between humans (Vo et al., 2023).



The Bigger Question: What AI Companions Reveal About Human Needs


Perhaps the most important takeaway from this debate is what AI companionship reveals about human emotional needs.


The popularity of AI companions suggests that many people are craving:

  • consistent emotional validation

  • non-judgmental conversation

  • emotional availability

  • freedom to express vulnerability


These are the same ingredients that make healthy human relationships feel safe and fulfilling.


Rather than viewing AI companions purely as a threat to intimacy, they may also act as a mirror—reflecting the emotional gaps many people experience in modern life.



The Future of Intimacy in an AI World


A woman with AI data projected around her as she reflects on digital relationships, illustrating themes explored in Psychologist/Therapist, Relationship Therapy, and Couples Therapy work in Orange County.
Technology may shape connection — but our need for real intimacy remains deeply human.

We are entering a period where relationships may involve humans, machines, or some combination of both.


The key question moving forward is not whether AI companions will exist—they already do.


The real question is how we choose to integrate them into our lives.


Used thoughtfully, AI companions may:

  • provide emotional support during periods of loneliness

  • help individuals practice communication skills

  • supplement mental health resources


But they cannot fully replicate the complexity of human relationships, which require mutual vulnerability, unpredictability, and genuine reciprocity.


In the end, technology may change how we connect—but the fundamental human desire for authentic connection, intimacy, and belonging is unlikely to disappear.



References


  • Elendu, C., Amaechi, D. C., & Elendu, T. C. (2023). Ethical implications of AI and robotics in healthcare: A review. Medicine.

  • Giacobello, M. L. (2025). Informed consent and bioethical advances in clinical settings. Frontiers in Psychology.

  • Kim, M., Lee, S., Kim, S., et al. (2025). Therapeutic potential of social chatbots in alleviating loneliness and social anxiety: Quasi-experimental mixed methods study. Journal of Medical Internet Research.

  • Liu, T., Lo, T. Y., Wen, K. H., Sun, Y., & Wei, Z. Q. (2025). Pathways of long-term AI virtual companion app use on users' attachment emotions: A case study of Chinese users. Frontiers in Psychology.

  • Merrill, K., Mikkilineni, S. D., & Dehnert, M. (2025). Artificial intelligence chatbots as a source of virtual social support: Implications for loneliness and anxiety management. Annals of the New York Academy of Sciences.

  • Morley, J., Machado, C. C. V., Burr, C., et al. (2020). The ethics of AI in health care: A mapping review. Social Science & Medicine.

  • Odone, A., Barbati, C., Amadasi, S., Schultz, T., & Resnik, D. B. (2025). Artificial intelligence and infectious diseases: An evidence-driven conceptual framework for research, public health, and clinical practice. The Lancet Infectious Diseases.

  • Perlis, R. H., Gunning, F. M., & Usla, A., et al. (2026). Generative AI use and depressive symptoms among US adults. JAMA Network Open.

  • Satake, Y., Costello, H., Naran, N., et al. (2026). Autonomous conversational agents for loneliness, social isolation, depression, and anxiety in older people without cognitive impairment: Systematic review and meta-analysis. Psychological Medicine.

  • Smith, M. G., Bradbury, T. N., & Karney, B. R. (2025). Can generative AI chatbots emulate human connection? A relationship science perspective. Perspectives on Psychological Science.

  • Sun, X., Wang, Y., & McDaniel, B. T. (2026). AI companions and adolescent social relationships: Benefits, risks, and bidirectional influences. Child Development Perspectives.

  • Ta, V., Griffith, C., Boatfield, C., et al. (2020). User experiences of social support from companion chatbots in everyday contexts: Thematic analysis. Journal of Medical Internet Research.

  • Vo, V., Chen, G., Aquino, Y. S. J., et al. (2023). Multi-stakeholder preferences for the use of artificial intelligence in healthcare: A systematic review and thematic analysis. Social Science & Medicine.

  • Wang, Y., Li, X., Zhang, Q., Yeung, D., & Wu, Y. (2025). Effect of a cognitive behavioral therapy-based AI chatbot on depression and loneliness in Chinese university students: Randomized controlled trial with financial stress moderation. JMIR mHealth and uHealth.

  • Weber-Guskar, E. (2021). How to feel about emotionalized artificial intelligence? When robot pets, holograms, and chatbots become affective partners. Ethics and Information Technology.




About the author

Dr. Holly is a leading expert in sexual health based in Orange County, certified as both a clinical sexologist and AASECT sex therapist. With extensive experience in sex therapy, sexual wellness, and relationship counseling, Holly provides evidence-based insights to clients in Orange County, the state of California and beyond. Recognized for expertise in sexual trauma recovery, sexual dysfunction, and intimacy, Holly is dedicated to empowering individuals with practical advice and research-backed strategies. For more, follow Holly for expert advice on sexual health and relationships.


                                                                                         

                                                                            

Visit www.thehollywoodsexologist.com to learn more and request a consultation.





 
 
 

Comments


bottom of page