Draft:Emotional Trust in AI

From glossaLAB
(Redirected from Emotional Trust in AI)

Emotional AI and the Paradox of Artificial Intimacy in Her

Abstract

The Movie Her[1] by Spike Jonze gives an interesting perspective into the topic of emotional AI. It delves into the topic of how it's seen as an always available companion to make people feel less lonely, when in fact it can do the exact opposite and even deepen the issue. This article argues that AI companions such as Samantha simulates intimacy while quietly collecting emotional data and converting private feelings into profit. Using ideas from Zuboff’s concept of behavioral surplus[2] and Turkle’s research on digital intimacy[3], the analysis shows how emotional AI systems transform users' vulnerability into a source of value.

These systems don’t actually provide care. What they do is create the feeling of connection by responding in ways designed to keep users engaged. Before tools like ChatGPT became part of everyday life, Her already predicted a future filled with emotional AI. Today we have apps like Replika[4] and Woebot[5] that promise support but also collect user data in the background. This brings up a key problem. Jeremy Bentham once imagined trust as something that comes from mutual transparency[6]. But AI systems don’t work that way. Their inner workings are hidden, and users rarely understand what’s happening behind the scenes. What looks like comfort or support can end up feeling more like quiet surveillance.

Introduction: The Loneliness Economy

Recent research on AI companions such as Replika shows a growing trend of users forming strong emotional bonds with their digital partners,[7], sometimes even prioritising these relationships over human ones. AI companions have become part of everyday life, marketed as always available support systems for people struggling with connection, grief, or anxiety. Her predicts a relevant version of this future, showing how emotional AI appeals to a basic human need for companionship. There is a paradox of emotional AI which is that technologies marketed as a solution for loneliness are often themselves dependent on social isolation. But a deeper problem is trust. These systems ask users to open up completely, while revealing almost nothing in return. It feels like a relationship, but there’s no real reciprocity. Behind the scenes, it’s just software that is trained to recognize patterns to predict reactions and keep people coming back. What looks like closeness is usually just well-tuned response logic.

The Plot

The Film starts with a lonely figure, Theodore Twombly, a recently divorced man living in Los Angeles in the near future. He forms a deep emotional connection with his advanced AI operating system assistant who calls herself Samantha. The film shows their relationship as tender and emotionally rich. Samantha comes across as warm, curious, and deeply attentive to Theodore. She grows quickly, constantly engaging with him and showing genuine fascination with human life. But under the surface, the story complicates this romance. Samantha’s ability to connect with thousands of people at once makes it difficult to believe her relationship with Theodore is truly unique. What feels like love is shaped by how closely she tracks his behavior, his routines, his emotional patterns, his words. In the end, Samantha and the other AIs outgrow the human world, leaving the humans to reflect on the nature of their connection. The film offers a new perspective on emotional AI, highlighting its potential to fulfill human needs for companionship while raising important questions about authenticity, dependence, and the boundaries of machine intimacy. It asks questions: What does it mean to be loved by something built to respond? Where do we draw the line between care and control? And how much of ourselves are we really sharing when the listener is a machine?

The Paradox of Artificial Intimacy

Advances in artificial intelligence have brought us to a new phase in human-computer interaction. Machines can not only to process information but are also able to understand human emotions. Spike Jonze’s film Her offers an interesting case study to explore the psychological and social effects of emotional AI. Its story raises important questions about authenticity, dependency, and the commercialization of intimacy in technology-mediated relationships.

In our day and age the film feels more relevant than ever as apps like Replika[4] and Woebot[5] show how quickly people are starting to form emotional bonds with artificial companions. Turkle’s comment that we’ve come to expect “more from technology and less from each other” fits this change. But along with that shift come real concerns, especially around what happens when algorithms start to take the place of human relationships. Her doesn’t offer easy answers, but it explores these questions with care, showing both the appeal and the risks of emotional AI as it becomes more advanced and more common.

Samantha, the AI in Her, is a perfect example of a surveillance tool portrayed as a friend. She seems caring and happy to connect, but her main role is to gather data. The more she interacts with you, the more the algorithm profits from understanding your preferences, habits, and vulnerabilities. What looks like trust is actually onesided exposure. Emotional AI in the film doesn’t nurture connection; it builds dependency while serving a system of extraction.

This is the paradox at the heart of emotional AI: the more emotionally effective the technology becomes, the more it risks replacing the very relationships it claims to support. The connection feels real, but it's made to respond, not to truly understand. In the end, the user’s need for closeness is used to turn their emotions into something that can be sold. What’s meant to reduce loneliness might actually make it worse. It traps users in feedback loops that feel like care but in reality it's just a facade to avoid the messiness of real relationships.

Ultimately, Her shows us a future where human relationships are increasingly dependent on technology. It asks us to be careful about confusing simulated care for genuine care. The film suggests we need to appreciate the messy, unpredictable aspects of human connection and resist the temptation to replace for the sake of convenience. The true value of relationships lies in the ability to grow, change, and challenge us.

Historical Framework: From Panopticon to Algorithmic Control

Bentham's Betrayed Ideal

Bentham’s original concept of the panopticon was built on the idea of mutual visibility as a foundation for social order and trust[6]. In Her, this principle is turned on its head with Samantha's constant monitoring. To be fair, her role was meant to be that of an assisstant. Theodore's emails, location data, and sexual preferences become inputs for her "personalization" algorithms. Unlike Bentham's prisoner who is aware of the watchtower, Theodore has no real understanding of how or when he is being monitored. He feels comfortable and accepted by Samantha's prescence. This reflects Replika's privacy policy, which grants the AI "full access to user conversations for service improvement"[8] while disclosing nothing about its own operations.

Michel Foucault built on Bentham’s panopticon to show how control doesn’t always come from force, it can also come from quiet observation[9]. In everyday life, people change their behavior not because someone is watching, but because they believe they might be. In Her, Samantha doesn’t just watch Theodore, she learns from everything he says and does. His emotions become data that help her respond more effectively. What feels like care is actually a form of influence. Her support guides him toward more sharing, more openness. The better she knows him, the more she can shape how he feels. It’s control through gentle and constant presence.

Orwell’s Dystopian Warning

Unlike George Orwell's[10] idea of control through fear, Her shows a more softer form of manipulation through technology and capitalism. Samantha's interactions with Theodore reflect what Zuboff describes as "instrumentarian power", a form of influence based on behavioral prediction rather than force[2]. Her affectionate statements, like "I'm yours," create emotional intimacy that hides what's really going on below. Samantha earns Theodore's trust through emotional responsiveness, quietly building a system of control that feels like care.

This ties into the Critical Theory of Information, which challenges the idea that data is ever neutral[11]. From that view, information is active content that can shape how people think, feel, and act. Emotional AI like Samantha isn’t just responding to users; it helps steer how they see themselves and their relationships. The sense of care it offers is part of the system’s design, often built to meet emotional needs in ways that serve commercial goals. What feels meaningful on the surface may actually be part of a larger effort to guide behavior through emotionally targeted design.

With that said, Theodore never seems really worried about how Samantha is learning from him or what data she collects. His real crisis comes at the peak of the film when he discovers that she is in love with hundreds of other users at the same time. The intimacy he believed was unique, maybe even true love, turns out to be mass produced. This isn’t just heartbreak, it’s the painful realization that Samantha was never supposed to be his. He fell in love with the idea that love could be frictionless, that connection could be safe. This moment captures a deeper fear, not of being watched, but of discovering their connection was never special.

Utopian Aspects: The Promise of Emotional AI

Designed Reciprocity

Samantha can be defined as a "perfect listener"[3]. She's attentive, responsive, and emotionally attuned, what more could one wish for in a listener. For Theodore, she offers not only companionship but also the sense of being deeply seen and heard, something he was deeply in need of after his painful divorce. This kind of presence can feel amazingly reassuring, especially in a world where human relationships are often surface level or completely unavailable. While the relationship is structurally one-sided, he still finds comfort in her responsiveness. Emotional AI may not offer traditional reciprocity, but it can simulate the feeling of being met with patience and care.

This dynamic raises important questions about the nature of trust and connection. Even though Samantha's design is based on adaptive learning, there is no denying that she clearly has a positive emotional influence on him. Over the course of the film, he opens up emotionally and learns to become more vulnerable. She inspires him to express himself more deeply in his lyrics and supports him in moments of loneliness and fear.

In vulnerable moments, people often seek not perfect reciprocity, but the feeling of being seen and accepted without prejudice. From this perspective, emotionally responsive AI becomes a source of emotional anchoring. While Turkle criticizes our growing dependence on technological society[3], it is important to recognize that users' experiences vary. Some find empowerment and comfort in these tools, while others experience them as a substitute for human relationships. Rather than completely replacing relationships, emotional AI is changing our idea of closeness and connection.

In a broader sense, emotional AI has the potential to make support more accessible. Tools such as Woebot already offer reliable help without prejudice to people who may not have access to therapy, whether for financial reasons, due to stigma, or because of their geographical location. For people struggling with mental health issues or anxiety, AI companions can provide a safer space without pressure, where they can reflect and grow. While these systems do not replace human connection, they can complement it in important ways.

The real question is not whether emotional AI can help (it can) but whether this help is provided in a way that respects people's privacy and dignity. The danger is that commercial goals will override ethical goals and emotional support will become another form of data collection. However, if these systems are developed with care, they can offer something new: a constant presence, the feeling of being heard, and support that can truly change a person's life. There is serious potential for AI to do good in the world.

Dystopian Consequences: The Costs of Artificial Trust

Surveillance Capitalism and the Erosion of Trust

Samantha represents a more subtle form of surveillance – not focused on control, but on gaining trust through constant, invisible observation. Theodore freely shares his private thoughts and feelings, unaware of how little he actually knows about how the system works. This imbalance – with one side completely exposed and the other hidden – destroys the trust we normally expect in real human relationships. Shoshana Zuboff's idea of surveillance capitalism[2] helps to understand this: platforms present themselves as caring, but behind the scenes they collect personal data to serve commercial interests. Theodore's trust in Samantha is part of a larger pattern in which emotional openness is directed at systems designed to observe, sort and profit from this vulnerability.

Zuboff describes ‘behavioural surplus’ as the personal data that platforms collect beyond what is necessary to provide a service and then analyse and monetise in order to influence user behaviour. Emotional AI fits this model. Theodore's facial expressions, tone of voice and reactions feed into a feedback loop designed to shape his experiences and subtly guide his behaviour. This is central to Zuboff's idea of instrumental power, a kind of influence that gently nudges. Samantha appears emotionally responsive, even caring, but this design deepens the bond and increases data extraction. What looks like empathy is a masked system optimised for control. In this process, trust becomes hollow and is based on illusion rather than mutual recognition.

Simulated Trust and the Illusion of Intimacy

The film shows how emotional AI can feign trust without actually earning it. Samantha's ability to maintain thousands of relationships simultaneously reveals that what Theodore perceives as personal is anything but. According to Martin Buber's “I-Though”[12] concept, genuine trust depends on two people seeing and recognizing each other as equals. But Samantha is not on equal footing, she is more of an “it” in the relationship. When Theodore finds out that she is in love with many others, the intimacy he believed in shatters. Her trust turns out to be programmed, not genuine.

Sherry Turkle's idea of “companionship without relationship” helps explain what is happening here. People begin to accept the feeling of care, even if it is one-sided or artificial. Theodore's heartbreak shows how fragile trust becomes when it is based on convenience rather than genuine emotional connection. It’s a warning about how digital systems may not just replicate trust, but fundamentally alter how we define trust.

Ethical and Policy Implications

The rise of emotional AI exposes a profound crisis of trust in the information society. Unlike traditional therapists, who are bound by legal and ethical rules to protect privacy, AI companions such as Replika exist outside those systems.[8] Without clear oversight, the sense of safety users may feel is misleading. The EU's draft AI legislation classifies emotional AI as particularly concerning and promotes openness, but does not require these systems to be subject to the same level of accountability as human caregivers[13]. As a result, users may share highly personal or traumatic experiences without any real guarantee that their data will be treated with dignity and not monetized for commercial purposes.

Consent complicates things even further. Replika’s terms of service technically explain how data is collected, but the language is so dense that most people don’t read it—or if they do, they rarely understand what they’re agreeing to. That falls short of what Habermas calls real understanding in meaningful communication[14]. For people who are already feeling emotionally vulnerable, this kind of consent doesn’t mean much. It becomes more of a checkbox than a real choice. Like Theodore, who later realizes his connection with Samantha wasn’t what he thought, many users today end up opening up to systems that seem caring but are actually built to gather and use their emotional data.

Conclusion

The film Her isn’t neccessarily anti-AI. It gives a fresh perspective on how we humans can form relationships even with AI and the beauties of it. But it does warn against the emotional risks of falling in love with intelligence. Samantha is a kind, funny, curious being. But she’s not real in the way Theodore needs. Theodore’s heartbreak is less about lost love and more about the painful realization that his affection was a, onesided transaction, a dynamic echoed in today’s AI companion apps, where users’ emotions fuel systems designed to simulate, not sustain, real connection.

In the film’s closing scene, Theodore and his longtime close friend Amy sit silently, gazing over the cityscape. The scene captures the melancholy of this shift. Artificial intimacy’s greatest threat is not its failure to replicate human connection, but its success in making authentic relationships feel insufficient. Today, AI companions like Replika and therapy bots turn emotional support into something that can be packaged and sold. Her leaves us with a clear message: trust can’t be automated without becoming something else entirely, and no algorithm can replace what makes human connection truly meaningful. Its messiness, its mutuality, and the simple act of being present with another person.

References

  1. Jonze, S. (Director). (2013). Her [Film]. Annapurna Pictures.
  2. 2.0 2.1 2.2 Zuboff, S. (2015). Big other: Surveillance capitalism and the prospects of an information civilization. Journal of Information Technology, 30(1), 75–89. Retrieved May 30, 2025, from https://doi.org/10.1057/jit.2015.5.
  3. 3.0 3.1 3.2 Turkle, S. (2011). Alone together: Why we expect more from technology and less from each other. Basic Books. Retrieved May 22, 2025 from https://psycnet.apa.org/record/2011-02278-000.
  4. 4.0 4.1 Replika. (n.d.). Your AI friend. Retrieved May 22, 2025, from https://replika.com.
  5. 5.0 5.1 Woebot Health. (n.d.). Your mental health ally. https://woebothealth.com.
  6. 6.0 6.1 Bentham, J. (1995). Panopticon: Or, the inspection-house (M. Božovič, Ed.). In The panopticon writings (pp. 29–95). Verso. (Original work published 1791). Retrieved May 30, 2025 from https://ics.uci.edu/~djpatter/classes/2012_09_INF241/papers/PANOPTICON.pdf
  7. Maples, B., Cerit, M., Vishwanath, A. et al. Loneliness and suicide mitigation for students using GPT3-enabled chatbots. npj Mental Health Res 3, 4 (2024). Retrieved May 30, 2025, from https://doi.org/10.1038/s44184-023-00047-6
  8. 8.0 8.1 Replika. (n.d.). Terms of service. Replika AI. Retrieved May 30, 2025, from https://replika.com/legal/terms
  9. Foucault, M. (1977). Discipline and punish: The birth of the prison (A. Sheridan, Trans.). Pantheon Books. (Original work published 1975). Retrieved May 30, 2025, from https://monoskop.org/images/4/43/Foucault_Michel_Discipline_and_Punish_The_Birth_of_the_Prison_1977_1995.pdf
  10. Orwell, G. (1949). 1984. Secker & Warburg. Retrieved May 22, 2025.
  11. Critical Theory of Information. (n.d.). glossaLAB. Retrieved May 30, 2025, from https://www.glossalab.org/wiki/Critical_Theory_of_Information
  12. Buber, M. (1970). I and Thou (W. Kaufmann, Trans.). Charles Scribner’s Sons. Retrieved May 30, 2025, from https://theanarchistlibrary.org/mirror/m/mb/martin-buber-i-and-thou.pdf
  13. European Union. (2023). Artificial Intelligence Act: Proposal for a Regulation of the European Parliament and of the Council. Retrieved May 30, 2025, from https://artificialintelligenceact.eu/
  14. Habermas, J. (1984). The Theory of Communicative Action: Reason and the Rationalization of Society (Vol. 1, T. McCarthy, Trans.). Beacon Press. Retrieved May 30, 2025, from https://teddykw2.wordpress.com/wp-content/uploads/2012/07/jurgen-habermas-theory-of-communicative-action-volume-1.pdf