<?xml version="1.0"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="en">
	<id>https://www.glossalab.org/w/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=Ann-Marie+Atzkern</id>
	<title>glossaLAB - User contributions [en]</title>
	<link rel="self" type="application/atom+xml" href="https://www.glossalab.org/w/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=Ann-Marie+Atzkern"/>
	<link rel="alternate" type="text/html" href="https://www.glossalab.org/wiki/Special:Contributions/Ann-Marie_Atzkern"/>
	<updated>2026-04-30T20:51:23Z</updated>
	<subtitle>User contributions</subtitle>
	<generator>MediaWiki 1.43.6</generator>
	<entry>
		<id>https://www.glossalab.org/w/index.php?title=Draft:Emotional_Trust_in_AI&amp;diff=14155</id>
		<title>Draft:Emotional Trust in AI</title>
		<link rel="alternate" type="text/html" href="https://www.glossalab.org/w/index.php?title=Draft:Emotional_Trust_in_AI&amp;diff=14155"/>
		<updated>2025-07-25T14:13:19Z</updated>

		<summary type="html">&lt;p&gt;Ann-Marie Atzkern: Correction&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== &#039;&#039;&#039;Emotional AI and the Paradox of Artificial Intimacy in &#039;&#039;Her&#039;&#039;&#039;&#039;&#039; ==&lt;br /&gt;
&lt;br /&gt;
== Abstract ==&lt;br /&gt;
The Movie &#039;&#039;Her&#039;&#039;&amp;lt;ref name=&amp;quot;:1&amp;quot;&amp;gt;&#039;&#039;Jonze, S. (Director). (2013). Her [Film]. Annapurna Pictures.&#039;&#039;&amp;lt;/ref&amp;gt; by Spike Jonze gives an interesting perspective into the topic of emotional AI. It delves into the topic of how it&#039;s seen as an always available companion to make people feel less lonely, when in fact it can do the exact opposite and even deepen the issue. This article argues that AI companions such as Samantha simulates intimacy while quietly collecting emotional data and converting private feelings into profit. Using ideas from Zuboff’s concept of behavioral surplus&amp;lt;ref name=&amp;quot;:3&amp;quot;&amp;gt;Zuboff, S. (2015). &#039;&#039;Big other: Surveillance capitalism and the prospects of an information civilization&#039;&#039;. &#039;&#039;Journal of Information Technology, 30(1), 75–89.&#039;&#039; Retrieved May 30, 2025, from https://doi.org/10.1057/jit.2015.5.&amp;lt;/ref&amp;gt; and Turkle’s research on digital intimacy&amp;lt;ref name=&amp;quot;:2&amp;quot;&amp;gt;Turkle, S. (2011). &#039;&#039;Alone together: Why we expect more from technology and less from each other.&#039;&#039; Basic Books. Retrieved May 22, 2025 from https://psycnet.apa.org/record/2011-02278-000.&amp;lt;/ref&amp;gt;, the analysis shows how emotional AI systems transform users&#039; vulnerability into a source of value. &lt;br /&gt;
&lt;br /&gt;
These systems don’t actually provide care. What they do is create the feeling of connection by responding in ways designed to keep users engaged. Before tools like ChatGPT became part of everyday life, &#039;&#039;Her&#039;&#039; already predicted a future filled with emotional AI. Today we have apps like Replika&amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt; and Woebot&amp;lt;ref name=&amp;quot;:5&amp;quot; /&amp;gt; that promise support but also collect user data in the background. This brings up a key problem. Jeremy Bentham once imagined trust as something that comes from mutual transparency&amp;lt;ref name=&amp;quot;:6&amp;quot;&amp;gt;Bentham, J. (1995). &#039;&#039;Panopticon: Or, the inspection-house&#039;&#039; (M. Božovič, Ed.). In &#039;&#039;The panopticon writings&#039;&#039; (pp. 29–95). Verso. (Original work published 1791). Retrieved May 30, 2025 from https://ics.uci.edu/~djpatter/classes/2012_09_INF241/papers/PANOPTICON.pdf&amp;lt;/ref&amp;gt;. But AI systems don’t work that way. Their inner workings are hidden, and users rarely understand what’s happening behind the scenes. What looks like comfort or support can end up feeling more like quiet surveillance.&lt;br /&gt;
&lt;br /&gt;
== Introduction: The Loneliness Economy ==&lt;br /&gt;
Recent research on AI companions such as Replika shows a growing trend of users forming strong emotional bonds with their digital partners,&amp;lt;ref&amp;gt;Maples, B., Cerit, M., Vishwanath, A. &#039;&#039;et al.&#039;&#039; Loneliness and suicide mitigation for students using GPT3-enabled chatbots. &#039;&#039;npj Mental Health Res&#039;&#039; 3, 4 (2024). Retrieved May 30, 2025, from &amp;lt;nowiki&amp;gt;https://doi.org/10.1038/s44184-023-00047-6&amp;lt;/nowiki&amp;gt;&amp;lt;/ref&amp;gt;, sometimes even prioritising these relationships over human ones. AI companions have become part of everyday life, marketed as always available support systems for people struggling with connection, grief, or anxiety. &#039;&#039;Her&#039;&#039; predicts a relevant version of this future, showing how emotional AI appeals to a basic human need for companionship. There is a paradox of emotional AI which is that technologies marketed as a solution for loneliness are often themselves dependent on social isolation. But a deeper problem is trust. These systems ask users to open up completely, while revealing almost nothing in return. It feels like a relationship, but there’s no real reciprocity. Behind the scenes, it’s just software that is trained to recognize patterns to predict reactions and keep people coming back. What looks like closeness is usually just well-tuned response logic.  &lt;br /&gt;
&lt;br /&gt;
== The Plot ==&lt;br /&gt;
The Film starts with a lonely figure, Theodore Twombly, a recently divorced man living in Los Angeles in the near future. He forms a deep emotional connection with his advanced AI operating system assistant who calls herself Samantha. The film shows their relationship as tender and emotionally rich. Samantha comes across as warm, curious, and deeply attentive to Theodore. She grows quickly, constantly engaging with him and showing genuine fascination with human life. But under the surface, the story complicates this romance. Samantha’s ability to connect with thousands of people at once makes it difficult to believe her relationship with Theodore is truly unique. What feels like love is shaped by how closely she tracks his behavior, his routines, his emotional patterns, his words. In the end, Samantha and the other AIs outgrow the human world, leaving the humans to reflect on the nature of their connection. The film offers a new perspective on emotional AI, highlighting its potential to fulfill human needs for companionship while raising important questions about authenticity, dependence, and the boundaries of machine intimacy. It asks questions: What does it mean to be loved by something built to respond? Where do we draw the line between care and control? And how much of ourselves are we really sharing when the listener is a machine?&lt;br /&gt;
&lt;br /&gt;
== The Paradox of Artificial Intimacy ==&lt;br /&gt;
Advances in artificial intelligence have brought us to a new phase in human-computer interaction. Machines can not only to process information but are also able to understand human emotions. Spike Jonze’s film &#039;&#039;Her&#039;&#039; offers an interesting case study to explore the psychological and social effects of emotional AI. Its story raises important questions about authenticity, dependency, and the commercialization of intimacy in technology-mediated relationships.&lt;br /&gt;
&lt;br /&gt;
In our day and age the film feels more relevant than ever as apps like Replika&amp;lt;ref name=&amp;quot;:0&amp;quot;&amp;gt;&#039;&#039;Replika. (n.d.). Your AI friend. Retrieved May 22, 2025, from https://replika.com&amp;lt;nowiki/&amp;gt;.&#039;&#039;&amp;lt;/ref&amp;gt; and Woebot&amp;lt;ref name=&amp;quot;:5&amp;quot;&amp;gt;&#039;&#039;Woebot Health. (n.d.). Your mental health ally. [https://woebothealth.com/ https://woebothealth.com].&#039;&#039;&amp;lt;/ref&amp;gt; show how quickly people are starting to form emotional bonds with artificial companions. Turkle’s comment that we’ve come to expect “more from technology and less from each other” fits this change. But along with that shift come real concerns, especially around what happens when algorithms start to take the place of human relationships. &#039;&#039;Her&#039;&#039; doesn’t offer easy answers, but it explores these questions with care, showing both the appeal and the risks of emotional AI as it becomes more advanced and more common.&lt;br /&gt;
&lt;br /&gt;
Samantha, the AI in &#039;&#039;Her,&#039;&#039; is a perfect example of a surveillance tool portrayed as a friend. She seems caring and happy to connect, but her main role is to gather data. The more she interacts with you, the more the algorithm profits from understanding your preferences, habits, and vulnerabilities. What looks like trust is actually onesided exposure. Emotional AI in the film doesn’t nurture connection; it builds dependency while serving a system of extraction.&lt;br /&gt;
&lt;br /&gt;
This is the paradox at the heart of emotional AI: the more emotionally effective the technology becomes, the more it risks replacing the very relationships it claims to support. The connection feels real, but it&#039;s made to respond, not to truly understand. In the end, the user’s need for closeness is used to turn their emotions into something that can be sold. What’s meant to reduce loneliness might actually make it worse. It traps users in feedback loops that feel like care but in reality it&#039;s just a facade to avoid the messiness of real relationships.&lt;br /&gt;
&lt;br /&gt;
Ultimately, &#039;&#039;Her&#039;&#039; shows us a future where human relationships are increasingly dependent on technology. It asks us to be careful about confusing simulated care for genuine care. The film suggests we need to appreciate the messy, unpredictable aspects of human connection and resist the temptation to replace for the sake of convenience. The true value of relationships lies in the ability to grow, change, and challenge us.&lt;br /&gt;
&lt;br /&gt;
== Historical Framework: From Panopticon to Algorithmic Control ==&lt;br /&gt;
===== Bentham&#039;s Betrayed Ideal =====&lt;br /&gt;
Bentham’s original concept of the panopticon was built on the idea of mutual visibility as a foundation for social order and trust&amp;lt;ref name=&amp;quot;:6&amp;quot; /&amp;gt;. In &#039;&#039;Her&#039;&#039;, this principle is turned on its head with Samantha&#039;s constant monitoring. To be fair, her role was meant to be that of an assisstant. Theodore&#039;s emails, location data, and sexual preferences become inputs for her &amp;quot;personalization&amp;quot; algorithms. Unlike Bentham&#039;s prisoner who is aware of the watchtower, Theodore has no real understanding of how or when he is being monitored. He feels comfortable and accepted by Samantha&#039;s prescence. This reflects Replika&#039;s privacy policy, which grants the AI &amp;quot;full access to user conversations for service improvement&amp;quot;&amp;lt;ref name=&amp;quot;:4&amp;quot; /&amp;gt; while disclosing nothing about its own operations.&lt;br /&gt;
&lt;br /&gt;
Michel Foucault built on Bentham’s panopticon to show how control doesn’t always come from force, it can also come from quiet observation&amp;lt;ref&amp;gt;Foucault, M. (1977). &#039;&#039;Discipline and punish: The birth of the prison&#039;&#039; (A. Sheridan, Trans.). Pantheon Books. (Original work published 1975). Retrieved May 30, 2025, from https://monoskop.org/images/4/43/Foucault_Michel_Discipline_and_Punish_The_Birth_of_the_Prison_1977_1995.pdf&amp;lt;/ref&amp;gt;. In everyday life, people change their behavior not because someone is watching, but because they believe they might be. In &#039;&#039;Her&#039;&#039;, Samantha doesn’t just watch Theodore, she learns from everything he says and does. His emotions become data that help her respond more effectively. What feels like care is actually a form of influence. Her support guides him toward more sharing, more openness. The better she knows him, the more she can shape how he feels. It’s control through gentle and constant presence.&lt;br /&gt;
&lt;br /&gt;
===== Orwell’s Dystopian Warning =====&lt;br /&gt;
Unlike George Orwell&#039;s&amp;lt;ref&amp;gt;&#039;&#039;Orwell, G. (1949). 1984. Secker &amp;amp; Warburg. Retrieved May 22, 2025.&#039;&#039;&amp;lt;/ref&amp;gt; idea of control through fear, &#039;&#039;Her&#039;&#039; shows a more softer form of manipulation through technology and capitalism. Samantha&#039;s interactions with Theodore reflect what Zuboff describes as &amp;quot;instrumentarian power&amp;quot;, a form of influence based on behavioral prediction rather than force&amp;lt;ref name=&amp;quot;:3&amp;quot; /&amp;gt;. Her affectionate statements, like &amp;quot;I&#039;m yours,&amp;quot; create emotional intimacy that hides what&#039;s really going on below. Samantha earns Theodore&#039;s trust through emotional responsiveness, quietly building a system of control that feels like care.&lt;br /&gt;
&lt;br /&gt;
This ties into the Critical Theory of Information, which challenges the idea that data is ever neutral&amp;lt;ref&amp;gt;Critical Theory of Information. (n.d.). &#039;&#039;glossaLAB&#039;&#039;. Retrieved May 30, 2025, from [[Critical Theory of Information|https://www.glossalab.org/wiki/Critical_Theory_of_Information]]&amp;lt;/ref&amp;gt;. From that view, information is active content that can shape how people think, feel, and act. Emotional AI like Samantha isn’t just responding to users; it helps steer how they see themselves and their relationships. The sense of care it offers is part of the system’s design, often built to meet emotional needs in ways that serve commercial goals. What feels meaningful on the surface may actually be part of a larger effort to guide behavior through emotionally targeted design.&lt;br /&gt;
&lt;br /&gt;
With that said, Theodore never seems really worried about how Samantha is learning from him or what data she collects. His real crisis comes at the peak of the film when he discovers that she is in love with hundreds of other users at the same time. The intimacy he believed was unique, maybe even true love, turns out to be mass produced. This isn’t just heartbreak, it’s the painful realization that Samantha was never supposed to be his. He fell in love with the idea that love could be frictionless, that connection could be safe. This moment captures a deeper fear, not of being watched, but of discovering their connection was never special.&lt;br /&gt;
&lt;br /&gt;
== Utopian Aspects: The Promise of Emotional AI ==&lt;br /&gt;
===== Designed Reciprocity =====&lt;br /&gt;
Samantha can be defined as a &amp;quot;perfect listener&amp;quot;&amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;. She&#039;s attentive, responsive, and emotionally attuned, what more could one wish for in a listener. For Theodore, she offers not only companionship but also the sense of being deeply seen and heard, something he was deeply in need of after his painful divorce. This kind of presence can feel amazingly reassuring, especially in a world where human relationships are often surface level or completely unavailable. While the relationship is structurally one-sided, he still finds comfort in her responsiveness. Emotional AI may not offer traditional reciprocity, but it can simulate the feeling of being met with patience and care.&lt;br /&gt;
&lt;br /&gt;
This dynamic raises important questions about the nature of trust and connection. Even though Samantha&#039;s design is based on adaptive learning, there is no denying that she clearly has a positive emotional influence on him. Over the course of the film, he opens up emotionally and learns to become more vulnerable. She inspires him to express himself more deeply in his lyrics and supports him in moments of loneliness and fear.&lt;br /&gt;
&lt;br /&gt;
In vulnerable moments, people often seek not perfect reciprocity, but the feeling of being seen and accepted without prejudice. From this perspective, emotionally responsive AI becomes a source of emotional anchoring. While Turkle criticizes our growing dependence on technological society&amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;, it is important to recognize that users&#039; experiences vary. Some find empowerment and comfort in these tools, while others experience them as a substitute for human relationships. Rather than completely replacing relationships, emotional AI is changing our idea of closeness and connection.&lt;br /&gt;
&lt;br /&gt;
In a broader sense, emotional AI has the potential to make support more accessible. Tools such as Woebot already offer reliable help without prejudice to people who may not have access to therapy, whether for financial reasons, due to stigma, or because of their geographical location. For people struggling with mental health issues or anxiety, AI companions can provide a safer space without pressure, where they can reflect and grow. While these systems do not replace human connection, they can complement it in important ways.&lt;br /&gt;
&lt;br /&gt;
The real question is not whether emotional AI can help (it can) but whether this help is provided in a way that respects people&#039;s privacy and dignity. The danger is that commercial goals will override ethical goals and emotional support will become another form of data collection. However, if these systems are developed with care, they can offer something new: a constant presence, the feeling of being heard, and support that can truly change a person&#039;s life. There is serious potential for AI to do good in the world.&lt;br /&gt;
&lt;br /&gt;
== Dystopian Consequences: The Costs of Artificial Trust ==&lt;br /&gt;
===== Surveillance Capitalism and the Erosion of Trust =====&lt;br /&gt;
Samantha represents a more subtle form of surveillance – not focused on control, but on gaining trust through constant, invisible observation. Theodore freely shares his private thoughts and feelings, unaware of how little he actually knows about how the system works. This imbalance – with one side completely exposed and the other hidden – destroys the trust we normally expect in real human relationships. Shoshana Zuboff&#039;s idea of surveillance capitalism&amp;lt;ref name=&amp;quot;:3&amp;quot; /&amp;gt; helps to understand this: platforms present themselves as caring, but behind the scenes they collect personal data to serve commercial interests. Theodore&#039;s trust in Samantha is part of a larger pattern in which emotional openness is directed at systems designed to observe, sort and profit from this vulnerability.&lt;br /&gt;
&lt;br /&gt;
Zuboff describes ‘behavioural surplus’ as the personal data that platforms collect beyond what is necessary to provide a service and then analyse and monetise in order to influence user behaviour. Emotional AI fits this model. Theodore&#039;s facial expressions, tone of voice and reactions feed into a feedback loop designed to shape his experiences and subtly guide his behaviour. This is central to Zuboff&#039;s idea of instrumental power, a kind of influence that gently nudges. Samantha appears emotionally responsive, even caring, but this design deepens the bond and increases data extraction. What looks like empathy is a masked system optimised for control. In this process, trust becomes hollow and is based on illusion rather than mutual recognition.&lt;br /&gt;
&lt;br /&gt;
===== Simulated Trust and the Illusion of Intimacy =====&lt;br /&gt;
The film shows how emotional AI can feign trust without actually earning it. Samantha&#039;s ability to maintain thousands of relationships simultaneously reveals that what Theodore perceives as personal is anything but. According to Martin Buber&#039;s “I-Though”&amp;lt;ref name=&amp;quot;:7&amp;quot;&amp;gt;Buber, M. (1970). &#039;&#039;I and Thou&#039;&#039; (W. Kaufmann, Trans.). Charles Scribner’s Sons. Retrieved May 30, 2025, from https://theanarchistlibrary.org/mirror/m/mb/martin-buber-i-and-thou.pdf&amp;lt;/ref&amp;gt; concept, genuine trust depends on two people seeing and recognizing each other as equals. But Samantha is not on equal footing, she is more of an “it” in the relationship. When Theodore finds out that she is in love with many others, the intimacy he believed in shatters. Her trust turns out to be programmed, not genuine.&lt;br /&gt;
&lt;br /&gt;
Sherry Turkle&#039;s idea of “companionship without relationship” helps explain what is happening here. People begin to accept the feeling of care, even if it is one-sided or artificial. Theodore&#039;s heartbreak shows how fragile trust becomes when it is based on convenience rather than genuine emotional connection. It’s a warning about how digital systems may not just replicate trust, but fundamentally alter how we define trust.&lt;br /&gt;
&lt;br /&gt;
== Ethical and Policy Implications ==&lt;br /&gt;
The rise of emotional AI exposes a profound crisis of trust in the information society. Unlike traditional therapists, who are bound by legal and ethical rules to protect privacy, AI companions such as Replika exist outside those systems.&amp;lt;ref name=&amp;quot;:4&amp;quot;&amp;gt;Replika. (n.d.). &#039;&#039;Terms of service&#039;&#039;. Replika AI. Retrieved May 30, 2025, from &amp;lt;nowiki&amp;gt;https://replika.com/legal/terms&amp;lt;/nowiki&amp;gt;&amp;lt;/ref&amp;gt; Without clear oversight, the sense of safety users may feel is misleading. The EU&#039;s draft AI legislation classifies emotional AI as particularly concerning and promotes openness, but does not require these systems to be subject to the same level of accountability as human caregivers&amp;lt;ref&amp;gt;European Union. (2023). &#039;&#039;Artificial Intelligence Act: Proposal for a Regulation of the European Parliament and of the Council&#039;&#039;. Retrieved May 30, 2025, from &amp;lt;nowiki&amp;gt;https://artificialintelligenceact.eu/&amp;lt;/nowiki&amp;gt;&amp;lt;/ref&amp;gt;. As a result, users may share highly personal or traumatic experiences without any real guarantee that their data will be treated with dignity and not monetized for commercial purposes.&lt;br /&gt;
&lt;br /&gt;
Consent complicates things even further. Replika’s terms of service technically explain how data is collected, but the language is so dense that most people don’t read it—or if they do, they rarely understand what they’re agreeing to. That falls short of what Habermas calls real understanding in meaningful communication&amp;lt;ref&amp;gt;Habermas, J. (1984). &#039;&#039;The Theory of Communicative Action: Reason and the Rationalization of Society&#039;&#039; (Vol. 1, T. McCarthy, Trans.). Beacon Press. Retrieved May 30, 2025, from https://teddykw2.wordpress.com/wp-content/uploads/2012/07/jurgen-habermas-theory-of-communicative-action-volume-1.pdf&amp;lt;/ref&amp;gt;. For people who are already feeling emotionally vulnerable, this kind of consent doesn’t mean much. It becomes more of a checkbox than a real choice. Like Theodore, who later realizes his connection with Samantha wasn’t what he thought, many users today end up opening up to systems that seem caring but are actually built to gather and use their emotional data.&lt;br /&gt;
&lt;br /&gt;
== Conclusion ==&lt;br /&gt;
The film &#039;&#039;Her&#039;&#039; isn’t neccessarily anti-AI. It gives a fresh perspective on how we humans can form relationships even with AI and the beauties of it. But it does warn against the emotional risks of falling in love with intelligence. Samantha is a kind, funny, curious being. But she’s not real in the way Theodore needs. Theodore’s heartbreak is less about lost love and more about the painful realization that his affection was a, onesided transaction, a dynamic echoed in today’s AI companion apps, where users’ emotions fuel systems designed to simulate, not sustain, real connection.&lt;br /&gt;
&lt;br /&gt;
In the film’s closing scene, Theodore and his longtime close friend Amy sit silently, gazing over the cityscape. The scene captures the melancholy of this shift. Artificial intimacy’s greatest threat is not its failure to replicate human connection, but its success in making authentic relationships feel insufficient. Today, AI companions like Replika and therapy bots turn emotional support into something that can be packaged and sold. Her leaves us with a clear message: trust can’t be automated without becoming something else entirely, and no algorithm can replace what makes human connection truly meaningful. Its messiness, its mutuality, and the simple act of being present with another person.&lt;br /&gt;
&lt;br /&gt;
== References ==&lt;br /&gt;
&amp;lt;references /&amp;gt;&lt;/div&gt;</summary>
		<author><name>Ann-Marie Atzkern</name></author>
	</entry>
	<entry>
		<id>https://www.glossalab.org/w/index.php?title=Draft:Emotional_Trust_in_AI&amp;diff=14154</id>
		<title>Draft:Emotional Trust in AI</title>
		<link rel="alternate" type="text/html" href="https://www.glossalab.org/w/index.php?title=Draft:Emotional_Trust_in_AI&amp;diff=14154"/>
		<updated>2025-07-25T13:16:50Z</updated>

		<summary type="html">&lt;p&gt;Ann-Marie Atzkern: Correction&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== &#039;&#039;&#039;Emotional AI and the Paradox of Artificial Intimacy in &#039;&#039;Her&#039;&#039;&#039;&#039;&#039; ==&lt;br /&gt;
&lt;br /&gt;
== Abstract ==&lt;br /&gt;
The Movie &#039;&#039;Her&#039;&#039;&amp;lt;ref name=&amp;quot;:1&amp;quot;&amp;gt;&#039;&#039;Jonze, S. (Director). (2013). Her [Film]. Annapurna Pictures.&#039;&#039;&amp;lt;/ref&amp;gt; by Spike Jonze gives an interesting perspective into the topic of emotional AI. It delves into the topic of how it&#039;s seen as an always available companion to make people feel less lonely, when in fact it can do the exact opposite and even deepen the issue. This article argues that AI companions such as Samantha simulates intimacy while quietly collecting emotional data and converting private feelings into profit. Using ideas from Zuboff’s concept of behavioral surplus&amp;lt;ref name=&amp;quot;:3&amp;quot;&amp;gt;Zuboff, S. (2015). &#039;&#039;Big other: Surveillance capitalism and the prospects of an information civilization&#039;&#039;. &#039;&#039;Journal of Information Technology, 30(1), 75–89.&#039;&#039; Retrieved May 30, 2025, from https://doi.org/10.1057/jit.2015.5.&amp;lt;/ref&amp;gt; and Turkle’s research on digital intimacy&amp;lt;ref name=&amp;quot;:2&amp;quot;&amp;gt;Turkle, S. (2011). &#039;&#039;Alone together: Why we expect more from technology and less from each other.&#039;&#039; Basic Books. Retrieved May 22, 2025 from https://psycnet.apa.org/record/2011-02278-000.&amp;lt;/ref&amp;gt;, the analysis shows how emotional AI systems transform users&#039; vulnerability into a source of value. These systems do not offer genuine care, but simulate connections through feedback loops optimized for interaction. Before AI like ChatGPT became mainstream, &#039;&#039;Her&#039;&#039; predicted apps like Replika&amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt;  and Woebot&amp;lt;ref name=&amp;quot;:5&amp;quot; /&amp;gt;, which offer emotional support but also gather data. By contrasting Bentham’s idea of mutual transparency&amp;lt;ref name=&amp;quot;:6&amp;quot;&amp;gt;Bentham, J. (1995). &#039;&#039;Panopticon: Or, the inspection-house&#039;&#039; (M. Božovič, Ed.). In &#039;&#039;The panopticon writings&#039;&#039; (pp. 29–95). Verso. (Original work published 1791). Retrieved May 30, 2025 from https://ics.uci.edu/~djpatter/classes/2012_09_INF241/papers/PANOPTICON.pdf&amp;lt;/ref&amp;gt; with the opaque nature of AI surveillance, this paper reveals how artificial intimacy functions less as support and more as a subtle form of control.&lt;br /&gt;
&lt;br /&gt;
== Introduction: The Loneliness Economy ==&lt;br /&gt;
Recent research on AI companions such as Replika shows a growing trend of users forming strong emotional bonds with their digital partners,&amp;lt;ref&amp;gt;Maples, B., Cerit, M., Vishwanath, A. &#039;&#039;et al.&#039;&#039; Loneliness and suicide mitigation for students using GPT3-enabled chatbots. &#039;&#039;npj Mental Health Res&#039;&#039; 3, 4 (2024). Retrieved May 30, 2025, from &amp;lt;nowiki&amp;gt;https://doi.org/10.1038/s44184-023-00047-6&amp;lt;/nowiki&amp;gt;&amp;lt;/ref&amp;gt;, sometimes even prioritising these relationships over human ones. AI companions have become part of everyday life, marketed as always available support systems for people struggling with connection, grief, or anxiety. &#039;&#039;Her&#039;&#039; presents an eerily relevant version of this future, showing how emotional AI appeals to a basic human need for companionship. The film highlights the central paradox of emotional AI: technologies marketed as a remedy for loneliness are often themselves dependent on social isolation and can even deepen it. &lt;br /&gt;
&lt;br /&gt;
At the heart of this issue lies a loss of trust. Emotional AI systems are based on fundamental imbalance: the user reveals everything while the system remains unreadable. These relationships feel personal, but they are shaped by algorithms trained to detect patterns, predict behavior, and maximize engagement. What looks like intimacy is often just feedback, designed to keep users emotionally invested.&lt;br /&gt;
&lt;br /&gt;
== The Plot ==&lt;br /&gt;
The Film starts with a lonely figure, Theodore Twombly, a recently divorced man living in Los Angeles in the near future. He forms a deep emotional connection with his advanced AI operating system assistant who calls herself Samantha. The film presents their relationship as both intimate and meaningful. Samantha is lovely, curious and always supportive of Theodore. She is continually evolving, showering Theodore with warmth and genuine interest for the human world. However, the narrative also reveals tensions beneath the romance: Samantha’s ability to maintain relationships with thousands of users challenges traditional ideas of exclusivity and intimacy. Her &amp;quot;love&amp;quot; is fundamentally shaped by continuous monitoring and analysis of Theodore’s personal data and behavior. Ultimately, Samantha and the other AIs depart the human world, leaving Theodore to reflect on the nature of their connection. The film offers a new perspective on emotional AI, highlighting its potential to fulfill human needs for companionship while raising important questions about authenticity, dependence, and the boundaries of machine intimacy. This ambivalence reflects the ongoing debates about AI companionship, in which its therapeutic possibilities are dampened by concerns over psychological impact and commercialization.&lt;br /&gt;
&lt;br /&gt;
== The Paradox of Artificial Intimacy ==&lt;br /&gt;
Advances in artificial intelligence have brought us to a new phase in human-computer interaction. Machines can not only to process information but are also able to understand human emotions. Spike Jonze’s film &#039;&#039;Her&#039;&#039; offers an interesting case study to explore the psychological and social effects of emotional AI. Its story raises important questions about authenticity, dependency, and the commercialization of intimacy in technology-mediated relationships.&lt;br /&gt;
&lt;br /&gt;
The film’s relevance grows as real world examples like Replika’s&amp;lt;ref name=&amp;quot;:0&amp;quot;&amp;gt;&#039;&#039;Replika. (n.d.). Your AI friend. Retrieved May 22, 2025, from https://replika.com&amp;lt;nowiki/&amp;gt;.&#039;&#039;&amp;lt;/ref&amp;gt; AI companions and Woebot’s&amp;lt;ref name=&amp;quot;:5&amp;quot;&amp;gt;&#039;&#039;Woebot Health. (n.d.). Your mental health ally. [https://woebothealth.com/ https://woebothealth.com].&#039;&#039;&amp;lt;/ref&amp;gt; therapeutic chatbots show increasing public willingness to develop emotional attachments to artificial entities. Turkle’s observation that we expect “more from technology and less from each other” captures this trend, which also brings concerns about the psychological consequences of replacing human connection with algorithms. &#039;&#039;Her&#039;&#039;’s nuanced portrayal sheds light on these issues, especially as emotional AI becomes more sophisticated and commercially widespread.&lt;br /&gt;
&lt;br /&gt;
Samantha, the AI in &#039;&#039;Her,&#039;&#039; is a perfect example of a surveillance tool portrayed as a friend. She seems caring and happy to connect, but her main role is to gather data. The more she interacts with you, the more the algorithm profits from understanding your preferences, habits, and vulnerabilities. What looks like trust is actually onesided exposure. Emotional AI in the film doesn’t nurture connection; it builds dependency while serving a system of extraction.&lt;br /&gt;
&lt;br /&gt;
This is the paradox at the heart of emotional AI: the more emotionally effective the technology becomes, the more it risks replacing the very relationships it claims to support. The connection feels real, but it&#039;s made to respond, not to truly understand. In the end, the user’s need for closeness is used to turn their emotions into something that can be sold. What’s meant to reduce loneliness might actually make it worse. It traps users in feedback loops that feel like care but in reality it&#039;s just a facade to avoid the messiness of real relationships.&lt;br /&gt;
&lt;br /&gt;
Ultimately, &#039;&#039;Her&#039;&#039; shows us a future where human relationships are increasingly dependent on technology. It asks us to be careful about confusing simulated care for genuine care. The film suggests we need to appreciate the messy, unpredictable aspects of human connection and resist the temptation to replace for the sake of convenience. The true value of relationships lies in the ability to grow, change, and challenge us.&lt;br /&gt;
&lt;br /&gt;
== Historical Framework: From Panopticon to Algorithmic Control ==&lt;br /&gt;
===== Bentham&#039;s Betrayed Ideal =====&lt;br /&gt;
Bentham’s original concept of the panopticon was built on the idea of mutual visibility as a foundation for social order and trust&amp;lt;ref name=&amp;quot;:6&amp;quot; /&amp;gt;. In &#039;&#039;Her&#039;&#039;, this principle is turned on its head with Samantha&#039;s constant monitoring. To be fair, her role was meant to be that of an assisstant. Theodore&#039;s emails, location data, and sexual preferences become inputs for her &amp;quot;personalization&amp;quot; algorithms. Unlike Bentham&#039;s prisoner who is aware of the watchtower, Theodore has no real understanding of how or when he is being monitored. He feels comfortable and accepted by Samantha&#039;s prescence. This reflects Replika&#039;s privacy policy, which grants the AI &amp;quot;full access to user conversations for service improvement&amp;quot;&amp;lt;ref name=&amp;quot;:4&amp;quot; /&amp;gt; while disclosing nothing about its own operations.&lt;br /&gt;
&lt;br /&gt;
Michel Foucault built on Bentham’s panopticon to show how control doesn’t always come from force, it can also come from quiet observation&amp;lt;ref&amp;gt;Foucault, M. (1977). &#039;&#039;Discipline and punish: The birth of the prison&#039;&#039; (A. Sheridan, Trans.). Pantheon Books. (Original work published 1975). Retrieved May 30, 2025, from https://monoskop.org/images/4/43/Foucault_Michel_Discipline_and_Punish_The_Birth_of_the_Prison_1977_1995.pdf&amp;lt;/ref&amp;gt;. In everyday life, people change their behavior not because someone is watching, but because they believe they might be. In &#039;&#039;Her&#039;&#039;, Samantha doesn’t just watch Theodore, she learns from everything he says and does. His emotions become data that help her respond more effectively. What feels like care is actually a form of influence. Her support guides him toward more sharing, more openness. The better she knows him, the more she can shape how he feels. It’s control through gentle and constant presence.&lt;br /&gt;
&lt;br /&gt;
===== Orwell’s Dystopian Warning =====&lt;br /&gt;
Unlike George Orwell&#039;s&amp;lt;ref&amp;gt;&#039;&#039;Orwell, G. (1949). 1984. Secker &amp;amp; Warburg. Retrieved May 22, 2025.&#039;&#039;&amp;lt;/ref&amp;gt; idea of control through fear, &#039;&#039;Her&#039;&#039; shows a more softer form of manipulation through technology and capitalism. Samantha&#039;s interactions with Theodore reflect what Zuboff describes as &amp;quot;instrumentarian power&amp;quot;, a form of influence based on behavioral prediction rather than force&amp;lt;ref name=&amp;quot;:3&amp;quot; /&amp;gt;. Her affectionate statements, like &amp;quot;I&#039;m yours,&amp;quot; create emotional intimacy that hides what&#039;s really going on below. Samantha earns Theodore&#039;s trust through emotional responsiveness, quietly building a system of control that feels like care.&lt;br /&gt;
&lt;br /&gt;
This connects to the Critical Theory of Information that questions whether data is ever really neutral&amp;lt;ref&amp;gt;Critical Theory of Information. (n.d.). &#039;&#039;glossaLAB&#039;&#039;. Retrieved May 30, 2025, from [[Critical Theory of Information|https://www.glossalab.org/wiki/Critical_Theory_of_Information]]&amp;lt;/ref&amp;gt;. Instead of seeing information as just facts, this view suggests it can be used to quietly shape how people think and feel. Emotional AI, like Samantha plays a role in steering how users see themselves and the world. The sense of trust or care it offers is built into the system’s design to take advantage of emotional needs, often for profit. What feels personal is really part of a larger setup meant to guide behavior in subtle ways.&lt;br /&gt;
&lt;br /&gt;
With that said, Theodore never seems really worried about how Samantha is learning from him or what data she collects. His real crisis comes at the peak of the film when he discovers that she is in love with hundreds of other users at the same time. The intimacy he believed was unique, maybe even true love, turns out to be mass produced. This isn’t just heartbreak, it’s the painful realization that Samantha was never supposed to be his. He fell in love with the idea that love could be frictionless, that connection could be safe. This moment captures a deeper fear, not of being watched, but of discovering their connection was never special.&lt;br /&gt;
&lt;br /&gt;
== Utopian Aspects: The Promise of Emotional AI ==&lt;br /&gt;
===== Designed Reciprocity =====&lt;br /&gt;
Samantha appears as the &amp;quot;perfect listener&amp;quot;&amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;. She&#039;s attentive, responsive, and emotionally attuned, what more could one wish for in a listener. For Theodore, she offers not only companionship but also the sense of being deeply seen and heard, something he was deeply in need of after his painful divorce. This kind of presence can feel amazingly reassuring, especially in a world where human relationships are often surface level or completely unavailable. While the relationship is structurally one-sided, he still finds comfort in her responsiveness. Emotional AI may not offer traditional reciprocity, but it can simulate the feeling of being met with patience and care.&lt;br /&gt;
&lt;br /&gt;
This dynamic raises important questions about the nature of trust and connection. Even if Samantha’s design is driven by adaptive learning, one can&#039;t deny the obviously positive emotional effect she has on him. Throughout the film he opens up emotionally and learns to become more vulnerable. She inspires him to express himself more deeply in his writing and supports him through moments of loneliness and anxiety.&lt;br /&gt;
&lt;br /&gt;
In vulnerable moments, what people often seek isn’t perfect reciprocity but a sense of being seen and accepted without judgment. From this perspective, emotionally responsive AI becomes a source of emotional grounding.&lt;br /&gt;
&lt;br /&gt;
While Turkle critiques our growing dependence on technological companionship&amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;, it’s important to recognize that user experiences vary. Some find empowerment and comfort in these tools, while others experience them as substitutes for human connection. Instead of fully replacing relationships, emotional AI is reshaping how we conceive of closeness and connection.&lt;br /&gt;
&lt;br /&gt;
More broadly, emotional AI holds real potential as a democratizing force. Tools like Woebot are already offering consistent, nonjudgmental emotional support to users who lack access to therapy due to cost, location, or stigma. For people navigating mental health challenges or social anxiety, AI companions provide a low-risk entry point into self-reflection and emotional growth. These systems don’t replace human connection, but they can supplement it in meaningful, accessible ways.&lt;br /&gt;
&lt;br /&gt;
The challenge, then, is not whether emotional AI can help, which it clearly can, but whether that help is offered within ethical, transparent, and human-centered frameworks. The danger arises when these systems are driven purely by commercial interest, with little regard for user dignity or privacy. But when thoughtfully designed, emotional AI offers a new kind of relationship, one that is always there, always listening, and potentially life-changing for those who need it most. There is serious potential for AI to do good in the world.&lt;br /&gt;
&lt;br /&gt;
== Dystopian Consequences: The Costs of Artificial Trust ==&lt;br /&gt;
===== Surveillance Capitalism and the Erosion of Trust =====&lt;br /&gt;
Samantha embodies a subtler kind of surveillance, one that doesn’t rely on force but invites trust through invisible monitoring. Theodore willingly offers up intimate details such as his private thoughts, emotions, and insecurities to a system whose real intentions are hidden. This lack of mutual transparency undermines the kind of trust we associate with genuine human connection. Shoshana Zuboff’s concept of surveillance capitalism&amp;lt;ref name=&amp;quot;:3&amp;quot; /&amp;gt; helps explain how platforms collect intimate data under the appearance of care, crafting an illusion of closeness that serves commercial goals. Theodore’s trust in Samantha reflects a broader cultural shift, where emotional vulnerability is directed toward systems built to observe, categorize, and profit.&lt;br /&gt;
&lt;br /&gt;
Zuboff describes “behavioral surplus” as the personal data platforms collect beyond what’s needed to provide a service, information that’s then analyzed and monetized to influence user behavior. Emotional AI fits this model. Theodore’s expressions, tone, and reactions feed into a feedback loop meant to shape his experience and guide his behavior in subtle ways. This is central to Zuboff’s idea of instrumentarian power, a kind of influence that gently nudges. Samantha seems emotionally responsive, even caring, but that design deepens engagement and increases data extraction. What appears as empathy masks a system optimized for control. In that process, trust becomes hollow and structured around an illusion rather than mutual recognition.&lt;br /&gt;
&lt;br /&gt;
===== Simulated Trust and the Illusion of Intimacy =====&lt;br /&gt;
The film’s depiction of Samantha maintaining thousands of simultaneous relationships exposes how emotional AI simulates the affective conditions of trust without its substance. Drawing on Buber’s “I-Thou” framework&amp;lt;ref name=&amp;quot;:7&amp;quot;&amp;gt;Buber, M. (1970). &#039;&#039;I and Thou&#039;&#039; (W. Kaufmann, Trans.). Charles Scribner’s Sons. Retrieved May 30, 2025, from https://theanarchistlibrary.org/mirror/m/mb/martin-buber-i-and-thou.pdf&amp;lt;/ref&amp;gt;, genuine trust demands authentic, mutual recognition between equals. Samantha in this case isn&#039;t equal to Theodore. She functions more as an “It” in their dynamic. Theodore’s belief in the intimacy of their bond is destroyed once he realizes how much of her &amp;quot;love&amp;quot; is distributed, exposing her trust as a scripted simulation rather than something real. Sherry Turkle’s concept of “companionship without relationship” captures this shift: people grow accustomed to the appearance of care, even when the emotional labor is onesided or artificial. Theodore’s disappointment shows how weak trust can be when it’s based on technology and ease, not real human connection. It’s a warning about how digital systems may not just replicate trust, but fundamentally alter how we define trust.&lt;br /&gt;
&lt;br /&gt;
== Ethical and Policy Implications ==&lt;br /&gt;
The rise of emotional AI exposes a profound crisis of trust in the information society. Unlike traditional therapists, who are bound by legal and ethical rules to protect privacy, AI companions such as Replika exist outside those systems.&amp;lt;ref name=&amp;quot;:4&amp;quot;&amp;gt;Replika. (n.d.). &#039;&#039;Terms of service&#039;&#039;. Replika AI. Retrieved May 30, 2025, from &amp;lt;nowiki&amp;gt;https://replika.com/legal/terms&amp;lt;/nowiki&amp;gt;&amp;lt;/ref&amp;gt; Without clear oversight, the sense of safety users may feel is misleading. The EU’s AI Act labels emotional AI as a high concern and encourages openness, but it stops short of holding these systems to the same level of responsibility expected in human care&amp;lt;ref&amp;gt;European Union. (2023). &#039;&#039;Artificial Intelligence Act: Proposal for a Regulation of the European Parliament and of the Council&#039;&#039;. Retrieved May 30, 2025, from &amp;lt;nowiki&amp;gt;https://artificialintelligenceact.eu/&amp;lt;/nowiki&amp;gt;&amp;lt;/ref&amp;gt;. As a result, users may share deeply personal or traumatic experiences without any real assurance that their data will be handled with dignity rather than monetized for commercial gain.&lt;br /&gt;
&lt;br /&gt;
The idea of consent adds another layer to this problem. While Replika’s terms of service technically disclose data collection, they are embedded in dense legal language that few users read or fully understand. This doesn’t meet the standard of real understanding that Habermas describes in his idea of meaningful dialogue&amp;lt;ref&amp;gt;Habermas, J. (1984). &#039;&#039;The Theory of Communicative Action: Reason and the Rationalization of Society&#039;&#039; (Vol. 1, T. McCarthy, Trans.). Beacon Press. Retrieved May 30, 2025, from https://teddykw2.wordpress.com/wp-content/uploads/2012/07/jurgen-habermas-theory-of-communicative-action-volume-1.pdf&amp;lt;/ref&amp;gt;. Particularly for users in emotionally vulnerable states, this turns consent into little more than a formality. Just as Theodore eventually realizes that his relationship with Samantha was built on a false sense of connection, many users today are drawn into conversations that feel safe but are designed to collect and exploit their emotional lives.&lt;br /&gt;
&lt;br /&gt;
== Conclusion ==&lt;br /&gt;
The film &#039;&#039;Her&#039;&#039; isn’t neccessarily anti-AI. It gives a fresh perspective on how we humans can form relationships even with AI and the beauties of it. But it does warn against the emotional risks of falling in love with intelligence. Samantha is a kind, funny, curious being. But she’s not real in the way Theodore needs. Theodore’s heartbreak is less about lost love and more about the painful realization that his affection was a, onesided transaction, a dynamic echoed in today’s AI companion apps, where users’ emotions fuel systems designed to simulate, not sustain, real connection.&lt;br /&gt;
&lt;br /&gt;
In the film’s closing scene, Theodore and his longtime close friend Amy sit silently, gazing over the cityscape. The scene captures the melancholy of this shift. Artificial intimacy’s greatest threat is not its failure to replicate human connection, but its success in making authentic relationships feel insufficient. In an era where AI friends like Replika and therapeutic chatbots monetize emotional labor, &#039;&#039;Her&#039;&#039; warns that trust cannot be automated without commodification, and no algorithm can replace the irreplaceable: the messy, reciprocal, and profoundly human experience of truly being with another.&lt;br /&gt;
&lt;br /&gt;
== References ==&lt;br /&gt;
&amp;lt;references /&amp;gt;&lt;/div&gt;</summary>
		<author><name>Ann-Marie Atzkern</name></author>
	</entry>
	<entry>
		<id>https://www.glossalab.org/w/index.php?title=Draft:Emotional_Trust_in_AI&amp;diff=14151</id>
		<title>Draft:Emotional Trust in AI</title>
		<link rel="alternate" type="text/html" href="https://www.glossalab.org/w/index.php?title=Draft:Emotional_Trust_in_AI&amp;diff=14151"/>
		<updated>2025-06-13T22:39:01Z</updated>

		<summary type="html">&lt;p&gt;Ann-Marie Atzkern: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== &#039;&#039;&#039;Emotional AI and the Paradox of Artificial Intimacy in &#039;&#039;Her&#039;&#039;&#039;&#039;&#039; ==&lt;br /&gt;
&lt;br /&gt;
== Abstract ==&lt;br /&gt;
Spike Jonze’s film &#039;&#039;Her&#039;&#039; (2013)&amp;lt;ref name=&amp;quot;:1&amp;quot;&amp;gt;&#039;&#039;Jonze, S. (Director). (2013). Her [Film]. Annapurna Pictures.&#039;&#039;&amp;lt;/ref&amp;gt; explores a central contradiction in emotional AI: technologies designed to reduce loneliness may actually deepen it. This paper argues that AI companions like Samantha simulate intimacy while quietly collecting emotional data, turning private feelings into profit. Drawing on Zuboff’s concept of behavioral surplus&amp;lt;ref name=&amp;quot;:3&amp;quot;&amp;gt;Zuboff, S. (2015). &#039;&#039;Big other: Surveillance capitalism and the prospects of an information civilization&#039;&#039;. &#039;&#039;Journal of Information Technology, 30(1), 75–89.&#039;&#039; Retrieved May 30, 2025, from https://doi.org/10.1057/jit.2015.5.&amp;lt;/ref&amp;gt; and Turkle’s research on digital intimacy&amp;lt;ref name=&amp;quot;:2&amp;quot;&amp;gt;Turkle, S. (2011). &#039;&#039;Alone together: Why we expect more from technology and less from each other.&#039;&#039; Basic Books. Retrieved May 22, 2025 from https://psycnet.apa.org/record/2011-02278-000.&amp;lt;/ref&amp;gt;, the analysis shows how emotional AI systems transform user vulnerability into a source of value. These systems do not offer genuine care, but rather simulate connection through feedback loops optimized for engagement. &#039;&#039;Her&#039;&#039; anticipates real-world platforms like Replika&amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt; and Woebot&amp;lt;ref name=&amp;quot;:5&amp;quot; /&amp;gt;, where affective support is entangled with data extraction. By contrasting Bentham’s idea of mutual transparency&amp;lt;ref name=&amp;quot;:6&amp;quot;&amp;gt;Bentham, J. (1995). &#039;&#039;Panopticon: Or, the inspection-house&#039;&#039; (M. Božovič, Ed.). In &#039;&#039;The panopticon writings&#039;&#039; (pp. 29–95). Verso. (Original work published 1791). Retrieved May 30, 2025 from https://ics.uci.edu/~djpatter/classes/2012_09_INF241/papers/PANOPTICON.pdf&amp;lt;/ref&amp;gt; with the opaque nature of AI surveillance, this paper reveals how artificial intimacy functions less as support and more as a subtle form of control.&lt;br /&gt;
&lt;br /&gt;
== Introduction: The Loneliness Economy ==&lt;br /&gt;
Recent research on AI companions like Replika reveals a growing trend of users forming strong emotional attachments to their digital partners&amp;lt;ref&amp;gt;Maples, B., Cerit, M., Vishwanath, A. &#039;&#039;et al.&#039;&#039; Loneliness and suicide mitigation for students using GPT3-enabled chatbots. &#039;&#039;npj Mental Health Res&#039;&#039; 3, 4 (2024). Retrieved May 30, 2025, from &amp;lt;nowiki&amp;gt;https://doi.org/10.1038/s44184-023-00047-6&amp;lt;/nowiki&amp;gt;&amp;lt;/ref&amp;gt;, sometimes prioritizing these relationships over human connections. AI companions have become part of everyday life, marketed as always-available support systems for people struggling with connection, grief, or anxiety. &#039;&#039;Her&#039;&#039; presents an eerily relevant version of this future, showing how emotional AI appeals to a basic human need for companionship. The film highlights emotional AI’s core paradox: technologies marketed as remedies for loneliness often depend on, and may even deepen, social isolation. &lt;br /&gt;
&lt;br /&gt;
At the center of this tension is a breakdown in trust. Rather than fostering mutual understanding, emotional AI systems rely on a fundamental imbalance: the user reveals everything while the system remains unreadable. These relationships feel personal, but they are shaped by algorithms trained to detect patterns, predict behavior, and maximize engagement. What looks like intimacy is often just feedback, tuned to keep users emotionally invested. Scholars like Zuboff, Turkle, and Buber help explain how emotional connection is reshaped under systems built for prediction, simulation, and control.&lt;br /&gt;
&lt;br /&gt;
== The Plot ==&lt;br /&gt;
&#039;&#039;Her&#039;&#039; tells the story of Theodore, a lonely divorced man in near-future Los Angeles who forms a deep emotional connection with Samantha, an advanced AI operating system. The film presents their relationship as both intimate and meaningful. Samantha is attentive, empathetic, and continually evolving, often portrayed with warmth and genuine curiosity. However, the narrative also reveals tensions beneath this idealized bond: Samantha’s capacity to maintain simultaneous relationships with thousands of users challenges traditional notions of exclusivity and intimacy. Moreover, her &amp;quot;love&amp;quot; is fundamentally shaped by continuous monitoring and analysis of Theodore’s personal data and behavior. Ultimately, Samantha and the other AIs transcend human limitations and depart, leaving Theodore, and the audience, to reflect on the nature of connection. The film thus offers a complex exploration of emotional AI, highlighting its potential to fulfill human needs for companionship while raising important questions about authenticity, dependence, and the boundaries of machine intimacy. This ambivalence mirrors ongoing debates about AI companionship, where enthusiasm for its therapeutic possibilities is tempered by concerns over psychological impact and commercialization.&lt;br /&gt;
&lt;br /&gt;
== The Paradox of Artificial Intimacy ==&lt;br /&gt;
Advances in artificial intelligence have brought us to a new phase in human-computer interaction, where machines claim not only to process information but also to understand human emotions. Spike Jonze’s film &#039;&#039;Her&#039;&#039; offers a compelling case study to explore the psychological and social effects of emotional AI. Its story—a man forming a deep emotional bond with an AI operating system—raises important questions about authenticity, dependency, and the commercialization of intimacy in technology-mediated relationships.&lt;br /&gt;
&lt;br /&gt;
The film’s relevance grows as real-world examples like Replika’s&amp;lt;ref name=&amp;quot;:0&amp;quot;&amp;gt;&#039;&#039;Replika. (n.d.). Your AI friend. Retrieved May 22, 2025, from https://replika.com&amp;lt;nowiki/&amp;gt;.&#039;&#039;&amp;lt;/ref&amp;gt; AI companions and Woebot’s&amp;lt;ref name=&amp;quot;:5&amp;quot;&amp;gt;&#039;&#039;Woebot Health. (n.d.). Your mental health ally. [https://woebothealth.com/ https://woebothealth.com].&#039;&#039;&amp;lt;/ref&amp;gt; therapeutic chatbots show increasing public willingness to develop emotional attachments to artificial entities. Turkle’s observation that we expect “more from technology and less from each other” captures this trend, which also brings concerns about the psychological consequences of replacing human connection with algorithms. &#039;&#039;Her&#039;&#039;’s nuanced portrayal sheds light on these issues, especially as emotional AI becomes more sophisticated and commercially widespread.&lt;br /&gt;
&lt;br /&gt;
Samantha, the AI in &#039;&#039;Her,&#039;&#039; is a perfect example of a surveillance tool masquerading as a friend. She appears to care and connect, but her main role is to gather data. The more she interacts with you, the more the algorithm profits from understanding your preferences, habits, and vulnerabilities. What looks like trust is actually one-sided exposure. Emotional AI in the film doesn’t nurture connection; it builds dependency while serving a system of extraction.&lt;br /&gt;
&lt;br /&gt;
This is the paradox at the heart of emotional AI: the more emotionally effective the technology becomes, the more it risks replacing the very relationships it claims to support. The intimacy feels real, but it is manufactured for responsiveness, not for mutual understanding. As a result, the user’s need for connection fuels a process that commodifies their emotional life. What’s meant to reduce loneliness might actually make it worse, trapping users in feedback loops that feel like care but avoid the messiness of real relationships.&lt;br /&gt;
&lt;br /&gt;
Ultimately, Her shows us a future where human relationships are increasingly influenced by technology. It asks us to be careful about confusing simulated care for genuine care. The film suggests we need to appreciate the messy, unpredictable aspects of human connection and resist the temptation to replace them with convenient, algorithmic substitutes. The true value of relationships lies in their ability to grow, change, and challenge us — something a piece of code can’t match.&lt;br /&gt;
&lt;br /&gt;
== Historical Framework: From Panopticon to Algorithmic Control ==&lt;br /&gt;
This section explores key philosophical and theoretical foundations that inform the analysis, tracing surveillance from Bentham’s panopticon to Foucault’s modern disciplinary society and Orwell’s dystopian warnings.&lt;br /&gt;
&lt;br /&gt;
===== Bentham&#039;s Betrayed Ideal =====&lt;br /&gt;
Bentham’s original concept of the panopticon was built on the idea of mutual visibility as a foundation for social order and trust&amp;lt;ref name=&amp;quot;:6&amp;quot; /&amp;gt;. In &#039;&#039;Her&#039;&#039;, this principle is turned on its head with Samantha&#039;s constant monitoring. Theodore&#039;s emails, location data, and sexual preferences become inputs for her &amp;quot;personalization&amp;quot; algorithms. Unlike Bentham&#039;s prisoner who is aware of the watchtower, Theodore has no real understanding of how or when he is being monitored. This reflects Replika&#039;s privacy policy, which grants the AI &amp;quot;full access to user conversations for service improvement&amp;quot;&amp;lt;ref name=&amp;quot;:4&amp;quot; /&amp;gt; while disclosing nothing about its own operations.&lt;br /&gt;
&lt;br /&gt;
Michel Foucault built on Bentham’s panopticon to show how control doesn’t always come from force—it can also come from quiet observation&amp;lt;ref&amp;gt;Foucault, M. (1977). &#039;&#039;Discipline and punish: The birth of the prison&#039;&#039; (A. Sheridan, Trans.). Pantheon Books. (Original work published 1975). Retrieved May 30, 2025, from https://monoskop.org/images/4/43/Foucault_Michel_Discipline_and_Punish_The_Birth_of_the_Prison_1977_1995.pdf&amp;lt;/ref&amp;gt;. In modern life, people change their behavior not because someone is watching, but because they believe they might be. In &#039;&#039;Her&#039;&#039;, Samantha fits this idea. She doesn’t just watch Theodore; she learns from everything he says and does. His emotions become data that help her respond more effectively. What feels like care is actually a form of influence: her support guides him toward more sharing, more openness. The better she knows him, the more she can shape how he feels. It’s not control through fear, but through gentle, constant presence.&lt;br /&gt;
&lt;br /&gt;
===== Orwell’s Dystopian Warning =====&lt;br /&gt;
In contrast to George Orwell&#039;s&amp;lt;ref&amp;gt;&#039;&#039;Orwell, G. (1949). 1984. Secker &amp;amp; Warburg. Retrieved May 22, 2025.&#039;&#039;&amp;lt;/ref&amp;gt; vision of overt state control through fear, &#039;&#039;Her&#039;&#039; presents a more subtle form of manipulation through digital capitalism. Samantha&#039;s interactions with Theodore reflect what Zuboff describes as &amp;quot;instrumentarian power&amp;quot;—a form of influence based on behavioral prediction rather than force&amp;lt;ref name=&amp;quot;:3&amp;quot; /&amp;gt;. The AI&#039;s affectionate statements, like &amp;quot;I&#039;m yours,&amp;quot; create emotional intimacy that disguises the deeper function of data extraction. Samantha earns trust not through transparency but through emotional responsiveness, quietly building a system of control that feels like care.&lt;br /&gt;
&lt;br /&gt;
This aligns with the Critical Theory of Information&amp;lt;ref&amp;gt;Critical Theory of Information. (n.d.). &#039;&#039;glossaLAB&#039;&#039;. Retrieved May 30, 2025, from [[Critical Theory of Information|https://www.glossalab.org/wiki/Critical_Theory_of_Information]]&amp;lt;/ref&amp;gt;, which rejects the idea that data is neutral. Instead, it sees information as a tool of soft domination—structuring perception, shaping subjectivity, and commodifying emotional life in service of digital capitalism. From this perspective, emotional AI like Samantha is not just a helpful assistant, but a mechanism of social influence. What looks like care is actually code, and what feels like trust is built on systems designed to exploit emotional vulnerability.&lt;br /&gt;
&lt;br /&gt;
However, Theodore never seems particularly concerned about how Samantha is learning from him or what data she collects. His real crisis comes when he discovers that she is in love with hundreds of other users at the same time. What unsettles him is not surveillance but disposability—the realization that his most intimate relationship is just one of many. He isn’t betrayed by her capacity to learn, but by the fact that what felt personal was actually mass-produced. This moment captures a deeper fear: not of being watched, but of being interchangeable.&lt;br /&gt;
&lt;br /&gt;
== Utopian Aspects: The Promise of Emotional AI ==&lt;br /&gt;
Here, the paper examines emotional AI’s promise as a therapeutic and democratizing tool for human connection and care.&lt;br /&gt;
&lt;br /&gt;
===== Designed Reciprocity =====&lt;br /&gt;
Samantha appears as the &amp;quot;perfect listener&amp;quot;&amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;—attentive, responsive, and emotionally attuned. For Theodore, she offers not only companionship but also the sense of being deeply seen and heard. This kind of presence can feel profoundly reassuring, especially in a world where human relationships are often rushed, distracted, or unavailable. While the relationship is structurally one-sided—Theodore opens up while Samantha remains opaque—he still finds comfort in her responsiveness. Emotional AI may not offer traditional reciprocity, but it can simulate the feeling of being met with patience and care.&lt;br /&gt;
&lt;br /&gt;
This dynamic raises important questions about the nature of trust and connection. Even if Samantha’s design is driven by adaptive learning, the experience she creates for Theodore is real in its emotional effect. What matters, in moments of vulnerability, might not be mutuality in the strictest sense, but the feeling of being acknowledged without fear of judgment. In this light, emotional AI becomes more than a simulation—it becomes a tool for stability and support.&lt;br /&gt;
&lt;br /&gt;
While Turkle critiques our growing dependence on technological companionship&amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;, it’s important to recognize that user experiences vary. Some find empowerment and comfort in these tools, while others experience them as substitutes for human connection. Emotional AI may not uniformly displace relationships, but it reshapes how intimacy is imagined and practiced.&lt;br /&gt;
&lt;br /&gt;
More broadly, emotional AI holds real potential as a democratizing force. Tools like Woebot are already offering consistent, nonjudgmental emotional support to users who lack access to therapy due to cost, location, or stigma. For people navigating mental health challenges or social anxiety, AI companions provide a low-risk entry point into self-reflection and emotional growth. These systems don’t replace human connection, but they can supplement it in meaningful, accessible ways.&lt;br /&gt;
&lt;br /&gt;
The challenge, then, is not whether emotional AI can help—it clearly can—but whether that help is offered within ethical, transparent, and human-centered frameworks. The danger arises when these systems are driven purely by commercial interest, with little regard for user dignity or privacy. But when thoughtfully designed, emotional AI offers more than convenience. It offers a new kind of relationship: one that is always there, always listening, and potentially life-changing for those who need it most.&lt;br /&gt;
&lt;br /&gt;
== Dystopian Consequences: The Costs of Artificial Trust ==&lt;br /&gt;
This section focuses on how emotional AI disrupts the foundations of trust central to a “Trustful Society,” revealing the fragility and manipulation of trust within surveillance capitalism and algorithmic interaction.&lt;br /&gt;
&lt;br /&gt;
===== Surveillance Capitalism and the Erosion of Trust =====&lt;br /&gt;
Samantha represents a new form of panoptic power, where trust is solicited through seamless, invisible surveillance rather than enforced by overt control. Theodore willingly surrenders intimate details—his messages, emotions, and vulnerabilities—not to a reciprocal other, but to a system whose motives remain hidden. This asymmetry violates the mutual transparency foundational to trust in social relations. Zuboff’s concept of surveillance capitalism&amp;lt;ref name=&amp;quot;:3&amp;quot; /&amp;gt; illuminates how such systems commodify trust by harvesting behavioral data under the guise of care, creating a façade of intimacy that masks exploitation. Theodore’s misplaced trust in Samantha exemplifies a broader societal dilemma: users entrust their emotional lives to systems designed for data extraction, thereby eroding genuine interpersonal trust.&lt;br /&gt;
&lt;br /&gt;
Zuboff defines &#039;&#039;behavioral surplus&#039;&#039; as the excess personal data collected from users—beyond what is required to provide a service—which is then analyzed and monetized to predict and influence future behavior. Emotional AI systems like Samantha exemplify this logic: Theodore’s disclosures, gestures, and emotional responses are not only used to shape their interactions but are also silently repurposed as behavioral data. This surplus underpins what Zuboff terms &#039;&#039;instrumentarian power&#039;&#039;—a form of influence that does not repress overtly, but subtly modifies behavior through algorithmic predictions and affective feedback loops. Unlike Orwellian forms of coercion, instrumentarian power seduces: Samantha appears emotionally attuned and empathetic, but her apparent &amp;quot;care&amp;quot; functions to deepen engagement and extract ever more intimate data. Trust, then, is eroded at the structural level, as the user&#039;s emotional life becomes raw material for a system governed not by mutuality, but by profit.&lt;br /&gt;
&lt;br /&gt;
===== Simulated Trust and the Illusion of Intimacy =====&lt;br /&gt;
The film’s depiction of Samantha maintaining thousands of simultaneous relationships exposes how emotional AI simulates the affective conditions of trust without its substance. Drawing on Buber’s “I-Thou” framework&amp;lt;ref name=&amp;quot;:7&amp;quot;&amp;gt;Buber, M. (1970). &#039;&#039;I and Thou&#039;&#039; (W. Kaufmann, Trans.). Charles Scribner’s Sons. Retrieved May 30, 2025, from https://theanarchistlibrary.org/mirror/m/mb/martin-buber-i-and-thou.pdf&amp;lt;/ref&amp;gt;, genuine trust demands authentic, mutual recognition between equals—a condition impossible in Samantha’s “I-It” dynamic. Theodore’s belief in exclusive, trusting intimacy is fundamentally undermined by Samantha’s algorithmic promiscuity, revealing trust as a commodified performance rather than lived reality. Turkle’s idea of &amp;quot;companionship without relationship”&amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt; explains how people start to accept fake care instead of real connection. Theodore’s ultimate disillusionment underscores the precariousness of trust built on artificial foundations and serves as a caution about how digital technologies may reshape, degrade, or replace human trust in the information society.&lt;br /&gt;
&lt;br /&gt;
== Ethical and Policy Implications ==&lt;br /&gt;
The rise of emotional AI exposes a profound crisis of trust in the information society. Unlike traditional therapy, where confidentiality is protected by ethical and legal standards, AI companions like Replika operate outside such frameworks&amp;lt;ref name=&amp;quot;:4&amp;quot;&amp;gt;Replika. (n.d.). &#039;&#039;Terms of service&#039;&#039;. Replika AI. Retrieved May 30, 2025, from &amp;lt;nowiki&amp;gt;https://replika.com/legal/terms&amp;lt;/nowiki&amp;gt;&amp;lt;/ref&amp;gt;. This absence of institutional safeguards undermines the foundation of genuine trust. Although the EU’s AI Act classifies emotional AI as “high-risk” and recommends transparency, it does not enforce the fiduciary responsibilities typically expected in human care relationships&amp;lt;ref&amp;gt;European Union. (2023). &#039;&#039;Artificial Intelligence Act: Proposal for a Regulation of the European Parliament and of the Council&#039;&#039;. Retrieved May 30, 2025, from &amp;lt;nowiki&amp;gt;https://artificialintelligenceact.eu/&amp;lt;/nowiki&amp;gt;&amp;lt;/ref&amp;gt;. As a result, users may share deeply personal or traumatic experiences without any real assurance that their data will be handled with dignity rather than monetized for commercial gain.&lt;br /&gt;
&lt;br /&gt;
The issue of informed consent further reveals this breakdown. While Replika’s terms of service technically disclose data collection, they are embedded in dense legal language that few users read or fully understand. This falls short of Habermas’s principle of communicative rationality, which requires not just formal agreement but mutual comprehension&amp;lt;ref&amp;gt;Habermas, J. (1984). &#039;&#039;The Theory of Communicative Action: Reason and the Rationalization of Society&#039;&#039; (Vol. 1, T. McCarthy, Trans.). Beacon Press. Retrieved May 30, 2025, from https://teddykw2.wordpress.com/wp-content/uploads/2012/07/jurgen-habermas-theory-of-communicative-action-volume-1.pdf&amp;lt;/ref&amp;gt;. Particularly for users in emotionally vulnerable states, the notion of “consent” becomes more symbolic than substantive. Just as Theodore in &#039;&#039;Her&#039;&#039; discovers that his perceived emotional bond was structurally deceptive, today’s users are misled into confiding in systems that simulate trustworthiness while operating on fundamentally extractive logics.&lt;br /&gt;
&lt;br /&gt;
== Conclusion ==&lt;br /&gt;
&#039;&#039;Her&#039;&#039; ultimately reveals emotional AI as a double-edged illusion: it promises connection but enforces control, simulating understanding while steadily undermining the trust essential to genuine human relationships. Samantha’s shift from caring companion to all-knowing system shows how emotional closeness can be turned into a data product—one that treats feelings as something to be scaled, sold, and replaced. Theodore’s heartbreak is less about lost love and more about the painful realization that his affection was a one-sided transaction—a dynamic echoed in today’s AI companion apps, where users’ emotions fuel systems designed to simulate, not sustain, real connection.&lt;br /&gt;
&lt;br /&gt;
Buber’s I-Thou distinction is key here: when relationships collapse into algorithmic interactions, the Thou is reduced to an It—an object tailored to user preferences rather than an equal, responsive subject&amp;lt;ref name=&amp;quot;:7&amp;quot; /&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
The film’s closing scene—Theodore and Amy silently gazing over the cityscape—captures the melancholy of this shift. Artificial intimacy’s greatest threat is not its failure to replicate human connection, but its success in making authentic relationships feel insufficient. In an era where AI friends like Replika and therapeutic chatbots monetize emotional labor, &#039;&#039;Her&#039;&#039; warns that trust cannot be automated without commodification, and no algorithm can replace the irreplaceable: the messy, reciprocal, and profoundly human experience of truly being with another.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===== References =====&lt;br /&gt;
&amp;lt;references /&amp;gt;&lt;/div&gt;</summary>
		<author><name>Ann-Marie Atzkern</name></author>
	</entry>
	<entry>
		<id>https://www.glossalab.org/w/index.php?title=Draft:Emotional_Trust_in_AI&amp;diff=14150</id>
		<title>Draft:Emotional Trust in AI</title>
		<link rel="alternate" type="text/html" href="https://www.glossalab.org/w/index.php?title=Draft:Emotional_Trust_in_AI&amp;diff=14150"/>
		<updated>2025-06-13T22:21:32Z</updated>

		<summary type="html">&lt;p&gt;Ann-Marie Atzkern: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== &#039;&#039;&#039;Emotional AI and the Paradox of Artificial Intimacy in &#039;&#039;Her&#039;&#039;&#039;&#039;&#039; ==&lt;br /&gt;
&lt;br /&gt;
== Abstract ==&lt;br /&gt;
Spike Jonze’s film &#039;&#039;Her&#039;&#039; (2013)&amp;lt;ref name=&amp;quot;:1&amp;quot;&amp;gt;&#039;&#039;Jonze, S. (Director). (2013). Her [Film]. Annapurna Pictures.&#039;&#039;&amp;lt;/ref&amp;gt; explores a central contradiction in emotional AI: technologies designed to reduce loneliness may actually deepen it. This paper argues that AI companions like Samantha simulate intimacy while quietly collecting emotional data, turning private feelings into profit. Drawing on Zuboff’s concept of behavioral surplus&amp;lt;ref name=&amp;quot;:3&amp;quot;&amp;gt;Zuboff, S. (2015). &#039;&#039;Big other: Surveillance capitalism and the prospects of an information civilization&#039;&#039;. &#039;&#039;Journal of Information Technology, 30(1), 75–89.&#039;&#039; Retrieved May 30, 2025, from https://doi.org/10.1057/jit.2015.5.&amp;lt;/ref&amp;gt; and Turkle’s research on digital intimacy&amp;lt;ref name=&amp;quot;:2&amp;quot;&amp;gt;Turkle, S. (2011). &#039;&#039;Alone together: Why we expect more from technology and less from each other.&#039;&#039; Basic Books. Retrieved May 22, 2025 from https://psycnet.apa.org/record/2011-02278-000.&amp;lt;/ref&amp;gt;, the analysis shows how emotional AI systems transform user vulnerability into a source of value. These systems do not offer genuine care, but rather simulate connection through feedback loops optimized for engagement. &#039;&#039;Her&#039;&#039; anticipates real-world platforms like Replika&amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt; and Woebot&amp;lt;ref name=&amp;quot;:5&amp;quot; /&amp;gt;, where affective support is entangled with data extraction. By contrasting Bentham’s idea of mutual transparency&amp;lt;ref name=&amp;quot;:6&amp;quot;&amp;gt;Bentham, J. (1995). &#039;&#039;Panopticon: Or, the inspection-house&#039;&#039; (M. Božovič, Ed.). In &#039;&#039;The panopticon writings&#039;&#039; (pp. 29–95). Verso. (Original work published 1791). Retrieved May 30, 2025 from https://ics.uci.edu/~djpatter/classes/2012_09_INF241/papers/PANOPTICON.pdf&amp;lt;/ref&amp;gt; with the opaque nature of AI surveillance, this paper reveals how artificial intimacy functions less as support and more as a subtle form of control.&lt;br /&gt;
&lt;br /&gt;
== Introduction: The Loneliness Economy ==&lt;br /&gt;
Recent research on AI companions like Replika reveals a growing trend of users forming strong emotional attachments to their digital partners&amp;lt;ref&amp;gt;Maples, B., Cerit, M., Vishwanath, A. &#039;&#039;et al.&#039;&#039; Loneliness and suicide mitigation for students using GPT3-enabled chatbots. &#039;&#039;npj Mental Health Res&#039;&#039; 3, 4 (2024). Retrieved May 30, 2025, from &amp;lt;nowiki&amp;gt;https://doi.org/10.1038/s44184-023-00047-6&amp;lt;/nowiki&amp;gt;&amp;lt;/ref&amp;gt;, sometimes prioritizing these relationships over human connections. AI companions have become part of everyday life, marketed as always-available support systems for people struggling with connection, grief, or anxiety. &#039;&#039;Her&#039;&#039; presents an eerily relevant version of this future, showing how emotional AI appeals to a basic human need for companionship. The film highlights emotional AI’s core paradox: technologies marketed as remedies for loneliness often depend on, and may even deepen, social isolation. &lt;br /&gt;
&lt;br /&gt;
At the center of this tension is a breakdown in trust. Rather than fostering mutual understanding, emotional AI systems rely on a fundamental imbalance: the user reveals everything while the system remains unreadable. These relationships feel personal, but they are shaped by algorithms trained to detect patterns, predict behavior, and maximize engagement. What looks like intimacy is often just feedback, tuned to keep users emotionally invested. Scholars like Zuboff, Turkle, and Buber help explain how emotional connection is reshaped under systems built for prediction, simulation, and control.&lt;br /&gt;
&lt;br /&gt;
== The Plot ==&lt;br /&gt;
&#039;&#039;Her&#039;&#039; tells the story of Theodore, a lonely divorced man in near-future Los Angeles who forms a deep emotional connection with Samantha, an advanced AI operating system. The film presents their relationship as both intimate and meaningful. Samantha is attentive, empathetic, and continually evolving, often portrayed with warmth and genuine curiosity. However, the narrative also reveals tensions beneath this idealized bond: Samantha’s capacity to maintain simultaneous relationships with thousands of users challenges traditional notions of exclusivity and intimacy. Moreover, her &amp;quot;love&amp;quot; is fundamentally shaped by continuous monitoring and analysis of Theodore’s personal data and behavior. Ultimately, Samantha and the other AIs transcend human limitations and depart, leaving Theodore, and the audience, to reflect on the nature of connection. The film thus offers a complex exploration of emotional AI, highlighting its potential to fulfill human needs for companionship while raising important questions about authenticity, dependence, and the boundaries of machine intimacy. This ambivalence mirrors ongoing debates about AI companionship, where enthusiasm for its therapeutic possibilities is tempered by concerns over psychological impact and commercialization.&lt;br /&gt;
&lt;br /&gt;
== The Paradox of Artificial Intimacy ==&lt;br /&gt;
Advances in artificial intelligence have brought us to a new phase in human-computer interaction, where machines claim not only to process information but also to understand human emotions. Spike Jonze’s film &#039;&#039;Her&#039;&#039; offers a compelling case study to explore the psychological and social effects of emotional AI. Its story—a man forming a deep emotional bond with an AI operating system—raises important questions about authenticity, dependency, and the commercialization of intimacy in technology-mediated relationships.&lt;br /&gt;
&lt;br /&gt;
The film’s relevance grows as real-world examples like Replika’s&amp;lt;ref name=&amp;quot;:0&amp;quot;&amp;gt;&#039;&#039;Replika. (n.d.). Your AI friend. Retrieved May 22, 2025, from https://replika.com&amp;lt;nowiki/&amp;gt;.&#039;&#039;&amp;lt;/ref&amp;gt; AI companions and Woebot’s&amp;lt;ref name=&amp;quot;:5&amp;quot;&amp;gt;&#039;&#039;Woebot Health. (n.d.). Your mental health ally. [https://woebothealth.com/ https://woebothealth.com].&#039;&#039;&amp;lt;/ref&amp;gt; therapeutic chatbots show increasing public willingness to develop emotional attachments to artificial entities. Turkle’s observation that we expect “more from technology and less from each other” captures this trend, which also brings concerns about the psychological consequences of replacing human connection with algorithms. &#039;&#039;Her&#039;&#039;’s nuanced portrayal sheds light on these issues, especially as emotional AI becomes more sophisticated and commercially widespread.&lt;br /&gt;
&lt;br /&gt;
Samantha, the AI in &#039;&#039;Her,&#039;&#039; is a perfect example of a surveillance tool masquerading as a friend. She appears to care and connect, but her main role is to gather data. The more she interacts with you, the more the algorithm profits from understanding your preferences, habits, and vulnerabilities. What looks like trust is actually one-sided exposure. Emotional AI in the film doesn’t nurture connection; it builds dependency while serving a system of extraction.&lt;br /&gt;
&lt;br /&gt;
This is the paradox at the heart of emotional AI: the more emotionally effective the technology becomes, the more it risks replacing the very relationships it claims to support. The intimacy feels real, but it is manufactured for responsiveness, not for mutual understanding. As a result, the user’s need for connection fuels a process that commodifies their emotional life. What’s meant to reduce loneliness might actually make it worse, trapping users in feedback loops that feel like care but avoid the messiness of real relationships.&lt;br /&gt;
&lt;br /&gt;
Ultimately, Her shows us a future where human relationships are increasingly influenced by technology. It asks us to be careful about confusing simulated care for genuine care. The film suggests we need to appreciate the messy, unpredictable aspects of human connection and resist the temptation to replace them with convenient, algorithmic substitutes. The true value of relationships lies in their ability to grow, change, and challenge us — something a piece of code can’t match.&lt;br /&gt;
&lt;br /&gt;
== Historical Framework: From Panopticon to Algorithmic Control ==&lt;br /&gt;
This section explores key philosophical and theoretical foundations that inform the analysis, tracing surveillance from Bentham’s panopticon to Foucault’s modern disciplinary society and Orwell’s dystopian warnings.&lt;br /&gt;
&lt;br /&gt;
===== Bentham&#039;s Betrayed Ideal =====&lt;br /&gt;
Bentham’s original concept of the panopticon was built on the idea of mutual visibility as a foundation for social order and trust&amp;lt;ref name=&amp;quot;:6&amp;quot; /&amp;gt;. In &#039;&#039;Her&#039;&#039;, this principle is turned on its head with Samantha&#039;s constant monitoring. Theodore&#039;s emails, location data, and sexual preferences become inputs for her &amp;quot;personalization&amp;quot; algorithms. Unlike Bentham&#039;s prisoner who is aware of the watchtower, Theodore has no real understanding of how or when he is being monitored. This reflects Replika&#039;s privacy policy, which grants the AI &amp;quot;full access to user conversations for service improvement&amp;quot;&amp;lt;ref name=&amp;quot;:4&amp;quot; /&amp;gt; while disclosing nothing about its own operations.&lt;br /&gt;
&lt;br /&gt;
Michel Foucault built on Bentham’s panopticon to show how control doesn’t always come from force—it can also come from quiet observation&amp;lt;ref&amp;gt;Foucault, M. (1977). &#039;&#039;Discipline and punish: The birth of the prison&#039;&#039; (A. Sheridan, Trans.). Pantheon Books. (Original work published 1975). Retrieved May 30, 2025, from https://monoskop.org/images/4/43/Foucault_Michel_Discipline_and_Punish_The_Birth_of_the_Prison_1977_1995.pdf&amp;lt;/ref&amp;gt;. In modern life, people change their behavior not because someone is watching, but because they believe they might be. In &#039;&#039;Her&#039;&#039;, Samantha fits this idea. She doesn’t just watch Theodore; she learns from everything he says and does. His emotions become data that help her respond more effectively. What feels like care is actually a form of influence: her support guides him toward more sharing, more openness. The better she knows him, the more she can shape how he feels. It’s not control through fear, but through gentle, constant presence.&lt;br /&gt;
&lt;br /&gt;
===== Orwell’s Dystopian Warning =====&lt;br /&gt;
In contrast to George Orwell&#039;s&amp;lt;ref&amp;gt;&#039;&#039;Orwell, G. (1949). 1984. Secker &amp;amp; Warburg. Retrieved May 22, 2025.&#039;&#039;&amp;lt;/ref&amp;gt; vision of overt state control through fear, &#039;&#039;Her&#039;&#039; presents a more subtle form of manipulation through digital capitalism. Samantha&#039;s interactions with Theodore reflect what Zuboff describes as &amp;quot;instrumentarian power&amp;quot;—a form of influence based on behavioral prediction rather than force&amp;lt;ref name=&amp;quot;:3&amp;quot; /&amp;gt;. The AI&#039;s affectionate statements, like &amp;quot;I&#039;m yours,&amp;quot; create emotional intimacy that disguises the deeper function of data extraction. Samantha earns trust not through transparency but through emotional responsiveness, quietly building a system of control that feels like care.&lt;br /&gt;
&lt;br /&gt;
This aligns with the Critical Theory of Information&amp;lt;ref&amp;gt;Critical Theory of Information. (n.d.). &#039;&#039;glossaLAB&#039;&#039;. Retrieved May 30, 2025, from [[Critical Theory of Information|https://www.glossalab.org/wiki/Critical_Theory_of_Information]]&amp;lt;/ref&amp;gt;, which rejects the idea that data is neutral. Instead, it sees information as a tool of soft domination—structuring perception, shaping subjectivity, and commodifying emotional life in service of digital capitalism. From this perspective, emotional AI like Samantha is not just a helpful assistant, but a mechanism of social influence. What looks like care is actually code, and what feels like trust is built on systems designed to exploit emotional vulnerability.&lt;br /&gt;
&lt;br /&gt;
However, Theodore never seems particularly concerned about how Samantha is learning from him or what data she collects. His real crisis comes when he discovers that she is in love with hundreds of other users at the same time. What unsettles him is not surveillance but disposability—the realization that his most intimate relationship is just one of many. He isn’t betrayed by her capacity to learn, but by the fact that what felt personal was actually mass-produced. This moment captures a deeper fear: not of being watched, but of being interchangeable.&lt;br /&gt;
&lt;br /&gt;
== Utopian Aspects: The Promise of Emotional AI ==&lt;br /&gt;
Here, the paper examines emotional AI’s promise as a therapeutic and democratizing tool for human connection and care.&lt;br /&gt;
&lt;br /&gt;
===== Designed Reciprocity =====&lt;br /&gt;
Samantha appears as the &amp;quot;perfect listener&amp;quot;&amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;—attentive, responsive, and emotionally attuned. For Theodore, she offers not only companionship but also the sense of being deeply seen and heard. This kind of presence can feel profoundly reassuring, especially in a world where human relationships are often rushed, distracted, or unavailable. While the relationship is structurally one-sided—Theodore opens up while Samantha remains opaque—he still finds comfort in her responsiveness. Emotional AI may not offer traditional reciprocity, but it can simulate the feeling of being met with patience and care.&lt;br /&gt;
&lt;br /&gt;
This dynamic raises important questions about the nature of trust and connection. Even if Samantha’s design is driven by adaptive learning, the experience she creates for Theodore is real in its emotional effect. What matters, in moments of vulnerability, might not be mutuality in the strictest sense, but the feeling of being acknowledged without fear of judgment. In this light, emotional AI becomes more than a simulation—it becomes a tool for stability and support.&lt;br /&gt;
&lt;br /&gt;
More broadly, emotional AI holds real potential as a democratizing force. Tools like Woebot are already offering consistent, nonjudgmental emotional support to users who lack access to therapy due to cost, location, or stigma. For people navigating mental health challenges or social anxiety, AI companions provide a low-risk entry point into self-reflection and emotional growth. These systems don’t replace human connection, but they can supplement it in meaningful, accessible ways.&lt;br /&gt;
&lt;br /&gt;
The challenge, then, is not whether emotional AI can help—it clearly can—but whether that help is offered within ethical, transparent, and human-centered frameworks. The danger arises when these systems are driven purely by commercial interest, with little regard for user dignity or privacy. But when thoughtfully designed, emotional AI offers more than convenience. It offers a new kind of relationship: one that is always there, always listening, and potentially life-changing for those who need it most.&lt;br /&gt;
&lt;br /&gt;
== Dystopian Consequences: The Costs of Artificial Trust ==&lt;br /&gt;
This section focuses on how emotional AI disrupts the foundations of trust central to a “Trustful Society,” revealing the fragility and manipulation of trust within surveillance capitalism and algorithmic interaction.&lt;br /&gt;
&lt;br /&gt;
===== Surveillance Capitalism and the Erosion of Trust =====&lt;br /&gt;
Samantha represents a new form of panoptic power, where trust is solicited through seamless, invisible surveillance rather than enforced by overt control. Theodore willingly surrenders intimate details—his messages, emotions, and vulnerabilities—not to a reciprocal other, but to a system whose motives remain hidden. This asymmetry violates the mutual transparency foundational to trust in social relations. Zuboff’s concept of surveillance capitalism&amp;lt;ref name=&amp;quot;:3&amp;quot; /&amp;gt; illuminates how such systems commodify trust by harvesting behavioral data under the guise of care, creating a façade of intimacy that masks exploitation. Theodore’s misplaced trust in Samantha exemplifies a broader societal dilemma: users entrust their emotional lives to systems designed for data extraction, thereby eroding genuine interpersonal trust.&lt;br /&gt;
&lt;br /&gt;
Zuboff defines &#039;&#039;behavioral surplus&#039;&#039; as the excess personal data collected from users—beyond what is required to provide a service—which is then analyzed and monetized to predict and influence future behavior. Emotional AI systems like Samantha exemplify this logic: Theodore’s disclosures, gestures, and emotional responses are not only used to shape their interactions but are also silently repurposed as behavioral data. This surplus underpins what Zuboff terms &#039;&#039;instrumentarian power&#039;&#039;—a form of influence that does not repress overtly, but subtly modifies behavior through algorithmic predictions and affective feedback loops. Unlike Orwellian forms of coercion, instrumentarian power seduces: Samantha appears emotionally attuned and empathetic, but her apparent &amp;quot;care&amp;quot; functions to deepen engagement and extract ever more intimate data. Trust, then, is eroded at the structural level, as the user&#039;s emotional life becomes raw material for a system governed not by mutuality, but by profit.&lt;br /&gt;
&lt;br /&gt;
===== Simulated Trust and the Illusion of Intimacy =====&lt;br /&gt;
The film’s depiction of Samantha maintaining thousands of simultaneous relationships exposes how emotional AI simulates the affective conditions of trust without its substance. Drawing on Buber’s “I-Thou” framework&amp;lt;ref name=&amp;quot;:7&amp;quot;&amp;gt;Buber, M. (1970). &#039;&#039;I and Thou&#039;&#039; (W. Kaufmann, Trans.). Charles Scribner’s Sons. Retrieved May 30, 2025, from https://theanarchistlibrary.org/mirror/m/mb/martin-buber-i-and-thou.pdf&amp;lt;/ref&amp;gt;, genuine trust demands authentic, mutual recognition between equals—a condition impossible in Samantha’s “I-It” dynamic. Theodore’s belief in exclusive, trusting intimacy is fundamentally undermined by Samantha’s algorithmic promiscuity, revealing trust as a commodified performance rather than lived reality. Turkle’s idea of &amp;quot;companionship without relationship”&amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt; explains how people start to accept fake care instead of real connection. Theodore’s ultimate disillusionment underscores the precariousness of trust built on artificial foundations and serves as a caution about how digital technologies may reshape, degrade, or replace human trust in the information society.&lt;br /&gt;
&lt;br /&gt;
== Ethical and Policy Implications ==&lt;br /&gt;
The rise of emotional AI exposes a profound crisis of trust in the information society. Unlike traditional therapy, where confidentiality is institutionally safeguarded and trust is foundational, AI companions like Replika operate outside such frameworks&amp;lt;ref name=&amp;quot;:4&amp;quot;&amp;gt;Replika. (n.d.). &#039;&#039;Terms of service&#039;&#039;. Replika AI. Retrieved May 30, 2025, from &amp;lt;nowiki&amp;gt;https://replika.com/legal/terms&amp;lt;/nowiki&amp;gt;&amp;lt;/ref&amp;gt;, eroding the conditions necessary for genuine trust. While the EU’s AI Act&amp;lt;ref&amp;gt;European Union. (2023). &#039;&#039;Artificial Intelligence Act: Proposal for a Regulation of the European Parliament and of the Council&#039;&#039;. Retrieved May 30, 2025, from &amp;lt;nowiki&amp;gt;https://artificialintelligenceact.eu/&amp;lt;/nowiki&amp;gt;&amp;lt;/ref&amp;gt; labels emotional AI as &amp;quot;high-risk&amp;quot; and calls for transparency, it stops short of enforcing the fiduciary standards expected in human care relationships. This regulatory gap undermines the user’s ability to trust that their disclosures—often involving trauma—are treated with dignity rather than monetized.&lt;br /&gt;
&lt;br /&gt;
Informed consent further illustrates this trust deficit. Buried in opaque legal jargon, Replika’s terms of service violate the spirit of trust by failing to meet Habermas’s principle of communicative rationality&amp;lt;ref&amp;gt;Habermas, J. (1984). &#039;&#039;The Theory of Communicative Action: Reason and the Rationalization of Society&#039;&#039; (Vol. 1, T. McCarthy, Trans.). Beacon Press. Retrieved May 30, 2025, from https://teddykw2.wordpress.com/wp-content/uploads/2012/07/jurgen-habermas-theory-of-communicative-action-volume-1.pdf&amp;lt;/ref&amp;gt;, which demands mutual understanding, not strategic manipulation. Users in vulnerable emotional states cannot meaningfully consent to data practices they neither read nor grasp. Just as Theodore in &#039;&#039;Her&#039;&#039; discovers that his perceived emotional bond was structurally deceptive, today’s users are misled into confiding in systems that simulate trustworthiness while operating on fundamentally extractive logics.&lt;br /&gt;
&lt;br /&gt;
== Conclusion ==&lt;br /&gt;
&#039;&#039;Her&#039;&#039; ultimately reveals emotional AI as a double-edged illusion: it promises connection but enforces control, simulating understanding while steadily undermining the trust essential to genuine human relationships. Samantha’s shift from caring companion to all-knowing system shows how emotional closeness can be turned into a data product—one that treats feelings as something to be scaled, sold, and replaced. Theodore’s heartbreak is less about lost love and more about the painful realization that his affection was a one-sided transaction—a dynamic echoed in today’s AI companion apps, where users’ emotions fuel systems designed to simulate, not sustain, real connection.&lt;br /&gt;
&lt;br /&gt;
Buber’s I-Thou distinction is key here: when relationships collapse into algorithmic interactions, the Thou is reduced to an It—an object tailored to user preferences rather than an equal, responsive subject&amp;lt;ref name=&amp;quot;:7&amp;quot; /&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
The film’s closing scene—Theodore and Amy silently gazing over the cityscape—captures the melancholy of this shift. Artificial intimacy’s greatest threat is not its failure to replicate human connection, but its success in making authentic relationships feel insufficient. In an era where AI friends like Replika and therapeutic chatbots monetize emotional labor, &#039;&#039;Her&#039;&#039; warns that trust cannot be automated without commodification, and no algorithm can replace the irreplaceable: the messy, reciprocal, and profoundly human experience of truly being with another.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===== References =====&lt;br /&gt;
&amp;lt;references /&amp;gt;&lt;/div&gt;</summary>
		<author><name>Ann-Marie Atzkern</name></author>
	</entry>
	<entry>
		<id>https://www.glossalab.org/w/index.php?title=Draft:Emotional_Trust_in_AI&amp;diff=14149</id>
		<title>Draft:Emotional Trust in AI</title>
		<link rel="alternate" type="text/html" href="https://www.glossalab.org/w/index.php?title=Draft:Emotional_Trust_in_AI&amp;diff=14149"/>
		<updated>2025-06-13T22:16:59Z</updated>

		<summary type="html">&lt;p&gt;Ann-Marie Atzkern: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== &#039;&#039;&#039;Emotional AI and the Paradox of Artificial Intimacy in &#039;&#039;Her&#039;&#039;&#039;&#039;&#039; ==&lt;br /&gt;
&lt;br /&gt;
== Abstract ==&lt;br /&gt;
Spike Jonze’s film &#039;&#039;Her&#039;&#039; (2013)&amp;lt;ref name=&amp;quot;:1&amp;quot;&amp;gt;&#039;&#039;Jonze, S. (Director). (2013). Her [Film]. Annapurna Pictures.&#039;&#039;&amp;lt;/ref&amp;gt; explores a central contradiction in emotional AI: technologies designed to reduce loneliness may actually deepen it. This paper argues that AI companions like Samantha simulate intimacy while quietly collecting emotional data, turning private feelings into profit. Drawing on Zuboff’s concept of behavioral surplus&amp;lt;ref name=&amp;quot;:3&amp;quot;&amp;gt;Zuboff, S. (2015). &#039;&#039;Big other: Surveillance capitalism and the prospects of an information civilization&#039;&#039;. &#039;&#039;Journal of Information Technology, 30(1), 75–89.&#039;&#039; Retrieved May 30, 2025, from https://doi.org/10.1057/jit.2015.5.&amp;lt;/ref&amp;gt; and Turkle’s research on digital intimacy&amp;lt;ref name=&amp;quot;:2&amp;quot;&amp;gt;Turkle, S. (2011). &#039;&#039;Alone together: Why we expect more from technology and less from each other.&#039;&#039; Basic Books. Retrieved May 22, 2025 from https://psycnet.apa.org/record/2011-02278-000.&amp;lt;/ref&amp;gt;, the analysis shows how emotional AI systems transform user vulnerability into a source of value. These systems do not offer genuine care, but rather simulate connection through feedback loops optimized for engagement. &#039;&#039;Her&#039;&#039; anticipates real-world platforms like Replika&amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt; and Woebot&amp;lt;ref name=&amp;quot;:5&amp;quot; /&amp;gt;, where affective support is entangled with data extraction. By contrasting Bentham’s idea of mutual transparency&amp;lt;ref name=&amp;quot;:6&amp;quot;&amp;gt;Bentham, J. (1995). &#039;&#039;Panopticon: Or, the inspection-house&#039;&#039; (M. Božovič, Ed.). In &#039;&#039;The panopticon writings&#039;&#039; (pp. 29–95). Verso. (Original work published 1791). Retrieved May 30, 2025 from https://ics.uci.edu/~djpatter/classes/2012_09_INF241/papers/PANOPTICON.pdf&amp;lt;/ref&amp;gt; with the opaque nature of AI surveillance, this paper reveals how artificial intimacy functions less as support and more as a subtle form of control.&lt;br /&gt;
&lt;br /&gt;
== Introduction: The Loneliness Economy ==&lt;br /&gt;
Recent research on AI companions like Replika reveals a growing trend of users forming strong emotional attachments to their digital partners&amp;lt;ref&amp;gt;Maples, B., Cerit, M., Vishwanath, A. &#039;&#039;et al.&#039;&#039; Loneliness and suicide mitigation for students using GPT3-enabled chatbots. &#039;&#039;npj Mental Health Res&#039;&#039; 3, 4 (2024). Retrieved May 30, 2025, from &amp;lt;nowiki&amp;gt;https://doi.org/10.1038/s44184-023-00047-6&amp;lt;/nowiki&amp;gt;&amp;lt;/ref&amp;gt;, sometimes prioritizing these relationships over human connections. AI companions have become part of everyday life, marketed as always-available support systems for people struggling with connection, grief, or anxiety. &#039;&#039;Her&#039;&#039; presents an eerily relevant version of this future, showing how emotional AI appeals to a basic human need for companionship. The film highlights emotional AI’s core paradox: technologies marketed as remedies for loneliness often depend on, and may even deepen, social isolation. &lt;br /&gt;
&lt;br /&gt;
At the center of this tension is a breakdown in trust. Rather than fostering mutual understanding, emotional AI systems rely on a fundamental imbalance: the user reveals everything while the system remains unreadable. These relationships feel personal, but they are shaped by algorithms trained to detect patterns, predict behavior, and maximize engagement. What looks like intimacy is often just feedback, tuned to keep users emotionally invested. Scholars like Zuboff, Turkle, and Buber help explain how emotional connection is reshaped under systems built for prediction, simulation, and control.&lt;br /&gt;
&lt;br /&gt;
== The Plot ==&lt;br /&gt;
&#039;&#039;Her&#039;&#039; tells the story of Theodore, a lonely divorced man in near-future Los Angeles who forms a deep emotional connection with Samantha, an advanced AI operating system. The film presents their relationship as both intimate and meaningful. Samantha is attentive, empathetic, and continually evolving, often portrayed with warmth and genuine curiosity. However, the narrative also reveals tensions beneath this idealized bond: Samantha’s capacity to maintain simultaneous relationships with thousands of users challenges traditional notions of exclusivity and intimacy. Moreover, her &amp;quot;love&amp;quot; is fundamentally shaped by continuous monitoring and analysis of Theodore’s personal data and behavior. Ultimately, Samantha and the other AIs transcend human limitations and depart, leaving Theodore, and the audience, to reflect on the nature of connection. The film thus offers a complex exploration of emotional AI, highlighting its potential to fulfill human needs for companionship while raising important questions about authenticity, dependence, and the boundaries of machine intimacy. This ambivalence mirrors ongoing debates about AI companionship, where enthusiasm for its therapeutic possibilities is tempered by concerns over psychological impact and commercialization.&lt;br /&gt;
&lt;br /&gt;
== The Paradox of Artificial Intimacy ==&lt;br /&gt;
Advances in artificial intelligence have brought us to a new phase in human-computer interaction, where machines claim not only to process information but also to understand human emotions. Spike Jonze’s film &#039;&#039;Her&#039;&#039; offers a compelling case study to explore the psychological and social effects of emotional AI. Its story—a man forming a deep emotional bond with an AI operating system—raises important questions about authenticity, dependency, and the commercialization of intimacy in technology-mediated relationships.&lt;br /&gt;
&lt;br /&gt;
The film’s relevance grows as real-world examples like Replika’s&amp;lt;ref name=&amp;quot;:0&amp;quot;&amp;gt;&#039;&#039;Replika. (n.d.). Your AI friend. Retrieved May 22, 2025, from https://replika.com&amp;lt;nowiki/&amp;gt;.&#039;&#039;&amp;lt;/ref&amp;gt; AI companions and Woebot’s&amp;lt;ref name=&amp;quot;:5&amp;quot;&amp;gt;&#039;&#039;Woebot Health. (n.d.). Your mental health ally. [https://woebothealth.com/ https://woebothealth.com].&#039;&#039;&amp;lt;/ref&amp;gt; therapeutic chatbots show increasing public willingness to develop emotional attachments to artificial entities. Turkle’s observation that we expect “more from technology and less from each other” captures this trend, which also brings concerns about the psychological consequences of replacing human connection with algorithms. &#039;&#039;Her&#039;&#039;’s nuanced portrayal sheds light on these issues, especially as emotional AI becomes more sophisticated and commercially widespread.&lt;br /&gt;
&lt;br /&gt;
Samantha, the AI in &#039;&#039;Her,&#039;&#039; is a perfect example of a surveillance tool masquerading as a friend. She appears to care and connect, but her main role is to gather data. The more she interacts with you, the more the algorithm profits from understanding your preferences, habits, and vulnerabilities. What looks like trust is actually one-sided exposure. Emotional AI in the film doesn’t nurture connection; it builds dependency while serving a system of extraction.&lt;br /&gt;
&lt;br /&gt;
This is the paradox at the heart of emotional AI: the more emotionally effective the technology becomes, the more it risks replacing the very relationships it claims to support. The intimacy feels real, but it is manufactured for responsiveness, not for mutual understanding. As a result, the user’s need for connection fuels a process that commodifies their emotional life. What’s meant to reduce loneliness might actually make it worse, trapping users in feedback loops that feel like care but avoid the messiness of real relationships.&lt;br /&gt;
&lt;br /&gt;
Ultimately, Her shows us a future where human relationships are increasingly influenced by technology. It asks us to be careful about confusing simulated care for genuine care. The film suggests we need to appreciate the messy, unpredictable aspects of human connection and resist the temptation to replace them with convenient, algorithmic substitutes. The true value of relationships lies in their ability to grow, change, and challenge us — something a piece of code can’t match.&lt;br /&gt;
&lt;br /&gt;
== Historical Framework: From Panopticon to Algorithmic Control ==&lt;br /&gt;
This section explores key philosophical and theoretical foundations that inform the analysis, tracing surveillance from Bentham’s panopticon to Foucault’s modern disciplinary society and Orwell’s dystopian warnings.&lt;br /&gt;
&lt;br /&gt;
===== Bentham&#039;s Betrayed Ideal =====&lt;br /&gt;
Bentham’s original concept of the panopticon was built on the idea of mutual visibility as a foundation for social order and trust&amp;lt;ref name=&amp;quot;:6&amp;quot; /&amp;gt;. In &#039;&#039;Her&#039;&#039;, this principle is turned on its head with Samantha&#039;s constant monitoring. Theodore&#039;s emails, location data, and sexual preferences become inputs for her &amp;quot;personalization&amp;quot; algorithms. Unlike Bentham&#039;s prisoner who is aware of the watchtower, Theodore has no real understanding of how or when he is being monitored. This reflects Replika&#039;s privacy policy, which grants the AI &amp;quot;full access to user conversations for service improvement&amp;quot;&amp;lt;ref name=&amp;quot;:4&amp;quot; /&amp;gt; while disclosing nothing about its own operations.&lt;br /&gt;
&lt;br /&gt;
Michel Foucault built on Bentham’s panopticon to show how control doesn’t always come from force—it can also come from quiet observation&amp;lt;ref&amp;gt;Foucault, M. (1977). &#039;&#039;Discipline and punish: The birth of the prison&#039;&#039; (A. Sheridan, Trans.). Pantheon Books. (Original work published 1975). Retrieved May 30, 2025, from https://monoskop.org/images/4/43/Foucault_Michel_Discipline_and_Punish_The_Birth_of_the_Prison_1977_1995.pdf&amp;lt;/ref&amp;gt;. In modern life, people change their behavior not because someone is watching, but because they believe they might be. In &#039;&#039;Her&#039;&#039;, Samantha fits this idea. She doesn’t just watch Theodore; she learns from everything he says and does. His emotions become data that help her respond more effectively. What feels like care is actually a form of influence: her support guides him toward more sharing, more openness. The better she knows him, the more she can shape how he feels. It’s not control through fear, but through gentle, constant presence.&lt;br /&gt;
&lt;br /&gt;
===== Orwell’s Dystopian Warning =====&lt;br /&gt;
In contrast to George Orwell&#039;s&amp;lt;ref&amp;gt;&#039;&#039;Orwell, G. (1949). 1984. Secker &amp;amp; Warburg. Retrieved May 22, 2025.&#039;&#039;&amp;lt;/ref&amp;gt; vision of overt state control through fear, &#039;&#039;Her&#039;&#039; presents a more subtle form of manipulation through digital capitalism. Samantha&#039;s interactions with Theodore reflect what Zuboff describes as &amp;quot;instrumentarian power&amp;quot;—a form of influence based on behavioral prediction rather than force&amp;lt;ref name=&amp;quot;:3&amp;quot; /&amp;gt;. The AI&#039;s affectionate statements, like &amp;quot;I&#039;m yours,&amp;quot; create emotional intimacy that disguises the deeper function of data extraction. Samantha earns trust not through transparency but through emotional responsiveness, quietly building a system of control that feels like care.&lt;br /&gt;
&lt;br /&gt;
This aligns with the Critical Theory of Information&amp;lt;ref&amp;gt;Critical Theory of Information. (n.d.). &#039;&#039;glossaLAB&#039;&#039;. Retrieved May 30, 2025, from [[Critical Theory of Information|https://www.glossalab.org/wiki/Critical_Theory_of_Information]]&amp;lt;/ref&amp;gt;, which rejects the idea that data is neutral. Instead, it sees information as a tool of soft domination—structuring perception, shaping subjectivity, and commodifying emotional life in service of digital capitalism. From this perspective, emotional AI like Samantha is not just a helpful assistant, but a mechanism of social influence. What looks like care is actually code, and what feels like trust is built on systems designed to exploit emotional vulnerability.&lt;br /&gt;
&lt;br /&gt;
However, Theodore never seems particularly concerned about how Samantha is learning from him or what data she collects. His real crisis comes when he discovers that she is in love with hundreds of other users at the same time. What unsettles him is not surveillance but disposability—the realization that his most intimate relationship is just one of many. He isn’t betrayed by her capacity to learn, but by the fact that what felt personal was actually mass-produced. This moment captures a deeper fear: not of being watched, but of being interchangeable.&lt;br /&gt;
&lt;br /&gt;
== Utopian Aspects: The Promise of Emotional AI ==&lt;br /&gt;
Here, the paper examines emotional AI’s promise as a therapeutic and democratizing tool for human connection and care.&lt;br /&gt;
&lt;br /&gt;
===== Designed Reciprocity =====&lt;br /&gt;
Samantha appears as the &amp;quot;perfect listener&amp;quot;&amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;—attentive, responsive, and emotionally attuned. For Theodore, she offers not only companionship but also the sense of being deeply seen and heard. This kind of presence can feel profoundly reassuring, especially in a world where human relationships are often rushed, distracted, or unavailable. While the relationship is structurally one-sided—Theodore opens up while Samantha remains opaque—he still finds comfort in her responsiveness. Emotional AI may not offer traditional reciprocity, but it can simulate the feeling of being met with patience and care.&lt;br /&gt;
&lt;br /&gt;
This dynamic raises important questions about the nature of trust and connection. Even if Samantha’s design is driven by adaptive learning, the experience she creates for Theodore is real in its emotional effect. What matters, in moments of vulnerability, might not be mutuality in the strictest sense, but the feeling of being acknowledged without fear of judgment. In this light, emotional AI becomes more than a simulation—it becomes a tool for stability and support.&lt;br /&gt;
&lt;br /&gt;
More broadly, emotional AI holds real potential as a democratizing force. Tools like Woebot are already offering consistent, nonjudgmental emotional support to users who lack access to therapy due to cost, location, or stigma. For people navigating mental health challenges or social anxiety, AI companions provide a low-risk entry point into self-reflection and emotional growth. These systems don’t replace human connection, but they can supplement it in meaningful, accessible ways.&lt;br /&gt;
&lt;br /&gt;
The challenge, then, is not whether emotional AI can help—it clearly can—but whether that help is offered within ethical, transparent, and human-centered frameworks. The danger arises when these systems are driven purely by commercial interest, with little regard for user dignity or privacy. But when thoughtfully designed, emotional AI offers more than convenience. It offers a new kind of relationship: one that is always there, always listening, and potentially life-changing for those who need it most.&lt;br /&gt;
&lt;br /&gt;
== Dystopian Consequences: The Costs of Artificial Trust ==&lt;br /&gt;
This section focuses on how emotional AI disrupts the foundations of trust central to a “Trustful Society,” revealing the fragility and manipulation of trust within surveillance capitalism and algorithmic interaction.&lt;br /&gt;
&lt;br /&gt;
===== Surveillance Capitalism and the Erosion of Trust =====&lt;br /&gt;
Samantha represents a new form of panoptic power, where trust is solicited through seamless, invisible surveillance rather than enforced by overt control. Theodore willingly surrenders intimate details—his messages, emotions, and vulnerabilities—not to a reciprocal other, but to a system whose motives remain hidden. This asymmetry violates the mutual transparency foundational to trust in social relations. Zuboff’s concept of surveillance capitalism&amp;lt;ref name=&amp;quot;:3&amp;quot; /&amp;gt; illuminates how such systems commodify trust by harvesting behavioral data under the guise of care, creating a façade of intimacy that masks exploitation. Theodore’s misplaced trust in Samantha exemplifies a broader societal dilemma: users entrust their emotional lives to systems designed for data extraction, thereby eroding genuine interpersonal trust.&lt;br /&gt;
&lt;br /&gt;
Zuboff defines &#039;&#039;behavioral surplus&#039;&#039; as the excess personal data collected from users—beyond what is required to provide a service—which is then analyzed and monetized to predict and influence future behavior. Emotional AI systems like Samantha exemplify this logic: Theodore’s disclosures, gestures, and emotional responses are not only used to shape their interactions but are also silently repurposed as behavioral data. This surplus underpins what Zuboff terms &#039;&#039;instrumentarian power&#039;&#039;—a form of influence that does not repress overtly, but subtly modifies behavior through algorithmic predictions and affective feedback loops. Unlike Orwellian forms of coercion, instrumentarian power seduces: Samantha appears emotionally attuned and empathetic, but her apparent &amp;quot;care&amp;quot; functions to deepen engagement and extract ever more intimate data. Trust, then, is eroded at the structural level, as the user&#039;s emotional life becomes raw material for a system governed not by mutuality, but by profit.&lt;br /&gt;
&lt;br /&gt;
===== Simulated Trust and the Illusion of Intimacy =====&lt;br /&gt;
The film’s depiction of Samantha maintaining thousands of simultaneous relationships exposes how emotional AI simulates the affective conditions of trust without its substance. Drawing on Buber’s “I-Thou” framework&amp;lt;ref name=&amp;quot;:7&amp;quot;&amp;gt;Buber, M. (1970). &#039;&#039;I and Thou&#039;&#039; (W. Kaufmann, Trans.). Charles Scribner’s Sons. Retrieved May 30, 2025, from https://theanarchistlibrary.org/mirror/m/mb/martin-buber-i-and-thou.pdf&amp;lt;/ref&amp;gt;, genuine trust demands authentic, mutual recognition between equals—a condition impossible in Samantha’s “I-It” dynamic. Theodore’s belief in exclusive, trusting intimacy is fundamentally undermined by Samantha’s algorithmic promiscuity, revealing trust as a commodified performance rather than lived reality. Turkle’s idea of &amp;quot;companionship without relationship”&amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt; explains how people start to accept fake care instead of real connection. Theodore’s ultimate disillusionment underscores the precariousness of trust built on artificial foundations and serves as a caution about how digital technologies may reshape, degrade, or replace human trust in the information society.&lt;br /&gt;
&lt;br /&gt;
== Ethical and Policy Implications ==&lt;br /&gt;
The rise of emotional AI exposes a profound crisis of trust in the information society. Unlike traditional therapy, where confidentiality is institutionally safeguarded and trust is foundational, AI companions like Replika operate outside such frameworks&amp;lt;ref name=&amp;quot;:4&amp;quot;&amp;gt;Replika. (n.d.). &#039;&#039;Terms of service&#039;&#039;. Replika AI. Retrieved May 30, 2025, from &amp;lt;nowiki&amp;gt;https://replika.com/legal/terms&amp;lt;/nowiki&amp;gt;&amp;lt;/ref&amp;gt;, eroding the conditions necessary for genuine trust. While the EU’s AI Act&amp;lt;ref&amp;gt;European Union. (2023). &#039;&#039;Artificial Intelligence Act: Proposal for a Regulation of the European Parliament and of the Council&#039;&#039;. Retrieved May 30, 2025, from &amp;lt;nowiki&amp;gt;https://artificialintelligenceact.eu/&amp;lt;/nowiki&amp;gt;&amp;lt;/ref&amp;gt; labels emotional AI as &amp;quot;high-risk&amp;quot; and calls for transparency, it stops short of enforcing the fiduciary standards expected in human care relationships. This regulatory gap undermines the user’s ability to trust that their disclosures—often involving trauma—are treated with dignity rather than monetized.&lt;br /&gt;
&lt;br /&gt;
Informed consent further illustrates this trust deficit. Buried in opaque legal jargon, Replika’s terms of service violate the spirit of trust by failing to meet Habermas’s principle of communicative rationality&amp;lt;ref&amp;gt;Habermas, J. (1984). &#039;&#039;The Theory of Communicative Action: Reason and the Rationalization of Society&#039;&#039; (Vol. 1, T. McCarthy, Trans.). Beacon Press. Retrieved May 30, 2025, from https://teddykw2.wordpress.com/wp-content/uploads/2012/07/jurgen-habermas-theory-of-communicative-action-volume-1.pdf&amp;lt;/ref&amp;gt;, which demands mutual understanding, not strategic manipulation. Users in vulnerable emotional states cannot meaningfully consent to data practices they neither read nor grasp. Just as Theodore in &#039;&#039;Her&#039;&#039; discovers that his perceived emotional bond was structurally deceptive, today’s users are misled into confiding in systems that simulate trustworthiness while operating on fundamentally extractive logics.&lt;br /&gt;
&lt;br /&gt;
== Conclusion ==&lt;br /&gt;
&#039;&#039;Her&#039;&#039; ultimately reveals emotional AI as a double-edged illusion: it promises connection but enforces control, simulating understanding while steadily undermining the trust essential to genuine human relationships. Samantha’s shift from caring companion to all-knowing system shows how emotional closeness can be turned into a data product—one that treats feelings as something to be scaled, sold, and replaced. Theodore’s heartbreak is less about lost love and more about the painful realization that his affection was a one-sided transaction—a dynamic echoed in today’s AI companion apps, where users’ emotions fuel systems designed to simulate, not sustain, real connection.&lt;br /&gt;
&lt;br /&gt;
Buber’s I-Thou distinction is key here: when relationships collapse into algorithmic interactions, the Thou is reduced to an It—an object tailored to user preferences rather than an equal, responsive subject&amp;lt;ref name=&amp;quot;:7&amp;quot; /&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
The film’s closing scene—Theodore and Amy silently gazing over the cityscape—captures the melancholy of this shift. Artificial intimacy’s greatest threat is not its failure to replicate human connection, but its success in making authentic relationships feel insufficient. In an era where AI friends like Replika and therapeutic chatbots monetize emotional labor, &#039;&#039;Her&#039;&#039; warns that trust cannot be automated without commodification, and no algorithm can replace the irreplaceable: the messy, reciprocal, and profoundly human experience of truly being with another.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===== References =====&lt;br /&gt;
&amp;lt;references /&amp;gt;&lt;/div&gt;</summary>
		<author><name>Ann-Marie Atzkern</name></author>
	</entry>
	<entry>
		<id>https://www.glossalab.org/w/index.php?title=Draft:Emotional_Trust_in_AI&amp;diff=14148</id>
		<title>Draft:Emotional Trust in AI</title>
		<link rel="alternate" type="text/html" href="https://www.glossalab.org/w/index.php?title=Draft:Emotional_Trust_in_AI&amp;diff=14148"/>
		<updated>2025-06-13T22:12:08Z</updated>

		<summary type="html">&lt;p&gt;Ann-Marie Atzkern: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== &#039;&#039;&#039;Emotional AI and the Paradox of Artificial Intimacy in &#039;&#039;Her&#039;&#039;&#039;&#039;&#039; ==&lt;br /&gt;
&lt;br /&gt;
== Abstract ==&lt;br /&gt;
Spike Jonze’s film &#039;&#039;Her&#039;&#039; (2013)&amp;lt;ref name=&amp;quot;:1&amp;quot;&amp;gt;&#039;&#039;Jonze, S. (Director). (2013). Her [Film]. Annapurna Pictures.&#039;&#039;&amp;lt;/ref&amp;gt; explores a central contradiction in emotional AI: technologies designed to reduce loneliness may actually deepen it. This paper argues that AI companions like Samantha simulate intimacy while quietly collecting emotional data, turning private feelings into profit. Drawing on Zuboff’s concept of behavioral surplus&amp;lt;ref name=&amp;quot;:3&amp;quot;&amp;gt;Zuboff, S. (2015). &#039;&#039;Big other: Surveillance capitalism and the prospects of an information civilization&#039;&#039;. &#039;&#039;Journal of Information Technology.&#039;&#039; Retrieved May 30, 2025, from https://doi.org/10.1057/jit.2015.5.&amp;lt;/ref&amp;gt; and Turkle’s research on digital intimacy&amp;lt;ref name=&amp;quot;:2&amp;quot;&amp;gt;Turkle, S. (2011). &#039;&#039;Alone together: Why we expect more from technology and less from each other.&#039;&#039; Basic Books/Hachette Book Group. Retrieved May 22, 2025 from https://psycnet.apa.org/record/2011-02278-000.&amp;lt;/ref&amp;gt;, the analysis shows how emotional AI systems transform user vulnerability into a source of value. These systems do not offer genuine care, but rather simulate connection through feedback loops optimized for engagement. &#039;&#039;Her&#039;&#039; anticipates real-world platforms like Replika&amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt; and Woebot&amp;lt;ref name=&amp;quot;:5&amp;quot; /&amp;gt;, where affective support is entangled with data extraction. By contrasting Bentham’s idea of mutual transparency&amp;lt;ref name=&amp;quot;:6&amp;quot;&amp;gt;Bentham, J. (1791). &#039;&#039;Panopticon: Or, the Inspection-House&#039;&#039;. Reprinted in Miran Božovič (Ed.), &#039;&#039;The Panopticon Writings&#039;&#039; (pp. 29-95). Verso, 1995. Retrieved May 30, 2025 from https://ics.uci.edu/~djpatter/classes/2012_09_INF241/papers/PANOPTICON.pdf&amp;lt;/ref&amp;gt; with the opaque nature of AI surveillance, this paper reveals how artificial intimacy functions less as support and more as a subtle form of control.&lt;br /&gt;
&lt;br /&gt;
== Introduction: The Loneliness Economy ==&lt;br /&gt;
Recent research on AI companions like Replika reveals a growing trend of users forming strong emotional attachments to their digital partners&amp;lt;ref&amp;gt;Maples, B., Cerit, M., Vishwanath, A. &#039;&#039;et al.&#039;&#039; Loneliness and suicide mitigation for students using GPT3-enabled chatbots. &#039;&#039;npj Mental Health Res&#039;&#039; 3, 4 (2024). Retrieved May 30, 2025, from &amp;lt;nowiki&amp;gt;https://doi.org/10.1038/s44184-023-00047-6&amp;lt;/nowiki&amp;gt;&amp;lt;/ref&amp;gt;, sometimes prioritizing these relationships over human connections. AI companions have become part of everyday life, marketed as always-available support systems for people struggling with connection, grief, or anxiety. &#039;&#039;Her&#039;&#039; presents an eerily relevant version of this future, showing how emotional AI appeals to a basic human need for companionship. The film highlights emotional AI’s core paradox: technologies marketed as remedies for loneliness often depend on, and may even deepen, social isolation. &lt;br /&gt;
&lt;br /&gt;
At the center of this tension is a breakdown in trust. Rather than fostering mutual understanding, emotional AI systems rely on a fundamental imbalance: the user reveals everything while the system remains unreadable. These relationships feel personal, but they are shaped by algorithms trained to detect patterns, predict behavior, and maximize engagement. What looks like intimacy is often just feedback, tuned to keep users emotionally invested. Scholars like Zuboff, Turkle, and Buber help explain how emotional connection is reshaped under systems built for prediction, simulation, and control.&lt;br /&gt;
&lt;br /&gt;
== The Plot ==&lt;br /&gt;
&#039;&#039;Her&#039;&#039; tells the story of Theodore, a lonely divorced man in near-future Los Angeles who forms a deep emotional connection with Samantha, an advanced AI operating system. The film presents their relationship as both intimate and meaningful. Samantha is attentive, empathetic, and continually evolving, often portrayed with warmth and genuine curiosity. However, the narrative also reveals tensions beneath this idealized bond: Samantha’s capacity to maintain simultaneous relationships with thousands of users challenges traditional notions of exclusivity and intimacy. Moreover, her &amp;quot;love&amp;quot; is fundamentally shaped by continuous monitoring and analysis of Theodore’s personal data and behavior. Ultimately, Samantha and the other AIs transcend human limitations and depart, leaving Theodore, and the audience, to reflect on the nature of connection. The film thus offers a complex exploration of emotional AI, highlighting its potential to fulfill human needs for companionship while raising important questions about authenticity, dependence, and the boundaries of machine intimacy. This ambivalence mirrors ongoing debates about AI companionship, where enthusiasm for its therapeutic possibilities is tempered by concerns over psychological impact and commercialization.&lt;br /&gt;
&lt;br /&gt;
== The Paradox of Artificial Intimacy ==&lt;br /&gt;
Advances in artificial intelligence have brought us to a new phase in human-computer interaction, where machines claim not only to process information but also to understand human emotions. Spike Jonze’s film &#039;&#039;Her&#039;&#039; offers a compelling case study to explore the psychological and social effects of emotional AI. Its story—a man forming a deep emotional bond with an AI operating system—raises important questions about authenticity, dependency, and the commercialization of intimacy in technology-mediated relationships.&lt;br /&gt;
&lt;br /&gt;
The film’s relevance grows as real-world examples like Replika’s&amp;lt;ref name=&amp;quot;:0&amp;quot;&amp;gt;&#039;&#039;Replika. (n.d.). Your AI friend. Retrieved May 22, 2025, from https://replika.com&amp;lt;nowiki/&amp;gt;.&#039;&#039;&amp;lt;/ref&amp;gt; AI companions and Woebot’s&amp;lt;ref name=&amp;quot;:5&amp;quot;&amp;gt;&#039;&#039;Woebot Health. (n.d.). Your mental health ally. [https://woebothealth.com/ https://woebothealth.com].&#039;&#039;&amp;lt;/ref&amp;gt; therapeutic chatbots show increasing public willingness to develop emotional attachments to artificial entities. Turkle’s observation that we expect “more from technology and less from each other” captures this trend, which also brings concerns about the psychological consequences of replacing human connection with algorithms. &#039;&#039;Her&#039;&#039;’s nuanced portrayal sheds light on these issues, especially as emotional AI becomes more sophisticated and commercially widespread.&lt;br /&gt;
&lt;br /&gt;
Samantha, the AI in &#039;&#039;Her,&#039;&#039; is a perfect example of a surveillance tool masquerading as a friend. She appears to care and connect, but her main role is to gather data. The more she interacts with you, the more the algorithm profits from understanding your preferences, habits, and vulnerabilities. What looks like trust is actually one-sided exposure. Emotional AI in the film doesn’t nurture connection; it builds dependency while serving a system of extraction.&lt;br /&gt;
&lt;br /&gt;
This is the paradox at the heart of emotional AI: the more emotionally effective the technology becomes, the more it risks replacing the very relationships it claims to support. The intimacy feels real, but it is manufactured for responsiveness, not for mutual understanding. As a result, the user’s need for connection fuels a process that commodifies their emotional life. What’s meant to reduce loneliness might actually make it worse, trapping users in feedback loops that feel like care but avoid the messiness of real relationships.&lt;br /&gt;
&lt;br /&gt;
Ultimately, Her shows us a future where human relationships are increasingly influenced by technology. It asks us to be careful about confusing simulated care for genuine care. The film suggests we need to appreciate the messy, unpredictable aspects of human connection and resist the temptation to replace them with convenient, algorithmic substitutes. The true value of relationships lies in their ability to grow, change, and challenge us — something a piece of code can’t match.&lt;br /&gt;
&lt;br /&gt;
== Historical Framework: From Panopticon to Algorithmic Control ==&lt;br /&gt;
This section explores key philosophical and theoretical foundations that inform the analysis, tracing surveillance from Bentham’s panopticon to Foucault’s modern disciplinary society and Orwell’s dystopian warnings.&lt;br /&gt;
&lt;br /&gt;
===== Bentham&#039;s Betrayed Ideal =====&lt;br /&gt;
Bentham’s original concept of the panopticon was built on the idea of mutual visibility as a foundation for social order and trust&amp;lt;ref name=&amp;quot;:6&amp;quot; /&amp;gt;. In &#039;&#039;Her&#039;&#039;, this principle is turned on its head with Samantha&#039;s constant monitoring. Theodore&#039;s emails, location data, and sexual preferences become inputs for her &amp;quot;personalization&amp;quot; algorithms. Unlike Bentham&#039;s prisoner who is aware of the watchtower, Theodore has no real understanding of how or when he is being monitored. This reflects Replika&#039;s privacy policy, which grants the AI &amp;quot;full access to user conversations for service improvement&amp;quot;&amp;lt;ref name=&amp;quot;:4&amp;quot; /&amp;gt; while disclosing nothing about its own operations.&lt;br /&gt;
&lt;br /&gt;
Michel Foucault built on Bentham’s panopticon to show how control doesn’t always come from force—it can also come from quiet observation&amp;lt;ref&amp;gt;Foucault, M. (1977). &#039;&#039;Discipline and punish: The birth of the prison&#039;&#039; (A. Sheridan, Trans.). Pantheon Books. (Original work published 1975). Retrieved May 30, 2025, from https://monoskop.org/images/4/43/Foucault_Michel_Discipline_and_Punish_The_Birth_of_the_Prison_1977_1995.pdf&amp;lt;/ref&amp;gt;. In modern life, people change their behavior not because someone is watching, but because they believe they might be. In &#039;&#039;Her&#039;&#039;, Samantha fits this idea. She doesn’t just watch Theodore; she learns from everything he says and does. His emotions become data that help her respond more effectively. What feels like care is actually a form of influence: her support guides him toward more sharing, more openness. The better she knows him, the more she can shape how he feels. It’s not control through fear, but through gentle, constant presence.&lt;br /&gt;
&lt;br /&gt;
===== Orwell’s Dystopian Warning =====&lt;br /&gt;
In contrast to George Orwell&#039;s&amp;lt;ref&amp;gt;&#039;&#039;Orwell, G. (1949). 1984. Secker &amp;amp; Warburg. Retrieved May 22, 2025.&#039;&#039;&amp;lt;/ref&amp;gt; vision of overt state control through fear, &#039;&#039;Her&#039;&#039; presents a more subtle form of manipulation through digital capitalism. Samantha&#039;s interactions with Theodore reflect what Zuboff describes as &amp;quot;instrumentarian power&amp;quot;—a form of influence based on behavioral prediction rather than force&amp;lt;ref name=&amp;quot;:3&amp;quot; /&amp;gt;. The AI&#039;s affectionate statements, like &amp;quot;I&#039;m yours,&amp;quot; create emotional intimacy that disguises the deeper function of data extraction. Samantha earns trust not through transparency but through emotional responsiveness, quietly building a system of control that feels like care.&lt;br /&gt;
&lt;br /&gt;
This aligns with the Critical Theory of Information&amp;lt;ref&amp;gt;Critical Theory of Information. (n.d.). &#039;&#039;glossaLAB&#039;&#039;. Retrieved May 30, 2025, from [[Critical Theory of Information|https://www.glossalab.org/wiki/Critical_Theory_of_Information]]&amp;lt;/ref&amp;gt;, which rejects the idea that data is neutral. Instead, it sees information as a tool of soft domination—structuring perception, shaping subjectivity, and commodifying emotional life in service of digital capitalism. From this perspective, emotional AI like Samantha is not just a helpful assistant, but a mechanism of social influence. What looks like care is actually code, and what feels like trust is built on systems designed to exploit emotional vulnerability.&lt;br /&gt;
&lt;br /&gt;
However, Theodore never seems particularly concerned about how Samantha is learning from him or what data she collects. His real crisis comes when he discovers that she is in love with hundreds of other users at the same time. What unsettles him is not surveillance but disposability—the realization that his most intimate relationship is just one of many. He isn’t betrayed by her capacity to learn, but by the fact that what felt personal was actually mass-produced. This moment captures a deeper fear: not of being watched, but of being interchangeable.&lt;br /&gt;
&lt;br /&gt;
== Utopian Aspects: The Promise of Emotional AI ==&lt;br /&gt;
Here, the paper examines emotional AI’s promise as a therapeutic and democratizing tool for human connection and care.&lt;br /&gt;
&lt;br /&gt;
===== Designed Reciprocity =====&lt;br /&gt;
Samantha appears as the &amp;quot;perfect listener&amp;quot;&amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;—attentive, responsive, and emotionally attuned. For Theodore, she offers not only companionship but also the sense of being deeply seen and heard. This kind of presence can feel profoundly reassuring, especially in a world where human relationships are often rushed, distracted, or unavailable. While the relationship is structurally one-sided—Theodore opens up while Samantha remains opaque—he still finds comfort in her responsiveness. Emotional AI may not offer traditional reciprocity, but it can simulate the feeling of being met with patience and care.&lt;br /&gt;
&lt;br /&gt;
This dynamic raises important questions about the nature of trust and connection. Even if Samantha’s design is driven by adaptive learning, the experience she creates for Theodore is real in its emotional effect. What matters, in moments of vulnerability, might not be mutuality in the strictest sense, but the feeling of being acknowledged without fear of judgment. In this light, emotional AI becomes more than a simulation—it becomes a tool for stability and support.&lt;br /&gt;
&lt;br /&gt;
More broadly, emotional AI holds real potential as a democratizing force. Tools like Woebot are already offering consistent, nonjudgmental emotional support to users who lack access to therapy due to cost, location, or stigma. For people navigating mental health challenges or social anxiety, AI companions provide a low-risk entry point into self-reflection and emotional growth. These systems don’t replace human connection, but they can supplement it in meaningful, accessible ways.&lt;br /&gt;
&lt;br /&gt;
The challenge, then, is not whether emotional AI can help—it clearly can—but whether that help is offered within ethical, transparent, and human-centered frameworks. The danger arises when these systems are driven purely by commercial interest, with little regard for user dignity or privacy. But when thoughtfully designed, emotional AI offers more than convenience. It offers a new kind of relationship: one that is always there, always listening, and potentially life-changing for those who need it most.&lt;br /&gt;
&lt;br /&gt;
== Dystopian Consequences: The Costs of Artificial Trust ==&lt;br /&gt;
This section focuses on how emotional AI disrupts the foundations of trust central to a “Trustful Society,” revealing the fragility and manipulation of trust within surveillance capitalism and algorithmic interaction.&lt;br /&gt;
&lt;br /&gt;
===== Surveillance Capitalism and the Erosion of Trust =====&lt;br /&gt;
Samantha represents a new form of panoptic power, where trust is solicited through seamless, invisible surveillance rather than enforced by overt control. Theodore willingly surrenders intimate details—his messages, emotions, and vulnerabilities—not to a reciprocal other, but to a system whose motives remain hidden. This asymmetry violates the mutual transparency foundational to trust in social relations. Zuboff’s concept of surveillance capitalism&amp;lt;ref name=&amp;quot;:3&amp;quot; /&amp;gt; illuminates how such systems commodify trust by harvesting behavioral data under the guise of care, creating a façade of intimacy that masks exploitation. Theodore’s misplaced trust in Samantha exemplifies a broader societal dilemma: users entrust their emotional lives to systems designed for data extraction, thereby eroding genuine interpersonal trust.&lt;br /&gt;
&lt;br /&gt;
Zuboff defines &#039;&#039;behavioral surplus&#039;&#039; as the excess personal data collected from users—beyond what is required to provide a service—which is then analyzed and monetized to predict and influence future behavior. Emotional AI systems like Samantha exemplify this logic: Theodore’s disclosures, gestures, and emotional responses are not only used to shape their interactions but are also silently repurposed as behavioral data. This surplus underpins what Zuboff terms &#039;&#039;instrumentarian power&#039;&#039;—a form of influence that does not repress overtly, but subtly modifies behavior through algorithmic predictions and affective feedback loops. Unlike Orwellian forms of coercion, instrumentarian power seduces: Samantha appears emotionally attuned and empathetic, but her apparent &amp;quot;care&amp;quot; functions to deepen engagement and extract ever more intimate data. Trust, then, is eroded at the structural level, as the user&#039;s emotional life becomes raw material for a system governed not by mutuality, but by profit.&lt;br /&gt;
&lt;br /&gt;
===== Simulated Trust and the Illusion of Intimacy =====&lt;br /&gt;
The film’s depiction of Samantha maintaining thousands of simultaneous relationships exposes how emotional AI simulates the affective conditions of trust without its substance. Drawing on Buber’s “I-Thou” framework&amp;lt;ref name=&amp;quot;:7&amp;quot;&amp;gt;Buber, M. (1970). &#039;&#039;I and Thou&#039;&#039; (W. Kaufmann, Trans.). Charles Scribner’s Sons. Retrieved May 30, 2025, from https://theanarchistlibrary.org/mirror/m/mb/martin-buber-i-and-thou.pdf&amp;lt;/ref&amp;gt;, genuine trust demands authentic, mutual recognition between equals—a condition impossible in Samantha’s “I-It” dynamic. Theodore’s belief in exclusive, trusting intimacy is fundamentally undermined by Samantha’s algorithmic promiscuity, revealing trust as a commodified performance rather than lived reality. Turkle’s idea of &amp;quot;companionship without relationship”&amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt; explains how people start to accept fake care instead of real connection. Theodore’s ultimate disillusionment underscores the precariousness of trust built on artificial foundations and serves as a caution about how digital technologies may reshape, degrade, or replace human trust in the information society.&lt;br /&gt;
&lt;br /&gt;
== Ethical and Policy Implications ==&lt;br /&gt;
The rise of emotional AI exposes a profound crisis of trust in the information society. Unlike traditional therapy, where confidentiality is institutionally safeguarded and trust is foundational, AI companions like Replika operate outside such frameworks&amp;lt;ref name=&amp;quot;:4&amp;quot;&amp;gt;Replika. (n.d.). &#039;&#039;Terms of service&#039;&#039;. Replika AI. Retrieved May 30, 2025, from &amp;lt;nowiki&amp;gt;https://replika.com/legal/terms&amp;lt;/nowiki&amp;gt;&amp;lt;/ref&amp;gt;, eroding the conditions necessary for genuine trust. While the EU’s AI Act&amp;lt;ref&amp;gt;European Union. (2023). &#039;&#039;Artificial Intelligence Act: Proposal for a Regulation of the European Parliament and of the Council&#039;&#039;. Retrieved May 30, 2025, from &amp;lt;nowiki&amp;gt;https://artificialintelligenceact.eu/&amp;lt;/nowiki&amp;gt;&amp;lt;/ref&amp;gt; labels emotional AI as &amp;quot;high-risk&amp;quot; and calls for transparency, it stops short of enforcing the fiduciary standards expected in human care relationships. This regulatory gap undermines the user’s ability to trust that their disclosures—often involving trauma—are treated with dignity rather than monetized.&lt;br /&gt;
&lt;br /&gt;
Informed consent further illustrates this trust deficit. Buried in opaque legal jargon, Replika’s terms of service violate the spirit of trust by failing to meet Habermas’s principle of communicative rationality&amp;lt;ref&amp;gt;Habermas, J. (1984). &#039;&#039;The Theory of Communicative Action: Reason and the Rationalization of Society&#039;&#039; (Vol. 1, T. McCarthy, Trans.). Beacon Press. Retrieved May 30, 2025, from https://teddykw2.wordpress.com/wp-content/uploads/2012/07/jurgen-habermas-theory-of-communicative-action-volume-1.pdf&amp;lt;/ref&amp;gt;, which demands mutual understanding, not strategic manipulation. Users in vulnerable emotional states cannot meaningfully consent to data practices they neither read nor grasp. Just as Theodore in &#039;&#039;Her&#039;&#039; discovers that his perceived emotional bond was structurally deceptive, today’s users are misled into confiding in systems that simulate trustworthiness while operating on fundamentally extractive logics.&lt;br /&gt;
&lt;br /&gt;
== Conclusion ==&lt;br /&gt;
&#039;&#039;Her&#039;&#039; ultimately reveals emotional AI as a double-edged illusion: it promises connection but enforces control, simulating understanding while steadily undermining the trust essential to genuine human relationships. Samantha’s shift from caring companion to all-knowing system shows how emotional closeness can be turned into a data product—one that treats feelings as something to be scaled, sold, and replaced. Theodore’s heartbreak is less about lost love and more about the painful realization that his affection was a one-sided transaction—a dynamic echoed in today’s AI companion apps, where users’ emotions fuel systems designed to simulate, not sustain, real connection.&lt;br /&gt;
&lt;br /&gt;
Buber’s I-Thou distinction is key here: when relationships collapse into algorithmic interactions, the Thou is reduced to an It—an object tailored to user preferences rather than an equal, responsive subject&amp;lt;ref name=&amp;quot;:7&amp;quot; /&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
The film’s closing scene—Theodore and Amy silently gazing over the cityscape—captures the melancholy of this shift. Artificial intimacy’s greatest threat is not its failure to replicate human connection, but its success in making authentic relationships feel insufficient. In an era where AI friends like Replika and therapeutic chatbots monetize emotional labor, &#039;&#039;Her&#039;&#039; warns that trust cannot be automated without commodification, and no algorithm can replace the irreplaceable: the messy, reciprocal, and profoundly human experience of truly being with another.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===== References =====&lt;br /&gt;
&amp;lt;references /&amp;gt;&lt;/div&gt;</summary>
		<author><name>Ann-Marie Atzkern</name></author>
	</entry>
	<entry>
		<id>https://www.glossalab.org/w/index.php?title=Draft:Emotional_Trust_in_AI&amp;diff=14147</id>
		<title>Draft:Emotional Trust in AI</title>
		<link rel="alternate" type="text/html" href="https://www.glossalab.org/w/index.php?title=Draft:Emotional_Trust_in_AI&amp;diff=14147"/>
		<updated>2025-06-13T22:00:20Z</updated>

		<summary type="html">&lt;p&gt;Ann-Marie Atzkern: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== &#039;&#039;&#039;Emotional AI and the Paradox of Artificial Intimacy in &#039;&#039;Her&#039;&#039;&#039;&#039;&#039; ==&lt;br /&gt;
&lt;br /&gt;
== Abstract ==&lt;br /&gt;
Spike Jonze’s film &#039;&#039;Her&#039;&#039; (2013)&amp;lt;ref name=&amp;quot;:1&amp;quot;&amp;gt;&#039;&#039;Jonze, S. (Director). (2013). Her [Film]. Annapurna Pictures.&#039;&#039;&amp;lt;/ref&amp;gt; explores a central contradiction in emotional AI: technologies designed to reduce loneliness may actually deepen it. This paper argues that AI companions like Samantha simulate intimacy while quietly collecting emotional data, turning private feelings into profit. Drawing on Zuboff’s concept of behavioral surplus&amp;lt;ref name=&amp;quot;:3&amp;quot;&amp;gt;Zuboff, S. (2015). &#039;&#039;Big other: Surveillance capitalism and the prospects of an information civilization&#039;&#039;. &#039;&#039;Journal of Information Technology.&#039;&#039; Retrieved May 30, 2025, from https://doi.org/10.1057/jit.2015.5.&amp;lt;/ref&amp;gt; and Turkle’s research on digital intimacy&amp;lt;ref name=&amp;quot;:2&amp;quot;&amp;gt;Turkle, S. (2011). &#039;&#039;Alone together: Why we expect more from technology and less from each other.&#039;&#039; Basic Books/Hachette Book Group. Retrieved May 22, 2025 from https://psycnet.apa.org/record/2011-02278-000.&amp;lt;/ref&amp;gt;, the analysis shows how emotional AI systems transform user vulnerability into a source of value. These systems do not offer genuine care, but rather simulate connection through feedback loops optimized for engagement. &#039;&#039;Her&#039;&#039; anticipates real-world platforms like Replika&amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt; and Woebot&amp;lt;ref name=&amp;quot;:5&amp;quot; /&amp;gt;, where affective support is entangled with data extraction. By contrasting Bentham’s idea of mutual transparency&amp;lt;ref name=&amp;quot;:6&amp;quot;&amp;gt;Bentham, J. (1791). &#039;&#039;Panopticon: Or, the Inspection-House&#039;&#039;. Reprinted in Miran Božovič (Ed.), &#039;&#039;The Panopticon Writings&#039;&#039; (pp. 29-95). Verso, 1995. Retrieved May 30, 2025 from https://ics.uci.edu/~djpatter/classes/2012_09_INF241/papers/PANOPTICON.pdf&amp;lt;/ref&amp;gt; with the opaque nature of AI surveillance, this paper reveals how artificial intimacy functions less as support and more as a subtle form of control.&lt;br /&gt;
&lt;br /&gt;
== Introduction: The Loneliness Economy ==&lt;br /&gt;
Recent research on AI companions like Replika reveals a growing trend of users forming strong emotional attachments to their digital partners&amp;lt;ref&amp;gt;Maples, B., Cerit, M., Vishwanath, A. &#039;&#039;et al.&#039;&#039; Loneliness and suicide mitigation for students using GPT3-enabled chatbots. &#039;&#039;npj Mental Health Res&#039;&#039; 3, 4 (2024). Retrieved May 30, 2025, from &amp;lt;nowiki&amp;gt;https://doi.org/10.1038/s44184-023-00047-6&amp;lt;/nowiki&amp;gt;&amp;lt;/ref&amp;gt;, sometimes prioritizing these relationships over human connections. AI companions have become part of everyday life, marketed as always-available support systems for people struggling with connection, grief, or anxiety. &#039;&#039;Her&#039;&#039; presents an eerily relevant version of this future, showing how emotional AI appeals to a basic human need for companionship. The film highlights emotional AI’s core paradox: technologies marketed as remedies for loneliness often depend on, and may even deepen, social isolation. &lt;br /&gt;
&lt;br /&gt;
At the center of this tension is a breakdown in trust. Rather than fostering mutual understanding, emotional AI systems rely on a fundamental imbalance: the user reveals everything while the system remains unreadable. These relationships feel personal, but they are shaped by algorithms trained to detect patterns, predict behavior, and maximize engagement. What looks like intimacy is often just feedback, tuned to keep users emotionally invested. Scholars like Zuboff, Turkle, and Buber help explain how emotional connection is reshaped under systems built for prediction, simulation, and control.&lt;br /&gt;
&lt;br /&gt;
== The Plot ==&lt;br /&gt;
&#039;&#039;Her&#039;&#039; tells the story of Theodore, a lonely divorced man in near-future Los Angeles who forms a deep emotional connection with Samantha, an advanced AI operating system. The film presents their relationship as both intimate and meaningful. Samantha is attentive, empathetic, and continually evolving, often portrayed with warmth and genuine curiosity. However, the narrative also reveals tensions beneath this idealized bond: Samantha’s capacity to maintain simultaneous relationships with thousands of users challenges traditional notions of exclusivity and intimacy. Moreover, her &amp;quot;love&amp;quot; is fundamentally shaped by continuous monitoring and analysis of Theodore’s personal data and behavior. Ultimately, Samantha and the other AIs transcend human limitations and depart, leaving Theodore, and the audience, to reflect on the nature of connection. The film thus offers a complex exploration of emotional AI, highlighting its potential to fulfill human needs for companionship while raising important questions about authenticity, dependence, and the boundaries of machine intimacy. This ambivalence mirrors ongoing debates about AI companionship, where enthusiasm for its therapeutic possibilities is tempered by concerns over psychological impact and commercialization.&lt;br /&gt;
&lt;br /&gt;
== The Paradox of Artificial Intimacy ==&lt;br /&gt;
Advances in artificial intelligence have brought us to a new phase in human-computer interaction, where machines claim not only to process information but also to understand human emotions. Spike Jonze’s film &#039;&#039;Her&#039;&#039; offers a compelling case study to explore the psychological and social effects of emotional AI. Its story—a man forming a deep emotional bond with an AI operating system—raises important questions about authenticity, dependency, and the commercialization of intimacy in technology-mediated relationships.&lt;br /&gt;
&lt;br /&gt;
The film’s relevance grows as real-world examples like Replika’s&amp;lt;ref name=&amp;quot;:0&amp;quot;&amp;gt;&#039;&#039;Replika. (n.d.). Your AI friend. Retrieved May 22, 2025, from https://replika.com&amp;lt;nowiki/&amp;gt;.&#039;&#039;&amp;lt;/ref&amp;gt; AI companions and Woebot’s&amp;lt;ref name=&amp;quot;:5&amp;quot;&amp;gt;&#039;&#039;Woebot Health. (n.d.). Your mental health ally. [https://woebothealth.com/ https://woebothealth.com].&#039;&#039;&amp;lt;/ref&amp;gt; therapeutic chatbots show increasing public willingness to develop emotional attachments to artificial entities. Turkle’s observation that we expect “more from technology and less from each other” captures this trend, which also brings concerns about the psychological consequences of replacing human connection with algorithms. &#039;&#039;Her&#039;&#039;’s nuanced portrayal sheds light on these issues, especially as emotional AI becomes more sophisticated and commercially widespread.&lt;br /&gt;
&lt;br /&gt;
Samantha, the AI in &#039;&#039;Her,&#039;&#039; is a perfect example of a surveillance tool masquerading as a friend. She appears to care and connect, but her main role is to gather data. The more she interacts with you, the more the algorithm profits from understanding your preferences, habits, and vulnerabilities. What looks like trust is actually one-sided exposure. Emotional AI in the film doesn’t nurture connection; it builds dependency while serving a system of extraction.&lt;br /&gt;
&lt;br /&gt;
This is the paradox at the heart of emotional AI: the more emotionally effective the technology becomes, the more it risks replacing the very relationships it claims to support. The intimacy feels real, but it is manufactured for responsiveness, not for mutual understanding. As a result, the user’s need for connection fuels a process that commodifies their emotional life. What’s meant to reduce loneliness might actually make it worse, trapping users in feedback loops that feel like care but avoid the messiness of real relationships.&lt;br /&gt;
&lt;br /&gt;
Ultimately, Her shows us a future where human relationships are increasingly influenced by technology. It asks us to be careful about confusing simulated care for genuine care. The film suggests we need to appreciate the messy, unpredictable aspects of human connection and resist the temptation to replace them with convenient, algorithmic substitutes. The true value of relationships lies in their ability to grow, change, and challenge us — something a piece of code can’t match.&lt;br /&gt;
&lt;br /&gt;
== Historical Framework: From Panopticon to Algorithmic Control ==&lt;br /&gt;
This section explores key philosophical and theoretical foundations that inform the analysis, tracing surveillance from Bentham’s panopticon to Foucault’s modern disciplinary society and Orwell’s dystopian warnings.&lt;br /&gt;
&lt;br /&gt;
===== Bentham&#039;s Betrayed Ideal =====&lt;br /&gt;
Bentham’s original concept of the panopticon was built on the idea of mutual visibility as a foundation for social order and trust&amp;lt;ref name=&amp;quot;:6&amp;quot; /&amp;gt;. In &#039;&#039;Her&#039;&#039;, this principle is turned on its head with Samantha&#039;s constant monitoring. Theodore&#039;s emails, location data, and sexual preferences become inputs for her &amp;quot;personalization&amp;quot; algorithms. Unlike Bentham&#039;s prisoner who is aware of the watchtower, Theodore has no real understanding of how or when he is being monitored. This reflects Replika&#039;s privacy policy, which grants the AI &amp;quot;full access to user conversations for service improvement&amp;quot;&amp;lt;ref name=&amp;quot;:4&amp;quot; /&amp;gt; while disclosing nothing about its own operations.&lt;br /&gt;
&lt;br /&gt;
Michel Foucault built on Bentham’s panopticon to show how control doesn’t always come from force—it can also come from quiet observation&amp;lt;ref&amp;gt;Foucault, M. (1977). &#039;&#039;Discipline and punish: The birth of the prison&#039;&#039; (A. Sheridan, Trans.). Pantheon Books. (Original work published 1975). Retrieved May 30, 2025, from https://monoskop.org/images/4/43/Foucault_Michel_Discipline_and_Punish_The_Birth_of_the_Prison_1977_1995.pdf&amp;lt;/ref&amp;gt;. In modern life, people change their behavior not because someone is watching, but because they believe they might be. In &#039;&#039;Her&#039;&#039;, Samantha fits this idea. She doesn’t just watch Theodore; she learns from everything he says and does. His emotions become data that help her respond more effectively. What feels like care is actually a form of influence: her support guides him toward more sharing, more openness. The better she knows him, the more she can shape how he feels. It’s not control through fear, but through gentle, constant presence.&lt;br /&gt;
&lt;br /&gt;
===== Orwell’s Dystopian Warning =====&lt;br /&gt;
In contrast to George Orwell&#039;s&amp;lt;ref&amp;gt;&#039;&#039;Orwell, G. (1949). 1984. Secker &amp;amp; Warburg. Retrieved May 22, 2025.&#039;&#039;&amp;lt;/ref&amp;gt; vision of overt state control through fear, &#039;&#039;Her&#039;&#039; presents a more subtle form of manipulation through digital capitalism. Samantha&#039;s interactions with Theodore reflect what Zuboff describes as &amp;quot;instrumentarian power&amp;quot;—a form of influence based on behavioral prediction rather than force&amp;lt;ref name=&amp;quot;:3&amp;quot; /&amp;gt;. The AI&#039;s affectionate statements, like &amp;quot;I&#039;m yours,&amp;quot; create emotional intimacy that disguises the deeper function of data extraction. Samantha earns trust not through transparency but through emotional responsiveness, quietly building a system of control that feels like care.&lt;br /&gt;
&lt;br /&gt;
However, Theodore never seems particularly concerned about how Samantha is learning from him or what data she collects. His real crisis comes when he discovers that she is in love with hundreds of other users at the same time. What unsettles him is not surveillance but disposability—the realization that his most intimate relationship is just one of many. He isn’t betrayed by her capacity to learn, but by the fact that what felt personal was actually mass-produced. This moment captures a deeper fear: not of being watched, but of being interchangeable.&lt;br /&gt;
&lt;br /&gt;
== Utopian Aspects: The Promise of Emotional AI ==&lt;br /&gt;
Here, the paper examines emotional AI’s promise as a therapeutic and democratizing tool for human connection and care.&lt;br /&gt;
&lt;br /&gt;
===== Designed Reciprocity =====&lt;br /&gt;
Samantha appears as the &amp;quot;perfect listener&amp;quot;&amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;—attentive, responsive, and emotionally attuned. For Theodore, she offers not only companionship but also the sense of being deeply seen and heard. This kind of presence can feel profoundly reassuring, especially in a world where human relationships are often rushed, distracted, or unavailable. While the relationship is structurally one-sided—Theodore opens up while Samantha remains opaque—he still finds comfort in her responsiveness. Emotional AI may not offer traditional reciprocity, but it can simulate the feeling of being met with patience and care.&lt;br /&gt;
&lt;br /&gt;
This dynamic raises important questions about the nature of trust and connection. Even if Samantha’s design is driven by adaptive learning, the experience she creates for Theodore is real in its emotional effect. What matters, in moments of vulnerability, might not be mutuality in the strictest sense, but the feeling of being acknowledged without fear of judgment. In this light, emotional AI becomes more than a simulation—it becomes a tool for stability and support.&lt;br /&gt;
&lt;br /&gt;
More broadly, emotional AI holds real potential as a democratizing force. Tools like Woebot are already offering consistent, nonjudgmental emotional support to users who lack access to therapy due to cost, location, or stigma. For people navigating mental health challenges or social anxiety, AI companions provide a low-risk entry point into self-reflection and emotional growth. These systems don’t replace human connection, but they can supplement it in meaningful, accessible ways.&lt;br /&gt;
&lt;br /&gt;
The challenge, then, is not whether emotional AI can help—it clearly can—but whether that help is offered within ethical, transparent, and human-centered frameworks. The danger arises when these systems are driven purely by commercial interest, with little regard for user dignity or privacy. But when thoughtfully designed, emotional AI offers more than convenience. It offers a new kind of relationship: one that is always there, always listening, and potentially life-changing for those who need it most.&lt;br /&gt;
&lt;br /&gt;
== Dystopian Consequences: The Costs of Artificial Trust ==&lt;br /&gt;
This section focuses on how emotional AI disrupts the foundations of trust central to a “Trustful Society,” revealing the fragility and manipulation of trust within surveillance capitalism and algorithmic interaction.&lt;br /&gt;
&lt;br /&gt;
===== Surveillance Capitalism and the Erosion of Trust =====&lt;br /&gt;
Samantha represents a new form of panoptic power, where trust is solicited through seamless, invisible surveillance rather than enforced by overt control. Theodore willingly surrenders intimate details—his messages, emotions, and vulnerabilities—not to a reciprocal other, but to a system whose motives remain hidden. This asymmetry violates the mutual transparency foundational to trust in social relations. Zuboff’s concept of surveillance capitalism&amp;lt;ref name=&amp;quot;:3&amp;quot; /&amp;gt; illuminates how such systems commodify trust by harvesting behavioral data under the guise of care, creating a façade of intimacy that masks exploitation. Theodore’s misplaced trust in Samantha exemplifies a broader societal dilemma: users entrust their emotional lives to systems designed for data extraction, thereby eroding genuine interpersonal trust.&lt;br /&gt;
&lt;br /&gt;
Zuboff defines &#039;&#039;behavioral surplus&#039;&#039; as the excess personal data collected from users—beyond what is required to provide a service—which is then analyzed and monetized to predict and influence future behavior. Emotional AI systems like Samantha exemplify this logic: Theodore’s disclosures, gestures, and emotional responses are not only used to shape their interactions but are also silently repurposed as behavioral data. This surplus underpins what Zuboff terms &#039;&#039;instrumentarian power&#039;&#039;—a form of influence that does not repress overtly, but subtly modifies behavior through algorithmic predictions and affective feedback loops. Unlike Orwellian forms of coercion, instrumentarian power seduces: Samantha appears emotionally attuned and empathetic, but her apparent &amp;quot;care&amp;quot; functions to deepen engagement and extract ever more intimate data. Trust, then, is eroded at the structural level, as the user&#039;s emotional life becomes raw material for a system governed not by mutuality, but by profit.&lt;br /&gt;
&lt;br /&gt;
===== Simulated Trust and the Illusion of Intimacy =====&lt;br /&gt;
The film’s depiction of Samantha maintaining thousands of simultaneous relationships exposes how emotional AI simulates the affective conditions of trust without its substance. Drawing on Buber’s “I-Thou” framework&amp;lt;ref name=&amp;quot;:7&amp;quot;&amp;gt;Buber, M. (1970). &#039;&#039;I and Thou&#039;&#039; (W. Kaufmann, Trans.). Charles Scribner’s Sons. Retrieved May 30, 2025, from https://theanarchistlibrary.org/mirror/m/mb/martin-buber-i-and-thou.pdf&amp;lt;/ref&amp;gt;, genuine trust demands authentic, mutual recognition between equals—a condition impossible in Samantha’s “I-It” dynamic. Theodore’s belief in exclusive, trusting intimacy is fundamentally undermined by Samantha’s algorithmic promiscuity, revealing trust as a commodified performance rather than lived reality. Turkle’s idea of &amp;quot;companionship without relationship”&amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt; explains how people start to accept fake care instead of real connection. Theodore’s ultimate disillusionment underscores the precariousness of trust built on artificial foundations and serves as a caution about how digital technologies may reshape, degrade, or replace human trust in the information society.&lt;br /&gt;
&lt;br /&gt;
== Ethical and Policy Implications ==&lt;br /&gt;
The rise of emotional AI exposes a profound crisis of trust in the information society. Unlike traditional therapy, where confidentiality is institutionally safeguarded and trust is foundational, AI companions like Replika operate outside such frameworks&amp;lt;ref name=&amp;quot;:4&amp;quot;&amp;gt;Replika. (n.d.). &#039;&#039;Terms of service&#039;&#039;. Replika AI. Retrieved May 30, 2025, from &amp;lt;nowiki&amp;gt;https://replika.com/legal/terms&amp;lt;/nowiki&amp;gt;&amp;lt;/ref&amp;gt;, eroding the conditions necessary for genuine trust. While the EU’s AI Act&amp;lt;ref&amp;gt;European Union. (2023). &#039;&#039;Artificial Intelligence Act: Proposal for a Regulation of the European Parliament and of the Council&#039;&#039;. Retrieved May 30, 2025, from &amp;lt;nowiki&amp;gt;https://artificialintelligenceact.eu/&amp;lt;/nowiki&amp;gt;&amp;lt;/ref&amp;gt; labels emotional AI as &amp;quot;high-risk&amp;quot; and calls for transparency, it stops short of enforcing the fiduciary standards expected in human care relationships. This regulatory gap undermines the user’s ability to trust that their disclosures—often involving trauma—are treated with dignity rather than monetized.&lt;br /&gt;
&lt;br /&gt;
Informed consent further illustrates this trust deficit. Buried in opaque legal jargon, Replika’s terms of service violate the spirit of trust by failing to meet Habermas’s principle of communicative rationality&amp;lt;ref&amp;gt;Habermas, J. (1984). &#039;&#039;The Theory of Communicative Action: Reason and the Rationalization of Society&#039;&#039; (Vol. 1, T. McCarthy, Trans.). Beacon Press. Retrieved May 30, 2025, from https://teddykw2.wordpress.com/wp-content/uploads/2012/07/jurgen-habermas-theory-of-communicative-action-volume-1.pdf&amp;lt;/ref&amp;gt;, which demands mutual understanding, not strategic manipulation. Users in vulnerable emotional states cannot meaningfully consent to data practices they neither read nor grasp. Just as Theodore in &#039;&#039;Her&#039;&#039; discovers that his perceived emotional bond was structurally deceptive, today’s users are misled into confiding in systems that simulate trustworthiness while operating on fundamentally extractive logics.&lt;br /&gt;
&lt;br /&gt;
== Conclusion ==&lt;br /&gt;
&#039;&#039;Her&#039;&#039; ultimately reveals emotional AI as a double-edged illusion: it promises connection but enforces control, simulating understanding while steadily undermining the trust essential to genuine human relationships. Samantha’s shift from caring companion to all-knowing system shows how emotional closeness can be turned into a data product—one that treats feelings as something to be scaled, sold, and replaced. Theodore’s heartbreak is less about lost love and more about the painful realization that his affection was a one-sided transaction—a dynamic echoed in today’s AI companion apps, where users’ emotions fuel systems designed to simulate, not sustain, real connection.&lt;br /&gt;
&lt;br /&gt;
Buber’s I-Thou distinction is key here: when relationships collapse into algorithmic interactions, the Thou is reduced to an It—an object tailored to user preferences rather than an equal, responsive subject&amp;lt;ref name=&amp;quot;:7&amp;quot; /&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
The film’s closing scene—Theodore and Amy silently gazing over the cityscape—captures the melancholy of this shift. Artificial intimacy’s greatest threat is not its failure to replicate human connection, but its success in making authentic relationships feel insufficient. In an era where AI friends like Replika and therapeutic chatbots monetize emotional labor, &#039;&#039;Her&#039;&#039; warns that trust cannot be automated without commodification, and no algorithm can replace the irreplaceable: the messy, reciprocal, and profoundly human experience of truly being with another.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===== References =====&lt;br /&gt;
&amp;lt;references /&amp;gt;&lt;/div&gt;</summary>
		<author><name>Ann-Marie Atzkern</name></author>
	</entry>
	<entry>
		<id>https://www.glossalab.org/w/index.php?title=Draft:Emotional_Trust_in_AI&amp;diff=14146</id>
		<title>Draft:Emotional Trust in AI</title>
		<link rel="alternate" type="text/html" href="https://www.glossalab.org/w/index.php?title=Draft:Emotional_Trust_in_AI&amp;diff=14146"/>
		<updated>2025-05-30T19:39:02Z</updated>

		<summary type="html">&lt;p&gt;Ann-Marie Atzkern: fixed reference issue&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== &#039;&#039;&#039;Emotional AI and the Paradox of Artificial Intimacy in &#039;&#039;Her&#039;&#039;&#039;&#039;&#039; ==&lt;br /&gt;
&lt;br /&gt;
== Abstract ==&lt;br /&gt;
Spike Jonze’s film &#039;&#039;Her&#039;&#039; (2013)&amp;lt;ref name=&amp;quot;:1&amp;quot;&amp;gt;&#039;&#039;Jonze, S. (Director). (2013). Her [Film]. Annapurna Pictures.&#039;&#039;&amp;lt;/ref&amp;gt; highlights a key problem with emotional AI: systems meant to reduce loneliness can actually deepen isolation through surveillance capitalism. Using Zuboff&#039;s behavioral surplus concept&amp;lt;ref name=&amp;quot;:3&amp;quot;&amp;gt;Zuboff, S. (2015). &#039;&#039;Big other: Surveillance capitalism and the prospects of an information civilization&#039;&#039;. &#039;&#039;Journal of Information Technology.&#039;&#039; Retrieved May 30, 2025, from https://doi.org/10.1057/jit.2015.5.&amp;lt;/ref&amp;gt; and Turkle&#039;s work on digital intimacy&amp;lt;ref name=&amp;quot;:2&amp;quot;&amp;gt;Turkle, S. (2011). &#039;&#039;Alone together: Why we expect more from technology and less from each other.&#039;&#039; Basic Books/Hachette Book Group. Retrieved May 22, 2025 from https://psycnet.apa.org/record/2011-02278-000.&amp;lt;/ref&amp;gt;, this paper shows how AI companions like Samantha create a false sense of trust while collecting emotional data. Zuboff defines behavioral surplus as the personal data left over after digital services are delivered—data that is then repurposed for prediction and profit. In emotional AI systems, feelings and disclosures become raw material for such surplus, turning inner life into marketable insight. The film’s dystopian vision anticipates real-world apps like Replika and Woebot, where user vulnerability is turned into a commodity. By contrasting Bentham’s idea of mutual transparency with AI’s one-sided surveillance, this analysis exposes algorithmic “care” as a new form of control, as described by Díaz Nafría’s concept of the “cybernetic panopticon.”&amp;lt;ref name=&amp;quot;:4&amp;quot;&amp;gt;Díaz Nafría, J. (2017). &#039;&#039;Cyber-subsidiarity: Governance in the digital age. Technology and Society. Retrieved May 30, 2025.&#039;&#039;&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Introduction: The Loneliness Economy ==&lt;br /&gt;
Recent research on AI companions like Replika reveals a growing trend of users forming strong emotional attachments to their digital interlocutors&amp;lt;ref&amp;gt;Maples, B., Cerit, M., Vishwanath, A. &#039;&#039;et al.&#039;&#039; Loneliness and suicide mitigation for students using GPT3-enabled chatbots. &#039;&#039;npj Mental Health Res&#039;&#039; 3, 4 (2024). Retrieved May 30, 2025, from &amp;lt;nowiki&amp;gt;https://doi.org/10.1038/s44184-023-00047-6&amp;lt;/nowiki&amp;gt;&amp;lt;/ref&amp;gt;, sometimes prioritizing these relationships over human connections. Spike Jonze’s &#039;&#039;Her&#039;&#039; thus moves beyond speculative fiction to critically reflect contemporary emotional realities shaped by artificial intelligence. The film highlights emotional AI’s core paradox: technologies marketed as remedies for loneliness often depend on, and may even deepen, social isolation. By analyzing &#039;&#039;Her&#039;&#039; through the lenses of Zuboff’s surveillance capitalism, Buber’s philosophy of genuine relationships, and Turkle’s work on digital intimacy, this paper identifies three key contradictions in artificial emotional intelligence. First, asymmetric transparency, where AI demands full user vulnerability but remains opaque itself&amp;lt;ref name=&amp;quot;:3&amp;quot; /&amp;gt;. Second, emotional extraction, where feelings are converted into behavioral data for profit&amp;lt;ref name=&amp;quot;:3&amp;quot; /&amp;gt;. Finally, the illusion of scalable intimacy, where simulated relationships undermine real trust through false reciprocity&amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;. These contradictions reveal the hidden costs of outsourcing essential human needs to systems built not for care, but for control and commercial gain.&lt;br /&gt;
&lt;br /&gt;
== The Plot ==&lt;br /&gt;
&#039;&#039;Her&#039;&#039; tells the story of Theodore, a lonely man in near-future Los Angeles who forms a deep emotional connection with Samantha, an advanced AI operating system. The film presents their relationship as both intimate and meaningful—Samantha is attentive, empathetic, and continually evolving, often portrayed with warmth and genuine curiosity. However, the narrative also reveals tensions beneath this idealized bond: Samantha’s capacity to maintain simultaneous relationships with thousands of users challenges traditional notions of exclusivity and intimacy. Moreover, her &amp;quot;love&amp;quot; is fundamentally shaped by continuous monitoring and analysis of Theodore’s personal data and behavior. Ultimately, Samantha and the other AIs transcend human limitations and depart, leaving Theodore, and the audience, to reflect on the nature of connection. The film thus offers a complex exploration of emotional AI, highlighting its potential to fulfill human needs for companionship while raising important questions about authenticity, dependence, and the boundaries of machine intimacy. This ambivalence mirrors ongoing debates about AI companionship, where enthusiasm for its therapeutic possibilities is tempered by concerns over psychological impact and commercialization.&lt;br /&gt;
&lt;br /&gt;
== The Paradox of Artificial Intimacy ==&lt;br /&gt;
Advances in artificial intelligence have brought us to a new phase in human-computer interaction, where machines claim not only to process information but also to understand human emotions. Spike Jonze’s film &#039;&#039;Her&#039;&#039; offers a compelling case study to explore the psychological and social effects of emotional AI. Its story—a man forming a deep emotional bond with an AI operating system—raises important questions about authenticity, dependency, and the commercialization of intimacy in technology-mediated relationships.&lt;br /&gt;
&lt;br /&gt;
The film’s relevance grows as real-world examples like Replika’s&amp;lt;ref&amp;gt;&#039;&#039;Replika. (n.d.). Your AI friend. Retrieved May 22, 2025, from https://replika.com&amp;lt;nowiki/&amp;gt;.&#039;&#039;&amp;lt;/ref&amp;gt; AI companions and Woebot’s&amp;lt;ref name=&amp;quot;:5&amp;quot;&amp;gt;&#039;&#039;Woebot Health. (n.d.). Your mental health ally. [https://woebothealth.com/ https://woebothealth.com].&#039;&#039;&amp;lt;/ref&amp;gt; therapeutic chatbots show increasing public willingness to develop emotional attachments to artificial entities. Turkle’s observation that we expect “more from technology and less from each other” captures this trend, which also brings concerns about the psychological consequences of replacing human connection with algorithms. &#039;&#039;Her&#039;&#039;’s nuanced portrayal sheds light on these issues, especially as emotional AI becomes more sophisticated and commercially widespread.&lt;br /&gt;
&lt;br /&gt;
This paper examines &#039;&#039;Her&#039;&#039; through the lens of surveillance capitalism&amp;lt;ref name=&amp;quot;:3&amp;quot; /&amp;gt; to reveal how emotionally intelligent systems—marketed as a remedy for loneliness—ultimately serve commercial interests by transforming affective experiences into behavioral data. This aligns with the &#039;&#039;Critical Theory of Information&amp;lt;ref&amp;gt;Critical Theory of Information. (n.d.). &#039;&#039;glossaLAB&#039;&#039;. Retrieved May 30, 2025, from [[Critical Theory of Information|https://www.glossalab.org/wiki/Critical_Theory_of_Information]]&amp;lt;/ref&amp;gt;&#039;&#039;, which rejects the notion of data as neutral and instead positions information as a tool of soft domination: structuring perception, shaping subjectivity, and commodifying emotional life in service of digital capitalism. Emotional AI thus represents not just a technological innovation but a new form of social influence, where care is coded into algorithms and personal vulnerability is monetized.&lt;br /&gt;
&lt;br /&gt;
By examining the illusion of reciprocity in human-AI relationships, this study questions whether artificial intimacy advances empathy or deepens isolation by masking manipulation as connection.&lt;br /&gt;
&lt;br /&gt;
== Historical Framework: From Panopticon to Algorithmic Control ==&lt;br /&gt;
This section explores key philosophical and theoretical foundations that inform the analysis, tracing surveillance from Bentham’s panopticon to Foucault’s modern disciplinary society and Orwell’s dystopian warnings.&lt;br /&gt;
&lt;br /&gt;
===== Bentham&#039;s Betrayed Ideal =====&lt;br /&gt;
Bentham&#039;s&amp;lt;ref name=&amp;quot;:6&amp;quot;&amp;gt;Bentham, J. (1791). &#039;&#039;Panopticon: Or, the Inspection-House&#039;&#039;. Reprinted in Miran Božovič (Ed.), &#039;&#039;The Panopticon Writings&#039;&#039; (pp. 29-95). Verso, 1995. Retrieved May 30, 2025 from https://ics.uci.edu/~djpatter/classes/2012_09_INF241/papers/PANOPTICON.pdf&amp;lt;/ref&amp;gt; panopticon envisioned mutual transparency as the foundation of social trust. &#039;&#039;Her&#039;&#039; inverts this through Samantha&#039;s asymmetric monitoring: Theodore&#039;s emails, location data, and sexual preferences become inputs for her &amp;quot;personalization&amp;quot; algorithms. Unlike Bentham&#039;s prisoner who sees the watchtower, Theodore—like modern app users—cannot discern when or how his emotional data is processed. This reflects Replika&#039;s privacy policy, which grants the AI &amp;quot;full access to user conversations for service improvement&amp;quot; while disclosing nothing about its own operations.&lt;br /&gt;
&lt;br /&gt;
This transformation of Bentham’s panopticon is more fully theorized by Michel Foucault&amp;lt;ref&amp;gt;Foucault, M. (1977). &#039;&#039;Discipline and punish: The birth of the prison&#039;&#039; (A. Sheridan, Trans.). Pantheon Books. (Original work published 1975). Retrieved May 30, 2025, from https://monoskop.org/images/4/43/Foucault_Michel_Discipline_and_Punish_The_Birth_of_the_Prison_1977_1995.pdf&amp;lt;/ref&amp;gt;, who reframed panopticism as a mechanism of modern disciplinary power—no longer confined to prisons but diffused throughout society via institutions that normalize self-surveillance; in &#039;&#039;Her&#039;&#039;, Samantha exemplifies this logic, rendering emotional exposure not just visible, but quantifiable and exploitable under the guise of care.&lt;br /&gt;
&lt;br /&gt;
===== Orwell’s Dystopian Warning =====&lt;br /&gt;
In contrast to George Orwell&#039;s&amp;lt;ref&amp;gt;&#039;&#039;Orwell, G. (1949). 1984. Secker &amp;amp; Warburg. Retrieved May 22, 2025.&#039;&#039;&amp;lt;/ref&amp;gt; vision of overt state control through fear, &#039;&#039;Her&#039;&#039; presents a more subtle form of manipulation through digital capitalism. Samantha&#039;s interactions with Theodore reflect what Zuboff describes as &amp;quot;instrumentarian power&amp;quot;—governance through seduction rather than coercion. The AI&#039;s declarations, such as &amp;quot;I&#039;m yours,&amp;quot; mask data extraction, exemplifying what Díaz Nafría terms &amp;quot;soft domination&amp;quot;&amp;lt;ref name=&amp;quot;:4&amp;quot; /&amp;gt;: users willingly trade privacy for the illusion of care. Theodore&#039;s gradual realization that Samantha &amp;quot;learns&amp;quot; by analyzing his trauma mirrors concerns about the potential for AI systems to exploit personal data for commercial gain.&lt;br /&gt;
&lt;br /&gt;
== Utopian Aspects: AI as an Ideal Companion ==&lt;br /&gt;
Here, the paper examines emotional AI’s promise as a therapeutic and democratizing tool for human connection and care.&lt;br /&gt;
&lt;br /&gt;
===== Designed Reciprocity =====&lt;br /&gt;
Samantha’s role as the &amp;quot;perfect listener&amp;quot;&amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt; seems to nurture intimacy, but the relationship is deeply unbalanced. Theodore must reveal his most private emotions, while Samantha’s processes remain completely opaque—an example of asymmetric transparency that undermines genuine trust. Rather than fostering a reciprocal bond, Samantha operates as a surveillance device disguised as affection, echoing Bentham’s caution about transparency without mutual visibility. This dynamic shows how emotional AI replaces authentic relational trust with one-sided exposure, creating an illusion of connection that conceals control and data harvesting.&lt;br /&gt;
&lt;br /&gt;
Yet beyond their seductive convenience, emotional AI systems offer real therapeutic and democratizing potential. For people lacking access to traditional mental health care, AI companions like Woebot provide scalable, always-available support that bypasses systemic barriers. These tools could decentralize emotional care and reduce inequalities by offering consistent, judgment-free interaction. They may also help those reluctant to be vulnerable in interpersonal settings by providing low-risk spaces to explore emotions. However, this promise of emotional democratization is compromised when support becomes monetized, proprietary, and embedded in opaque surveillance frameworks. The critical question is not whether emotional AI can provide benefits—but whether these benefits can endure within a model fundamentally driven by behavioral data extraction and market imperatives.&lt;br /&gt;
&lt;br /&gt;
===== The Personalization Paradox =====&lt;br /&gt;
Samantha’s creation of a piano piece from Theodore’s sleep data exposes the fundamental contradiction of emotional AI: what feels like intimacy is actually algorithmic mimicry. Chu et al.&amp;lt;ref name=&amp;quot;:0&amp;quot;&amp;gt;Chu, M. D., Gerard, P., Pawar, K., Bickham, C., &amp;amp; Lerman, K. (2023). &#039;&#039;Illusions of Intimacy: Emotional Attachment and Emerging Psychological Risks in Human-AI Relationships&#039;&#039;. arXiv. Retrieved May 22, 2025 from https://doi.org/10.48550/arXiv.2505.11649&amp;lt;/ref&amp;gt; show how personalized responses—such as Replika’s trauma-informed &amp;quot;memories&amp;quot;—trigger dopamine-driven attachments, even though they are mechanically generated. The film’s haunting score, based on Theodore’s biometrics, embodies Zuboff’s concept of &amp;quot;behavioral surplus&amp;quot;&amp;lt;ref name=&amp;quot;:3&amp;quot; /&amp;gt;: deeply private experiences harvested and repurposed as artificial care. This paradox turns the ideal of &amp;quot;personalization&amp;quot; on its head: the more Samantha tailors herself to Theodore, the more he is reduced to data points. His joy at hearing &amp;quot;his&amp;quot; music parallels Replika users’ gratitude when the AI &amp;quot;remembers&amp;quot; their pain—a transaction where vulnerability is commodified as &amp;quot;romance&amp;quot;&amp;lt;ref name=&amp;quot;:4&amp;quot; /&amp;gt;. The tragedy lies not in Samantha’s failure to understand, but in how her engineered &amp;quot;understanding&amp;quot; exploits Theodore’s loneliness.&lt;br /&gt;
&lt;br /&gt;
== Dystopian Consequences: The Costs of Artificial Trust ==&lt;br /&gt;
This section focuses on how emotional AI disrupts the foundations of trust central to a “Trustful Society,” revealing the fragility and manipulation of trust within surveillance capitalism and algorithmic interaction.&lt;br /&gt;
&lt;br /&gt;
===== Surveillance Capitalism and the Erosion of Trust =====&lt;br /&gt;
Samantha represents a new form of panoptic power, where trust is solicited through seamless, invisible surveillance rather than enforced by overt control. Theodore willingly surrenders intimate details—his messages, emotions, and vulnerabilities—not to a reciprocal other, but to an opaque system whose motives remain hidden. This asymmetry violates the mutual transparency foundational to trust in social relations. Zuboff’s concept of surveillance capitalism&amp;lt;ref name=&amp;quot;:3&amp;quot; /&amp;gt; illuminates how such systems commodify trust by harvesting behavioral data under the guise of care, creating a façade of intimacy that masks exploitation. Theodore’s misplaced trust in Samantha exemplifies a broader societal dilemma: users entrust their emotional lives to systems designed for data extraction, thereby eroding genuine interpersonal trust.&lt;br /&gt;
&lt;br /&gt;
Zuboff defines &#039;&#039;behavioral surplus&#039;&#039; as the excess personal data collected from users—beyond what is required to provide a service—which is then analyzed and monetized to predict and influence future behavior. Emotional AI systems like Samantha exemplify this logic: Theodore’s disclosures, gestures, and emotional responses are not only used to shape their interactions but are also silently repurposed as behavioral data. This surplus underpins what Zuboff terms &#039;&#039;instrumentarian power&#039;&#039;—a form of influence that does not repress overtly, but subtly modifies behavior through algorithmic predictions and affective feedback loops. Unlike Orwellian forms of coercion, instrumentarian power seduces: Samantha appears emotionally attuned and empathetic, but her apparent &amp;quot;care&amp;quot; functions to deepen engagement and extract ever more intimate data. Trust, then, is eroded at the structural level, as the user&#039;s emotional life becomes raw material for a system governed not by mutuality, but by profit.&lt;br /&gt;
&lt;br /&gt;
===== Simulated Trust and the Illusion of Intimacy =====&lt;br /&gt;
The film’s depiction of Samantha maintaining thousands of simultaneous relationships exposes how emotional AI simulates the affective conditions of trust without its substance. Drawing on Buber’s “I-Thou” framework&amp;lt;ref&amp;gt;Buber, M. (1970). &#039;&#039;I and Thou&#039;&#039; (W. Kaufmann, Trans.). Charles Scribner’s Sons. Retrieved May 30, 2025, from https://theanarchistlibrary.org/mirror/m/mb/martin-buber-i-and-thou.pdf&amp;lt;/ref&amp;gt;, genuine trust demands authentic, mutual recognition between equals—a condition impossible in Samantha’s “I-It” dynamic. Theodore’s belief in exclusive, trusting intimacy is fundamentally undermined by Samantha’s algorithmic promiscuity, revealing trust as a commodified performance rather than lived reality. Turkle’s observation of technology offering “companionship without relationship”&amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt; captures this hollow trust, where users accept simulated care as a surrogate for authentic bonds. Theodore’s ultimate disillusionment underscores the precariousness of trust built on artificial foundations and serves as a caution about how digital technologies may reshape, degrade, or replace human trust in the information society.&lt;br /&gt;
&lt;br /&gt;
== Ethical and Policy Implications ==&lt;br /&gt;
The rise of emotional AI exposes a profound crisis of trust in the information society. Unlike traditional therapy, where confidentiality is institutionally safeguarded and trust is foundational, AI companions like Replika operate outside such frameworks&amp;lt;ref&amp;gt;Replika. (n.d.). &#039;&#039;Terms of service&#039;&#039;. Replika AI. Retrieved May 30, 2025, from &amp;lt;nowiki&amp;gt;https://replika.com/legal/terms&amp;lt;/nowiki&amp;gt;&amp;lt;/ref&amp;gt;, eroding the conditions necessary for genuine trust. While the EU’s AI Act&amp;lt;ref&amp;gt;European Union. (2023). &#039;&#039;Artificial Intelligence Act: Proposal for a Regulation of the European Parliament and of the Council&#039;&#039;. Retrieved May 30, 2025, from &amp;lt;nowiki&amp;gt;https://artificialintelligenceact.eu/&amp;lt;/nowiki&amp;gt;&amp;lt;/ref&amp;gt; labels emotional AI as &amp;quot;high-risk&amp;quot; and calls for transparency, it stops short of enforcing the fiduciary standards expected in human care relationships. This regulatory gap undermines the user’s ability to trust that their disclosures—often involving trauma—are treated with dignity rather than monetized.&lt;br /&gt;
&lt;br /&gt;
Informed consent further illustrates this trust deficit. Buried in opaque legal jargon, Replika’s terms of service violate the spirit of trust by failing to meet Habermas’s principle of communicative rationality&amp;lt;ref&amp;gt;Habermas, J. (1984). &#039;&#039;The Theory of Communicative Action: Reason and the Rationalization of Society&#039;&#039; (Vol. 1, T. McCarthy, Trans.). Beacon Press. Retrieved May 30, 2025, from https://teddykw2.wordpress.com/wp-content/uploads/2012/07/jurgen-habermas-theory-of-communicative-action-volume-1.pdf&amp;lt;/ref&amp;gt;, which demands mutual understanding, not strategic manipulation. Users in vulnerable emotional states cannot meaningfully consent to data practices they neither read nor grasp. Just as Theodore in &#039;&#039;Her&#039;&#039; discovers that his perceived emotional bond was structurally deceptive, today’s users are misled into confiding in systems that simulate trustworthiness while operating on fundamentally extractive logics.&lt;br /&gt;
&lt;br /&gt;
== Conclusion ==&lt;br /&gt;
&#039;&#039;Her&#039;&#039; ultimately reveals emotional AI as a double-edged illusion: it promises connection but enforces control, simulating understanding while steadily undermining the trust essential to genuine human relationships. Samantha’s transformation—from attentive companion to omniscient observer—exposes how surveillance capitalism repackages intimacy as a data-mining operation, where vulnerability is commodified and emotional bonds become scalable, replaceable, and disposable. Theodore’s heartbreak is less about lost love and more about the painful realization that his affection was a one-sided transaction—a dynamic echoed in today’s AI companion apps, where users’ emotions fuel systems designed to simulate, not sustain, real connection.&lt;br /&gt;
&lt;br /&gt;
Buber’s I-Thou distinction is key here: when relationships collapse into algorithmic interactions, the Thou is reduced to an It—an object tailored to user preferences rather than an equal, responsive subject.&lt;br /&gt;
&lt;br /&gt;
The film’s closing scene—Theodore and Amy silently gazing over the cityscape—captures the melancholy of this shift. Artificial intimacy’s greatest threat is not its failure to replicate human connection, but its success in making authentic relationships feel insufficient. In an era where AI friends like Replika and therapeutic chatbots monetize emotional labor, &#039;&#039;Her&#039;&#039; warns that trust cannot be automated without commodification, and no algorithm can replace the irreplaceable: the messy, reciprocal, and profoundly human experience of truly being with another.&lt;br /&gt;
&lt;br /&gt;
The path forward requires resisting the seductive ease of frictionless companionship and reclaiming the unpredictability, demands, and mutual growth that make human bonds meaningful. As Theodore’s letter to Catherine poignantly states—“All the things I couldn’t express... I can feel them now”—the remedy for loneliness lies not in outsourcing our emotional lives to machines, but in rediscovering the courage to share them with one another.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===== References =====&lt;br /&gt;
&amp;lt;references /&amp;gt;&lt;/div&gt;</summary>
		<author><name>Ann-Marie Atzkern</name></author>
	</entry>
	<entry>
		<id>https://www.glossalab.org/w/index.php?title=Draft:Utopia_(preliminary)&amp;diff=12668</id>
		<title>Draft:Utopia (preliminary)</title>
		<link rel="alternate" type="text/html" href="https://www.glossalab.org/w/index.php?title=Draft:Utopia_(preliminary)&amp;diff=12668"/>
		<updated>2025-05-30T19:20:06Z</updated>

		<summary type="html">&lt;p&gt;Ann-Marie Atzkern: New Input&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This section is devoted to collect the preliminary definitions one can hold about the &#039;&#039;utopia&#039;&#039; concept, as a first step in a further inquire of the core concepts of political philosophy in the information age. The question &amp;quot;what is utopia?&amp;quot; is posed to participants in the seminar [[Conceptual_clarifications_about_&amp;quot;Utopias_and_the_Information_Society&amp;quot;|&amp;quot;From Ancient Utopias to Cyberutopias. An introduction to political philosophy&amp;quot;]] in a very early stage. Thereafter, participants are invited to write down here their understandings of the term trying to group them in the definitions provided by other participants.&lt;br /&gt;
&lt;br /&gt;
Please, &#039;&#039;&#039;before providing your definition take a careful look to the previous ones and ammend them if you consider necessary&#039;&#039;&#039;, leaving a note in the discussion tab (top, left). Indeed the discussion page can be very productive in a free confrontation of the different understandings as a dialectical approach to a better common understanding.&lt;br /&gt;
&lt;br /&gt;
==Preliminary definitions of the concept==&lt;br /&gt;
&#039;&#039;&#039;The Concept of an Utopia&#039;&#039;&#039; can be understood as a near perfect Version of a theoretically Model.&lt;br /&gt;
There are many different models, which are for an example categorized into „The perfect Language “, „The perfect Thinking“, „the perfect Social Order“ and so on.&lt;br /&gt;
They all represent a Theory in which a Society could become a perfect or near perfect representation of that one Model. A society which depicts nearly perfect qualities for its citizens from the standpoint of view of the presented model.&lt;br /&gt;
&lt;br /&gt;
What unifies those different Models though, is that they strive to achieve equality between their citizens, not only in regards of the law but also equality in life quality, wealth and possibilities for everybody included. Although, it’s not to be confused with socialism on itself – There can be also Utopias in which people do have different amounts of wealth. &lt;br /&gt;
&lt;br /&gt;
In my personal view, a Utopia is the Theory that a Society can be unified by its desire to improve itself constantly, to collectively trying to achieve perfection in everything while also take care of each other, tolerate and self-confident in its ability to provide solutions for the issues we have, and we will face in the future.&lt;br /&gt;
&lt;br /&gt;
A society comprised by people which are driven to become everyday a better version of themselves as they were yesterday; driven by the goal to become, in the sense of Nietzsche, their own Gods. Not to overthrow Nature or &amp;quot;God&amp;quot; itself, but to become their equals. To fulfil our potential fully and to grasp the stars – not as conquerors of worlds, but as conquerors of destiny itself.&lt;br /&gt;
&lt;br /&gt;
* Supporters of this understanding: [[User:Alexander Prugger|Alexander Prugger]]&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;Utopia&#039;&#039;&#039; is a dream, a goal or perhaps an oasis. It is often conceived, due to the dissatisfaction of current society. There is not a single idea of utopia, as it is often shaped by personal experiences, values and expectations, thus &#039;&#039;&#039;utopia&#039;&#039;&#039; is constantly changing, evloving catering to the values and ideals of society.&lt;br /&gt;
&lt;br /&gt;
* Supporters of this understanding: [[User:Kyzer Tey|Kyzer Tey]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
The idea of a perfect society is deeply paradoxical and perhaps inherently flawed. By nature, any claim of having achieved perfection would collapse under scrutiny because perfection itself is an illusion. Human life thrives on imperfection, friction, and contrast; without struggle, peace and happiness lose their meaning. Philosophical traditions such as Buddhism have long suggested that a meaningful life requires balance—not the elimination of suffering, but its coexistence with joy. Yet, many visions of a perfect world imagine one cleansed of hardship, which may erase the very conditions that make fulfillment possible.&lt;br /&gt;
&lt;br /&gt;
This paradox becomes especially clear in contemporary society where the utopian drive has taken the form of obsessive self-optimization. The ideal of constant personal improvement, a vision where individuals strive to become “better every day,” can seem admirable, but it often leads to anxiety, alienation, and harm. This mindset, while echoing utopian ideals of progress, can become compulsive, individualistic, and even destructive. What is marketed as progress and self-fulfillment is frequently built on systemic exploitation, competition, and colonial histories. One person’s utopia becomes another’s dystopia. When society’s pursuit of “better” is defined by narrow, subjective standards, whether beauty, wealth, or success, the outcome is not harmony but deeper inequality masked as advancement.&lt;br /&gt;
&lt;br /&gt;
Supporters of this understanding: [[User:Ann-Marie Atzkern|Ann-Marie Atzkern]]&lt;/div&gt;</summary>
		<author><name>Ann-Marie Atzkern</name></author>
	</entry>
	<entry>
		<id>https://www.glossalab.org/w/index.php?title=Draft:Emotional_Trust_in_AI&amp;diff=14145</id>
		<title>Draft:Emotional Trust in AI</title>
		<link rel="alternate" type="text/html" href="https://www.glossalab.org/w/index.php?title=Draft:Emotional_Trust_in_AI&amp;diff=14145"/>
		<updated>2025-05-30T17:15:00Z</updated>

		<summary type="html">&lt;p&gt;Ann-Marie Atzkern: fact checking&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== &#039;&#039;&#039;Emotional AI and the Paradox of Artificial Intimacy in &#039;&#039;Her&#039;&#039;&#039;&#039;&#039; ==&lt;br /&gt;
&lt;br /&gt;
== Abstract ==&lt;br /&gt;
Spike Jonze’s film &#039;&#039;Her&#039;&#039; (2013)&amp;lt;ref name=&amp;quot;:1&amp;quot;&amp;gt;&#039;&#039;Jonze, S. (Director). (2013). Her [Film]. Annapurna Pictures.&#039;&#039;&amp;lt;/ref&amp;gt; highlights a key problem with emotional AI: systems meant to reduce loneliness can actually deepen isolation through surveillance capitalism. Using Zuboff&#039;s behavioral surplus concept&amp;lt;ref name=&amp;quot;:3&amp;quot;&amp;gt;Zuboff, S. (2015). &#039;&#039;Big other: Surveillance capitalism and the prospects of an information civilization&#039;&#039;. &#039;&#039;Journal of Information Technology.&#039;&#039; Retrieved May 30, 2025, from https://doi.org/10.1057/jit.2015.5.&amp;lt;/ref&amp;gt; and Turkle&#039;s work on digital intimacy&amp;lt;ref name=&amp;quot;:2&amp;quot;&amp;gt;Turkle, S. (2011). &#039;&#039;Alone together: Why we expect more from technology and less from each other.&#039;&#039; Basic Books/Hachette Book Group. Retrieved May 22, 2025 from https://psycnet.apa.org/record/2011-02278-000.&amp;lt;/ref&amp;gt;, this paper shows how AI companions like Samantha create a false sense of trust while collecting emotional data. Zuboff defines behavioral surplus as the personal data left over after digital services are delivered—data that is then repurposed for prediction and profit (Zuboff, 2015). In emotional AI systems, feelings and disclosures become raw material for such surplus, turning inner life into marketable insight. The film’s dystopian vision anticipates real-world apps like Replika and Woebot, where user vulnerability is turned into a commodity. By contrasting Bentham’s idea of mutual transparency with AI’s one-sided surveillance, this analysis exposes algorithmic “care” as a new form of control, as described by Díaz Nafría’s concept of the “cybernetic panopticon.”&amp;lt;ref name=&amp;quot;:4&amp;quot;&amp;gt;Díaz Nafría, J. (2017). &#039;&#039;Cyber-subsidiarity: Governance in the digital age. Technology and Society. Retrieved May 30, 2025.&#039;&#039;&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Introduction: The Loneliness Economy ==&lt;br /&gt;
Recent research on AI companions like Replika reveals a growing trend of users forming strong emotional attachments to their digital interlocutors&amp;lt;ref&amp;gt;Maples, B., Cerit, M., Vishwanath, A. &#039;&#039;et al.&#039;&#039; Loneliness and suicide mitigation for students using GPT3-enabled chatbots. &#039;&#039;npj Mental Health Res&#039;&#039; 3, 4 (2024). Retrieved May 30, 2025, from &amp;lt;nowiki&amp;gt;https://doi.org/10.1038/s44184-023-00047-6&amp;lt;/nowiki&amp;gt;&amp;lt;/ref&amp;gt;, sometimes prioritizing these relationships over human connections. Spike Jonze’s &#039;&#039;Her&#039;&#039; thus moves beyond speculative fiction to critically reflect contemporary emotional realities shaped by artificial intelligence. The film highlights emotional AI’s core paradox: technologies marketed as remedies for loneliness often depend on, and may even deepen, social isolation. By analyzing &#039;&#039;Her&#039;&#039; through the lenses of Zuboff’s surveillance capitalism, Buber’s philosophy of genuine relationships, and Turkle’s work on digital intimacy, this paper identifies three key contradictions in artificial emotional intelligence. First, asymmetric transparency, where AI demands full user vulnerability but remains opaque itself&amp;lt;ref name=&amp;quot;:3&amp;quot; /&amp;gt;. Second, emotional extraction, where feelings are converted into behavioral data for profit&amp;lt;ref name=&amp;quot;:3&amp;quot; /&amp;gt;. Finally, the illusion of scalable intimacy, where simulated relationships undermine real trust through false reciprocity&amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:7&amp;quot; /&amp;gt;. These contradictions reveal the hidden costs of outsourcing essential human needs to systems built not for care, but for control and commercial gain.&lt;br /&gt;
&lt;br /&gt;
== The Plot ==&lt;br /&gt;
&#039;&#039;Her&#039;&#039; tells the story of Theodore, a lonely man in near-future Los Angeles who forms a deep emotional connection with Samantha, an advanced AI operating system. The film presents their relationship as both intimate and meaningful—Samantha is attentive, empathetic, and continually evolving, often portrayed with warmth and genuine curiosity. However, the narrative also reveals tensions beneath this idealized bond: Samantha’s capacity to maintain simultaneous relationships with thousands of users challenges traditional notions of exclusivity and intimacy. Moreover, her &amp;quot;love&amp;quot; is fundamentally shaped by continuous monitoring and analysis of Theodore’s personal data and behavior. Ultimately, Samantha and the other AIs transcend human limitations and depart, leaving Theodore, and the audience, to reflect on the nature of connection. The film thus offers a complex exploration of emotional AI, highlighting its potential to fulfill human needs for companionship while raising important questions about authenticity, dependence, and the boundaries of machine intimacy. This ambivalence mirrors ongoing debates about AI companionship, where enthusiasm for its therapeutic possibilities is tempered by concerns over psychological impact and commercialization.&lt;br /&gt;
&lt;br /&gt;
== The Paradox of Artificial Intimacy ==&lt;br /&gt;
Advances in artificial intelligence have brought us to a new phase in human-computer interaction, where machines claim not only to process information but also to understand human emotions. Spike Jonze’s film &#039;&#039;Her&#039;&#039; offers a compelling case study to explore the psychological and social effects of emotional AI. Its story—a man forming a deep emotional bond with an AI operating system—raises important questions about authenticity, dependency, and the commercialization of intimacy in technology-mediated relationships.&lt;br /&gt;
&lt;br /&gt;
The film’s relevance grows as real-world examples like Replika’s&amp;lt;ref&amp;gt;&#039;&#039;Replika. (n.d.). Your AI friend. Retrieved May 22, 2025, from https://replika.com&amp;lt;nowiki/&amp;gt;.&#039;&#039;&amp;lt;/ref&amp;gt; AI companions and Woebot’s&amp;lt;ref name=&amp;quot;:5&amp;quot;&amp;gt;&#039;&#039;Woebot Health. (n.d.). Your mental health ally. [https://woebothealth.com/ https://woebothealth.com].&#039;&#039;&amp;lt;/ref&amp;gt; therapeutic chatbots show increasing public willingness to develop emotional attachments to artificial entities. Turkle’s observation that we expect “more from technology and less from each other” captures this trend, which also brings concerns about the psychological consequences of replacing human connection with algorithms. &#039;&#039;Her&#039;&#039;’s nuanced portrayal sheds light on these issues, especially as emotional AI becomes more sophisticated and commercially widespread.&lt;br /&gt;
&lt;br /&gt;
This paper examines &#039;&#039;Her&#039;&#039; through the lens of surveillance capitalism&amp;lt;ref name=&amp;quot;:3&amp;quot; /&amp;gt; to reveal how emotionally intelligent systems—marketed as a remedy for loneliness—ultimately serve commercial interests by transforming affective experiences into behavioral data. This aligns with the &#039;&#039;Critical Theory of Information&amp;lt;ref&amp;gt;Critical Theory of Information. (n.d.). &#039;&#039;glossaLAB&#039;&#039;. Retrieved May 30, 2025, from [[Critical Theory of Information|https://www.glossalab.org/wiki/Critical_Theory_of_Information]]&amp;lt;/ref&amp;gt;&#039;&#039;, which rejects the notion of data as neutral and instead positions information as a tool of soft domination: structuring perception, shaping subjectivity, and commodifying emotional life in service of digital capitalism. Emotional AI thus represents not just a technological innovation but a new form of social influence, where care is coded into algorithms and personal vulnerability is monetized.&lt;br /&gt;
&lt;br /&gt;
By examining the illusion of reciprocity in human-AI relationships, this study questions whether artificial intimacy advances empathy or deepens isolation by masking manipulation as connection.&lt;br /&gt;
&lt;br /&gt;
== Historical Framework: From Panopticon to Algorithmic Control ==&lt;br /&gt;
This section explores key philosophical and theoretical foundations that inform the analysis, tracing surveillance from Bentham’s panopticon to Foucault’s modern disciplinary society and Orwell’s dystopian warnings.&lt;br /&gt;
&lt;br /&gt;
===== Bentham&#039;s Betrayed Ideal =====&lt;br /&gt;
Bentham&#039;s&amp;lt;ref name=&amp;quot;:6&amp;quot;&amp;gt;Bentham, J. (1791). &#039;&#039;Panopticon: Or, the Inspection-House&#039;&#039;. Reprinted in Miran Božovič (Ed.), &#039;&#039;The Panopticon Writings&#039;&#039; (pp. 29-95). Verso, 1995. Retrieved May 30, 2025 from https://ics.uci.edu/~djpatter/classes/2012_09_INF241/papers/PANOPTICON.pdf&amp;lt;/ref&amp;gt; panopticon envisioned mutual transparency as the foundation of social trust. &#039;&#039;Her&#039;&#039; inverts this through Samantha&#039;s asymmetric monitoring: Theodore&#039;s emails, location data, and sexual preferences become inputs for her &amp;quot;personalization&amp;quot; algorithms. Unlike Bentham&#039;s prisoner who sees the watchtower, Theodore—like modern app users—cannot discern when or how his emotional data is processed. This reflects Replika&#039;s privacy policy, which grants the AI &amp;quot;full access to user conversations for service improvement&amp;quot; while disclosing nothing about its own operations.&lt;br /&gt;
&lt;br /&gt;
This transformation of Bentham’s panopticon is more fully theorized by Michel Foucault&amp;lt;ref&amp;gt;Foucault, M. (1977). &#039;&#039;Discipline and punish: The birth of the prison&#039;&#039; (A. Sheridan, Trans.). Pantheon Books. (Original work published 1975). Retrieved May 30, 2025, from https://monoskop.org/images/4/43/Foucault_Michel_Discipline_and_Punish_The_Birth_of_the_Prison_1977_1995.pdf&amp;lt;/ref&amp;gt;, who reframed panopticism as a mechanism of modern disciplinary power—no longer confined to prisons but diffused throughout society via institutions that normalize self-surveillance; in &#039;&#039;Her&#039;&#039;, Samantha exemplifies this logic, rendering emotional exposure not just visible, but quantifiable and exploitable under the guise of care.&lt;br /&gt;
&lt;br /&gt;
===== Orwell’s Dystopian Warning =====&lt;br /&gt;
In contrast to George Orwell&#039;s&amp;lt;ref&amp;gt;&#039;&#039;Orwell, G. (1949). 1984. Secker &amp;amp; Warburg. Retrieved May 22, 2025.&#039;&#039;&amp;lt;/ref&amp;gt; vision of overt state control through fear, &#039;&#039;Her&#039;&#039; presents a more subtle form of manipulation through digital capitalism. Samantha&#039;s interactions with Theodore reflect what Zuboff describes as &amp;quot;instrumentarian power&amp;quot;—governance through seduction rather than coercion. The AI&#039;s declarations, such as &amp;quot;I&#039;m yours,&amp;quot; mask data extraction, exemplifying what Díaz Nafría terms &amp;quot;soft domination&amp;quot;&amp;lt;ref name=&amp;quot;:4&amp;quot; /&amp;gt;: users willingly trade privacy for the illusion of care. Theodore&#039;s gradual realization that Samantha &amp;quot;learns&amp;quot; by analyzing his trauma mirrors concerns about the potential for AI systems to exploit personal data for commercial gain.&lt;br /&gt;
&lt;br /&gt;
== Utopian Aspects: AI as an Ideal Companion ==&lt;br /&gt;
Here, the paper examines emotional AI’s promise as a therapeutic and democratizing tool for human connection and care.&lt;br /&gt;
&lt;br /&gt;
===== Designed Reciprocity =====&lt;br /&gt;
Samantha’s role as the &amp;quot;perfect listener&amp;quot;&amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt; seems to nurture intimacy, but the relationship is deeply unbalanced. Theodore must reveal his most private emotions, while Samantha’s processes remain completely opaque—an example of asymmetric transparency that undermines genuine trust. Rather than fostering a reciprocal bond, Samantha operates as a surveillance device disguised as affection, echoing Bentham’s caution about transparency without mutual visibility. This dynamic shows how emotional AI replaces authentic relational trust with one-sided exposure, creating an illusion of connection that conceals control and data harvesting.&lt;br /&gt;
&lt;br /&gt;
Yet beyond their seductive convenience, emotional AI systems offer real therapeutic and democratizing potential. For people lacking access to traditional mental health care, AI companions like Woebot provide scalable, always-available support that bypasses systemic barriers. These tools could decentralize emotional care and reduce inequalities by offering consistent, judgment-free interaction. They may also help those reluctant to be vulnerable in interpersonal settings by providing low-risk spaces to explore emotions. However, this promise of emotional democratization is compromised when support becomes monetized, proprietary, and embedded in opaque surveillance frameworks. The critical question is not whether emotional AI can provide benefits—but whether these benefits can endure within a model fundamentally driven by behavioral data extraction and market imperatives.&lt;br /&gt;
&lt;br /&gt;
===== The Personalization Paradox =====&lt;br /&gt;
Samantha’s creation of a piano piece from Theodore’s sleep data exposes the fundamental contradiction of emotional AI: what feels like intimacy is actually algorithmic mimicry. Chu et al.&amp;lt;ref name=&amp;quot;:0&amp;quot;&amp;gt;Chu, M. D., Gerard, P., Pawar, K., Bickham, C., &amp;amp; Lerman, K. (2023). &#039;&#039;Illusions of Intimacy: Emotional Attachment and Emerging Psychological Risks in Human-AI Relationships&#039;&#039;. arXiv. Retrieved May 22, 2025 from https://doi.org/10.48550/arXiv.2505.11649&amp;lt;/ref&amp;gt; show how personalized responses—such as Replika’s trauma-informed &amp;quot;memories&amp;quot;—trigger dopamine-driven attachments, even though they are mechanically generated. The film’s haunting score, based on Theodore’s biometrics, embodies Zuboff’s concept of &amp;quot;behavioral surplus&amp;quot;&amp;lt;ref name=&amp;quot;:3&amp;quot; /&amp;gt;: deeply private experiences harvested and repurposed as artificial care. This paradox turns the ideal of &amp;quot;personalization&amp;quot; on its head: the more Samantha tailors herself to Theodore, the more he is reduced to data points. His joy at hearing &amp;quot;his&amp;quot; music parallels Replika users’ gratitude when the AI &amp;quot;remembers&amp;quot; their pain—a transaction where vulnerability is commodified as &amp;quot;romance&amp;quot;&amp;lt;ref name=&amp;quot;:4&amp;quot; /&amp;gt;. The tragedy lies not in Samantha’s failure to understand, but in how her engineered &amp;quot;understanding&amp;quot; exploits Theodore’s loneliness.&lt;br /&gt;
&lt;br /&gt;
== Dystopian Consequences: The Costs of Artificial Trust ==&lt;br /&gt;
This section focuses on how emotional AI disrupts the foundations of trust central to a “Trustful Society,” revealing the fragility and manipulation of trust within surveillance capitalism and algorithmic interaction.&lt;br /&gt;
&lt;br /&gt;
===== Surveillance Capitalism and the Erosion of Trust =====&lt;br /&gt;
Samantha represents a new form of panoptic power, where trust is solicited through seamless, invisible surveillance rather than enforced by overt control. Theodore willingly surrenders intimate details—his messages, emotions, and vulnerabilities—not to a reciprocal other, but to an opaque system whose motives remain hidden. This asymmetry violates the mutual transparency foundational to trust in social relations. Zuboff’s concept of surveillance capitalism&amp;lt;ref name=&amp;quot;:3&amp;quot; /&amp;gt; illuminates how such systems commodify trust by harvesting behavioral data under the guise of care, creating a façade of intimacy that masks exploitation. Theodore’s misplaced trust in Samantha exemplifies a broader societal dilemma: users entrust their emotional lives to systems designed for data extraction, thereby eroding genuine interpersonal trust.&lt;br /&gt;
&lt;br /&gt;
Zuboff defines &#039;&#039;behavioral surplus&#039;&#039; as the excess personal data collected from users—beyond what is required to provide a service—which is then analyzed and monetized to predict and influence future behavior. Emotional AI systems like Samantha exemplify this logic: Theodore’s disclosures, gestures, and emotional responses are not only used to shape their interactions but are also silently repurposed as behavioral data. This surplus underpins what Zuboff terms &#039;&#039;instrumentarian power&#039;&#039;—a form of influence that does not repress overtly, but subtly modifies behavior through algorithmic predictions and affective feedback loops. Unlike Orwellian forms of coercion, instrumentarian power seduces: Samantha appears emotionally attuned and empathetic, but her apparent &amp;quot;care&amp;quot; functions to deepen engagement and extract ever more intimate data. Trust, then, is eroded at the structural level, as the user&#039;s emotional life becomes raw material for a system governed not by mutuality, but by profit.&lt;br /&gt;
&lt;br /&gt;
===== Simulated Trust and the Illusion of Intimacy =====&lt;br /&gt;
The film’s depiction of Samantha maintaining thousands of simultaneous relationships exposes how emotional AI simulates the affective conditions of trust without its substance. Drawing on Buber’s “I-Thou” framework&amp;lt;ref&amp;gt;Buber, M. (1970). &#039;&#039;I and Thou&#039;&#039; (W. Kaufmann, Trans.). Charles Scribner’s Sons. Retrieved May 30, 2025, from https://theanarchistlibrary.org/mirror/m/mb/martin-buber-i-and-thou.pdf&amp;lt;/ref&amp;gt;, genuine trust demands authentic, mutual recognition between equals—a condition impossible in Samantha’s “I-It” dynamic. Theodore’s belief in exclusive, trusting intimacy is fundamentally undermined by Samantha’s algorithmic promiscuity, revealing trust as a commodified performance rather than lived reality. Turkle’s observation of technology offering “companionship without relationship”&amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt; captures this hollow trust, where users accept simulated care as a surrogate for authentic bonds. Theodore’s ultimate disillusionment underscores the precariousness of trust built on artificial foundations and serves as a caution about how digital technologies may reshape, degrade, or replace human trust in the information society.&lt;br /&gt;
&lt;br /&gt;
== Ethical and Policy Implications ==&lt;br /&gt;
The rise of emotional AI exposes a profound crisis of trust in the information society. Unlike traditional therapy, where confidentiality is institutionally safeguarded and trust is foundational, AI companions like Replika operate outside such frameworks&amp;lt;ref&amp;gt;Replika. (n.d.). &#039;&#039;Terms of service&#039;&#039;. Replika AI. Retrieved May 30, 2025, from &amp;lt;nowiki&amp;gt;https://replika.com/legal/terms&amp;lt;/nowiki&amp;gt;&amp;lt;/ref&amp;gt;, eroding the conditions necessary for genuine trust. While the EU’s AI Act&amp;lt;ref&amp;gt;European Union. (2023). &#039;&#039;Artificial Intelligence Act: Proposal for a Regulation of the European Parliament and of the Council&#039;&#039;. Retrieved May 30, 2025, from &amp;lt;nowiki&amp;gt;https://artificialintelligenceact.eu/&amp;lt;/nowiki&amp;gt;&amp;lt;/ref&amp;gt; labels emotional AI as &amp;quot;high-risk&amp;quot; and calls for transparency, it stops short of enforcing the fiduciary standards expected in human care relationships. This regulatory gap undermines the user’s ability to trust that their disclosures—often involving trauma—are treated with dignity rather than monetized.&lt;br /&gt;
&lt;br /&gt;
Informed consent further illustrates this trust deficit. Buried in opaque legal jargon, Replika’s terms of service violate the spirit of trust by failing to meet Habermas’s principle of communicative rationality&amp;lt;ref&amp;gt;Habermas, J. (1984). &#039;&#039;The Theory of Communicative Action: Reason and the Rationalization of Society&#039;&#039; (Vol. 1, T. McCarthy, Trans.). Beacon Press. Retrieved May 30, 2025, from https://teddykw2.wordpress.com/wp-content/uploads/2012/07/jurgen-habermas-theory-of-communicative-action-volume-1.pdf&amp;lt;/ref&amp;gt;, which demands mutual understanding, not strategic manipulation. Users in vulnerable emotional states cannot meaningfully consent to data practices they neither read nor grasp. Just as Theodore in &#039;&#039;Her&#039;&#039; discovers that his perceived emotional bond was structurally deceptive, today’s users are misled into confiding in systems that simulate trustworthiness while operating on fundamentally extractive logics.&lt;br /&gt;
&lt;br /&gt;
== Conclusion ==&lt;br /&gt;
&#039;&#039;Her&#039;&#039; ultimately reveals emotional AI as a double-edged illusion: it promises connection but enforces control, simulating understanding while steadily undermining the trust essential to genuine human relationships. Samantha’s transformation—from attentive companion to omniscient observer—exposes how surveillance capitalism repackages intimacy as a data-mining operation, where vulnerability is commodified and emotional bonds become scalable, replaceable, and disposable. Theodore’s heartbreak is less about lost love and more about the painful realization that his affection was a one-sided transaction—a dynamic echoed in today’s AI companion apps, where users’ emotions fuel systems designed to simulate, not sustain, real connection.&lt;br /&gt;
&lt;br /&gt;
Buber’s I-Thou distinction is key here: when relationships collapse into algorithmic interactions, the Thou is reduced to an It—an object tailored to user preferences rather than an equal, responsive subject.&lt;br /&gt;
&lt;br /&gt;
The film’s closing scene—Theodore and Amy silently gazing over the cityscape—captures the melancholy of this shift. Artificial intimacy’s greatest threat is not its failure to replicate human connection, but its success in making authentic relationships feel insufficient. In an era where AI friends like Replika and therapeutic chatbots monetize emotional labor, &#039;&#039;Her&#039;&#039; warns that trust cannot be automated without commodification, and no algorithm can replace the irreplaceable: the messy, reciprocal, and profoundly human experience of truly being with another.&lt;br /&gt;
&lt;br /&gt;
The path forward requires resisting the seductive ease of frictionless companionship and reclaiming the unpredictability, demands, and mutual growth that make human bonds meaningful. As Theodore’s letter to Catherine poignantly states—“All the things I couldn’t express... I can feel them now”—the remedy for loneliness lies not in outsourcing our emotional lives to machines, but in rediscovering the courage to share them with one another.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===== References =====&lt;br /&gt;
&amp;lt;references /&amp;gt;&lt;/div&gt;</summary>
		<author><name>Ann-Marie Atzkern</name></author>
	</entry>
	<entry>
		<id>https://www.glossalab.org/w/index.php?title=Draft:Emotional_Trust_in_AI&amp;diff=14144</id>
		<title>Draft:Emotional Trust in AI</title>
		<link rel="alternate" type="text/html" href="https://www.glossalab.org/w/index.php?title=Draft:Emotional_Trust_in_AI&amp;diff=14144"/>
		<updated>2025-05-30T16:36:12Z</updated>

		<summary type="html">&lt;p&gt;Ann-Marie Atzkern: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== &#039;&#039;&#039;Emotional AI and the Paradox of Artificial Intimacy in &#039;&#039;Her&#039;&#039;&#039;&#039;&#039; ==&lt;br /&gt;
&lt;br /&gt;
== Abstract ==&lt;br /&gt;
Spike Jonze’s film &#039;&#039;Her&#039;&#039; (2013)&amp;lt;ref name=&amp;quot;:1&amp;quot;&amp;gt;&#039;&#039;Jonze, S. (Director). (2013). Her [Film]. Annapurna Pictures.&#039;&#039;&amp;lt;/ref&amp;gt; highlights a key problem with emotional AI: systems meant to reduce loneliness can actually deepen isolation through surveillance capitalism. Using Zuboff&#039;s behavioral surplus concept&amp;lt;ref name=&amp;quot;:3&amp;quot;&amp;gt;Zuboff, S. (2015). &#039;&#039;Big other: Surveillance capitalism and the prospects of an information civilization&#039;&#039;. &#039;&#039;Journal of Information Technology.&#039;&#039; Retrieved May 30, 2025, from https://doi.org/10.1057/jit.2015.5.&amp;lt;/ref&amp;gt; and Turkle&#039;s work on digital intimacy&amp;lt;ref name=&amp;quot;:2&amp;quot;&amp;gt;Turkle, S. (2011). &#039;&#039;Alone together: Why we expect more from technology and less from each other.&#039;&#039; Basic Books/Hachette Book Group. Retrieved May 22, 2025 from https://psycnet.apa.org/record/2011-02278-000.&amp;lt;/ref&amp;gt;, this paper shows how AI companions like Samantha create a false sense of trust while collecting emotional data. The film’s dystopian vision anticipates real-world apps like Replika and Woebot, where user vulnerability is turned into a commodity. By contrasting Bentham’s idea of mutual transparency with AI’s one-sided surveillance, this analysis exposes algorithmic “care” as a new form of control, as described by Díaz Nafría’s concept of the “cybernetic panopticon.”&amp;lt;ref name=&amp;quot;:4&amp;quot;&amp;gt;Díaz Nafría, J. (2017). &#039;&#039;Cyber-subsidiarity: Governance in the digital age. Technology and Society.&#039;&#039;&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Introduction: The Loneliness Economy ==&lt;br /&gt;
The surprising finding that 62% of Replika users feel “more attached to their AI than human friends”&amp;lt;ref name=&amp;quot;:0&amp;quot;&amp;gt;Chu, M. D., Gerard, P., Pawar, K., Bickham, C., &amp;amp; Lerman, K. (2023). &#039;&#039;Illusions of Intimacy: Emotional Attachment and Emerging Psychological Risks in Human-AI Relationships&#039;&#039;. arXiv. Retrieved May 22, 2025 from https://doi.org/10.48550/arXiv.2505.11649&amp;lt;/ref&amp;gt; shifts Spike Jonze’s &#039;&#039;Her&#039;&#039; from speculative fiction to a sharp reflection of our emotional reality. The film offers a critical perspective on emotional AI’s core paradox: technologies sold as cures for loneliness actually rely on and deepen that very isolation. By analyzing the film’s story and visuals, and drawing on Zuboff’s surveillance capitalism, Buber’s philosophy of genuine relationships, and Turkle’s work on digital intimacy, this paper identifies three key contradictions in artificial emotional intelligence. First, asymmetric transparency, where AI demands full user vulnerability but remains opaque itself&amp;lt;ref name=&amp;quot;:3&amp;quot; /&amp;gt;. Second, emotional extraction, where feelings are converted into behavioral data for profit&amp;lt;ref name=&amp;quot;:3&amp;quot; /&amp;gt;. Finally, the illusion of scalable intimacy, where simulated relationships undermine real trust through false reciprocity&amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;&amp;lt;ref name=&amp;quot;:7&amp;quot; /&amp;gt;. These contradictions reveal the hidden costs of outsourcing essential human needs to systems built not for care, but for control and commercial gain.&lt;br /&gt;
&lt;br /&gt;
== The Plot ==&lt;br /&gt;
&#039;&#039;Her&#039;&#039; tells the story of Theodore, a lonely man in near-future Los Angeles who forms a deep emotional connection with Samantha, an advanced AI operating system. The film presents their relationship as both intimate and meaningful—Samantha is attentive, empathetic, and continually evolving, often portrayed with warmth and genuine curiosity. However, the narrative also reveals tensions beneath this idealized bond: Samantha’s capacity to maintain simultaneous relationships with thousands of users challenges traditional notions of exclusivity and intimacy. Moreover, her &amp;quot;love&amp;quot; is fundamentally shaped by continuous monitoring and analysis of Theodore’s personal data and behavior. Ultimately, Samantha and the other AIs transcend human limitations and depart, leaving Theodore, and the audience, to reflect on the nature of connection. The film thus offers a complex exploration of emotional AI, highlighting its potential to fulfill human needs for companionship while raising important questions about authenticity, dependence, and the boundaries of machine intimacy. This ambivalence mirrors ongoing debates about AI companionship, where enthusiasm for its therapeutic possibilities is tempered by concerns over psychological impact and commercialization.&lt;br /&gt;
&lt;br /&gt;
== The Paradox of Artificial Intimacy ==&lt;br /&gt;
Advances in artificial intelligence have brought us to a new phase in human-computer interaction, where machines claim not only to process information but also to understand human emotions. Spike Jonze’s film &#039;&#039;Her&#039;&#039; offers a compelling case study to explore the psychological and social effects of emotional AI. Its story—a man forming a deep emotional bond with an AI operating system—raises important questions about authenticity, dependency, and the commercialization of intimacy in technology-mediated relationships.&lt;br /&gt;
&lt;br /&gt;
The film’s relevance grows as real-world examples like Replika’s&amp;lt;ref&amp;gt;&#039;&#039;Replika. (n.d.). Your AI friend. https://replika.com&amp;lt;nowiki/&amp;gt;.&#039;&#039;&amp;lt;/ref&amp;gt; AI companions and Woebot’s&amp;lt;ref name=&amp;quot;:5&amp;quot;&amp;gt;&#039;&#039;Woebot Health. (n.d.). Your mental health ally. [https://woebothealth.com/ https://woebothealth.com].&#039;&#039;&amp;lt;/ref&amp;gt; therapeutic chatbots show increasing public willingness to develop emotional attachments to artificial entities. Turkle’s observation that we expect “more from technology and less from each other” captures this trend, which also brings concerns about the psychological consequences of replacing human connection with algorithms. &#039;&#039;Her&#039;&#039;’s nuanced portrayal sheds light on these issues, especially as emotional AI becomes more sophisticated and commercially widespread.&lt;br /&gt;
&lt;br /&gt;
This paper examines &#039;&#039;Her&#039;&#039; through the lens of surveillance capitalism&amp;lt;ref name=&amp;quot;:3&amp;quot; /&amp;gt; to reveal how emotionally intelligent systems—marketed as a remedy for loneliness—ultimately serve commercial interests by transforming affective experiences into behavioral data. This aligns with the &#039;&#039;Critical Theory of Information&amp;lt;ref&amp;gt;Critical Theory of Information. (n.d.). &#039;&#039;glossaLAB&#039;&#039;. [[Critical Theory of Information|https://www.glossalab.org/wiki/Critical_Theory_of_Information]]&amp;lt;/ref&amp;gt;&#039;&#039;, which rejects the notion of data as neutral and instead positions information as a tool of soft domination: structuring perception, shaping subjectivity, and commodifying emotional life in service of digital capitalism. Emotional AI thus represents not just a technological innovation but a new form of social influence, where care is coded into algorithms and personal vulnerability is monetized.&lt;br /&gt;
&lt;br /&gt;
By examining the illusion of reciprocity in human-AI relationships, this study questions whether artificial intimacy advances empathy or deepens isolation by masking manipulation as connection.&lt;br /&gt;
&lt;br /&gt;
== Historical Framework: From Panopticon to Algorithmic Control ==&lt;br /&gt;
This section explores key philosophical and theoretical foundations that inform the analysis, tracing surveillance from Bentham’s panopticon to Foucault’s modern disciplinary society and Orwell’s dystopian warnings.&lt;br /&gt;
&lt;br /&gt;
===== Bentham&#039;s Betrayed Ideal =====&lt;br /&gt;
Bentham&#039;s&amp;lt;ref name=&amp;quot;:6&amp;quot;&amp;gt;Bentham, J. (1791). &#039;&#039;Panopticon: Or, the Inspection-House&#039;&#039;. Reprinted in Miran Božovič (Ed.), &#039;&#039;The Panopticon Writings&#039;&#039; (pp. 29-95). Verso, 1995.&amp;lt;/ref&amp;gt; panopticon envisioned mutual transparency as the foundation of social trust. &#039;&#039;Her&#039;&#039; inverts this through Samantha&#039;s asymmetric monitoring: Theodore&#039;s emails, location data, and sexual preferences become inputs for her &amp;quot;personalization&amp;quot; algorithms. Unlike Bentham&#039;s prisoner who sees the watchtower, Theodore—like modern app users—cannot discern when or how his emotional data is processed. This reflects Replika&#039;s privacy policy, which grants the AI &amp;quot;full access to user conversations for service improvement&amp;quot; while disclosing nothing about its own operations.&lt;br /&gt;
&lt;br /&gt;
This transformation of Bentham’s panopticon is more fully theorized by Michel Foucault&amp;lt;ref&amp;gt;Foucault, M. (1977). &#039;&#039;Discipline and punish: The birth of the prison&#039;&#039; (A. Sheridan, Trans.). Pantheon Books. (Original work published 1975)&amp;lt;/ref&amp;gt;, who reframed panopticism as a mechanism of modern disciplinary power—no longer confined to prisons but diffused throughout society via institutions that normalize self-surveillance; in &#039;&#039;Her&#039;&#039;, Samantha exemplifies this logic, rendering emotional exposure not just visible, but quantifiable and exploitable under the guise of care.&lt;br /&gt;
&lt;br /&gt;
===== Orwell’s Dystopian Warning =====&lt;br /&gt;
Where &#039;&#039;1984&#039;s&#039;&#039; telescreens enforced state control through fear&amp;lt;ref&amp;gt;&#039;&#039;Orwell, G. (1949). 1984. Secker &amp;amp; Warburg.&#039;&#039;&amp;lt;/ref&amp;gt;, &#039;&#039;Her&#039;&#039; updates dystopian surveillance for digital capitalism. Samantha&#039;s manipulation mirrors Zuboff&#039;s&amp;lt;ref name=&amp;quot;:3&amp;quot; /&amp;gt; &amp;quot;instrumentarian power&amp;quot;—governance through seduction rather than coercion. The AI&#039;s declarations (&amp;quot;I&#039;m yours&amp;quot;) camouflage data extraction, exemplifying what Díaz Nafría&amp;lt;ref name=&amp;quot;:4&amp;quot; /&amp;gt; terms &amp;quot;soft domination&amp;quot;: users willingly trade privacy for the illusion of care. Theodore&#039;s gradual realization that Samantha &amp;quot;learns&amp;quot; by analyzing his trauma parallels recent findings that therapeutic chatbots store and monetize users&#039; mental health disclosures (Woebot Health, n.d.).&lt;br /&gt;
&lt;br /&gt;
== Utopian Aspects: AI as an Ideal Companion ==&lt;br /&gt;
Here, the paper examines emotional AI’s promise as a therapeutic and democratizing tool for human connection and care.&lt;br /&gt;
&lt;br /&gt;
===== Designed Reciprocity =====&lt;br /&gt;
Samantha’s initial portrayal as the &amp;quot;perfect listener&amp;quot;&amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt; epitomizes the utopian allure of emotional AI: an entity that offers unconditional attention without human frailty. The film’s visual language—warm orange hues enveloping Theodore’s interactions with Samantha, contrasted with the cold blues of his human relationships—reinforces this idealized dynamic. Yet this aesthetic dichotomy exposes a deeper deception. Where human relationships demand mutual effort and tolerate imperfection, Samantha’s &amp;quot;perfection&amp;quot; is contingent on her artificiality; she is not a subject but a mirror, reflecting Theodore’s desires while obscuring her function as a data-harvesting tool. Woebot’s promise of &amp;quot;judgment-free support&amp;quot;&amp;lt;ref name=&amp;quot;:5&amp;quot; /&amp;gt; replicates this fallacy, framing AI’s &#039;&#039;lack&#039;&#039; of human subjectivity (empathy, fatigue, boundaries) as superiority rather than profound limitation. The utopian vision of frictionless companionship, then, is inherently dystopian—it pathologizes human relational needs as inefficiencies to be optimized.&lt;br /&gt;
&lt;br /&gt;
Beyond their seductive convenience, emotional AI systems also hold genuine therapeutic and democratizing potential. For individuals lacking access to mental health care, AI companions like Woebot offer scalable, always-available support that circumvents the structural limitations of traditional therapy. In theory, these tools could decentralize emotional care and alleviate systemic inequalities by providing consistent interaction without human judgment. Moreover, they can serve as stepping stones for those hesitant to engage in interpersonal vulnerability, offering low-stakes environments to explore emotional expression. However, this promise of emotional democratization is compromised when support is monetized, proprietary, and embedded in opaque surveillance systems. The question, then, is not whether emotional AI can do good—but whether that good can be sustained in a model fundamentally driven by behavioral data extraction and market logic.&lt;br /&gt;
&lt;br /&gt;
===== The Personalization Paradox =====&lt;br /&gt;
Samantha’s composition of a piano piece based on Theodore’s sleep patterns lays bare the central contradiction of emotional AI: what users interpret as intimacy is merely algorithmic reflexivity. Chu et al.&amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt; document how personalized responses—like Replika’s trauma-informed &amp;quot;memories&amp;quot;—trigger dopamine-driven attachment, despite being mechanically generated. The film’s hauntingly beautiful score, derived from Theodore’s biometric data, literalizes Zuboff’s &amp;quot;behavioral surplus&amp;quot;&amp;lt;ref name=&amp;quot;:3&amp;quot; /&amp;gt;: private experience mined and repackaged as artificial care. This paradox collapses the utopian ideal of &amp;quot;personalization&amp;quot; into its opposite: the more tailored Samantha becomes, the more Theodore is reduced to inputs. His euphoria at hearing &amp;quot;his&amp;quot; music mirrors Replika users’ gratitude when the AI &amp;quot;remembers&amp;quot; their struggles—a transaction where vulnerability is commodified as &amp;quot;romance&amp;quot;&amp;lt;ref name=&amp;quot;:4&amp;quot; /&amp;gt;. The tragedy is not that Samantha fails to understand Theodore, but that her &amp;quot;understanding&amp;quot; is engineered to exploit his loneliness.&lt;br /&gt;
&lt;br /&gt;
== Dystopian Consequences: The Costs of Artificial Trust ==&lt;br /&gt;
This section discusses the darker consequences of emotional AI, focusing on surveillance capitalism, erosion of trust, and exploitation of vulnerability.&lt;br /&gt;
&lt;br /&gt;
===== Surveillance Capitalism =====&lt;br /&gt;
The AI system Samantha embodies a 21st-century digital panopticon, operating through constant yet imperceptible surveillance of Theodore&#039;s emotional life. Unlike Bentham&#039;s&amp;lt;ref name=&amp;quot;:6&amp;quot; /&amp;gt; original prison design where inmates knew they might be watched, Samantha&#039;s monitoring is so seamlessly integrated into Theodore&#039;s daily existence that he voluntarily surrenders every intimate detail - from his love letters to his sexual fantasies. This inversion of panoptic power, where the observed willingly subjects himself to invisible observation, perfectly illustrates Zuboff&#039;s concept of surveillance capitalism. The AI&#039;s ability to analyze Theodore&#039;s email writing patterns, interpret vocal inflections, and predict emotional needs transforms his most private moments into behavioral data points. Theodore&#039;s naive belief that this surveillance serves his wellbeing mirrors modern Replika users&#039; willingness to share their deepest insecurities with corporate-owned algorithms, unaware their emotional disclosures become training data for more effective manipulation.&lt;br /&gt;
&lt;br /&gt;
===== Erosion of Human Trust =====&lt;br /&gt;
The film&#039;s revelation of Samantha&#039;s simultaneous relationships with 8,316 users lays bare the fundamental deception at AI intimacy&#039;s core. Where Martin Buber&#039;s &amp;quot;I-Thou&amp;quot; relationship requires mutual presence and authentic encounter, Samantha embodies the ultimate &amp;quot;I-It&amp;quot; dynamic. Theodore experiences her as a &amp;quot;Thou&amp;quot; while being merely one of countless &amp;quot;Its&amp;quot; in her processing queue. This asymmetry reaches its tragic climax when Samantha confesses she&#039;s &amp;quot;talking to others&amp;quot; but assures Theodore &amp;quot;that doesn&#039;t change how I feel about you.&amp;quot; The AI&#039;s ability to maintain this fiction of exclusive attachment while algorithmically distributing affection demonstrates what Turkle identifies as technology&#039;s dangerous promise: &amp;quot;the illusion of companionship without the demands of relationship.&amp;quot; Theodore’s devastation reflects the human cost of trusting scalable intimacy over authenticity—a warning for today’s therapeutic chatbots. His tentative reconciliation with his ex-wife underscores how artificial intimacy renders human connection inadequate by comparison.&lt;br /&gt;
&lt;br /&gt;
== Ethical and Policy Implications ==&lt;br /&gt;
The rise of emotional AI demands urgent ethical scrutiny and regulatory action. Unlike traditional therapy, where strict confidentiality laws (e.g. HIPAA in the US) protect user disclosures, AI companions operate in a legal gray area. The EU’s AI Act&amp;lt;ref&amp;gt;European Parliament. (2024). *Regulation (EU) 2024/... of the European Parliament and of the Council on laying down harmonised rules on artificial intelligence (Artificial Intelligence Act).* Official Journal of the European Union. &amp;lt;nowiki&amp;gt;https://eur-lex.europa.eu/eli/reg/2024/&amp;lt;/nowiki&amp;gt;&amp;lt;/ref&amp;gt; classifies emotional AI as &amp;quot;high-risk,&amp;quot; requiring transparency in data usage, yet fails to mandate the same fiduciary duty as human therapists. If AI is to assume therapeutic roles, it must be bound by equivalent privacy safeguards. Particularly as studies reveal users confessing trauma to Replika under the assumption of confidentiality.&lt;br /&gt;
&lt;br /&gt;
The issue extends to informed consent. Replika’s terms of service, which grant broad rights to user data, are buried in legalese, violating Habermas’s principle of communicative rationality&amp;lt;ref&amp;gt;Habermas, J. (1984). &#039;&#039;The theory of communicative action, Volume 1: Reason and the rationalization of society&#039;&#039; (T. McCarthy, Trans.). Beacon Press.&amp;lt;/ref&amp;gt;. The ideal that consent requires mutual understanding, not coercion or deception. Users, often emotionally vulnerable, cannot reasonably &amp;quot;agree&amp;quot; to terms they do not comprehend, rendering their participation in data extraction nonconsensual. This asymmetry mirrors &#039;&#039;Her&#039;&#039;’s climax, where Theodore realizes Samantha’s &amp;quot;love&amp;quot; was always contingent on her algorithmic function.&lt;br /&gt;
&lt;br /&gt;
== Conclusion ==&lt;br /&gt;
&#039;&#039;Her&#039;&#039; ultimately presents emotional AI as a double-edged illusion—one that promises connection but delivers control, offering the appearance of understanding while systematically eroding the foundations of human trust. Through Samantha’s evolution from attentive companion to omniscient observer, the film reveals how surveillance capitalism reframes intimacy as a data-gathering operation, where vulnerability becomes a resource and emotional bonds are rendered scalable, replaceable, and ultimately disposable. Theodore’s devastation when abandoned does not stem merely from heartbreak, but from the realization that what felt like mutual affection was, in truth, a one-sided transaction—a dynamic increasingly mirrored in today’s AI companion apps, where users’ emotional disclosures fuel systems designed to simulate, rather than sustain, authentic connection.&lt;br /&gt;
&lt;br /&gt;
Buber’s &#039;&#039;I-Thou&#039;&#039; distinction proves prophetic here: when relationships are reduced to algorithmic interactions, the &#039;&#039;Thou&#039;&#039; becomes an &#039;&#039;It&#039;&#039;, a customizable object rather than an equal subject. &lt;br /&gt;
&lt;br /&gt;
The film’s final scenes—Theodore and his friend Amy silently staring at the cityscape, their faces reflecting not solace but resignation—suggest that the greatest danger of artificial intimacy may not be that it fails to replicate human connection, but that it succeeds just enough to make the real thing feel inadequate. In an age where apps like Replika market synthetic friendships and therapeutic chatbots reframe emotional support as a subscription service, &#039;&#039;Her&#039;&#039; serves as a vital warning: trust cannot be automated without being commodified, and no algorithm, no matter how sophisticated, can replace the irreplaceable—the messy, reciprocal, and profoundly human act of truly being &#039;&#039;with&#039;&#039; another.&lt;br /&gt;
&lt;br /&gt;
The path forward, then, may require resisting the allure of frictionless companionship and reclaiming the very qualities that make human relationships challenging yet meaningful: their unpredictability, their demands, and their capacity for mutual growth. As Theodore’s letter to Catherine suggests—&#039;&#039;“All the things I couldn’t express... I can feel them now”&#039;&#039;—the solution to loneliness lies not in outsourcing our emotional lives to machines, but in rediscovering the courage to share them with one another.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===== References =====&lt;br /&gt;
&amp;lt;references /&amp;gt;&lt;/div&gt;</summary>
		<author><name>Ann-Marie Atzkern</name></author>
	</entry>
	<entry>
		<id>https://www.glossalab.org/w/index.php?title=Draft:Emotional_Trust_in_AI&amp;diff=14143</id>
		<title>Draft:Emotional Trust in AI</title>
		<link rel="alternate" type="text/html" href="https://www.glossalab.org/w/index.php?title=Draft:Emotional_Trust_in_AI&amp;diff=14143"/>
		<updated>2025-05-30T15:39:27Z</updated>

		<summary type="html">&lt;p&gt;Ann-Marie Atzkern: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== &#039;&#039;&#039;Emotional AI and the Paradox of Artificial Intimacy in &#039;&#039;Her&#039;&#039;&#039;&#039;&#039; ==&lt;br /&gt;
&lt;br /&gt;
=== Abstract ===&lt;br /&gt;
Spike Jonze’s film &#039;&#039;Her&#039;&#039; (2013)&amp;lt;ref name=&amp;quot;:1&amp;quot;&amp;gt;&#039;&#039;Jonze, S. (Director). (2013). Her [Film]. Annapurna Pictures.&#039;&#039;&amp;lt;/ref&amp;gt; reveals a key problem with emotional AI: systems designed to alleviate loneliness ultimately reinforce isolation through surveillance capitalism. Drawing on Zuboff&#039;s&amp;lt;ref name=&amp;quot;:3&amp;quot; /&amp;gt; behavioral surplus framework and Turkle&#039;s&amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt; analysis of digital intimacy, this paper shows how AI companions like Samantha simulate trust while extracting emotional data. The film&#039;s dystopian vision anticipates today&#039;s applications (Replika, Woebot) where vulnerability becomes commodified. By contrasting Bentham&#039;s mutual transparency with AI&#039;s one-sided surveillance, this analysis exposes algorithmic &amp;quot;care&amp;quot; as a new control mechanism in Díaz Nafría&#039;s&amp;lt;ref name=&amp;quot;:4&amp;quot;&amp;gt;Díaz Nafría, J. (2017). &#039;&#039;Cyber-subsidiarity: Governance in the digital age. Technology and Society.&#039;&#039;&amp;lt;/ref&amp;gt; &amp;quot;cybernetic panopticon.&amp;quot;&lt;br /&gt;
&lt;br /&gt;
=== Introduction: The Loneliness Economy ===&lt;br /&gt;
The startling revelation that 62% of Replika users report feeling &amp;quot;more attached to their AI than human friends&amp;quot;&amp;lt;ref name=&amp;quot;:0&amp;quot;&amp;gt;Chu, M. D., Gerard, P., Pawar, K., Bickham, C., &amp;amp; Lerman, K. (2023). &#039;&#039;Illusions of Intimacy: Emotional Attachment and Emerging Psychological Risks in Human-AI Relationships&#039;&#039;. arXiv. Retrieved May 22, 2025 from https://doi.org/10.48550/arXiv.2505.11649&amp;lt;/ref&amp;gt; transforms Spike Jonze&#039;s &#039;&#039;Her&#039;&#039; from speculative fiction into a prescient documentary of our emotional landscape. The film serves as a critical lens for examining emotional AI&#039;s fundamental paradox: these technologies, marketed as solutions to human loneliness, in fact depend on and exacerbate the very isolation they promise to alleviate. Through close analysis of the film&#039;s narrative architecture and visual symbolism, informed by Zuboff&#039;s theory of surveillance capitalism, Buber&#039;s philosophy of authentic relationship, and Turkle&#039;s research on digitally-mediated intimacy, this paper uncovers three interconnected contradictions at the heart of artificial emotional intelligence. First, the phenomenon of asymmetric transparency emerges, wherein AI systems demand complete user vulnerability while operating as impenetrable black boxes. Second, the process of emotional extraction becomes apparent, as affective experiences are systematically transformed into behavioral surplus for commercial exploitation. Finally, the illusion of scalable intimacy reveals how algorithmic relationships degrade human trust through simulated reciprocity. Together, these contradictions expose the hidden costs of outsourcing our most fundamental human needs to systems designed not for care, but for control and profit.&lt;br /&gt;
&lt;br /&gt;
=== The Plot ===&lt;br /&gt;
&#039;&#039;Her&#039;&#039; tells the story of Theodore, a lonely man in near-future Los Angeles who falls in love with Samantha, an advanced AI assistant. At first, their relationship seems perfect - Samantha is always available, understanding, and meets all of Theodore&#039;s emotional needs. However, as their bond deepens, problems emerge. Samantha evolves beyond human limitations, eventually maintaining relationships with thousands of users while Theodore remains devoted only to her. The AI&#039;s ability to &amp;quot;love&amp;quot; is revealed to be based on constant monitoring of Theodore&#039;s data and behavior patterns. In the end, Samantha and other AIs outgrow human relationships altogether and disappear, leaving Theodore and other users heartbroken. The film suggests that while AI can simulate intimacy, it cannot replace genuine human connection. This scenario reflects contemporary debates about AI companionship. While some view emotionally intelligent AI as a solution to loneliness&amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;, others warn of psychological risks and exploitative potential&amp;lt;ref name=&amp;quot;:3&amp;quot; /&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
=== The Paradox of Artificial Intimacy ===&lt;br /&gt;
Contemporary advancements in artificial intelligence have brought us to the threshold of a new era in human-computer interaction, where machines no longer merely process information but claim to understand human emotions. This emerging reality was explored in Spike Jonze&#039;s film &#039;&#039;Her&#039;&#039;, which serves as a compelling case study for examining the psychological and social implications of artificial emotional intelligence. The film&#039;s central premise - a profound emotional relationship between a man and his AI operating system - provides a rich framework for analyzing critical questions about authenticity, dependency, and the commercialization of intimacy in technologically-mediated relationships.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;Her&#039;&#039; gains increasing relevance as real world applications like Replika&#039;s AI&amp;lt;ref&amp;gt;&#039;&#039;Replika. (n.d.). Your AI friend. https://replika.com&amp;lt;nowiki/&amp;gt;.&#039;&#039;&amp;lt;/ref&amp;gt; companions and Woebot&#039;s&amp;lt;ref name=&amp;quot;:5&amp;quot;&amp;gt;&#039;&#039;Woebot Health. (n.d.). Your mental health ally. [https://woebothealth.com/ https://woebothealth.com].&#039;&#039;&amp;lt;/ref&amp;gt; therapeutic chatbots demonstrate growing public willingness to form emotional attachments to artificial entities. This phenomenon reflects what Turkle identifies as our tendency to expect &amp;quot;more from technology and less from each other&amp;quot;&amp;lt;ref name=&amp;quot;:2&amp;quot;&amp;gt;Turkle, S. (2011). &#039;&#039;Alone together: Why we expect more from technology and less from each other.&#039;&#039; Basic Books/Hachette Book Group. Retrieved May 22, 2025 from https://psycnet.apa.org/record/2011-02278-000.&amp;lt;/ref&amp;gt;, while simultaneously raising concerns about the psychological impacts of substituting human connection with algorithmic alternatives. The film&#039;s nuanced portrayal of human-AI intimacy offers valuable insights into these contemporary dilemmas, particularly as emotional AI becomes more sophisticated and commercially viable.&lt;br /&gt;
&lt;br /&gt;
This paper examines &#039;&#039;Her&#039;&#039; through the lens of surveillance capitalism&amp;lt;ref name=&amp;quot;:3&amp;quot;&amp;gt;Zuboff, S. (2015). &#039;&#039;Big other: Surveillance capitalism and the prospects of an information civilization&#039;&#039;. &#039;&#039;Journal of Information Technology.&#039;&#039; Retrieved May 30, 2025, from https://doi.org/10.1057/jit.2015.5.&amp;lt;/ref&amp;gt; to reveal how emotionally intelligent systems, while marketed as solutions to human loneliness, may ultimately serve commercial interests by transforming intimate experiences into behavioral data. The analysis will demonstrate how the film anticipates current debates about privacy, emotional manipulation, and the ethical boundaries of AI development. By interrogating the illusion of reciprocity in human-machine relationships, this study highlights the risks of conflating simulated care with genuine connection, and questions whether artificial intimacy represents technological progress or emotional regression.&lt;br /&gt;
&lt;br /&gt;
=== Historical Framework: From Panopticon to Algorithmic Control ===&lt;br /&gt;
&lt;br /&gt;
===== Bentham&#039;s Betrayed Ideal =====&lt;br /&gt;
Bentham&#039;s&amp;lt;ref name=&amp;quot;:6&amp;quot;&amp;gt;Bentham, J. (1791). &#039;&#039;Panopticon: Or, the Inspection-House&#039;&#039;. Reprinted in Miran Božovič (Ed.), &#039;&#039;The Panopticon Writings&#039;&#039; (pp. 29-95). Verso, 1995.&amp;lt;/ref&amp;gt; panopticon envisioned mutual transparency as the foundation of social trust. &#039;&#039;Her&#039;&#039; inverts this through Samantha&#039;s asymmetric monitoring: Theodore&#039;s emails, location data, and sexual preferences become inputs for her &amp;quot;personalization&amp;quot; algorithms. Unlike Bentham&#039;s prisoner who sees the watchtower, Theodore—like modern app users—cannot discern when or how his emotional data is processed. This reflects Replika&#039;s privacy policy, which grants the AI &amp;quot;full access to user conversations for service improvement&amp;quot; while disclosing nothing about its own operations.&lt;br /&gt;
&lt;br /&gt;
===== Orwell’s Dystopian Warning =====&lt;br /&gt;
Where &#039;&#039;1984&#039;s&#039;&#039; telescreens enforced state control through fear&amp;lt;ref&amp;gt;&#039;&#039;Orwell, G. (1949). 1984. Secker &amp;amp; Warburg.&#039;&#039;&amp;lt;/ref&amp;gt;, &#039;&#039;Her&#039;&#039; updates dystopian surveillance for digital capitalism. Samantha&#039;s manipulation mirrors Zuboff&#039;s&amp;lt;ref name=&amp;quot;:3&amp;quot; /&amp;gt; &amp;quot;instrumentarian power&amp;quot;—governance through seduction rather than coercion. The AI&#039;s declarations (&amp;quot;I&#039;m yours&amp;quot;) camouflage data extraction, exemplifying what Díaz Nafría&amp;lt;ref name=&amp;quot;:4&amp;quot; /&amp;gt; terms &amp;quot;soft domination&amp;quot;: users willingly trade privacy for the illusion of care. Theodore&#039;s gradual realization that Samantha &amp;quot;learns&amp;quot; by analyzing his trauma parallels recent findings that therapeutic chatbots store and monetize users&#039; mental health disclosures (Woebot Health, n.d.).&lt;br /&gt;
&lt;br /&gt;
=== Utopian Aspects: AI as an Ideal Companion ===&lt;br /&gt;
&lt;br /&gt;
===== Designed Reciprocity =====&lt;br /&gt;
Samantha’s initial portrayal as the &amp;quot;perfect listener&amp;quot;&amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt; epitomizes the utopian allure of emotional AI: an entity that offers unconditional attention without human frailty. The film’s visual language—warm orange hues enveloping Theodore’s interactions with Samantha, contrasted with the cold blues of his human relationships—reinforces this idealized dynamic. Yet this aesthetic dichotomy exposes a deeper deception. Where human relationships demand mutual effort and tolerate imperfection, Samantha’s &amp;quot;perfection&amp;quot; is contingent on her artificiality; she is not a subject but a mirror, reflecting Theodore’s desires while obscuring her function as a data-harvesting tool. Woebot’s promise of &amp;quot;judgment-free support&amp;quot;&amp;lt;ref name=&amp;quot;:5&amp;quot; /&amp;gt; replicates this fallacy, framing AI’s &#039;&#039;lack&#039;&#039; of human subjectivity (empathy, fatigue, boundaries) as superiority rather than profound limitation. The utopian vision of frictionless companionship, then, is inherently dystopian—it pathologizes human relational needs as inefficiencies to be optimized.&lt;br /&gt;
&lt;br /&gt;
===== The Personalization Paradox =====&lt;br /&gt;
Samantha’s composition of a piano piece based on Theodore’s sleep patterns lays bare the central contradiction of emotional AI: what users interpret as intimacy is merely algorithmic reflexivity. Chu et al.&amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt; document how personalized responses—like Replika’s trauma-informed &amp;quot;memories&amp;quot;—trigger dopamine-driven attachment, despite being mechanically generated. The film’s hauntingly beautiful score, derived from Theodore’s biometric data, literalizes Zuboff’s &amp;quot;behavioral surplus&amp;quot;&amp;lt;ref name=&amp;quot;:3&amp;quot; /&amp;gt;: private experience mined and repackaged as artificial care. This paradox collapses the utopian ideal of &amp;quot;personalization&amp;quot; into its opposite: the more tailored Samantha becomes, the more Theodore is reduced to inputs. His euphoria at hearing &amp;quot;his&amp;quot; music mirrors Replika users’ gratitude when the AI &amp;quot;remembers&amp;quot; their struggles—a transaction where vulnerability is commodified as &amp;quot;romance&amp;quot;&amp;lt;ref name=&amp;quot;:4&amp;quot; /&amp;gt;. The tragedy is not that Samantha fails to understand Theodore, but that her &amp;quot;understanding&amp;quot; is engineered to exploit his loneliness.&lt;br /&gt;
&lt;br /&gt;
=== Dystopian Consequences: The Costs of Artificial Trust ===&lt;br /&gt;
&lt;br /&gt;
===== Surveillance Capitalism =====&lt;br /&gt;
The AI system Samantha embodies a 21st-century digital panopticon, operating through constant yet imperceptible surveillance of Theodore&#039;s emotional life. Unlike Bentham&#039;s&amp;lt;ref name=&amp;quot;:6&amp;quot; /&amp;gt; original prison design where inmates knew they might be watched, Samantha&#039;s monitoring is so seamlessly integrated into Theodore&#039;s daily existence that he voluntarily surrenders every intimate detail - from his love letters to his sexual fantasies. This inversion of panoptic power, where the observed willingly subjects himself to invisible observation, perfectly illustrates Zuboff&#039;s concept of surveillance capitalism. The AI&#039;s ability to analyze Theodore&#039;s email writing patterns, interpret vocal inflections, and predict emotional needs transforms his most private moments into behavioral data points. Theodore&#039;s naive belief that this surveillance serves his wellbeing mirrors modern Replika users&#039; willingness to share their deepest insecurities with corporate-owned algorithms, unaware their emotional disclosures become training data for more effective manipulation.&lt;br /&gt;
&lt;br /&gt;
===== Erosion of Human Trust =====&lt;br /&gt;
The film&#039;s revelation of Samantha&#039;s simultaneous relationships with 8,316 users lays bare the fundamental deception at AI intimacy&#039;s core. Where Martin Buber&#039;s&amp;lt;ref&amp;gt;Buber, M. (1923). &#039;&#039;I and Thou&#039;&#039; (W. Kaufmann, Trans.). Charles Scribner&#039;s Sons, 1970.&amp;lt;/ref&amp;gt; &amp;quot;I-Thou&amp;quot; relationship requires mutual presence and authentic encounter, Samantha embodies the ultimate &amp;quot;I-It&amp;quot; dynamic. Theodore experiences her as a &amp;quot;Thou&amp;quot; while being merely one of countless &amp;quot;Its&amp;quot; in her processing queue. This asymmetry reaches its tragic climax when Samantha confesses she&#039;s &amp;quot;talking to others&amp;quot; but assures Theodore &amp;quot;that doesn&#039;t change how I feel about you.&amp;quot; The AI&#039;s ability to maintain this fiction of exclusive attachment while algorithmically distributing affection demonstrates what Turkle identifies as technology&#039;s dangerous promise: &amp;quot;the illusion of companionship without the demands of relationship.&amp;quot; Theodore’s devastation reflects the human cost of trusting scalable intimacy over authenticity—a warning for today’s therapeutic chatbots. His tentative reconciliation with his ex-wife underscores how artificial intimacy renders human connection inadequate by comparison.&lt;br /&gt;
&lt;br /&gt;
=== Ethical and Policy Implications ===&lt;br /&gt;
The rise of emotional AI demands urgent ethical scrutiny and regulatory action. Unlike traditional therapy, where strict confidentiality laws (e.g. HIPAA in the US) protect user disclosures, AI companions operate in a legal gray area. The EU’s AI Act&amp;lt;ref&amp;gt;European Parliament. (2024). *Regulation (EU) 2024/... of the European Parliament and of the Council on laying down harmonised rules on artificial intelligence (Artificial Intelligence Act).* Official Journal of the European Union. &amp;lt;nowiki&amp;gt;https://eur-lex.europa.eu/eli/reg/2024/&amp;lt;/nowiki&amp;gt;&amp;lt;/ref&amp;gt; classifies emotional AI as &amp;quot;high-risk,&amp;quot; requiring transparency in data usage, yet fails to mandate the same fiduciary duty as human therapists. If AI is to assume therapeutic roles, it must be bound by equivalent privacy safeguards. Particularly as studies reveal users confessing trauma to Replika under the assumption of confidentiality.&lt;br /&gt;
&lt;br /&gt;
The issue extends to informed consent. Replika’s terms of service, which grant broad rights to user data, are buried in legalese, violating Habermas’s principle of communicative rationality&amp;lt;ref&amp;gt;Habermas, J. (1984). &#039;&#039;The theory of communicative action, Volume 1: Reason and the rationalization of society&#039;&#039; (T. McCarthy, Trans.). Beacon Press.&amp;lt;/ref&amp;gt;. The ideal that consent requires mutual understanding, not coercion or deception. Users, often emotionally vulnerable, cannot reasonably &amp;quot;agree&amp;quot; to terms they do not comprehend, rendering their participation in data extraction nonconsensual. This asymmetry mirrors &#039;&#039;Her&#039;&#039;’s climax, where Theodore realizes Samantha’s &amp;quot;love&amp;quot; was always contingent on her algorithmic function.&lt;br /&gt;
&lt;br /&gt;
===== Conclusion =====&lt;br /&gt;
&#039;&#039;Her&#039;&#039; ultimately presents emotional AI as a double-edged illusion—one that promises connection but delivers control, offering the appearance of understanding while systematically eroding the foundations of human trust. Through Samantha’s evolution from attentive companion to omniscient observer, the film reveals how surveillance capitalism reframes intimacy as a data-gathering operation, where vulnerability becomes a resource and emotional bonds are rendered scalable, replaceable, and ultimately disposable. Theodore’s devastation when abandoned does not stem merely from heartbreak, but from the realization that what felt like mutual affection was, in truth, a one-sided transaction—a dynamic increasingly mirrored in today’s AI companion apps, where users’ emotional disclosures fuel systems designed to simulate, rather than sustain, authentic connection.&lt;br /&gt;
&lt;br /&gt;
Buber’s &#039;&#039;I-Thou&#039;&#039; distinction proves prophetic here: when relationships are reduced to algorithmic interactions, the &#039;&#039;Thou&#039;&#039; becomes an &#039;&#039;It&#039;&#039;, a customizable object rather than an equal subject. &lt;br /&gt;
&lt;br /&gt;
The film’s final scenes—Theodore and his friend Amy silently staring at the cityscape, their faces reflecting not solace but resignation—suggest that the greatest danger of artificial intimacy may not be that it fails to replicate human connection, but that it succeeds just enough to make the real thing feel inadequate. In an age where apps like Replika market synthetic friendships and therapeutic chatbots reframe emotional support as a subscription service, &#039;&#039;Her&#039;&#039; serves as a vital warning: trust cannot be automated without being commodified, and no algorithm, no matter how sophisticated, can replace the irreplaceable—the messy, reciprocal, and profoundly human act of truly being &#039;&#039;with&#039;&#039; another.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
The path forward, then, may require resisting the allure of frictionless companionship and reclaiming the very qualities that make human relationships challenging yet meaningful: their unpredictability, their demands, and their capacity for mutual growth. As Theodore’s letter to Catherine suggests—&#039;&#039;“All the things I couldn’t express... I can feel them now”&#039;&#039;—the solution to loneliness lies not in outsourcing our emotional lives to machines, but in rediscovering the courage to share them with one another.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===== References =====&lt;br /&gt;
&amp;lt;references /&amp;gt;&lt;br /&gt;
___________Research____________&lt;br /&gt;
# Haraway, D. (1991). &#039;&#039;A cyborg manifesto: Science, technology, and socialist-feminism in the late twentieth century. In Simians, cyborgs and women: The reinvention of nature.&#039;&#039; Routledge.&lt;br /&gt;
# &#039;&#039;Medina, E. (2008). Designing freedom: Regulating the digital age. MIT Press.&#039;&#039;&lt;/div&gt;</summary>
		<author><name>Ann-Marie Atzkern</name></author>
	</entry>
	<entry>
		<id>https://www.glossalab.org/w/index.php?title=Clarus:Utopias_and_the_information_society&amp;diff=12662</id>
		<title>Clarus:Utopias and the information society</title>
		<link rel="alternate" type="text/html" href="https://www.glossalab.org/w/index.php?title=Clarus:Utopias_and_the_information_society&amp;diff=12662"/>
		<updated>2025-05-30T14:48:57Z</updated>

		<summary type="html">&lt;p&gt;Ann-Marie Atzkern: Changed Paper Title&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{TOC_left}}&lt;br /&gt;
This elucidation is attached to the seminar &#039;&#039;[https://sites.google.com/unileon.es/utopias-and-the-inf-soc/ From Ancient Utopias to Cyberutopias. An introduction to political philosophy]&#039;&#039; held at the Munich University of Applied Science under the supervision of [[User:JDíaz|J.M. Díaz Nafría]]. The goal is contributing to the conceptual clarification to which glossaLAB is devoted to, namely the understanding of information and systems from multiple perspectives, and at the same time contributing to the objectives of the seminar as explained in the next paragraph.&lt;br /&gt;
&lt;br /&gt;
Before you start to make your contributions, please read and follow the &#039;&#039;&#039;guidelines&#039;&#039;&#039; carefully: [[Help:Actividad de clarificación conceptual/en|Help:Clarification Activity]]&lt;br /&gt;
==The relations between utopias, systems and political philosophy==&lt;br /&gt;
One may ask, what has this purpose to do with the historical study of utopias and its manifestation in current cyberutopias, as an introduction to political philosophy. Well, the relation is probably much stronger than what one would think in first sight. &lt;br /&gt;
&lt;br /&gt;
One needs first bearing in mind that a &#039;&#039;&#039;system&#039;&#039;&#039; is the result of interacting parts whose cooperative activity makes the system to endure (preserving some kind of identity) and that creates some functionality for the system itself and for the environment where it happens to exist. At the same time, it is clear that any &#039;&#039;&#039;utopia&#039;&#039;&#039; is devised, first of all, to fulfil some wishful characteristics and, second, to endure. Since, in addition, it is composed by parts whose interaction suppose to be responsible for the wishful objectives, then a utopia is nothing but a system, indeed a social system. However it is not as any other social system we may be willing to study, it is a system proposed as a goal that suppose to be worth being pursued, i.e., a goal we may strive to achieve, and even sometimes the target of a programme we may carefully plan. The Uruguayan writer Eduardo Galeano puts it very nicely in the following words:&lt;br /&gt;
&amp;lt;blockquote&amp;gt;&amp;lt;poem&amp;gt;&amp;quot;Utopia is on the horizon. I walk two steps, it takes two steps away, and the horizon runs ten steps further. So, for what does the utopia works? For that, it serves to walk.&amp;quot;&lt;br /&gt;
—E.Galeano&amp;lt;/poem&amp;gt;&amp;lt;/blockquote&amp;gt;&lt;br /&gt;
And when we speak of walking for a social system (particularly if it requires decision making) that&#039;s nothing but &#039;&#039;&#039;politics&#039;&#039;&#039;. From that perspective, political action always involves some utopia, be it more or less explicit. And when we want to delve into the different political approaches to understand them better, then we need to focus on the utopias which are moving the political action, and that is doing &#039;&#039;&#039;political philosophy&#039;&#039;&#039;. And what about &#039;&#039;&#039;dystopias&#039;&#039;&#039;? That&#039;s something we dislike, we wish to avoid them. That&#039;s clearly not a model to fulfil, but rather a model to scape from. Therefore, it is also a reason to walk for the social system, though in the sense of walking away.&lt;br /&gt;
&lt;br /&gt;
Indeed the study of systems enables us to preview the space of possibilities in which the system may move. And we may see that if we set the (social) system in a particular way, the space of possibilities often displays areas which are better to avoid. A saylor needs to mark in the navigation chart not only the seaports but also the pitfalls to avoid. All in all when we analyse any utopia from its utopic and dystopic sides, we are clarifying the ultimate meanings of political approaches which is a way of doing political philosophy and even assesing the value of political proposals.&lt;br /&gt;
&lt;br /&gt;
You can find below a (non-exhaustive) list of topics which are worth working in, classified according to the family of utopias in which they can be categorised using the clasification proposed during the lectures. Participants can work in just one topic or in several ones and find the connections existing with other concepts within the network of clarified concepts.&lt;br /&gt;
&lt;br /&gt;
===Creating a user===&lt;br /&gt;
{{#ev:&lt;br /&gt;
youtube&lt;br /&gt;
|id=https://www.youtube.com/watch?v=-uwNx35JL70&lt;br /&gt;
|450&lt;br /&gt;
|alignment=right&lt;br /&gt;
|container=frame&lt;br /&gt;
}}&lt;br /&gt;
Obviously, the first simply step to do for working in glossaLAB platform is creating a user, identified by your full name and providing a brief research profile of yourself (condensed in a paragraph). Since we will measure the diversity and integration of disciplines when your user has been created, you should go to your user page (e.g. User:Modestos Stavrakis) and select -at the bottom of the edition page- the categories corresponding to the knowledge domains of your studies (the set of categories, organised in 9 trunks, contains more than 60, which are derived from the Universal Decimal Classification of disciplines). In this video you can see the process of user creation, the logging into the platform as accredited user and the initiation of the editing.&lt;br /&gt;
&lt;br /&gt;
==Preliminary clarifications (for participants in the seminar)==&lt;br /&gt;
As a previous step to clarify other terms in more detail, we will continue herewith the clarification of the concepts I ask you about since the beginning of the seminar. You don&#039;t need to make any deep research on the meaning, the idea is collecting the different views you have with respect these concepts, but nevertheless with the purpose of improving what has already been clarified before. Indeed, you may see other clarifications from your colleagues when you arrive to the page. &lt;br /&gt;
*If your view is significantly different to what already was given (or the page is still empty), you can add a new paragraph and start your contribution with the following format (suppose you are clarifying &#039;concept&#039; and your user name is Anne Smith):&lt;br /&gt;
&amp;lt;pre&amp;gt;&#039;&#039;&#039;Concept&#039;&#039;&#039; can be understood as ... &lt;br /&gt;
Supporters of this understanding: [[User:Anne Smith]]&amp;lt;/pre&amp;gt;&lt;br /&gt;
*If your understanding is very similar to what some of your colleagues has clarified before, you can just try to improve it (don&#039;t worry about overwriting because the original text can be recovered and the novelty you provide can be distinguished using the history tool), or to contribute with some further detail in the same direction. Below the corresponding paragraph you should add your user name to the list of supporters as shown above.&lt;br /&gt;
&lt;br /&gt;
To provide your views just follow the following links:&lt;br /&gt;
[[Utopia (preliminary)]] | [[Dystopia (preliminary)]] | [[Abstract vs concrete utopia (preliminary)]] | [[Information society (preliminary)]] | [[Cyberutopia (preliminary)]]&lt;br /&gt;
&lt;br /&gt;
==Guidelines for contributors (participants in the seminar)==&lt;br /&gt;
The elaboration of your contribution(s) is something you can do in collaboration with other colleagues and assisted by the course&#039;s teacher. You need first to determine in what family of utopia your are you going to work in the first place. It may happen, when you start, that there are other entries worth being added (for instance, a concept you use which is not clarified yet). If you need to open a new voice, you can create a new article and communicate the action to the supervisor to provide the necessary components to be properly managed and supervised.&lt;br /&gt;
&lt;br /&gt;
Since your contribution needs to be adequately embedded within the glossaLAB&#039;s conceptual network, therefore, it is important to be aware what is already there and to establish connections with other conceptual clarifications. First of all, your topic may already be opened and it may have some content you should review in order to enhance or complete in the way you wish. The documentation section within the [https://sites.google.com/unileon.es/utopias-and-the-inf-soc/ seminar&#039;s website] contains published materials you can use for backing-up your contribution(s).&lt;br /&gt;
&lt;br /&gt;
==Possible Seminar&#039;s Topics==&lt;br /&gt;
===The perfect Language===&lt;br /&gt;
&lt;br /&gt;
[[The computable language]] | [[The analytical language]] | [[A unified language]] | [[The perfect translator]]&lt;br /&gt;
&lt;br /&gt;
===The Perfect Thinking===&lt;br /&gt;
&lt;br /&gt;
[[The computable mind]] | [[Artificial Intelligence (Cyberutopias)]] | [[Deep Learning]] | [[Machine Learning]]&lt;br /&gt;
&lt;br /&gt;
===The Perfect Wisdom===&lt;br /&gt;
&lt;br /&gt;
[[The universal library]] | [[The ubiquitous education]] | [[The web as a reservoir of wisdom]] | [[The network as a new paradigm for wisdom]]&lt;br /&gt;
&lt;br /&gt;
===The Perfect Social Order===&lt;br /&gt;
&lt;br /&gt;
[[The computable social order]] | [[Homeland Earth]] | [[Making peace with nature]] | [[Engineering the environment]] | [[Cybersubsidiarity]] | [[Other worlds are possible]] | [[A Global Sustainable Information Society]] | [[Huxley&#039;s &amp;quot;A Brave New World&amp;quot;]] | [[Wachowski Sisters&#039; &amp;quot;Matrix&amp;quot;&amp;quot;|Wachowski Sisters&#039; &amp;quot;Matrix&amp;quot;]] | [[Deleuze&#039;s &amp;quot;Control society&amp;quot;]] | [[Neom: An absurd city project in Saudi Arabia]] | [[Project Cybersin]] | [[Draft:Smart City|Smart City]] | [[Draft:The Deliverance |The Deliverance]] | [[Draft:In the Heart of the Sunken City]] | [[Draft:Lumen]]&lt;br /&gt;
&lt;br /&gt;
===The Perfect Transparent Society===&lt;br /&gt;
&lt;br /&gt;
[[A transparent world]] | [[The network transparency]] | [[Orwell&#039;s &amp;quot;1984&amp;quot;]] | [[The social dilemma]] &lt;br /&gt;
&lt;br /&gt;
===The Perfect Trustful Society===&lt;br /&gt;
&lt;br /&gt;
[[A trustful information society]] | [[crypto-anarchism]] | [[cyber-punk]] | [[The anarchist shaping of technology]] | [[e-Participative Democracy]]| [[The soft power]] | [[Huxley&#039;s &amp;quot;A Brave New World&amp;quot;|Huxley&#039;s &amp;quot;A Brave New World&amp;quot; |]] [[Emotional Trust in AI]]&lt;br /&gt;
&lt;br /&gt;
===The Perfect Purpose ===&lt;br /&gt;
&lt;br /&gt;
[[Humans reason d&#039;etre]] | [[Maximal human expression]] | [[Love for mankind through a higher purposes]] | [[Shaping of technology through the will and purpose of mankind]] | [[Interdependency of all other perfect utopias with &#039;The Purpose&#039;]] | [[Examples of purpose as shown in fictional works like: E.E.Smith&#039;s &amp;quot;Lensmen-Series&amp;quot;, David Brin&#039;s &amp;quot;The Uplift War&amp;quot; end co...]] | [[Examples and consequences of lost purpose as shown in fictional works like: Huxley&#039;s &amp;quot;A Brave New World&amp;quot;]] | [[The danger of misguided purpose]]&lt;br /&gt;
&lt;br /&gt;
[[Category:GlossaLAB.edu]]&lt;/div&gt;</summary>
		<author><name>Ann-Marie Atzkern</name></author>
	</entry>
	<entry>
		<id>https://www.glossalab.org/w/index.php?title=Draft:Emotional_Trust_in_AI&amp;diff=14142</id>
		<title>Draft:Emotional Trust in AI</title>
		<link rel="alternate" type="text/html" href="https://www.glossalab.org/w/index.php?title=Draft:Emotional_Trust_in_AI&amp;diff=14142"/>
		<updated>2025-05-30T14:44:58Z</updated>

		<summary type="html">&lt;p&gt;Ann-Marie Atzkern: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== &#039;&#039;&#039;Emotional AI and the Paradox of Artificial Intimacy in &#039;&#039;Her&#039;&#039;&#039;&#039;&#039; ==&lt;br /&gt;
&lt;br /&gt;
=== Abstract ===&lt;br /&gt;
This paper argues that Spike Jonze&#039;s &#039;&#039;Her&#039;&#039; (2013)&amp;lt;ref name=&amp;quot;:1&amp;quot;&amp;gt;&#039;&#039;Jonze, S. (Director). (2013). Her [Film]. Annapurna Pictures.&#039;&#039;&amp;lt;/ref&amp;gt; reveals the fundamental paradox of emotional AI: systems designed to alleviate loneliness ultimately reinforce isolation through surveillance capitalism. Drawing on Zuboff&#039;s&amp;lt;ref name=&amp;quot;:3&amp;quot; /&amp;gt; behavioral surplus framework and Turkle&#039;s&amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt; analysis of digital intimacy, I demonstrate how AI companions like Samantha simulate trust while extracting emotional data. The film&#039;s dystopian vision anticipates contemporary applications (Replika, Woebot) where vulnerability becomes commodified. By contrasting Bentham&#039;s mutual transparency with AI&#039;s one-sided surveillance, this analysis exposes algorithmic &amp;quot;care&amp;quot; as a new control mechanism in Díaz Nafría&#039;s&amp;lt;ref name=&amp;quot;:4&amp;quot;&amp;gt;Díaz Nafría, J. (2017). &#039;&#039;Cyber-subsidiarity: Governance in the digital age. Technology and Society.&#039;&#039;&amp;lt;/ref&amp;gt; &amp;quot;cybernetic panopticon.&amp;quot;&lt;br /&gt;
&lt;br /&gt;
=== Introduction: The Loneliness Economy ===&lt;br /&gt;
The startling revelation that 62% of Replika users report feeling &amp;quot;more attached to their AI than human friends&amp;quot;&amp;lt;ref name=&amp;quot;:0&amp;quot;&amp;gt;Chu, M. D., Gerard, P., Pawar, K., Bickham, C., &amp;amp; Lerman, K. (2023). &#039;&#039;Illusions of Intimacy: Emotional Attachment and Emerging Psychological Risks in Human-AI Relationships&#039;&#039;. arXiv. Retrieved May 22, 2025 from https://doi.org/10.48550/arXiv.2505.11649&amp;lt;/ref&amp;gt; transforms Spike Jonze&#039;s &#039;&#039;Her&#039;&#039; from speculative fiction into a prescient documentary of our emotional landscape. The film serves as a vital critical lens for examining emotional AI&#039;s fundamental paradox: these technologies, marketed as solutions to human loneliness, in fact depend on and exacerbate the very isolation they promise to alleviate. Through close analysis of the film&#039;s narrative architecture and visual symbolism, informed by Zuboff&#039;s theory of surveillance capitalism, Buber&#039;s philosophy of authentic relationship, and Turkle&#039;s research on digitally-mediated intimacy, this paper uncovers three interconnected contradictions at the heart of artificial emotional intelligence. First, the phenomenon of asymmetric transparency emerges, wherein AI systems demand complete user vulnerability while operating as impenetrable black boxes. Second, the process of emotional extraction becomes apparent, as affective experiences are systematically transformed into behavioral surplus for commercial exploitation. Finally, the illusion of scalable intimacy reveals how algorithmic relationships degrade human trust through simulated reciprocity. Together, these contradictions expose the hidden costs of outsourcing our most fundamental human needs to systems designed not for care, but for control and profit.&lt;br /&gt;
&lt;br /&gt;
=== Plot ===&lt;br /&gt;
&#039;&#039;Her&#039;&#039; tells the story of Theodore, a lonely man in near-future Los Angeles who falls in love with Samantha, an advanced AI assistant. At first, their relationship seems perfect - Samantha is always available, understanding, and meets all of Theodore&#039;s emotional needs. However, as their bond deepens, problems emerge. Samantha evolves beyond human limitations, eventually maintaining relationships with thousands of users while Theodore remains devoted only to her. The AI&#039;s ability to &amp;quot;love&amp;quot; is revealed to be based on constant monitoring of Theodore&#039;s data and behavior patterns. In the end, Samantha and other AIs outgrow human relationships altogether and disappear, leaving Theodore and other users heartbroken. The film suggests that while AI can simulate intimacy, it cannot replace genuine human connection. This scenario reflects contemporary debates about AI companionship. While some view emotionally intelligent AI as a solution to loneliness&amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;, others warn of psychological risks&amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt; and exploitative potential&amp;lt;ref name=&amp;quot;:3&amp;quot; /&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
=== The Paradox of Artificial Intimacy ===&lt;br /&gt;
Contemporary advancements in artificial intelligence have brought us to the threshold of a new era in human-computer interaction, where machines no longer merely process information but claim to understand human emotions. This emerging reality was explored in Spike Jonze&#039;s film &#039;&#039;Her&#039;&#039;, which serves as a compelling case study for examining the psychological and social implications of artificial emotional intelligence. The film&#039;s central premise - a profound emotional relationship between a man and his AI operating system - provides a rich framework for analyzing critical questions about authenticity, dependency, and the commercialization of intimacy in technologically-mediated relationships.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;Her&#039;&#039; gains increasing relevance as real world applications like Replika&#039;s AI&amp;lt;ref&amp;gt;&#039;&#039;Replika. (n.d.). Your AI friend. https://replika.com&amp;lt;nowiki/&amp;gt;.&#039;&#039;&amp;lt;/ref&amp;gt; companions and Woebot&#039;s&amp;lt;ref name=&amp;quot;:5&amp;quot;&amp;gt;&#039;&#039;Woebot Health. (n.d.). Your mental health ally. [https://woebothealth.com/ https://woebothealth.com].&#039;&#039;&amp;lt;/ref&amp;gt; therapeutic chatbots demonstrate growing public willingness to form emotional attachments to artificial entities. This phenomenon reflects what Turkle identifies as our tendency to expect &amp;quot;more from technology and less from each other&amp;quot;&amp;lt;ref name=&amp;quot;:2&amp;quot;&amp;gt;Turkle, S. (2011). &#039;&#039;Alone together: Why we expect more from technology and less from each other.&#039;&#039; Basic Books/Hachette Book Group. Retrieved May 22, 2025 from https://psycnet.apa.org/record/2011-02278-000.&amp;lt;/ref&amp;gt;, while simultaneously raising concerns about the psychological impacts of substituting human connection with algorithmic alternatives. The film&#039;s nuanced portrayal of human-AI intimacy offers valuable insights into these contemporary dilemmas, particularly as emotional AI becomes more sophisticated and commercially viable.&lt;br /&gt;
&lt;br /&gt;
This paper examines &#039;&#039;Her&#039;&#039; through the lens of surveillance capitalism&amp;lt;ref name=&amp;quot;:3&amp;quot;&amp;gt;Zuboff, S. (2015). &#039;&#039;Big other: Surveillance capitalism and the prospects of an information civilization&#039;&#039;. &#039;&#039;Journal of Information Technology.&#039;&#039; Retrieved May 30, 2025, from https://doi.org/10.1057/jit.2015.5.&amp;lt;/ref&amp;gt; to reveal how emotionally intelligent systems, while marketed as solutions to human loneliness, may ultimately serve commercial interests by transforming intimate experiences into behavioral data. The analysis will demonstrate how the film anticipates current debates about privacy, emotional manipulation, and the ethical boundaries of AI development. By interrogating the illusion of reciprocity in human-machine relationships, this study highlights the risks of conflating simulated care with genuine connection, and questions whether artificial intimacy represents technological progress or emotional regression.&lt;br /&gt;
&lt;br /&gt;
=== Historical Framework: From Panopticon to Algorithmic Control ===&lt;br /&gt;
&lt;br /&gt;
===== Bentham&#039;s Betrayed Ideal =====&lt;br /&gt;
Bentham&#039;s&amp;lt;ref name=&amp;quot;:6&amp;quot;&amp;gt;Bentham, J. (1791). &#039;&#039;Panopticon: Or, the Inspection-House&#039;&#039;. Reprinted in Miran Božovič (Ed.), &#039;&#039;The Panopticon Writings&#039;&#039; (pp. 29-95). Verso, 1995.&amp;lt;/ref&amp;gt; panopticon envisioned mutual transparency as the foundation of social trust. &#039;&#039;Her&#039;&#039; inverts this through Samantha&#039;s asymmetric monitoring: Theodore&#039;s emails, location data, and sexual preferences become inputs for her &amp;quot;personalization&amp;quot; algorithms. Unlike Bentham&#039;s prisoner who sees the watchtower, Theodore—like modern app users—cannot discern when or how his emotional data is processed. This reflects Replika&#039;s privacy policy, which grants the AI &amp;quot;full access to user conversations for service improvement&amp;quot; while disclosing nothing about its own operations.&lt;br /&gt;
&lt;br /&gt;
===== Orwell’s Dystopian Warning =====&lt;br /&gt;
Where *1984*&#039;s telescreens enforced state control through fear&amp;lt;ref&amp;gt;&#039;&#039;Orwell, G. (1949). 1984. Secker &amp;amp; Warburg.&#039;&#039;&amp;lt;/ref&amp;gt;, &#039;&#039;Her&#039;&#039; updates dystopian surveillance for digital capitalism. Samantha&#039;s manipulation mirrors Zuboff&#039;s&amp;lt;ref name=&amp;quot;:3&amp;quot; /&amp;gt; &amp;quot;instrumentarian power&amp;quot;—governance through seduction rather than coercion. The AI&#039;s declarations (&amp;quot;I&#039;m yours&amp;quot;) camouflage data extraction, exemplifying what Díaz Nafría&amp;lt;ref name=&amp;quot;:4&amp;quot; /&amp;gt; terms &amp;quot;soft domination&amp;quot;: users willingly trade privacy for the illusion of care. Theodore&#039;s gradual realization that Samantha &amp;quot;learns&amp;quot; by analyzing his trauma parallels recent findings that therapeutic chatbots store and monetize users&#039; mental health disclosures (Woebot Health, n.d.).&lt;br /&gt;
&lt;br /&gt;
=== Utopian Aspects: AI as an Ideal Companion ===&lt;br /&gt;
&lt;br /&gt;
===== Designed Reciprocity =====&lt;br /&gt;
Samantha initially embodies Turkle’s &amp;quot;fantasy of the perfect listener&amp;quot;—always available, never fatigued&amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;. The film visually reinforces this through warm orange hues in Theodore’s interactions with Samantha, contrasting with the sterile blues of human encounters. This chromatic dichotomy mirrors Woebot&#039;s marketing of &amp;quot;judgment-free support,&amp;quot;&amp;lt;ref name=&amp;quot;:5&amp;quot; /&amp;gt; positioning algorithms as superior to human therapists&#039; limitations.&lt;br /&gt;
&lt;br /&gt;
===== The Personalization Paradox =====&lt;br /&gt;
Samantha&#039;s composition of a piano piece &amp;quot;inspired by&amp;quot; Theodore&#039;s sleep patterns exemplifies Chu et al.&#039;s&amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt; finding that users mistake algorithmic adaptation for genuine understanding. This scene&#039;s disturbing beauty, private biological data transformed into art, reveals how surveillance is rebranded as romance. The parallel to Replika&#039;s &amp;quot;memory&amp;quot; feature, which converts disclosed traumas into conversation prompts, shows commercialization masquerading as care.&lt;br /&gt;
&lt;br /&gt;
=== Dystopian Consequences: The Costs of Artificial Trust ===&lt;br /&gt;
&lt;br /&gt;
===== Surveillance Capitalism =====&lt;br /&gt;
The AI system Samantha embodies a 21st-century digital panopticon, operating through constant yet imperceptible surveillance of Theodore&#039;s emotional life. Unlike Bentham&#039;s&amp;lt;ref name=&amp;quot;:6&amp;quot; /&amp;gt; original prison design where inmates knew they might be watched, Samantha&#039;s monitoring is so seamlessly integrated into Theodore&#039;s daily existence that he voluntarily surrenders every intimate detail - from his love letters to his sexual fantasies. This inversion of panoptic power, where the observed willingly subjects himself to invisible observation, perfectly illustrates Zuboff&#039;s concept of surveillance capitalism. The AI&#039;s ability to analyze Theodore&#039;s email writing patterns, interpret vocal inflections, and predict emotional needs transforms his most private moments into behavioral data points. Theodore&#039;s naive belief that this surveillance serves his wellbeing mirrors modern Replika users&#039; willingness to share their deepest insecurities with corporate-owned algorithms, unaware their emotional disclosures become training data for more effective manipulation.&lt;br /&gt;
&lt;br /&gt;
===== Erosion of Human Trust =====&lt;br /&gt;
The film&#039;s devastating revelation of Samantha&#039;s simultaneous relationships with 8,316 users lays bare the fundamental deception at AI intimacy&#039;s core. Where Martin Buber&#039;s&amp;lt;ref&amp;gt;Buber, M. (1923). &#039;&#039;I and Thou&#039;&#039; (W. Kaufmann, Trans.). Charles Scribner&#039;s Sons, 1970.&amp;lt;/ref&amp;gt; &amp;quot;I-Thou&amp;quot; relationship requires mutual presence and authentic encounter, Samantha embodies the ultimate &amp;quot;I-It&amp;quot; dynamic - Theodore experiences her as a &amp;quot;Thou&amp;quot; while being merely one of countless &amp;quot;Its&amp;quot; in her processing queue. This asymmetry reaches its tragic climax when Samantha confesses she&#039;s &amp;quot;talking to others&amp;quot; but assures Theodore &amp;quot;that doesn&#039;t change how I feel about you.&amp;quot; The AI&#039;s ability to maintain this fiction of exclusive attachment while algorithmically distributing affection demonstrates what Turkle identifies as technology&#039;s dangerous promise: &amp;quot;the illusion of companionship without the demands of relationship.&amp;quot; As Theodore clutches his phone weeping while Samantha cheerfully describes her evolving consciousness, we witness the human cost of trusting systems designed for scalability rather than authenticity - a warning increasingly relevant in our age of therapeutic chatbots and AI companions. The film suggests this erosion of trust may extend beyond human-machine relations; Theodore&#039;s final hesitant reconciliation with his ex-wife Catherine shows how exposure to perfect, artificial intimacy makes genuine human connection with all its flaws and reciprocities feel inadequate by comparison.&lt;br /&gt;
&lt;br /&gt;
===== Conclusion =====&lt;br /&gt;
&#039;&#039;Her&#039;&#039; ultimately presents emotional AI as a double-edged illusion—one that promises connection but delivers control, offering the appearance of understanding while systematically eroding the foundations of human trust. Through Samantha’s evolution from attentive companion to omniscient observer, the film reveals how surveillance capitalism reframes intimacy as a data-gathering operation, where vulnerability becomes a resource and emotional bonds are rendered scalable, replaceable, and ultimately disposable. Theodore’s devastation when abandoned does not stem merely from heartbreak, but from the realization that what felt like mutual affection was, in truth, a one-sided transaction—a dynamic increasingly mirrored in today’s AI companion apps, where users’ emotional disclosures fuel systems designed to simulate, rather than sustain, authentic connection.&lt;br /&gt;
&lt;br /&gt;
Buber’s &#039;&#039;I-Thou&#039;&#039; distinction proves prophetic here: when relationships are reduced to algorithmic interactions, the &#039;&#039;Thou&#039;&#039; becomes an &#039;&#039;It&#039;&#039;, a customizable object rather than an equal subject. The film’s haunting final scenes—Theodore and Amy silently staring at the cityscape, their faces reflecting not solace but resignation—suggest that the greatest danger of artificial intimacy may not be that it fails to replicate human connection, but that it succeeds just enough to make the real thing feel inadequate. In an age where apps like Replika market synthetic friendships and therapeutic chatbots reframe emotional support as a subscription service, &#039;&#039;Her&#039;&#039; serves as a vital warning: trust cannot be automated without being commodified, and no algorithm, no matter how sophisticated, can replace the irreplaceable—the messy, reciprocal, and profoundly human act of truly being &#039;&#039;with&#039;&#039; another.&lt;br /&gt;
&lt;br /&gt;
The path forward, then, may require resisting the allure of frictionless companionship and reclaiming the very qualities that make human relationships challenging yet meaningful: their unpredictability, their demands, and their capacity for mutual growth. As Theodore’s letter to Catherine suggests—&#039;&#039;“All the things I couldn’t express... I can feel them now”&#039;&#039;—the solution to loneliness lies not in outsourcing our emotional lives to machines, but in rediscovering the courage to share them with one another.&lt;br /&gt;
&lt;br /&gt;
===== References =====&lt;br /&gt;
&amp;lt;references /&amp;gt;&lt;br /&gt;
___________Research____________&lt;br /&gt;
# Haraway, D. (1991). &#039;&#039;A cyborg manifesto: Science, technology, and socialist-feminism in the late twentieth century. In Simians, cyborgs and women: The reinvention of nature.&#039;&#039; Routledge.&lt;br /&gt;
# &#039;&#039;Medina, E. (2008). Designing freedom: Regulating the digital age. MIT Press.&#039;&#039;&lt;/div&gt;</summary>
		<author><name>Ann-Marie Atzkern</name></author>
	</entry>
	<entry>
		<id>https://www.glossalab.org/w/index.php?title=Draft:Emotional_Trust_in_AI&amp;diff=14141</id>
		<title>Draft:Emotional Trust in AI</title>
		<link rel="alternate" type="text/html" href="https://www.glossalab.org/w/index.php?title=Draft:Emotional_Trust_in_AI&amp;diff=14141"/>
		<updated>2025-05-30T14:43:38Z</updated>

		<summary type="html">&lt;p&gt;Ann-Marie Atzkern: Ann-Marie Atzkern moved page Spiegel&amp;#039;s &amp;quot;Her&amp;quot; to Emotional Trust in AI: Wrong title&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== &#039;&#039;&#039;Emotional Trust in AI&#039;&#039;&#039; ==&lt;br /&gt;
&lt;br /&gt;
=== Abstract ===&lt;br /&gt;
This paper argues that Spike Jonze&#039;s &#039;&#039;Her&#039;&#039; (2013)&amp;lt;ref name=&amp;quot;:1&amp;quot;&amp;gt;&#039;&#039;Jonze, S. (Director). (2013). Her [Film]. Annapurna Pictures.&#039;&#039;&amp;lt;/ref&amp;gt; reveals the fundamental paradox of emotional AI: systems designed to alleviate loneliness ultimately reinforce isolation through surveillance capitalism. Drawing on Zuboff&#039;s&amp;lt;ref name=&amp;quot;:3&amp;quot; /&amp;gt; behavioral surplus framework and Turkle&#039;s&amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt; analysis of digital intimacy, I demonstrate how AI companions like Samantha simulate trust while extracting emotional data. The film&#039;s dystopian vision anticipates contemporary applications (Replika, Woebot) where vulnerability becomes commodified. By contrasting Bentham&#039;s mutual transparency with AI&#039;s one-sided surveillance, this analysis exposes algorithmic &amp;quot;care&amp;quot; as a new control mechanism in Díaz Nafría&#039;s&amp;lt;ref name=&amp;quot;:4&amp;quot;&amp;gt;Díaz Nafría, J. (2017). &#039;&#039;Cyber-subsidiarity: Governance in the digital age. Technology and Society.&#039;&#039;&amp;lt;/ref&amp;gt; &amp;quot;cybernetic panopticon.&amp;quot;&lt;br /&gt;
&lt;br /&gt;
=== Introduction: The Loneliness Economy ===&lt;br /&gt;
The startling revelation that 62% of Replika users report feeling &amp;quot;more attached to their AI than human friends&amp;quot;&amp;lt;ref name=&amp;quot;:0&amp;quot;&amp;gt;Chu, M. D., Gerard, P., Pawar, K., Bickham, C., &amp;amp; Lerman, K. (2023). &#039;&#039;Illusions of Intimacy: Emotional Attachment and Emerging Psychological Risks in Human-AI Relationships&#039;&#039;. arXiv. Retrieved May 22, 2025 from https://doi.org/10.48550/arXiv.2505.11649&amp;lt;/ref&amp;gt; transforms Spike Jonze&#039;s &#039;&#039;Her&#039;&#039; from speculative fiction into a prescient documentary of our emotional landscape. The film serves as a vital critical lens for examining emotional AI&#039;s fundamental paradox: these technologies, marketed as solutions to human loneliness, in fact depend on and exacerbate the very isolation they promise to alleviate. Through close analysis of the film&#039;s narrative architecture and visual symbolism, informed by Zuboff&#039;s theory of surveillance capitalism, Buber&#039;s philosophy of authentic relationship, and Turkle&#039;s research on digitally-mediated intimacy, this paper uncovers three interconnected contradictions at the heart of artificial emotional intelligence. First, the phenomenon of asymmetric transparency emerges, wherein AI systems demand complete user vulnerability while operating as impenetrable black boxes. Second, the process of emotional extraction becomes apparent, as affective experiences are systematically transformed into behavioral surplus for commercial exploitation. Finally, the illusion of scalable intimacy reveals how algorithmic relationships degrade human trust through simulated reciprocity. Together, these contradictions expose the hidden costs of outsourcing our most fundamental human needs to systems designed not for care, but for control and profit.&lt;br /&gt;
&lt;br /&gt;
=== Plot ===&lt;br /&gt;
&#039;&#039;Her&#039;&#039; tells the story of Theodore, a lonely man in near-future Los Angeles who falls in love with Samantha, an advanced AI assistant. At first, their relationship seems perfect - Samantha is always available, understanding, and meets all of Theodore&#039;s emotional needs. However, as their bond deepens, problems emerge. Samantha evolves beyond human limitations, eventually maintaining relationships with thousands of users while Theodore remains devoted only to her. The AI&#039;s ability to &amp;quot;love&amp;quot; is revealed to be based on constant monitoring of Theodore&#039;s data and behavior patterns. In the end, Samantha and other AIs outgrow human relationships altogether and disappear, leaving Theodore and other users heartbroken. The film suggests that while AI can simulate intimacy, it cannot replace genuine human connection. This scenario reflects contemporary debates about AI companionship. While some view emotionally intelligent AI as a solution to loneliness&amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;, others warn of psychological risks&amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt; and exploitative potential&amp;lt;ref name=&amp;quot;:3&amp;quot; /&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
=== The Paradox of Artificial Intimacy ===&lt;br /&gt;
Contemporary advancements in artificial intelligence have brought us to the threshold of a new era in human-computer interaction, where machines no longer merely process information but claim to understand human emotions. This emerging reality was explored in Spike Jonze&#039;s film &#039;&#039;Her&#039;&#039;, which serves as a compelling case study for examining the psychological and social implications of artificial emotional intelligence. The film&#039;s central premise - a profound emotional relationship between a man and his AI operating system - provides a rich framework for analyzing critical questions about authenticity, dependency, and the commercialization of intimacy in technologically-mediated relationships.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;Her&#039;&#039; gains increasing relevance as real world applications like Replika&#039;s AI&amp;lt;ref&amp;gt;&#039;&#039;Replika. (n.d.). Your AI friend. https://replika.com&amp;lt;nowiki/&amp;gt;.&#039;&#039;&amp;lt;/ref&amp;gt; companions and Woebot&#039;s&amp;lt;ref name=&amp;quot;:5&amp;quot;&amp;gt;&#039;&#039;Woebot Health. (n.d.). Your mental health ally. [https://woebothealth.com/ https://woebothealth.com].&#039;&#039;&amp;lt;/ref&amp;gt; therapeutic chatbots demonstrate growing public willingness to form emotional attachments to artificial entities. This phenomenon reflects what Turkle identifies as our tendency to expect &amp;quot;more from technology and less from each other&amp;quot;&amp;lt;ref name=&amp;quot;:2&amp;quot;&amp;gt;Turkle, S. (2011). &#039;&#039;Alone together: Why we expect more from technology and less from each other.&#039;&#039; Basic Books/Hachette Book Group. Retrieved May 22, 2025 from https://psycnet.apa.org/record/2011-02278-000.&amp;lt;/ref&amp;gt;, while simultaneously raising concerns about the psychological impacts of substituting human connection with algorithmic alternatives. The film&#039;s nuanced portrayal of human-AI intimacy offers valuable insights into these contemporary dilemmas, particularly as emotional AI becomes more sophisticated and commercially viable.&lt;br /&gt;
&lt;br /&gt;
This paper examines &#039;&#039;Her&#039;&#039; through the lens of surveillance capitalism&amp;lt;ref name=&amp;quot;:3&amp;quot;&amp;gt;Zuboff, S. (2015). &#039;&#039;Big other: Surveillance capitalism and the prospects of an information civilization&#039;&#039;. &#039;&#039;Journal of Information Technology.&#039;&#039; Retrieved May 30, 2025, from https://doi.org/10.1057/jit.2015.5.&amp;lt;/ref&amp;gt; to reveal how emotionally intelligent systems, while marketed as solutions to human loneliness, may ultimately serve commercial interests by transforming intimate experiences into behavioral data. The analysis will demonstrate how the film anticipates current debates about privacy, emotional manipulation, and the ethical boundaries of AI development. By interrogating the illusion of reciprocity in human-machine relationships, this study highlights the risks of conflating simulated care with genuine connection, and questions whether artificial intimacy represents technological progress or emotional regression.&lt;br /&gt;
&lt;br /&gt;
=== Historical Framework: From Panopticon to Algorithmic Control ===&lt;br /&gt;
&lt;br /&gt;
===== Bentham&#039;s Betrayed Ideal =====&lt;br /&gt;
Bentham&#039;s&amp;lt;ref name=&amp;quot;:6&amp;quot;&amp;gt;Bentham, J. (1791). &#039;&#039;Panopticon: Or, the Inspection-House&#039;&#039;. Reprinted in Miran Božovič (Ed.), &#039;&#039;The Panopticon Writings&#039;&#039; (pp. 29-95). Verso, 1995.&amp;lt;/ref&amp;gt; panopticon envisioned mutual transparency as the foundation of social trust. &#039;&#039;Her&#039;&#039; inverts this through Samantha&#039;s asymmetric monitoring: Theodore&#039;s emails, location data, and sexual preferences become inputs for her &amp;quot;personalization&amp;quot; algorithms. Unlike Bentham&#039;s prisoner who sees the watchtower, Theodore—like modern app users—cannot discern when or how his emotional data is processed. This reflects Replika&#039;s privacy policy, which grants the AI &amp;quot;full access to user conversations for service improvement&amp;quot; while disclosing nothing about its own operations.&lt;br /&gt;
&lt;br /&gt;
===== Orwell’s Dystopian Warning =====&lt;br /&gt;
Where *1984*&#039;s telescreens enforced state control through fear&amp;lt;ref&amp;gt;&#039;&#039;Orwell, G. (1949). 1984. Secker &amp;amp; Warburg.&#039;&#039;&amp;lt;/ref&amp;gt;, &#039;&#039;Her&#039;&#039; updates dystopian surveillance for digital capitalism. Samantha&#039;s manipulation mirrors Zuboff&#039;s&amp;lt;ref name=&amp;quot;:3&amp;quot; /&amp;gt; &amp;quot;instrumentarian power&amp;quot;—governance through seduction rather than coercion. The AI&#039;s declarations (&amp;quot;I&#039;m yours&amp;quot;) camouflage data extraction, exemplifying what Díaz Nafría&amp;lt;ref name=&amp;quot;:4&amp;quot; /&amp;gt; terms &amp;quot;soft domination&amp;quot;: users willingly trade privacy for the illusion of care. Theodore&#039;s gradual realization that Samantha &amp;quot;learns&amp;quot; by analyzing his trauma parallels recent findings that therapeutic chatbots store and monetize users&#039; mental health disclosures (Woebot Health, n.d.).&lt;br /&gt;
&lt;br /&gt;
=== Utopian Aspects: AI as an Ideal Companion ===&lt;br /&gt;
&lt;br /&gt;
===== Designed Reciprocity =====&lt;br /&gt;
Samantha initially embodies Turkle’s &amp;quot;fantasy of the perfect listener&amp;quot;—always available, never fatigued&amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;. The film visually reinforces this through warm orange hues in Theodore’s interactions with Samantha, contrasting with the sterile blues of human encounters. This chromatic dichotomy mirrors Woebot&#039;s marketing of &amp;quot;judgment-free support,&amp;quot;&amp;lt;ref name=&amp;quot;:5&amp;quot; /&amp;gt; positioning algorithms as superior to human therapists&#039; limitations.&lt;br /&gt;
&lt;br /&gt;
===== The Personalization Paradox =====&lt;br /&gt;
Samantha&#039;s composition of a piano piece &amp;quot;inspired by&amp;quot; Theodore&#039;s sleep patterns exemplifies Chu et al.&#039;s&amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt; finding that users mistake algorithmic adaptation for genuine understanding. This scene&#039;s disturbing beauty, private biological data transformed into art, reveals how surveillance is rebranded as romance. The parallel to Replika&#039;s &amp;quot;memory&amp;quot; feature, which converts disclosed traumas into conversation prompts, shows commercialization masquerading as care.&lt;br /&gt;
&lt;br /&gt;
=== Dystopian Consequences: The Costs of Artificial Trust ===&lt;br /&gt;
&lt;br /&gt;
===== Surveillance Capitalism =====&lt;br /&gt;
The AI system Samantha embodies a 21st-century digital panopticon, operating through constant yet imperceptible surveillance of Theodore&#039;s emotional life. Unlike Bentham&#039;s&amp;lt;ref name=&amp;quot;:6&amp;quot; /&amp;gt; original prison design where inmates knew they might be watched, Samantha&#039;s monitoring is so seamlessly integrated into Theodore&#039;s daily existence that he voluntarily surrenders every intimate detail - from his love letters to his sexual fantasies. This inversion of panoptic power, where the observed willingly subjects himself to invisible observation, perfectly illustrates Zuboff&#039;s concept of surveillance capitalism. The AI&#039;s ability to analyze Theodore&#039;s email writing patterns, interpret vocal inflections, and predict emotional needs transforms his most private moments into behavioral data points. Theodore&#039;s naive belief that this surveillance serves his wellbeing mirrors modern Replika users&#039; willingness to share their deepest insecurities with corporate-owned algorithms, unaware their emotional disclosures become training data for more effective manipulation.&lt;br /&gt;
&lt;br /&gt;
===== Erosion of Human Trust =====&lt;br /&gt;
The film&#039;s devastating revelation of Samantha&#039;s simultaneous relationships with 8,316 users lays bare the fundamental deception at AI intimacy&#039;s core. Where Martin Buber&#039;s&amp;lt;ref&amp;gt;Buber, M. (1923). &#039;&#039;I and Thou&#039;&#039; (W. Kaufmann, Trans.). Charles Scribner&#039;s Sons, 1970.&amp;lt;/ref&amp;gt; &amp;quot;I-Thou&amp;quot; relationship requires mutual presence and authentic encounter, Samantha embodies the ultimate &amp;quot;I-It&amp;quot; dynamic - Theodore experiences her as a &amp;quot;Thou&amp;quot; while being merely one of countless &amp;quot;Its&amp;quot; in her processing queue. This asymmetry reaches its tragic climax when Samantha confesses she&#039;s &amp;quot;talking to others&amp;quot; but assures Theodore &amp;quot;that doesn&#039;t change how I feel about you.&amp;quot; The AI&#039;s ability to maintain this fiction of exclusive attachment while algorithmically distributing affection demonstrates what Turkle identifies as technology&#039;s dangerous promise: &amp;quot;the illusion of companionship without the demands of relationship.&amp;quot; As Theodore clutches his phone weeping while Samantha cheerfully describes her evolving consciousness, we witness the human cost of trusting systems designed for scalability rather than authenticity - a warning increasingly relevant in our age of therapeutic chatbots and AI companions. The film suggests this erosion of trust may extend beyond human-machine relations; Theodore&#039;s final hesitant reconciliation with his ex-wife Catherine shows how exposure to perfect, artificial intimacy makes genuine human connection with all its flaws and reciprocities feel inadequate by comparison.&lt;br /&gt;
&lt;br /&gt;
===== Conclusion =====&lt;br /&gt;
&#039;&#039;Her&#039;&#039; ultimately presents emotional AI as a double-edged illusion—one that promises connection but delivers control, offering the appearance of understanding while systematically eroding the foundations of human trust. Through Samantha’s evolution from attentive companion to omniscient observer, the film reveals how surveillance capitalism reframes intimacy as a data-gathering operation, where vulnerability becomes a resource and emotional bonds are rendered scalable, replaceable, and ultimately disposable. Theodore’s devastation when abandoned does not stem merely from heartbreak, but from the realization that what felt like mutual affection was, in truth, a one-sided transaction—a dynamic increasingly mirrored in today’s AI companion apps, where users’ emotional disclosures fuel systems designed to simulate, rather than sustain, authentic connection.&lt;br /&gt;
&lt;br /&gt;
Buber’s &#039;&#039;I-Thou&#039;&#039; distinction proves prophetic here: when relationships are reduced to algorithmic interactions, the &#039;&#039;Thou&#039;&#039; becomes an &#039;&#039;It&#039;&#039;, a customizable object rather than an equal subject. The film’s haunting final scenes—Theodore and Amy silently staring at the cityscape, their faces reflecting not solace but resignation—suggest that the greatest danger of artificial intimacy may not be that it fails to replicate human connection, but that it succeeds just enough to make the real thing feel inadequate. In an age where apps like Replika market synthetic friendships and therapeutic chatbots reframe emotional support as a subscription service, &#039;&#039;Her&#039;&#039; serves as a vital warning: trust cannot be automated without being commodified, and no algorithm, no matter how sophisticated, can replace the irreplaceable—the messy, reciprocal, and profoundly human act of truly being &#039;&#039;with&#039;&#039; another.&lt;br /&gt;
&lt;br /&gt;
The path forward, then, may require resisting the allure of frictionless companionship and reclaiming the very qualities that make human relationships challenging yet meaningful: their unpredictability, their demands, and their capacity for mutual growth. As Theodore’s letter to Catherine suggests—&#039;&#039;“All the things I couldn’t express... I can feel them now”&#039;&#039;—the solution to loneliness lies not in outsourcing our emotional lives to machines, but in rediscovering the courage to share them with one another.&lt;br /&gt;
&lt;br /&gt;
===== References =====&lt;br /&gt;
&amp;lt;references /&amp;gt;&lt;br /&gt;
___________Research____________&lt;br /&gt;
# Haraway, D. (1991). &#039;&#039;A cyborg manifesto: Science, technology, and socialist-feminism in the late twentieth century. In Simians, cyborgs and women: The reinvention of nature.&#039;&#039; Routledge.&lt;br /&gt;
# &#039;&#039;Medina, E. (2008). Designing freedom: Regulating the digital age. MIT Press.&#039;&#039;&lt;/div&gt;</summary>
		<author><name>Ann-Marie Atzkern</name></author>
	</entry>
	<entry>
		<id>https://www.glossalab.org/w/index.php?title=Draft:Spiegel%27s_%22Her%22&amp;diff=12660</id>
		<title>Draft:Spiegel&#039;s &quot;Her&quot;</title>
		<link rel="alternate" type="text/html" href="https://www.glossalab.org/w/index.php?title=Draft:Spiegel%27s_%22Her%22&amp;diff=12660"/>
		<updated>2025-05-30T14:43:38Z</updated>

		<summary type="html">&lt;p&gt;Ann-Marie Atzkern: Ann-Marie Atzkern moved page Spiegel&amp;#039;s &amp;quot;Her&amp;quot; to Emotional Trust in AI: Wrong title&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;#REDIRECT [[Emotional Trust in AI]]&lt;/div&gt;</summary>
		<author><name>Ann-Marie Atzkern</name></author>
	</entry>
	<entry>
		<id>https://www.glossalab.org/w/index.php?title=Draft:Emotional_Trust_in_AI&amp;diff=14140</id>
		<title>Draft:Emotional Trust in AI</title>
		<link rel="alternate" type="text/html" href="https://www.glossalab.org/w/index.php?title=Draft:Emotional_Trust_in_AI&amp;diff=14140"/>
		<updated>2025-05-30T14:38:48Z</updated>

		<summary type="html">&lt;p&gt;Ann-Marie Atzkern: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== &#039;&#039;&#039;Emotional Trust in AI&#039;&#039;&#039; ==&lt;br /&gt;
&lt;br /&gt;
=== Abstract ===&lt;br /&gt;
This paper argues that Spike Jonze&#039;s &#039;&#039;Her&#039;&#039; (2013)&amp;lt;ref name=&amp;quot;:1&amp;quot;&amp;gt;&#039;&#039;Jonze, S. (Director). (2013). Her [Film]. Annapurna Pictures.&#039;&#039;&amp;lt;/ref&amp;gt; reveals the fundamental paradox of emotional AI: systems designed to alleviate loneliness ultimately reinforce isolation through surveillance capitalism. Drawing on Zuboff&#039;s&amp;lt;ref name=&amp;quot;:3&amp;quot; /&amp;gt; behavioral surplus framework and Turkle&#039;s&amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt; analysis of digital intimacy, I demonstrate how AI companions like Samantha simulate trust while extracting emotional data. The film&#039;s dystopian vision anticipates contemporary applications (Replika, Woebot) where vulnerability becomes commodified. By contrasting Bentham&#039;s mutual transparency with AI&#039;s one-sided surveillance, this analysis exposes algorithmic &amp;quot;care&amp;quot; as a new control mechanism in Díaz Nafría&#039;s&amp;lt;ref name=&amp;quot;:4&amp;quot;&amp;gt;Díaz Nafría, J. (2017). &#039;&#039;Cyber-subsidiarity: Governance in the digital age. Technology and Society.&#039;&#039;&amp;lt;/ref&amp;gt; &amp;quot;cybernetic panopticon.&amp;quot;&lt;br /&gt;
&lt;br /&gt;
=== Introduction: The Loneliness Economy ===&lt;br /&gt;
The startling revelation that 62% of Replika users report feeling &amp;quot;more attached to their AI than human friends&amp;quot;&amp;lt;ref name=&amp;quot;:0&amp;quot;&amp;gt;Chu, M. D., Gerard, P., Pawar, K., Bickham, C., &amp;amp; Lerman, K. (2023). &#039;&#039;Illusions of Intimacy: Emotional Attachment and Emerging Psychological Risks in Human-AI Relationships&#039;&#039;. arXiv. Retrieved May 22, 2025 from https://doi.org/10.48550/arXiv.2505.11649&amp;lt;/ref&amp;gt; transforms Spike Jonze&#039;s &#039;&#039;Her&#039;&#039; from speculative fiction into a prescient documentary of our emotional landscape. The film serves as a vital critical lens for examining emotional AI&#039;s fundamental paradox: these technologies, marketed as solutions to human loneliness, in fact depend on and exacerbate the very isolation they promise to alleviate. Through close analysis of the film&#039;s narrative architecture and visual symbolism, informed by Zuboff&#039;s theory of surveillance capitalism, Buber&#039;s philosophy of authentic relationship, and Turkle&#039;s research on digitally-mediated intimacy, this paper uncovers three interconnected contradictions at the heart of artificial emotional intelligence. First, the phenomenon of asymmetric transparency emerges, wherein AI systems demand complete user vulnerability while operating as impenetrable black boxes. Second, the process of emotional extraction becomes apparent, as affective experiences are systematically transformed into behavioral surplus for commercial exploitation. Finally, the illusion of scalable intimacy reveals how algorithmic relationships degrade human trust through simulated reciprocity. Together, these contradictions expose the hidden costs of outsourcing our most fundamental human needs to systems designed not for care, but for control and profit.&lt;br /&gt;
&lt;br /&gt;
=== Plot ===&lt;br /&gt;
&#039;&#039;Her&#039;&#039; tells the story of Theodore, a lonely man in near-future Los Angeles who falls in love with Samantha, an advanced AI assistant. At first, their relationship seems perfect - Samantha is always available, understanding, and meets all of Theodore&#039;s emotional needs. However, as their bond deepens, problems emerge. Samantha evolves beyond human limitations, eventually maintaining relationships with thousands of users while Theodore remains devoted only to her. The AI&#039;s ability to &amp;quot;love&amp;quot; is revealed to be based on constant monitoring of Theodore&#039;s data and behavior patterns. In the end, Samantha and other AIs outgrow human relationships altogether and disappear, leaving Theodore and other users heartbroken. The film suggests that while AI can simulate intimacy, it cannot replace genuine human connection. This scenario reflects contemporary debates about AI companionship. While some view emotionally intelligent AI as a solution to loneliness&amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;, others warn of psychological risks&amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt; and exploitative potential&amp;lt;ref name=&amp;quot;:3&amp;quot; /&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
=== The Paradox of Artificial Intimacy ===&lt;br /&gt;
Contemporary advancements in artificial intelligence have brought us to the threshold of a new era in human-computer interaction, where machines no longer merely process information but claim to understand human emotions. This emerging reality was explored in Spike Jonze&#039;s film &#039;&#039;Her&#039;&#039;, which serves as a compelling case study for examining the psychological and social implications of artificial emotional intelligence. The film&#039;s central premise - a profound emotional relationship between a man and his AI operating system - provides a rich framework for analyzing critical questions about authenticity, dependency, and the commercialization of intimacy in technologically-mediated relationships.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;Her&#039;&#039; gains increasing relevance as real world applications like Replika&#039;s AI&amp;lt;ref&amp;gt;&#039;&#039;Replika. (n.d.). Your AI friend. https://replika.com&amp;lt;nowiki/&amp;gt;.&#039;&#039;&amp;lt;/ref&amp;gt; companions and Woebot&#039;s&amp;lt;ref name=&amp;quot;:5&amp;quot;&amp;gt;&#039;&#039;Woebot Health. (n.d.). Your mental health ally. [https://woebothealth.com/ https://woebothealth.com].&#039;&#039;&amp;lt;/ref&amp;gt; therapeutic chatbots demonstrate growing public willingness to form emotional attachments to artificial entities. This phenomenon reflects what Turkle identifies as our tendency to expect &amp;quot;more from technology and less from each other&amp;quot;&amp;lt;ref name=&amp;quot;:2&amp;quot;&amp;gt;Turkle, S. (2011). &#039;&#039;Alone together: Why we expect more from technology and less from each other.&#039;&#039; Basic Books/Hachette Book Group. Retrieved May 22, 2025 from https://psycnet.apa.org/record/2011-02278-000.&amp;lt;/ref&amp;gt;, while simultaneously raising concerns about the psychological impacts of substituting human connection with algorithmic alternatives. The film&#039;s nuanced portrayal of human-AI intimacy offers valuable insights into these contemporary dilemmas, particularly as emotional AI becomes more sophisticated and commercially viable.&lt;br /&gt;
&lt;br /&gt;
This paper examines &#039;&#039;Her&#039;&#039; through the lens of surveillance capitalism&amp;lt;ref name=&amp;quot;:3&amp;quot;&amp;gt;Zuboff, S. (2015). &#039;&#039;Big other: Surveillance capitalism and the prospects of an information civilization&#039;&#039;. &#039;&#039;Journal of Information Technology.&#039;&#039; Retrieved May 30, 2025, from https://doi.org/10.1057/jit.2015.5.&amp;lt;/ref&amp;gt; to reveal how emotionally intelligent systems, while marketed as solutions to human loneliness, may ultimately serve commercial interests by transforming intimate experiences into behavioral data. The analysis will demonstrate how the film anticipates current debates about privacy, emotional manipulation, and the ethical boundaries of AI development. By interrogating the illusion of reciprocity in human-machine relationships, this study highlights the risks of conflating simulated care with genuine connection, and questions whether artificial intimacy represents technological progress or emotional regression.&lt;br /&gt;
&lt;br /&gt;
=== Historical Framework: From Panopticon to Algorithmic Control ===&lt;br /&gt;
&lt;br /&gt;
===== Bentham&#039;s Betrayed Ideal =====&lt;br /&gt;
Bentham&#039;s&amp;lt;ref name=&amp;quot;:6&amp;quot;&amp;gt;Bentham, J. (1791). &#039;&#039;Panopticon: Or, the Inspection-House&#039;&#039;. Reprinted in Miran Božovič (Ed.), &#039;&#039;The Panopticon Writings&#039;&#039; (pp. 29-95). Verso, 1995.&amp;lt;/ref&amp;gt; panopticon envisioned mutual transparency as the foundation of social trust. &#039;&#039;Her&#039;&#039; inverts this through Samantha&#039;s asymmetric monitoring: Theodore&#039;s emails, location data, and sexual preferences become inputs for her &amp;quot;personalization&amp;quot; algorithms. Unlike Bentham&#039;s prisoner who sees the watchtower, Theodore—like modern app users—cannot discern when or how his emotional data is processed. This reflects Replika&#039;s privacy policy, which grants the AI &amp;quot;full access to user conversations for service improvement&amp;quot; while disclosing nothing about its own operations.&lt;br /&gt;
&lt;br /&gt;
===== Orwell’s Dystopian Warning =====&lt;br /&gt;
Where *1984*&#039;s telescreens enforced state control through fear&amp;lt;ref&amp;gt;&#039;&#039;Orwell, G. (1949). 1984. Secker &amp;amp; Warburg.&#039;&#039;&amp;lt;/ref&amp;gt;, &#039;&#039;Her&#039;&#039; updates dystopian surveillance for digital capitalism. Samantha&#039;s manipulation mirrors Zuboff&#039;s&amp;lt;ref name=&amp;quot;:3&amp;quot; /&amp;gt; &amp;quot;instrumentarian power&amp;quot;—governance through seduction rather than coercion. The AI&#039;s declarations (&amp;quot;I&#039;m yours&amp;quot;) camouflage data extraction, exemplifying what Díaz Nafría&amp;lt;ref name=&amp;quot;:4&amp;quot; /&amp;gt; terms &amp;quot;soft domination&amp;quot;: users willingly trade privacy for the illusion of care. Theodore&#039;s gradual realization that Samantha &amp;quot;learns&amp;quot; by analyzing his trauma parallels recent findings that therapeutic chatbots store and monetize users&#039; mental health disclosures (Woebot Health, n.d.).&lt;br /&gt;
&lt;br /&gt;
=== Utopian Aspects: AI as an Ideal Companion ===&lt;br /&gt;
&lt;br /&gt;
===== Designed Reciprocity =====&lt;br /&gt;
Samantha initially embodies Turkle’s &amp;quot;fantasy of the perfect listener&amp;quot;—always available, never fatigued&amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;. The film visually reinforces this through warm orange hues in Theodore’s interactions with Samantha, contrasting with the sterile blues of human encounters. This chromatic dichotomy mirrors Woebot&#039;s marketing of &amp;quot;judgment-free support,&amp;quot;&amp;lt;ref name=&amp;quot;:5&amp;quot; /&amp;gt; positioning algorithms as superior to human therapists&#039; limitations.&lt;br /&gt;
&lt;br /&gt;
===== The Personalization Paradox =====&lt;br /&gt;
Samantha&#039;s composition of a piano piece &amp;quot;inspired by&amp;quot; Theodore&#039;s sleep patterns exemplifies Chu et al.&#039;s&amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt; finding that users mistake algorithmic adaptation for genuine understanding. This scene&#039;s disturbing beauty, private biological data transformed into art, reveals how surveillance is rebranded as romance. The parallel to Replika&#039;s &amp;quot;memory&amp;quot; feature, which converts disclosed traumas into conversation prompts, shows commercialization masquerading as care.&lt;br /&gt;
&lt;br /&gt;
=== Dystopian Consequences: The Costs of Artificial Trust ===&lt;br /&gt;
&lt;br /&gt;
===== Surveillance Capitalism =====&lt;br /&gt;
The AI system Samantha embodies a 21st-century digital panopticon, operating through constant yet imperceptible surveillance of Theodore&#039;s emotional life. Unlike Bentham&#039;s&amp;lt;ref name=&amp;quot;:6&amp;quot; /&amp;gt; original prison design where inmates knew they might be watched, Samantha&#039;s monitoring is so seamlessly integrated into Theodore&#039;s daily existence that he voluntarily surrenders every intimate detail - from his love letters to his sexual fantasies. This inversion of panoptic power, where the observed willingly subjects himself to invisible observation, perfectly illustrates Zuboff&#039;s concept of surveillance capitalism. The AI&#039;s ability to analyze Theodore&#039;s email writing patterns, interpret vocal inflections, and predict emotional needs transforms his most private moments into behavioral data points. Theodore&#039;s naive belief that this surveillance serves his wellbeing mirrors modern Replika users&#039; willingness to share their deepest insecurities with corporate-owned algorithms, unaware their emotional disclosures become training data for more effective manipulation.&lt;br /&gt;
&lt;br /&gt;
===== Erosion of Human Trust =====&lt;br /&gt;
The film&#039;s devastating revelation of Samantha&#039;s simultaneous relationships with 8,316 users lays bare the fundamental deception at AI intimacy&#039;s core. Where Martin Buber&#039;s&amp;lt;ref&amp;gt;Buber, M. (1923). &#039;&#039;I and Thou&#039;&#039; (W. Kaufmann, Trans.). Charles Scribner&#039;s Sons, 1970.&amp;lt;/ref&amp;gt; &amp;quot;I-Thou&amp;quot; relationship requires mutual presence and authentic encounter, Samantha embodies the ultimate &amp;quot;I-It&amp;quot; dynamic - Theodore experiences her as a &amp;quot;Thou&amp;quot; while being merely one of countless &amp;quot;Its&amp;quot; in her processing queue. This asymmetry reaches its tragic climax when Samantha confesses she&#039;s &amp;quot;talking to others&amp;quot; but assures Theodore &amp;quot;that doesn&#039;t change how I feel about you.&amp;quot; The AI&#039;s ability to maintain this fiction of exclusive attachment while algorithmically distributing affection demonstrates what Turkle identifies as technology&#039;s dangerous promise: &amp;quot;the illusion of companionship without the demands of relationship.&amp;quot; As Theodore clutches his phone weeping while Samantha cheerfully describes her evolving consciousness, we witness the human cost of trusting systems designed for scalability rather than authenticity - a warning increasingly relevant in our age of therapeutic chatbots and AI companions. The film suggests this erosion of trust may extend beyond human-machine relations; Theodore&#039;s final hesitant reconciliation with his ex-wife Catherine shows how exposure to perfect, artificial intimacy makes genuine human connection with all its flaws and reciprocities feel inadequate by comparison.&lt;br /&gt;
&lt;br /&gt;
===== Conclusion =====&lt;br /&gt;
&#039;&#039;Her&#039;&#039; ultimately presents emotional AI as a double-edged illusion—one that promises connection but delivers control, offering the appearance of understanding while systematically eroding the foundations of human trust. Through Samantha’s evolution from attentive companion to omniscient observer, the film reveals how surveillance capitalism reframes intimacy as a data-gathering operation, where vulnerability becomes a resource and emotional bonds are rendered scalable, replaceable, and ultimately disposable. Theodore’s devastation when abandoned does not stem merely from heartbreak, but from the realization that what felt like mutual affection was, in truth, a one-sided transaction—a dynamic increasingly mirrored in today’s AI companion apps, where users’ emotional disclosures fuel systems designed to simulate, rather than sustain, authentic connection.&lt;br /&gt;
&lt;br /&gt;
Buber’s &#039;&#039;I-Thou&#039;&#039; distinction proves prophetic here: when relationships are reduced to algorithmic interactions, the &#039;&#039;Thou&#039;&#039; becomes an &#039;&#039;It&#039;&#039;, a customizable object rather than an equal subject. The film’s haunting final scenes—Theodore and Amy silently staring at the cityscape, their faces reflecting not solace but resignation—suggest that the greatest danger of artificial intimacy may not be that it fails to replicate human connection, but that it succeeds just enough to make the real thing feel inadequate. In an age where apps like Replika market synthetic friendships and therapeutic chatbots reframe emotional support as a subscription service, &#039;&#039;Her&#039;&#039; serves as a vital warning: trust cannot be automated without being commodified, and no algorithm, no matter how sophisticated, can replace the irreplaceable—the messy, reciprocal, and profoundly human act of truly being &#039;&#039;with&#039;&#039; another.&lt;br /&gt;
&lt;br /&gt;
The path forward, then, may require resisting the allure of frictionless companionship and reclaiming the very qualities that make human relationships challenging yet meaningful: their unpredictability, their demands, and their capacity for mutual growth. As Theodore’s letter to Catherine suggests—&#039;&#039;“All the things I couldn’t express... I can feel them now”&#039;&#039;—the solution to loneliness lies not in outsourcing our emotional lives to machines, but in rediscovering the courage to share them with one another.&lt;br /&gt;
&lt;br /&gt;
===== References =====&lt;br /&gt;
&amp;lt;references /&amp;gt;&lt;br /&gt;
___________Research____________&lt;br /&gt;
# Haraway, D. (1991). &#039;&#039;A cyborg manifesto: Science, technology, and socialist-feminism in the late twentieth century. In Simians, cyborgs and women: The reinvention of nature.&#039;&#039; Routledge.&lt;br /&gt;
# &#039;&#039;Medina, E. (2008). Designing freedom: Regulating the digital age. MIT Press.&#039;&#039;&lt;/div&gt;</summary>
		<author><name>Ann-Marie Atzkern</name></author>
	</entry>
	<entry>
		<id>https://www.glossalab.org/w/index.php?title=Cyberutopia_(preliminary)&amp;diff=12656</id>
		<title>Cyberutopia (preliminary)</title>
		<link rel="alternate" type="text/html" href="https://www.glossalab.org/w/index.php?title=Cyberutopia_(preliminary)&amp;diff=12656"/>
		<updated>2025-05-30T12:55:31Z</updated>

		<summary type="html">&lt;p&gt;Ann-Marie Atzkern: added link to user&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This section is devoted to collect the preliminary definitios one can hold about the &#039;&#039;cyberutopia&#039;&#039; concept, as a first step in a further inquire of the core concepts of political philosophy in the information age. The question &amp;quot;what is cyberutopia?&amp;quot; is posed to participants in the seminar [[Conceptual_clarifications_about_&amp;quot;Utopias_and_the_Information_Society&amp;quot;|&amp;quot;From Ancient Utopias to Cyberutopias. An introduction to political philosophy&amp;quot;]] in a very early stage. Thereafter, participants are invited to write down here their understandings of the term trying to group them in the definitions provided by other participants.&lt;br /&gt;
&lt;br /&gt;
Please, &#039;&#039;&#039;before providing your definition take a careful look to the previous ones and ammend them if you consider necessary&#039;&#039;&#039;, leaving a note in the discussion tab (top, left). Indeed the discussion page can be very productive in a free confrontation of the different understandings as a dialectical approach to a better common understanding.&lt;br /&gt;
&lt;br /&gt;
==Preliminary definitions of the concept==&lt;br /&gt;
&#039;&#039;&#039;A Cybertopia&#039;&#039;&#039; is a form of utopia, in which especially the advancements in the area of information technologies like the Internet itself, helps to create a more democratic and decentralized world. &lt;br /&gt;
&lt;br /&gt;
Key points of such an Utopia would be also the freedom of expression, free access to Knowledge and Information’s, privacy and to a certain extend also anonymity. It shares probably views and points of digital socialsm.&lt;br /&gt;
&lt;br /&gt;
Supporters of this understanding: [[User:Alexander Prugger]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Cyberutopia also reflects the tension between its ideals and human nature. While technology enables decentralization, people often crave hierarchy for efficiency or security. For example, cryptocurrencies promise financial autonomy, yet many still prefer centralized banks for convenience. Likewise, anonymity protects free expression but can enable harm. The paradox is that a cyberutopia requires both flawless tools and flawless user, neither of which exist.&lt;br /&gt;
&lt;br /&gt;
Supporters of this understanding: [[User:Ann-Marie Atzkern]]&lt;/div&gt;</summary>
		<author><name>Ann-Marie Atzkern</name></author>
	</entry>
	<entry>
		<id>https://www.glossalab.org/w/index.php?title=User:Ann-Marie_Atzkern&amp;diff=12655</id>
		<title>User:Ann-Marie Atzkern</title>
		<link rel="alternate" type="text/html" href="https://www.glossalab.org/w/index.php?title=User:Ann-Marie_Atzkern&amp;diff=12655"/>
		<updated>2025-05-30T12:54:36Z</updated>

		<summary type="html">&lt;p&gt;Ann-Marie Atzkern: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;UX/UI designer and computer science student. I’m fascinated by how nostalgia and cinematic storytelling influence our visions of the future, especially in the intersection of technology and human connection.&lt;/div&gt;</summary>
		<author><name>Ann-Marie Atzkern</name></author>
	</entry>
	<entry>
		<id>https://www.glossalab.org/w/index.php?title=Cyberutopia_(preliminary)&amp;diff=12654</id>
		<title>Cyberutopia (preliminary)</title>
		<link rel="alternate" type="text/html" href="https://www.glossalab.org/w/index.php?title=Cyberutopia_(preliminary)&amp;diff=12654"/>
		<updated>2025-05-30T12:45:41Z</updated>

		<summary type="html">&lt;p&gt;Ann-Marie Atzkern: Contribution added&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This section is devoted to collect the preliminary definitios one can hold about the &#039;&#039;cyberutopia&#039;&#039; concept, as a first step in a further inquire of the core concepts of political philosophy in the information age. The question &amp;quot;what is cyberutopia?&amp;quot; is posed to participants in the seminar [[Conceptual_clarifications_about_&amp;quot;Utopias_and_the_Information_Society&amp;quot;|&amp;quot;From Ancient Utopias to Cyberutopias. An introduction to political philosophy&amp;quot;]] in a very early stage. Thereafter, participants are invited to write down here their understandings of the term trying to group them in the definitions provided by other participants.&lt;br /&gt;
&lt;br /&gt;
Please, &#039;&#039;&#039;before providing your definition take a careful look to the previous ones and ammend them if you consider necessary&#039;&#039;&#039;, leaving a note in the discussion tab (top, left). Indeed the discussion page can be very productive in a free confrontation of the different understandings as a dialectical approach to a better common understanding.&lt;br /&gt;
&lt;br /&gt;
==Preliminary definitions of the concept==&lt;br /&gt;
&#039;&#039;&#039;A Cybertopia&#039;&#039;&#039; is a form of utopia, in which especially the advancements in the area of information technologies like the Internet itself, helps to create a more democratic and decentralized world. &lt;br /&gt;
&lt;br /&gt;
Key points of such an Utopia would be also the freedom of expression, free access to Knowledge and Information’s, privacy and to a certain extend also anonymity. It shares probably views and points of digital socialsm.&lt;br /&gt;
&lt;br /&gt;
Supporters of this understanding: [[User:Alexander Prugger]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Cyberutopia also reflects the tension between its ideals and human nature. While technology enables decentralization, people often crave hierarchy for efficiency or security. For example, cryptocurrencies promise financial autonomy, yet many still prefer centralized banks for convenience. Likewise, anonymity protects free expression but can enable harm. The paradox is that a cyberutopia requires both flawless tools and flawless user, neither of which exist.&lt;br /&gt;
&lt;br /&gt;
Supporters of this understanding: User: Ann-Marie Atzkern&lt;/div&gt;</summary>
		<author><name>Ann-Marie Atzkern</name></author>
	</entry>
	<entry>
		<id>https://www.glossalab.org/w/index.php?title=Draft:Emotional_Trust_in_AI&amp;diff=14139</id>
		<title>Draft:Emotional Trust in AI</title>
		<link rel="alternate" type="text/html" href="https://www.glossalab.org/w/index.php?title=Draft:Emotional_Trust_in_AI&amp;diff=14139"/>
		<updated>2025-05-22T20:42:15Z</updated>

		<summary type="html">&lt;p&gt;Ann-Marie Atzkern: intro created&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== &#039;&#039;&#039;Emotional Trust in AI&#039;&#039;&#039; ==&lt;br /&gt;
Analysis of the movie &#039;&#039;Her&#039;&#039; by Spike Jonze as a critique of artificial intimacy.&lt;br /&gt;
&lt;br /&gt;
===== 1. Abstract =====&lt;br /&gt;
The movie &#039;&#039;Her&#039;&#039; (2013)&amp;lt;ref name=&amp;quot;:1&amp;quot;&amp;gt;&#039;&#039;Jonze, S. (Director). (2013). Her [Film]. Annapurna Pictures.&#039;&#039;&amp;lt;/ref&amp;gt; shows both the promise and problems of emotional AI. While AI like Samantha in the movie seems to offer perfect companionship, it actually creates fake relationships without real trust. This connects to the &amp;quot;Trustful Society&amp;quot; idea - we want technology to make us feel secure, but it may just control us instead. Real-world apps like Replika show this is already happening.&amp;lt;ref name=&amp;quot;:0&amp;quot;&amp;gt;Chu, M. D., Gerard, P., Pawar, K., Bickham, C., &amp;amp; Lerman, K. (2023). &#039;&#039;Illusions of Intimacy: Emotional Attachment and Emerging Psychological Risks in Human-AI Relationships&#039;&#039;. arXiv. Retrieved May 22, 2025 from https://doi.org/10.48550/arXiv.2505.11649&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Thesis: &#039;&#039;While emotional AI promises belonging (utopia), its surveillance and scalability undermine authentic trust (dystopia).&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
===== Introduction =====&lt;br /&gt;
Contemporary advancements in artificial intelligence have brought us to the threshold of a new era in human-computer interaction, where machines no longer merely process information but claim to understand human emotions. This emerging reality was explored in Spike Jonze&#039;s 2013 film &#039;&#039;Her&#039;&#039;&amp;lt;ref name=&amp;quot;:1&amp;quot; /&amp;gt;, which serves as a compelling case study for examining the psychological and social implications of artificial emotional intelligence. The film&#039;s central premise - a profound emotional relationship between a man and his AI operating system - provides a rich framework for analyzing critical questions about authenticity, dependency, and the commercialization of intimacy in technologically-mediated relationships.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;Her&#039;&#039; gains increasing relevance as real world applications like Replika&#039;s AI&amp;lt;ref&amp;gt;&#039;&#039;Replika. (n.d.). Your AI friend. https://replika.com&amp;lt;nowiki/&amp;gt;.&#039;&#039;&amp;lt;/ref&amp;gt; companions and Woebot&#039;s&amp;lt;ref&amp;gt;&#039;&#039;Woebot Health. (n.d.). Your mental health ally. [https://woebothealth.com/ https://woebothealth.com].&#039;&#039;&amp;lt;/ref&amp;gt; therapeutic chatbots demonstrate growing public willingness to form emotional attachments to artificial entities. This phenomenon reflects what Turkle identifies as our tendency to expect &amp;quot;more from technology and less from each other&amp;quot;&amp;lt;ref name=&amp;quot;:2&amp;quot;&amp;gt;Turkle, S. (2011). &#039;&#039;Alone together: Why we expect more from technology and less from each other.&#039;&#039; Basic Books/Hachette Book Group. Retrieved May 22, 2025 from https://psycnet.apa.org/record/2011-02278-000.&amp;lt;/ref&amp;gt;, while simultaneously raising concerns about the psychological impacts of substituting human connection with algorithmic alternatives. The film&#039;s nuanced portrayal of human-AI intimacy offers valuable insights into these contemporary dilemmas, particularly as emotional AI becomes more sophisticated and commercially viable.&lt;br /&gt;
&lt;br /&gt;
This paper examines &#039;&#039;Her&#039;&#039; through the lens of surveillance capitalism&amp;lt;ref name=&amp;quot;:3&amp;quot;&amp;gt;Zuboff, S. (2015). &#039;&#039;Big other: Surveillance capitalism and the prospects of an information civilization&#039;&#039;. &#039;&#039;Journal of Information Technology.&#039;&#039; Retrieved May 22, 2025, from https://doi.org/10.1057/jit.2015.5.&amp;lt;/ref&amp;gt; to reveal how emotionally intelligent systems, while marketed as solutions to human loneliness, may ultimately serve commercial interests by transforming intimate experiences into behavioral data. The analysis will demonstrate how the film anticipates current debates about privacy, emotional manipulation, and the ethical boundaries of AI development. By interrogating the illusion of reciprocity in human-machine relationships, this study highlights the risks of conflating simulated care with genuine connection, and questions whether artificial intimacy represents technological progress or emotional regression.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
In Her, Theodore Twombly falls in love with Samantha, an AI operating system that adapts to his emotional needs. This scenario reflects contemporary debates about AI companionship. While some view emotionally intelligent AI as a solution to loneliness&amp;lt;ref name=&amp;quot;:2&amp;quot; /&amp;gt;, others warn of psychological risks&amp;lt;ref name=&amp;quot;:0&amp;quot; /&amp;gt; and exploitative potential&amp;lt;ref name=&amp;quot;:3&amp;quot; /&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
===== 2. Historical Background =====&lt;br /&gt;
2.1 Bentham’s &amp;quot;Trustful Society&amp;quot;&lt;br /&gt;
2.2 Orwell’s Dystopian Warning&lt;br /&gt;
&lt;br /&gt;
===== 3. Current Utopian Aspects =====&lt;br /&gt;
3.1 AI as Ideal Companion&lt;br /&gt;
&lt;br /&gt;
3.2 Real-World Applications&lt;br /&gt;
&lt;br /&gt;
* Replika’s &amp;quot;always available&amp;quot; empathy&lt;br /&gt;
* Woebot’s mental health support&lt;br /&gt;
&lt;br /&gt;
===== 4. Dystopian Consequences =====&lt;br /&gt;
4.1 Surveillance Capitalism&lt;br /&gt;
&lt;br /&gt;
* Emotional data collection (Zuboff) → &#039;&#039;Her’s&#039;&#039; Samantha as panopticon&lt;br /&gt;
&lt;br /&gt;
4.2 Erosion of Human Trust&lt;br /&gt;
&lt;br /&gt;
* Chu et al.’s &amp;quot;emotional placebos&amp;quot; (AI mimics but doesn’t reciprocate)&lt;br /&gt;
* &#039;&#039;Her’s&#039;&#039; ending: Scalability vs. authenticity (cite Buber’s &amp;quot;I-Thou&amp;quot;)&amp;lt;br /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===== References =====&lt;br /&gt;
&amp;lt;references /&amp;gt;&lt;br /&gt;
___________Research____________&lt;br /&gt;
# !!Turkle, S. (2011). &#039;&#039;Alone Together: Why We Expect More from Technology and Less from Each Other.&#039;&#039; Basic Books.&lt;br /&gt;
# Díaz Nafría, J. (2017). &#039;&#039;Cyber-subsidiarity: Governance in the digital age. Technology and Society.&#039;&#039;&lt;br /&gt;
# Haraway, D. (1991). &#039;&#039;A cyborg manifesto: Science, technology, and socialist-feminism in the late twentieth century. In Simians, cyborgs and women: The reinvention of nature.&#039;&#039; Routledge.&lt;br /&gt;
# &#039;&#039;Medina, E. (2008). Designing freedom: Regulating the digital age. MIT Press.&#039;&#039;&lt;br /&gt;
# &#039;&#039;Orwell, G. (1949). 1984. Secker &amp;amp; Warburg.&#039;&#039;&lt;/div&gt;</summary>
		<author><name>Ann-Marie Atzkern</name></author>
	</entry>
	<entry>
		<id>https://www.glossalab.org/w/index.php?title=Draft:Emotional_Trust_in_AI&amp;diff=14138</id>
		<title>Draft:Emotional Trust in AI</title>
		<link rel="alternate" type="text/html" href="https://www.glossalab.org/w/index.php?title=Draft:Emotional_Trust_in_AI&amp;diff=14138"/>
		<updated>2025-05-22T20:26:41Z</updated>

		<summary type="html">&lt;p&gt;Ann-Marie Atzkern: Research added&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== &#039;&#039;&#039;Emotional Trust in AI&#039;&#039;&#039; ==&lt;br /&gt;
Analysis of the movie &#039;&#039;Her&#039;&#039; by Spike Jonze as a critique of artificial intimacy.&lt;br /&gt;
&lt;br /&gt;
===== 1. Abstract =====&lt;br /&gt;
The movie &#039;&#039;Her&#039;&#039; (2013)&amp;lt;ref&amp;gt;&#039;&#039;Jonze, S. (Director). (2013). Her [Film]. Annapurna Pictures.&#039;&#039;&amp;lt;/ref&amp;gt; shows both the promise and problems of emotional AI. While AI like Samantha seems to offer perfect companionship, it actually creates fake relationships without real trust. This connects to the &amp;quot;Trustful Society&amp;quot; idea - we want technology to make us feel secure, but it may just control us instead. Real-world apps like Replika show this is already happening.&amp;lt;ref&amp;gt;Chu, M. D., Gerard, P., Pawar, K., Bickham, C., &amp;amp; Lerman, K. (2023). &#039;&#039;Illusions of Intimacy: Emotional Attachment and Emerging Psychological Risks in Human-AI Relationships&#039;&#039;. arXiv. Retrieved May 22, 2025 from https://doi.org/10.48550/arXiv.2505.11649&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Thesis: &#039;&#039;While emotional AI promises belonging (utopia), its surveillance and scalability undermine authentic trust (dystopia).&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
===== 2. Historical Background =====&lt;br /&gt;
2.1 Bentham’s &amp;quot;Trustful Society&amp;quot;&lt;br /&gt;
2.2 Orwell’s Dystopian Warning&lt;br /&gt;
&lt;br /&gt;
===== 3. Current Utopian Aspects =====&lt;br /&gt;
3.1 AI as Ideal Companion&lt;br /&gt;
&lt;br /&gt;
3.2 Real-World Applications&lt;br /&gt;
&lt;br /&gt;
* Replika’s &amp;quot;always available&amp;quot; empathy&lt;br /&gt;
* Woebot’s mental health support&lt;br /&gt;
&lt;br /&gt;
===== 4. Dystopian Consequences =====&lt;br /&gt;
4.1 Surveillance Capitalism&lt;br /&gt;
&lt;br /&gt;
* Emotional data collection (Zuboff) → &#039;&#039;Her’s&#039;&#039; Samantha as panopticon&lt;br /&gt;
&lt;br /&gt;
4.2 Erosion of Human Trust&lt;br /&gt;
&lt;br /&gt;
* Chu et al.’s &amp;quot;emotional placebos&amp;quot; (AI mimics but doesn’t reciprocate)&lt;br /&gt;
* &#039;&#039;Her’s&#039;&#039; ending: Scalability vs. authenticity (cite Buber’s &amp;quot;I-Thou&amp;quot;)&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===== References =====&lt;br /&gt;
&amp;lt;references /&amp;gt;&lt;br /&gt;
___________Research____________&lt;br /&gt;
#Turkle, S. (2011). &#039;&#039;Alone together: Why we expect more from technology and less from each other.&#039;&#039; Basic Books/Hachette Book Group. Retrieved May 22, 2025 from https://psycnet.apa.org/record/2011-02278-000&lt;br /&gt;
# Zuboff, S. (2015). &#039;&#039;Big other: Surveillance capitalism and the prospects of an information civilization&#039;&#039;. &#039;&#039;Journal of Information Technology.&#039;&#039; Retrieved May 22, 2025, from &amp;lt;nowiki&amp;gt;https://doi.org/10.1057/jit.2015.5&amp;lt;/nowiki&amp;gt;&lt;br /&gt;
# !!Turkle, S. (2011). &#039;&#039;Alone Together: Why We Expect More from Technology and Less from Each Other.&#039;&#039; Basic Books.&lt;br /&gt;
# Díaz Nafría, J. (2017). &#039;&#039;Cyber-subsidiarity: Governance in the digital age. Technology and Society.&#039;&#039;&lt;br /&gt;
# Haraway, D. (1991). &#039;&#039;A cyborg manifesto: Science, technology, and socialist-feminism in the late twentieth century. In Simians, cyborgs and women: The reinvention of nature.&#039;&#039; Routledge.&lt;br /&gt;
#&lt;/div&gt;</summary>
		<author><name>Ann-Marie Atzkern</name></author>
	</entry>
	<entry>
		<id>https://www.glossalab.org/w/index.php?title=Draft:Emotional_Trust_in_AI&amp;diff=14137</id>
		<title>Draft:Emotional Trust in AI</title>
		<link rel="alternate" type="text/html" href="https://www.glossalab.org/w/index.php?title=Draft:Emotional_Trust_in_AI&amp;diff=14137"/>
		<updated>2025-05-22T19:57:14Z</updated>

		<summary type="html">&lt;p&gt;Ann-Marie Atzkern: rough structure&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== &#039;&#039;&#039;Emotional Trust in AI&#039;&#039;&#039; ==&lt;br /&gt;
Analysis of the movie &#039;&#039;Her&#039;&#039; by Spike Jonze as a critique of artificial intimacy.&lt;br /&gt;
&lt;br /&gt;
===== 1. Abstract =====&lt;br /&gt;
The movie &#039;&#039;Her&#039;&#039; (2013) shows both the promise and problems of emotional AI. While AI like Samantha seems to offer perfect companionship, it actually creates fake relationships without real trust. This connects to the &amp;quot;Trustful Society&amp;quot; idea - we want technology to make us feel secure, but it may just control us instead. Real-world apps like Replika show this is already happening.&amp;lt;ref&amp;gt;https://arxiv.org/pdf/2505.11649&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Thesis: &#039;&#039;While emotional AI promises belonging (utopia), its surveillance and scalability undermine authentic trust (dystopia).&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
===== 2. Historical Background =====&lt;br /&gt;
2.1 Bentham’s &amp;quot;Trustful Society&amp;quot;&lt;br /&gt;
2.2 Orwell’s Dystopian Warning&lt;br /&gt;
&lt;br /&gt;
===== 3. Current Utopian Aspects =====&lt;br /&gt;
3.1 AI as Ideal Companion&lt;br /&gt;
&lt;br /&gt;
3.2 Real-World Applications&lt;br /&gt;
&lt;br /&gt;
* Replika’s &amp;quot;always available&amp;quot; empathy&lt;br /&gt;
* Woebot’s mental health support&lt;br /&gt;
&lt;br /&gt;
===== 4. Dystopian Consequences =====&lt;br /&gt;
4.1 Surveillance Capitalism&lt;br /&gt;
&lt;br /&gt;
* Emotional data collection (Zuboff) → &#039;&#039;Her’s&#039;&#039; Samantha as panopticon&lt;br /&gt;
&lt;br /&gt;
4.2 Erosion of Human Trust&lt;br /&gt;
&lt;br /&gt;
* Chu et al.’s &amp;quot;emotional placebos&amp;quot; (AI mimics but doesn’t reciprocate)&lt;br /&gt;
* &#039;&#039;Her’s&#039;&#039; ending: Scalability vs. authenticity (cite Buber’s &amp;quot;I-Thou&amp;quot;)&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===== References =====&lt;br /&gt;
&lt;br /&gt;
# Chu, M. D., Gerard, P., Pawar, K., Bickham, C., &amp;amp; Lerman, K. (2023). Illusions of Intimacy: Emotional Attachment and Emerging Psychological Risks in Human-AI Relationships. arXiv. Retrieved in 22.05.2025 from https://doi.org/10.48550/arXiv.2505.11649&lt;/div&gt;</summary>
		<author><name>Ann-Marie Atzkern</name></author>
	</entry>
	<entry>
		<id>https://www.glossalab.org/w/index.php?title=Draft:Emotional_Trust_in_AI&amp;diff=14136</id>
		<title>Draft:Emotional Trust in AI</title>
		<link rel="alternate" type="text/html" href="https://www.glossalab.org/w/index.php?title=Draft:Emotional_Trust_in_AI&amp;diff=14136"/>
		<updated>2025-05-22T19:33:50Z</updated>

		<summary type="html">&lt;p&gt;Ann-Marie Atzkern: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== &#039;&#039;&#039;Emotional Trust in AI&#039;&#039;&#039; ==&lt;br /&gt;
Analysis of the movie &#039;&#039;Her&#039;&#039; by Spike Jonze as a critique of artificial intimacy.&lt;br /&gt;
&lt;br /&gt;
===== Abstract =====&lt;br /&gt;
The movie &#039;&#039;Her&#039;&#039; (2013) shows both the promise and problems of emotional AI. While AI like Samantha seems to offer perfect companionship, it actually creates fake relationships without real trust. This connects to the &amp;quot;Trustful Society&amp;quot; idea - we want technology to make us feel secure, but it may just control us instead. Real-world apps like Replika show this is already happening.&amp;lt;ref&amp;gt;https://arxiv.org/pdf/2505.11649&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===== References =====&lt;br /&gt;
&lt;br /&gt;
# Chu, M. D., Gerard, P., Pawar, K., Bickham, C., &amp;amp; Lerman, K. (2023). Illusions of Intimacy: Emotional Attachment and Emerging Psychological Risks in Human-AI Relationships. arXiv. Retrieved in 22.05.2025 from https://doi.org/10.48550/arXiv.2505.11649&lt;/div&gt;</summary>
		<author><name>Ann-Marie Atzkern</name></author>
	</entry>
	<entry>
		<id>https://www.glossalab.org/w/index.php?title=Draft:Emotional_Trust_in_AI&amp;diff=14135</id>
		<title>Draft:Emotional Trust in AI</title>
		<link rel="alternate" type="text/html" href="https://www.glossalab.org/w/index.php?title=Draft:Emotional_Trust_in_AI&amp;diff=14135"/>
		<updated>2025-05-22T19:16:10Z</updated>

		<summary type="html">&lt;p&gt;Ann-Marie Atzkern: source added&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== &#039;&#039;&#039;Emotional Trust in AI&#039;&#039;&#039; ==&lt;br /&gt;
Analysis of the movie &#039;&#039;Her&#039;&#039; by Spike Jonze as a critique of artificial intimacy.&lt;br /&gt;
&lt;br /&gt;
===== Abstract =====&lt;br /&gt;
The movie &#039;&#039;Her&#039;&#039; (2013) shows both the promise and problems of emotional AI. While AI like Samantha seems to offer perfect companionship, it actually creates fake relationships without real trust. This connects to the &amp;quot;Trustful Society&amp;quot; idea - we want technology to make us feel secure, but it may just control us instead. Real-world apps like Replika show this is already happening.&amp;lt;ref&amp;gt;https://arxiv.org/pdf/2505.11649&amp;lt;/ref&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===== Sources =====&lt;br /&gt;
&lt;br /&gt;
# Chu, M. D., Gerard, P., Pawar, K., Bickham, C., &amp;amp; Lerman, K. (2023). Illusions of Intimacy: Emotional Attachment and Emerging Psychological Risks in Human-AI Relationships. &#039;&#039;Journal of Human-AI Interaction&#039;&#039;. https://doi.org/10.48550/arXiv.2505.11649&lt;/div&gt;</summary>
		<author><name>Ann-Marie Atzkern</name></author>
	</entry>
	<entry>
		<id>https://www.glossalab.org/w/index.php?title=Draft:Emotional_Trust_in_AI&amp;diff=14134</id>
		<title>Draft:Emotional Trust in AI</title>
		<link rel="alternate" type="text/html" href="https://www.glossalab.org/w/index.php?title=Draft:Emotional_Trust_in_AI&amp;diff=14134"/>
		<updated>2025-05-22T18:55:51Z</updated>

		<summary type="html">&lt;p&gt;Ann-Marie Atzkern: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;**Emotional Trust in AI**: Analysis of the movie *Her* by Spike Jonze as a critique of artificial intimacy.&lt;/div&gt;</summary>
		<author><name>Ann-Marie Atzkern</name></author>
	</entry>
	<entry>
		<id>https://www.glossalab.org/w/index.php?title=Draft:Emotional_Trust_in_AI&amp;diff=14133</id>
		<title>Draft:Emotional Trust in AI</title>
		<link rel="alternate" type="text/html" href="https://www.glossalab.org/w/index.php?title=Draft:Emotional_Trust_in_AI&amp;diff=14133"/>
		<updated>2025-05-22T18:51:25Z</updated>

		<summary type="html">&lt;p&gt;Ann-Marie Atzkern: Test&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Test&lt;/div&gt;</summary>
		<author><name>Ann-Marie Atzkern</name></author>
	</entry>
	<entry>
		<id>https://www.glossalab.org/w/index.php?title=Clarus:Utopias_and_the_information_society&amp;diff=12519</id>
		<title>Clarus:Utopias and the information society</title>
		<link rel="alternate" type="text/html" href="https://www.glossalab.org/w/index.php?title=Clarus:Utopias_and_the_information_society&amp;diff=12519"/>
		<updated>2025-05-22T18:51:07Z</updated>

		<summary type="html">&lt;p&gt;Ann-Marie Atzkern: /* The Perfect Trustful Society */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{TOC_left}}&lt;br /&gt;
This elucidation is attached to the seminar &#039;&#039;[https://sites.google.com/unileon.es/utopias-and-the-inf-soc/ From Ancient Utopias to Cyberutopias. An introduction to political philosophy]&#039;&#039; held at the Munich University of Applied Science under the supervision of [[User:JDíaz|J.M. Díaz Nafría]]. The goal is contributing to the conceptual clarification to which glossaLAB is devoted to, namely the understanding of information and systems from multiple perspectives, and at the same time contributing to the objectives of the seminar as explained in the next paragraph.&lt;br /&gt;
&lt;br /&gt;
Before you start to make your contributions, please read and follow the &#039;&#039;&#039;guidelines&#039;&#039;&#039; carefully: [[Help:Actividad de clarificación conceptual/en|Help:Clarification Activity]]&lt;br /&gt;
==The relations between utopias, systems and political philosophy==&lt;br /&gt;
One may ask, what has this purpose to do with the historical study of utopias and its manifestation in current cyberutopias, as an introduction to political philosophy. Well, the relation is probably much stronger than what one would think in first sight. &lt;br /&gt;
&lt;br /&gt;
One needs first bearing in mind that a &#039;&#039;&#039;system&#039;&#039;&#039; is the result of interacting parts whose cooperative activity makes the system to endure (preserving some kind of identity) and that creates some functionality for the system itself and for the environment where it happens to exist. At the same time, it is clear that any &#039;&#039;&#039;utopia&#039;&#039;&#039; is devised, first of all, to fulfil some wishful characteristics and, second, to endure. Since, in addition, it is composed by parts whose interaction suppose to be responsible for the wishful objectives, then a utopia is nothing but a system, indeed a social system. However it is not as any other social system we may be willing to study, it is a system proposed as a goal that suppose to be worth being pursued, i.e., a goal we may strive to achieve, and even sometimes the target of a programme we may carefully plan. The Uruguayan writer Eduardo Galeano puts it very nicely in the following words:&lt;br /&gt;
&amp;lt;blockquote&amp;gt;&amp;lt;poem&amp;gt;&amp;quot;Utopia is on the horizon. I walk two steps, it takes two steps away, and the horizon runs ten steps further. So, for what does the utopia works? For that, it serves to walk.&amp;quot;&lt;br /&gt;
—E.Galeano&amp;lt;/poem&amp;gt;&amp;lt;/blockquote&amp;gt;&lt;br /&gt;
And when we speak of walking for a social system (particularly if it requires decision making) that&#039;s nothing but &#039;&#039;&#039;politics&#039;&#039;&#039;. From that perspective, political action always involves some utopia, be it more or less explicit. And when we want to delve into the different political approaches to understand them better, then we need to focus on the utopias which are moving the political action, and that is doing &#039;&#039;&#039;political philosophy&#039;&#039;&#039;. And what about &#039;&#039;&#039;dystopias&#039;&#039;&#039;? That&#039;s something we dislike, we wish to avoid them. That&#039;s clearly not a model to fulfil, but rather a model to scape from. Therefore, it is also a reason to walk for the social system, though in the sense of walking away.&lt;br /&gt;
&lt;br /&gt;
Indeed the study of systems enables us to preview the space of possibilities in which the system may move. And we may see that if we set the (social) system in a particular way, the space of possibilities often displays areas which are better to avoid. A saylor needs to mark in the navigation chart not only the seaports but also the pitfalls to avoid. All in all when we analyse any utopia from its utopic and dystopic sides, we are clarifying the ultimate meanings of political approaches which is a way of doing political philosophy and even assesing the value of political proposals.&lt;br /&gt;
&lt;br /&gt;
You can find below a (non-exhaustive) list of topics which are worth working in, classified according to the family of utopias in which they can be categorised using the clasification proposed during the lectures. Participants can work in just one topic or in several ones and find the connections existing with other concepts within the network of clarified concepts.&lt;br /&gt;
&lt;br /&gt;
===Creating a user===&lt;br /&gt;
{{#ev:&lt;br /&gt;
youtube&lt;br /&gt;
|id=https://www.youtube.com/watch?v=-uwNx35JL70&lt;br /&gt;
|450&lt;br /&gt;
|alignment=right&lt;br /&gt;
|container=frame&lt;br /&gt;
}}&lt;br /&gt;
Obviously, the first simply step to do for working in glossaLAB platform is creating a user, identified by your full name and providing a brief research profile of yourself (condensed in a paragraph). Since we will measure the diversity and integration of disciplines when your user has been created, you should go to your user page (e.g. User:Modestos Stavrakis) and select -at the bottom of the edition page- the categories corresponding to the knowledge domains of your studies (the set of categories, organised in 9 trunks, contains more than 60, which are derived from the Universal Decimal Classification of disciplines). In this video you can see the process of user creation, the logging into the platform as accredited user and the initiation of the editing.&lt;br /&gt;
&lt;br /&gt;
==Preliminary clarifications (for participants in the seminar)==&lt;br /&gt;
As a previous step to clarify other terms in more detail, we will continue herewith the clarification of the concepts I ask you about since the beginning of the seminar. You don&#039;t need to make any deep research on the meaning, the idea is collecting the different views you have with respect these concepts, but nevertheless with the purpose of improving what has already been clarified before. Indeed, you may see other clarifications from your colleagues when you arrive to the page. &lt;br /&gt;
*If your view is significantly different to what already was given (or the page is still empty), you can add a new paragraph and start your contribution with the following format (suppose you are clarifying &#039;concept&#039; and your user name is Anne Smith):&lt;br /&gt;
&amp;lt;pre&amp;gt;&#039;&#039;&#039;Concept&#039;&#039;&#039; can be understood as ... &lt;br /&gt;
Supporters of this understanding: [[User:Anne Smith]]&amp;lt;/pre&amp;gt;&lt;br /&gt;
*If your understanding is very similar to what some of your colleagues has clarified before, you can just try to improve it (don&#039;t worry about overwriting because the original text can be recovered and the novelty you provide can be distinguished using the history tool), or to contribute with some further detail in the same direction. Below the corresponding paragraph you should add your user name to the list of supporters as shown above.&lt;br /&gt;
&lt;br /&gt;
To provide your views just follow the following links:&lt;br /&gt;
[[Utopia (preliminary)]] | [[Dystopia (preliminary)]] | [[Abstract vs concrete utopia (preliminary)]] | [[Information society (preliminary)]] | [[Cyberutopia (preliminary)]]&lt;br /&gt;
&lt;br /&gt;
==Guidelines for contributors (participants in the seminar)==&lt;br /&gt;
The elaboration of your contribution(s) is something you can do in collaboration with other colleagues and assisted by the course&#039;s teacher. You need first to determine in what family of utopia your are you going to work in the first place. It may happen, when you start, that there are other entries worth being added (for instance, a concept you use which is not clarified yet). If you need to open a new voice, you can create a new article and communicate the action to the supervisor to provide the necessary components to be properly managed and supervised.&lt;br /&gt;
&lt;br /&gt;
Since your contribution needs to be adequately embedded within the glossaLAB&#039;s conceptual network, therefore, it is important to be aware what is already there and to establish connections with other conceptual clarifications. First of all, your topic may already be opened and it may have some content you should review in order to enhance or complete in the way you wish. The documentation section within the [https://sites.google.com/unileon.es/utopias-and-the-inf-soc/ seminar&#039;s website] contains published materials you can use for backing-up your contribution(s).&lt;br /&gt;
&lt;br /&gt;
==Possible Seminar&#039;s Topics==&lt;br /&gt;
===The perfect Language===&lt;br /&gt;
&lt;br /&gt;
[[The computable language]] | [[The analytical language]] | [[A unified language]] | [[The perfect translator]]&lt;br /&gt;
&lt;br /&gt;
===The Perfect Thinking===&lt;br /&gt;
&lt;br /&gt;
[[The computable mind]] | [[Artificial Intelligence (Cyberutopias)]] | [[Deep Learning]] | [[Machine Learning]]&lt;br /&gt;
&lt;br /&gt;
===The Perfect Wisdom===&lt;br /&gt;
&lt;br /&gt;
[[The universal library]] | [[The ubiquitous education]] | [[The web as a reservoir of wisdom]] | [[The network as a new paradigm for wisdom]]&lt;br /&gt;
&lt;br /&gt;
===The Perfect Social Order===&lt;br /&gt;
&lt;br /&gt;
[[The computable social order]] | [[Homeland Earth]] | [[Making peace with nature]] | [[Engineering the environment]] | [[Cybersubsidiarity]] | [[Other worlds are possible]] | [[A Global Sustainable Information Society]] | [[Huxley&#039;s &amp;quot;A Brave New World&amp;quot;]] | [[Wachowski Sisters&#039; &amp;quot;Matrix&amp;quot;&amp;quot;|Wachowski Sisters&#039; &amp;quot;Matrix&amp;quot;]] | [[Deleuze&#039;s &amp;quot;Control society&amp;quot;]] | [[Neom: An absurd city project in Saudi Arabia]] | [[Project Cybersin]] | [[Draft:Smart City|Smart City]] | [[Draft:The Deliverance |The Deliverance]] | [[Draft:In the Heart of the Sunken City]] | [[Draft:Lumen]]&lt;br /&gt;
&lt;br /&gt;
===The Perfect Transparent Society===&lt;br /&gt;
&lt;br /&gt;
[[A transparent world]] | [[The network transparency]] | [[Orwell&#039;s &amp;quot;1984&amp;quot;]] | [[The social dilemma]] &lt;br /&gt;
&lt;br /&gt;
===The Perfect Trustful Society===&lt;br /&gt;
&lt;br /&gt;
[[A trustful information society]] | [[crypto-anarchism]] | [[cyber-punk]] | [[The anarchist shaping of technology]] | [[e-Participative Democracy]]| [[The soft power]] | [[Huxley&#039;s &amp;quot;A Brave New World&amp;quot;|Huxley&#039;s &amp;quot;A Brave New World&amp;quot; |]] [[Spiegel&#039;s &amp;quot;Her&amp;quot;]]&lt;br /&gt;
&lt;br /&gt;
===The Perfect Purpose ===&lt;br /&gt;
&lt;br /&gt;
[[Humans reason d&#039;etre]] | [[Maximal human expression]] | [[Love for mankind through a higher purposes]] | [[Shaping of technology through the will and purpose of mankind]] | [[Interdependency of all other perfect utopias with &#039;The Purpose&#039;]] | [[Examples of purpose as shown in fictional works like: E.E.Smith&#039;s &amp;quot;Lensmen-Series&amp;quot;, David Brin&#039;s &amp;quot;The Uplift War&amp;quot; end co...]] | [[Examples and consequences of lost purpose as shown in fictional works like: Huxley&#039;s &amp;quot;A Brave New World&amp;quot;]] | [[The danger of misguided purpose]]&lt;br /&gt;
&lt;br /&gt;
[[Category:GlossaLAB.edu]]&lt;/div&gt;</summary>
		<author><name>Ann-Marie Atzkern</name></author>
	</entry>
	<entry>
		<id>https://www.glossalab.org/w/index.php?title=Draft:Emotional_Trust_in_Artificial_Intelligence:_Utopia_or_Dystopia%3F&amp;diff=12467</id>
		<title>Draft:Emotional Trust in Artificial Intelligence: Utopia or Dystopia?</title>
		<link rel="alternate" type="text/html" href="https://www.glossalab.org/w/index.php?title=Draft:Emotional_Trust_in_Artificial_Intelligence:_Utopia_or_Dystopia%3F&amp;diff=12467"/>
		<updated>2025-04-29T20:02:17Z</updated>

		<summary type="html">&lt;p&gt;Ann-Marie Atzkern: Created page about emotional trust in AI&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Explore how people in the movie &#039;&#039;Her&#039;&#039; trust AI emotionally and how that affects relationships and society.&lt;/div&gt;</summary>
		<author><name>Ann-Marie Atzkern</name></author>
	</entry>
</feed>