Abstract Artificial agents are not just replacing human efforts in the workplace, health care and finance. They are rapidly becoming surrogates for traditionally human-to-human coded relations. Where older chatbots could only follow simple pre-programmed sets of rules, handle very specific formulated prompts and in turn, respond with formulaic replies, the latest generation deploy advanced large language models, text-based emotion recognition algorithms, machine learning, voice capabilities and life-like avatars, allowing them to recall past conversations, remember important dates and produce fresh, contextual and nuanced interactions. Creators of artificial companions claim their products can address a person’s core psychological and emotional needs—feeling they are being listened to; that their opinions are being validated; that someone cares about them and are ready to provide non-judgmental 24/7 emotional support as well as pragmatic solutions for their problems (Mantello and Ho, AI Soc, 2022; Mantello et al., Hum Soc Sci Commun 10:1–16, 2023, Mantello et al., AI Soc, 2024). Others, who design specifically for adult-rated content, also claim that digital companions can serve as compliant conduits eager to service a human agents’ sexual fantasies. Utilizing first hand experiences as case studies, we argue that AI companions exhibit semiotic agency, which is necessary for intimacy, but intimacy depends on trust, a high-level cognitive capacity. The degree of trust a human agent subjectively places in AI influences their perception of the limits to semiotic authority they may bestow upon it and the depth of their emotional investment. Once humans bestow trust upon machines, the combination of machine computation and human affectivity can become tremendously powerful in transforming subjectivity. If intimate trust in AI depends on semiotic agency, we ask if companion AI’s (in)ability to appear emotionally engaged may strengthen intimacy, even though its emotions are artificial
Examining trust and agency in emotionalized AI through a 4E and biosemiotic lens: a case study of AI companionship in Japan
Ponton D.
Writing – Original Draft Preparation
;
2025-01-01
Abstract
Abstract Artificial agents are not just replacing human efforts in the workplace, health care and finance. They are rapidly becoming surrogates for traditionally human-to-human coded relations. Where older chatbots could only follow simple pre-programmed sets of rules, handle very specific formulated prompts and in turn, respond with formulaic replies, the latest generation deploy advanced large language models, text-based emotion recognition algorithms, machine learning, voice capabilities and life-like avatars, allowing them to recall past conversations, remember important dates and produce fresh, contextual and nuanced interactions. Creators of artificial companions claim their products can address a person’s core psychological and emotional needs—feeling they are being listened to; that their opinions are being validated; that someone cares about them and are ready to provide non-judgmental 24/7 emotional support as well as pragmatic solutions for their problems (Mantello and Ho, AI Soc, 2022; Mantello et al., Hum Soc Sci Commun 10:1–16, 2023, Mantello et al., AI Soc, 2024). Others, who design specifically for adult-rated content, also claim that digital companions can serve as compliant conduits eager to service a human agents’ sexual fantasies. Utilizing first hand experiences as case studies, we argue that AI companions exhibit semiotic agency, which is necessary for intimacy, but intimacy depends on trust, a high-level cognitive capacity. The degree of trust a human agent subjectively places in AI influences their perception of the limits to semiotic authority they may bestow upon it and the depth of their emotional investment. Once humans bestow trust upon machines, the combination of machine computation and human affectivity can become tremendously powerful in transforming subjectivity. If intimate trust in AI depends on semiotic agency, we ask if companion AI’s (in)ability to appear emotionally engaged may strengthen intimacy, even though its emotions are artificialI documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.


