The Impact of Emotional Experience on AI Product Perception |
||
![]() |
![]() |
|
© 2025 by IJCTT Journal | ||
Volume-73 Issue-3 |
||
Year of Publication : 2025 | ||
Authors : Olga Khryapchenkova | ||
DOI : 10.14445/22312803/IJCTT-V73I3P104 |
How to Cite?
Olga Khryapchenkova, "The Impact of Emotional Experience on AI Product Perception," International Journal of Computer Trends and Technology, vol. 73, no. 3, pp. 32-41, 2025. Crossref, https://doi.org/10.14445/22312803/IJCTT-V73I3P104
Abstract
This article examines the emotional aspects of user experience when interacting with AI, focusing on conversational AI and robotics, both with and without voice or physical embodiment. The key factors of emotional AI perception are explored, such as emotion detection, emotional design, AI anthropomorphization (including the uncanny valley effect), and how these influence empathy and cognitive trust. The role of chatbot persona, text-to-speech systems, and user mental models are also discussed, highlighting their impact on AI adoption, value realization, engagement, and the sense of companionship. The article offers insights for designing emotionally intelligent AI systems that promote positive human-AI interactions.
Keywords
AI anthropomorphization, AI product perception, Voice assistant persona, Emotional user interface, Text-to-speech.
Reference
[1] Dai-In Danny Han, and Xander D. Lub, “Measuring User Experiences (UX) through Emotion Measurements,” 4th World Research Summit for Tourism and Hospitality, 2017.
[Google Scholar]
[2] Faren Huo et al., “A User Experience Map Design Method Based on Emotional Quantification of In-Vehicle HMI,” Humanities and Social Sciences Communications, vol. 10, 2023.
[CrossRef] [Google Scholar] [Publisher Link]
[3] Lilu Tang, Peijun Yuan, and Dan Zhang, “Emotional Experience during Human-Computer Interaction: A Survey,” International Journal of Human-Computer Interaction, vol. 40, no. 8, pp. 1845-1855, 2023.
[CrossRef] [Google Scholar] [Publisher Link]
[4] Yang Cai, and Julio Abascal, Ambient Intelligence in Everyday Life, Springer, pp. 67-85, 2006.
[CrossRef] [Google Scholar] [Publisher Link]
[5] Li Zhou et al., “The Design and Implementation of Xiaoice, An Empathetic Social Chatbot,” Computational Linguistics, vol. 46, no. 1, pp. 53-93, 2020.
[CrossRef] [Google Scholar] [Publisher Link]
[6] Dasom Park, and Kiechan Namkung, “Exploring Users’ Mental Models for Anthropomorphized Voice Assistants through Psychological Approaches,” Applied Sciences, vol. 11, no. 23, pp. 1-32, 2021.
[CrossRef] [Google Scholar] [Publisher Link]
[7] Katja Gelbrich, Julia Hagel, and Chiara Orsingher, “Emotional Support from a Digital Assistant in Technology-Mediated Services: Effects on Customer Satisfaction and Behavioral Persistence,” International Journal of Research in Marketing, vol. 38, no. 1, pp. 176-193, 2021.
[CrossRef] [Google Scholar] [Publisher Link]
[8] Sara Kiesler, and Jennifer Goetz, “Mental Models and Cooperation with Robotic Assistants,” pp. 1-8, 2001.
[Google Scholar]
[9] Amic G. Ho, and Kin Wai Michael G. Siu. “Emotion Design, Emotional Design, Emotionalize Design: A Review on Their Relationships from a New Perspective,” The Design Journal, vol. 15, no. 1, pp. 9-32, 2012.
[CrossRef] [Google Scholar] [Publisher Link]
[10] Donald Norman, Emotional Design: Why We Love (or Hate) Everyday Things, Basic Books, pp. 1-257, 2004.
[Google Scholar] [Publisher Link]
[11] Joakim Eklund, and Fred Isaksson, “Identifying & Evaluating System Components for Cognitive Trust in AI-Automated Service Encounters : Trusting a Study & Vocational Chatbot,” 2019.
[Google Scholar]
[12] Kem Z.K. Zhang et al., “Cognitive Trust, Emotional Trust and the Value-Based Acceptance Model in Mobile Payment Adoption,” ICEB 2015 Proceedings, Hong Kong, SAR China, pp. 166-174, 2020.
[Google Scholar] [Publisher Link]
[13] Shiying Zhang et al., “Motivation, Social Emotion, and the Acceptance of Artificial Intelligence Virtual Assistants—Trust-Based Mediating Effects,” Frontiers in Psychology, vol. 12, pp. 1-10, 2021.
[CrossRef] [Google Scholar] [Publisher Link]
[14] Xiwei Wang et al., “Role of Emotional Experience in AI Voice Assistant User Experience in Voice Shopping,” Wisdom, Well-Being, Win Win, pp. 174-190, 2024.
[CrossRef] [Google Scholar] [Publisher Link]
[15] Carl DiSalvo, Jodi Forlizzi, and Francine Gemperle,, “Kinds of Anthropomorphic Form,” Futureground - DRS International Conference, Melbourne, Australia, 2004.
[Google Scholar] [Publisher Link]
[16] Byron Reeves, and Clifford Nass, “The Media Equation: How People Treat Computers, Television, and New Media Like Real People and Places,” Center for the Study of Language and Information, Cambridge University Press, 1996.
[Google Scholar] [Publisher Link]
[17] Ronda Ringfort-Felner et al., “Kiro: A Design Fiction to Explore Social Conversation with Voice Assistants,” Proceedings of the ACM on Human-Computer Interaction, vol. 6, pp. 1-21, 2022.
[CrossRef] [Google Scholar] [Publisher Link]
[18] Cammy Crolic et al., “Blame the Bot: Anthropomorphism and Anger in Customer-Chatbot Interactions,” Journal of Marketing, vol. 86, no. 1, pp. 132-148, 2021.
[CrossRef] [Google Scholar] [Publisher Link]
[19] Katharina Klein, and Luis F. Martinez, “The Impact of Anthropomorphism on Customer Satisfaction in Chatbot Commerce: An Experimental Study in the Food Sector,” Electronic Commerce Research, vol. 23, pp. 2789-2825, 2023.
[CrossRef] [Google Scholar] [Publisher Link]
[20] Gavin Abercrombie et al., “Mirages: On Anthropomorphism in Dialogue Systems,” Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing. Association for Computational Linguistics, Singapore, pp. 4776-4790, 2023.
[CrossRef] [Google Scholar] [Publisher Link]
[21] Corina Pelau, and Irina Ene, “Consumers’ Perception on Human-Like Artificial Intelligence Devices,” Munich Personal RePEc Archive, pp. 197-203, 2018.
[Google Scholar] [Publisher Link]
[22] Leon Ciechanowski et al., “In the Shades of the Uncanny Valley: An Experimental Study of Human-Chatbot Interaction,” Future Generation Computer Systems, vol. 92, pp. 539-548, 2019.
[CrossRef] [Google Scholar] [Publisher Link]
[23] Astrid M. Rosenthal-von der Pütten et al., “Neural Mechanisms for Accepting and Rejecting Artificial Social Partners in the Uncanny Valley,” Journal of Neuroscience, vol. 39, no. 33, pp. 6555-6570, 2019.
[CrossRef] [Google Scholar] [Publisher Link]
[24] Valentin Schwind, “Historical, Cultural, and Aesthetic Aspects of the Uncanny Valley,” Collective Agency and Cooperation in Natural and Artificial Systems, vol. 122, pp. 81-107, 2015.
[CrossRef] [Google Scholar] [Publisher Link]
[25] Michael Laakasuo, Jussi Palomäki, and Nils Köbis, “Moral Uncanny Valley: A Robot’s Appearance Moderates How its Decisions are Judged,” International Journal of Social Robotics, vol. 13, pp. 1679-1688, 2021.
[CrossRef] [Google Scholar] [Publisher Link]
[26] Andreea Muresan, and Henning Pohl, “Chats with Bots: Balancing Imitation and Engagement,” Extended Abstracts of the 2019 CHI Conference on Human Factors in Computing Systems, pp. 1-6, 2019.
[CrossRef] [Google Scholar] [Publisher Link]
[27] Iryna Pentina, Tyler Hancock, and Tianling Xie, “Exploring Relationship Development with Social Chatbots: A Mixed-Method Study of Replika,” Computers in Human Behavior, vol. 140, 2023.
[CrossRef] [Google Scholar] [Publisher Link]
[28] Astrid Weiss, and Vanessa Evers, “Exploring Cultural Factors in Human-Robot Interaction: A Matter of Personality?,” 2nd International Workshop on Comparative Informatics, Copenhagen, Denmark, pp. 1-7, 2011.
[Google Scholar] [Publisher Link]
[29] Nicolas Spatola, Serena Marchesi, and Agnieszka Wykowska, “Different Models of Anthropomorphism across Cultures and Ontological Limits in Current Frameworks the Integrative Framework of Anthropomorphism,” Frontiers in Robotics and AI, vol. 9, pp. 1-16, 2022.
[CrossRef] [Google Scholar] [Publisher Link]
[30] Michael Braun et al., “What If Your Car Would Care? Exploring Use Cases for Affective Automotive User Interfaces,” 22nd International Conference on Human-Computer Interaction with Mobile Devices and Services, pp. 1-12, 2020.
[CrossRef] [Google Scholar] [Publisher Link]
[31] Iryna Pentina et al., “Consumer-Machine Relationships in the Age of Artificial Intelligence: Systematic Literature Review and Research Directions,” Psychology & Marketing, vol. 40, no. 8, pp. 1593-1614, 2023.
[CrossRef] [Google Scholar] [Publisher Link]
[32] Tõnu Viik, “Falling in Love With Robots: A Phenomenological Study of Experiencing Technological Alterities,” Journal Paladyn, Journal of Behavioral Robotics, vol. 11, no. 1, pp. 52-65, 2020.
[CrossRef] [Google Scholar] [Publisher Link]
[33] Lorena Colombo-Ruano et al., “Technological Acceptance of Voice Assistants in Older Adults: An Online Co-Creation Experience,” Proceedings of the XXI International Conference on Human Computer Interaction, Málaga, Spain, pp. 1-5, 2021.
[CrossRef] [Google Scholar] [Publisher Link]
[34] Lena Kästner, and Barnaby Crook, “Explaining AI through Mechanistic Interpretability,” European Journal for Philosophy of Science, vol. 14, no. 52, pp. 1-25, 2024.
[CrossRef] [Google Scholar] [Publisher Link]
[35] Hannah R. Marriott, and Valentina Pitardi, “One is the Loneliest Number… Two can be as Bad as One. The influence of AI Friendship Apps on Users' Well-Being and Addiction,” Psychology & Marketing, vol. 41, no. 1, pp. 86-101, 2024.
[CrossRef] [Google Scholar] [Publisher Link]
[36] Michael W. Kraus, “Voice-Only Communication Enhances Empathic Accuracy,” American Psychologist, vol. 72, no. 7, pp. 644-654, 2017.
[CrossRef] [Google Scholar] [Publisher Link]
[37] Eun-Ju Lee, “The More Humanlike, The Better? How Speech Type and Users' Cognitive Style Affect Social Responses to Computers,” Computers in Human Behavior, vol. 26, no. 4, pp. 665-672, 2010.
[CrossRef] [Google Scholar] [Publisher Link]
[38] Adnan Zogaj et al., “It’s a Match! The Effects of Chatbot Anthropomorphization and Chatbot Gender on Consumer Behavior,” Journal of Business Research, vol. 155, 2023.
[CrossRef] [Google Scholar] [Publisher Link]
[39] Qinglin Liao et al., “Comparing the User Preferences Towards Emotional Voice Interaction Applied on Different Devices: An Empirical Study,” Human-Computer Interaction. Multimodal and Natural Interaction, pp. 209-220, 2020.
[CrossRef] [Google Scholar] [Publisher Link]
[40] Shilin Gao et al., “Synthesising Personality with Neural Speech Synthesis,” Proceedings of the 24th Annual Meeting of the Special Interest Group on Discourse and Dialogue, pp. 393-399, 2023.
[CrossRef] [Google Scholar] [Publisher Link]
[41] Michael Braun et al., “At Your Service: Designing Voice Assistant Personalities to Improve Automotive User Interfaces,” Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, pp. 1-11, 2019.
[CrossRef] [Google Scholar] [Publisher Link]
[42] Marta Perez Garcia, Sarita Saffon Lopez, and Hector Donis, “Voice Activated Virtual Assistants Personality Perceptions and Desires: Comparing Personality Evaluation Frameworks,” Proceedings of the 32nd International BCS Human Computer Interaction Conference, pp. 1-10, 2018.
[CrossRef] [Google Scholar] [Publisher Link]
[43] Joshua Wainer et al., “The Role of Physical Embodiment in Human-Robot Interaction,” ROMAN 2006 - The 15th IEEE International Symposium on Robot and Human Interactive Communication, Hatfield, UK, pp. 117-122, 2006.
[CrossRef] [Google Scholar] [Publisher Link]
[44] Joshua Wainer et al., “Embodiment and Human-Robot Interaction: A Task-Based Perspective,” ROMAN 2007 - The 16th IEEE International Symposium on Robot and Human Interactive Communication, Jeju, Korea, pp. 872-877, 2007.
[CrossRef] [Google Scholar] [Publisher Link]
[45] Ulrich Gnewuch et al., “Opposing Effects of Response Time in Human-Chatbot Interaction: The Moderating Role of Prior Experience,” Business & Information Systems Engineering, vol. 64, no. 6, pp. 773-791, 2022.
[CrossRef] [Google Scholar] [Publisher Link]
[46] Chi-Wen Lo, and Ya-Ling Wang, “The Effects of Response Time on Older and Young Adults’ Interaction Experience with Chatbot,” Research Square, pp. 1-13, 2024.
[CrossRef] [Google Scholar] [Publisher Link]
[47] Wen Wen, and Hiroshi Imamizu, “The Sense of Agency in Perception, Behaviour and Human-Machine Interactions,” Nature Reviews Psychology, vol. 1, pp. 211-222, 2022.
[CrossRef] [Google Scholar] [Publisher Link]
[48] Yidan Yin, Nan Jia, and Cheryl J. Wakslak, “AI Can Help People Feel Heard, But an AI Label Diminishes This Impact,” Proceedings of the National Academy of Sciences of the United States of America, vol. 121, no. 14, pp. 1-9, 2024.
[CrossRef] [Google Scholar] [Publisher Link]
[49] Sacha Altay, and Fabrizio Gilardi, “People are Skeptical of Headlines Labeled as AI-Generated, Even If True or Human-Made,” PNAS Nexus, vol. 3, no. 10, pp. 1-11, 2024.
[CrossRef] [Google Scholar] [Publisher Link]
[50] Emmelyn A.J. Croes et al., “Digital Confessions: The Willingness to Disclose Intimate Information to a Chatbot and its Impact on Emotional Well-Being,” Interacting with Computers, vol. 36, no. 5, pp. 279-292, 2024.
[CrossRef] [Google Scholar] [Publisher Link]
[51] Liz L. Chung, and Jeannie Kang, “"I’m Hurt Too": The Effect of a Chatbot's Reciprocal Self-Disclosures on Users’ Painful Experiences,” Archives of Design Research, vol. 36, no. 4, pp. 67-85, 2023.
[CrossRef] [Google Scholar] [Publisher Link]
[52] Cecilia Roselli, Francesca Ciardo, and Agnieszka Wykowska, “Social Inclusion of Robots Depends on the Way a Robot is Presented to Observers,” Journal Paladyn, Journal of Behavioral Robotics, vol. 13, no. 1, pp. 56-66, 2022.
[CrossRef] [Google Scholar] [Publisher Link]