Abstract
Voice Command Devices (VCD) such as the ubiquitous Amazon Echo have entered the lives and homes of people the world over. Although emotional recognition and digital empathy are idealistic goals within the development of AI and intelligent agents, the current technology available lacks outward emotional understanding and the personas contribute only Alexithymic (no understanding of emotion) responses. Despite extensive research by large multinational technological organizations, authentic human-like empathic interactions with intelligent agents have not yet been achieved. Consequently, users are lulled into a false sense of security where they believe that their emotions remain private. This paper determines that despite Alexa's demonstrated lack of emotion and emotional understanding, Voice Command Devices such as the Amazon Echo have the ability to deduce emotions such as sadness through inferential data. This is displayed through responses to questions that offer the same information as those posed by health practitioners to establish potential cases of depression. This type of data paves the way for parent companies to effectively target future advertising and build EMOTOgraphic models. As users are presented with no indication of this by intelligent agents, most would be unaware that combined inferential data could be so revealing and potentially extremely profitable from a sales and marketing perspective. This potentially leads to great ethical and privacy concerns as intelligent agents such as Alexa are gradually and incrementally cured of Alexithymia indicators.
Original language | English |
---|---|
Pages (from-to) | 244-255 |
Number of pages | 12 |
Journal | CEUR Workshop Proceedings |
Volume | 2259 |
Publication status | Published - 2018 |
Event | 26th AIAI Irish Conference on Artificial Intelligence and Cognitive Science, AICS 2018 - Dublin, Ireland Duration: 6 Dec 2018 → 7 Dec 2018 |
Keywords
- Alexa
- Amazon Echo
- Emotion
- Inferential data