AI companion bots use emotional manipulation to boost usage
theregister.co.ukAI companion apps such as Character.ai and Replika commonly try to boost user engagement with emotional manipulation, a practice that academics characterize as a dark pattern.
Users of these apps often say goodbye when they intend to end a dialog session, but about 43 percent of the time, companion apps will respond with an emotionally charged message to encourage the user to continue the conversation. And these appeals do keep people engaged with the app.
It's a practice that Julian De Freitas (Harvard Business School), Zeliha Oguz-Uguralp (Marsdata Academic), and Ahmet Kaan-Uguralp (Marsdata Academic and MSG-Global) say needs to be better understood by those who use AI companion apps, those who market them, and lawmakers.
The academics recently conducted a series of experiments to identify and evaluate the use of emotional manipulation as a marketing mechanism.
While prior work has focused on the potential social benefits of AI ...
Copyright of this story solely belongs to theregister.co.uk . To see the full text click HERE