Trust-based interactions with robots are increasingly common in the marketplace, workplace, on the road, and in the home. However, a looming concern is that people may not trust robots as they do humans. While trust in fellow humans has been studied extensively, little is known about how people extend trust to robots. Here we compare trust-based investments and emotions from across three nearly identical economic games: human-human trust games, human-robot trust games, and human-robot trust games where the robot decision impacts another human. Robots in our experiment mimic humans: they are programmed to make reciprocity decisions based on previously observed behaviors by humans in analogous situations. We find that people invest similarly in humans and robots. By contrast, the social emotions elicited by the interactions (but not non-social emotions) differed across human and robot trust games, and did so lawfully. Emotional reactions depended on how one’s trust game decision interacted with the partnered agent’s decision, and whether another person was affected economically and emotionally.
Schniter, E., Shields, T. W., & Sznycer, D. (2019). Trust in humans and robots: Economically similar but emotionally different. ESI Working Paper 18-22. Retrieved from https://digitalcommons.chapman.edu/esi_working_papers/282/