Moral Psychology in Human-Robot Interaction
Maartje de Graaf
Date: 16:00 – 16:30, Thursday, 18.02.2021
Location: MS Teams
Title: Moral Psychology in Human-Robot Interaction
Abstract: Pending integration of robots into society gaining prominence to examine ethical questions such as how humans should design, deploy, and treat robots. Specifically, with robots entering human communities, they become part of social structures known as norms: the social, moral, and legal rules of how to (not) behave in specific contexts. Community members are expected to follow these norms, and if robots are becoming members of human communities —even in highly restricted roles— people will expect robots to follow norms as well. Which norms they should follow, and how those norms could be implemented in robotic architectures is currently unknown. Robots will only be able to integrate into our elaborate normative systems if they are built to read, and participate in, that system. By exploring people’s psychological understanding of robots as intentional moral agents, we will be able to draft guidelines needed to implement norm capacity correctly and effectively into robot systems. In my talk, I will present some recent findings regarding morally sensitive behaviors in human-robot interaction with a specific focus on norm violation, moral judgments, trust assessment, and social response. These results reveal our expectations regarding robots taking part in our human social system of norms which may offer some guidance for how robots could deal with our human expectations regarding their role in these normative system.