Automation is not a moral deus ex machina: electrophysiology of moral reasoning toward machine and human agents

Published: December 22, 2022
Abstract Views: 934
PDF: 12
Publisher's note
All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.

Authors

The diffusion of automated decision-making systems could represent a critical crossroads for the future society. Automated technology could feasibly be involved in morally-charged decisions, with major ethical consequences. In the present study, participants (n=34) took part in a task composed of moral dilemmas where the agent (human vs. machine) and the type of behavior (action vs inaction) factors were randomized. Responses in terms of evaluation of morality, the consciousness, responsibility, intentionality, and emotional impact of the agent's behaviour, reaction times (RTs), and EEG (delta, theta, beta, alpha, gamma powers) data were collected. Data showed that participants apply different moral rules based on the agent. Humans are considered more moral, responsible, intentional, and conscious compared to machines. Interestingly, the evaluation of the emotional impact derived from the moral behavior was perceived as more severe for humans, with decreased RTs. For EEG data, increased gamma power was detected when subjects were evaluating the intentionality and the emotional impact of machines, compared to humans. Higher beta power in the frontal and fronto-central regions was detected for the evaluation of the machine's derived emotional impact. Moreover, a right temporal activation was found when judging the emotional impact caused by humans. Lastly, a generalized alpha desynchronization occurred in the left occipital area, when subjects evaluated the responsibility derived from inaction behaviors. Present results provided evidence for the existence of different norms when judging moral behavior of machine and human agents, pointing to a possible asymmetry in moral judgment at a cognitive and emotional level.

Dimensions

Altmetric

PlumX Metrics

Downloads

Download data is not yet available.

Citations

How to Cite

Cassioli, F., Angioletti, L., & Balconi, M. (2022). Automation is not a moral deus ex machina: electrophysiology of moral reasoning toward machine and human agents. Medicina E Morale, 71(4), 391–411. https://doi.org/10.4081/mem.2022.1217