How guilty is a robot who kills other robots

2020 
Safety may depends crucially on making moral judgments. To date we have a lack of knowledge about the possibility of intervening in the processes that lead to moral judgments in relation to the behavior of artificial agents. The study reported here involved 293 students from the University of Siena who made moral judgments after reading the description of an event in which a person or robot killed other people or robots. The study was conducted through an online questionnaire. The results suggest that moral judgments essentially depend on the type of victim and that are different if they involve human or artificial agents. Furthermore, some characteristics of the evaluators, such as the greater or lesser disposition to attribute mental states to artificial agents, have an influence on these evaluations. On the other hand, the level of familiarity with these systems seems to have a limited effect.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    29
    References
    0
    Citations
    NaN
    KQI
    []