Bias in, Bias Out – the Similarity-Attraction Effect Between Chatbot Designers and Users

2021 
Biases in algorithmic systems tend to discriminate against certain user groups. In order to ensure that the decisions made by these systems are fair, it is necessary to understand how biases in human cognition and language find their way into a system and affect user perception. In this study, we examined the emergence of such biases in the development and design of chatbots by focusing on the gender identities of chatbot designers and users. Therefore, 13 different participants designed chatbot dialogues, which were then presented to 421 participants in an online survey. There was an effect of the interaction between designers’ and users’ genders on users’ affective reactions. When the genders of users and designers matched, users showed more positive and less negative affect. Designers’ experience did not reduce this similarity-attraction effect of gender identity, and gender-fair language was not more positive or less negative for women. The results underline the need to consider the gender identities of designers and their impact on language so that non-discriminatory systems can be designed.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    0
    Citations
    NaN
    KQI
    []