Gender as Emotive AI and the Case of ‘Nadia’: Regulatory and Ethical Implications

2021 
This article unpacks the regulatory and ethical problematics of the artificial intelligence (AI) powered virtual assistant, ‘Nadia’, developed for use in the Australian Government’s National Disabilities Insurance Agency (NDIA). We explore how Nadia is gendered female, utilises high-risk AI technologies, including emotive-inducing AI and machine learning to monitor highly sensitive health and biometric data, and was developed for use by a group deemed vulnerable to further human rights violations under international law. Drawing from the human rights frameworks of the EU and the European Convention on Human Rights, particularly the rights to data protection law and privacy, we explore how a system like Nadia poses interferences with, and potential violations to, the fundamental human rights of vulnerable groups, and discuss what regulatory provisions and frameworks could have been put in place to safeguard Nadia.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    0
    Citations
    NaN
    KQI
    []