Any views expressed within media held on this service are those of the contributors, should not be taken as approved or endorsed by the University, and do not necessarily reflect the views of the University in respect of any particular issue.

Developing a project idea – what application area of data and AI ethics might you want to address? What type of ethical intervention could your project propose? You might have multiple options, but how do you narrow this down, so it is feasible in the timeframe?

When considering application areas of Data and AI Ethics, I am very interested in the representativeness of datasets, and how this limits AI applications. As highlighted by Kiemde & Kora (2020, p.2), “for fair AI, the quality of data and representativeness of data are of great importance”. I feel there has been much data coverage on AI applications which predominantly recognise white faces, or applications which prefer male candidates over female candidates. On a basic level, datasets have to represent the population which they are intended to serve, and should be audited for minimum thresholds of representation to ensure that AI does not reproduce the representation gaps which it is trained on. When AI is used in contexts with different representations, it should be adjusted to serve the representation of that context. 

When I was living in South Africa, I became acutely aware of intersectional issues in data representation. For those experiencing multilayered inequality, the introduction of new AI technology could exacerbate these inequalities – a black woman theoretically experiences lack of representation one two levels, as oppose to a white woman who experiences lack of representation in the dimension of gender (Gwagwa et al., 2020). Moreover, “data blindness” occurs when people fall outside of a formal lens or setting, and are therefore excluded from ‘representative’ population statistics, and consequently, ‘representative’ datasets (Gwagwa et al., 2020). Having a life or business that resides in informal settings, like many which plague developing countries, results in another intersectional layer in considering representation. When these individuals are excluded or limited by the representativeness of datasets, they are also excluded or limited from AI services or social support instruments. This limitation is exacerbated by lack of access to these AI systems, which could effectively make these individuals ‘invisible’. Another example can be seen in senior citizens who may not be able to participate in crowdsourced data collection campaigns, limiting a vital perspective of elders in the community. 

With the resources, knowledge and time available to me, it is difficult to imagine an ethical intervention that I could implement to mitigate this issue. Directly, I could address the issue by attempting to survey individuals who may not be tech savvy or fall outside of formal lenses, creating an open dataset which includes underserved populations. However, this is a significant task, given that a useful dataset requires many data points to be effective. To fit within the timeframe, I could narrow this task down by surveying one underserved group, such as smaller / informal businesses, or senior citizens. Alternatively, I could attempt to educate these individuals on how to participate in data collection projects, or create a spreadsheet with the means to access these underserved citizens should they not be able to participate online, drawing awareness to which groups may not be able to access online, and therefore are ill-considered in AI solutions. Nevertheless, I do believe this issue is more produced in the African context, especially when considering informal businesses. During the course of this year, I hope to brainstorm which interventions could truly serve the underrepresented, and consult with academics on which interventions are feasible, realistic, and genuinely useful. 

Leave a reply

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>

css.php

Report this page

To report inappropriate content on this page, please use the form below. Upon receiving your report, we will be in touch as per the Take Down Policy of the service.

Please note that personal data collected through this form is used and stored for the purposes of processing this report and communication with you.

If you are unable to report a concern about content via this form please contact the Service Owner.

Please enter an email address you wish to be contacted on. Please describe the unacceptable content in sufficient detail to allow us to locate it, and why you consider it to be unacceptable.
By submitting this report, you accept that it is accurate and that fraudulent or nuisance complaints may result in action by the University.

  Cancel