A critical approach encourages meaningful interactions and discussions that challenge ethical standards and socio-political norms.

Bearman & Ajjawi (2023), published the article ‘Learning to work with the black box: Pedagogy for a world with artificial intelligence‘ in which they argue that by critically engaging with the rules of the game, students can make their own contribution to them.

The opacity of AI reflects the complex relationship between humans and technology, the social context within which it works can be examined critically. They outline two approaches, an analysis of quality standards and participation in meaningful interactions.

They urge educators to engage students with discussions of ethical and socio-political challenges, such as unacceptable bias.

Would medical students be surprised to learn that a study found an algorithm used in US healthcare decision making, scored black and white patients equally at risk when the black patients were in worse health, halving the number of black patients who were allocated extra care? You can read more about the impact of AI harms in this article from the Conversation For minorities, biased AI algorithms can damage almost every part of life (theconversation.com) which according to Adib-Moghaddam (2023) suppress vulnerable sections of society.

How much do law graduates know about the use of data in the police force, which is produced by algorithms which have shown to be skewed against ethnic minorities, the poor and socially disadvantaged? How many people have been wrongly incarcerated or refused parole because of bias in Machine Learning algorithms? Read more about this in Hamilton & Ugwudike’s (2023) article A Black Box AI System has been influencing criminal justice decisions for over two decades.

CC BY NDThe Conversation.com believes in the free flow of information

Activities

Algorithmic play activities could be incorporated into university curriculums to help encourage meaningful discussions around these issues. Below is a visualisation of an algorithmic play activity that shows some extracts of a conversation I had with Chat GPT 3, which illustrates the feeling we were going round in circles.

I asked 13 questions and the system generated around 4000 words in total, much of which was repetition. I tried to focus on a previous idea it had generated to acquire a more precise answer, as it seemed to be evading my questions.

What algorithms were at work here that made Chat GPT reluctant to critically evaluate itself?  A series of experiments would need to be performed to determine how this algorithm might behave differently in different contexts, however as a snapshot of an algorithm at work, it could lead to the meaningful discussions required to critically analyse AI.

Activity 6

You can read about three other examples of algorithmic play activities in this Harvard blog Algorithmic Play Activities Try one out for yourself and share your findings on the padlet. 

QR code for this padlet

Moving from Criticality to Action

Undoubtedly, a critical approach is imperative for university students in order to help them navigate an AI dependent world, however does it go far enough in terms of the potential of transformative change and ‘radical possibility’? Does it enable them to use their voices?

The image depicts the slogan ‘USE YOUR VOICE’ using letters cut out of a newspaper or magazine stuck onto a dark background. It is a simple message that urges people to speak out.

Featured Image: All photos and videos on Pexels can be downloaded and used freely. Use your voice in letters cut out of a newspaper or magazine

Meaningful forms of participation that allow students to take an active role in defining the issues of transparency of AI are also required. This could lead to a push for cultural and societal change through community led initiatives that offer students a voice in enacting their own desired futures as digital citizens.

Social movements are emerging that focus on promoting awareness about digital rights, university participation in these social mobilisation initiatives is vital.

References

Adib-Moghaddam, A. (2023). For minorities, biased AI algorithms can damage almost every part of life, The Conversation. Available at: For minorities, biased AI algorithms can damage almost every part of life (theconversation.com). Accessed 10 October 2023).

Bearman, M., & Ajjawi, R. (2023). Learning to work with the black box: Pedagogy for a world with artificial intelligence. British Journal of Educational Technology, [e-journal] 54 (5) pp.1160-1173 Available at: Learning to work with the black box: Pedagogy for a world with artifical intelligence. (Accessed: 10 October 2023).

Hamilton, M. & Ugwudike, P. (2023). A ‘black box’ AI system has been influencing criminal justice decisions for over two decades – it’s time to open it up, The Conversation. Available at: A ‘black box’ AI system has been influencing criminal justice decisions for over two decades – it’s time to open it up (theconversation.com) (Accessed: 18 November 2023).

The next section #5. Mobilisation will discuss this further. 5. Mobilisation