In this extra post, Vassilis Galanos invites us to reflect on the sociopolitical backdrop behind the fast adoption of technologies like Generative Artificial Intelligence in Higher Education, such as the casualisation of academic employment in a landscape pressured by research excellence frameworks. Dr Galanos is a Teaching Fellow in Science, Technology and Innovation Studies at the School of Social and Political Science.
“One fails to see, although it could hardly be more obvious, that pessimism is not a problem but a symptom” (Nietzsche 1968: 24).
Reacting against or enthusiastically adopting Generative Artificial Intelligence (GenAI) in Higher Education implies the sweeping of a dusty old problem under a carpet full of moisture. It becomes imperative to critically examine the factors influencing contemporary technological changes that every now and again seem to pose an existential threat, while they typically act as “novelty traps” (Rayner 2004). The massive adoption rates of generative AI technologies, such as ChatGPT, warrant a deeper analysis, especially when juxtaposed with recent trends in the working conditions of academic staff and the behavioural patterns of students. This post aims to unravel the sociopolitical substrata that not only support but also accelerate these sociotechnical phenomena.
The revelations from two recent publications highlight a distressing correlation that merits urgent attention. On 1st February 2024, The Guardian reported↗️ that more than half of UK undergraduates confess to using (Gen)AI to assist with their essays (Adams 2024). Just two days prior, the UK’s University and College Union (UCU) shed light on another dire statistic: around two-thirds (66%) of research staff find themselves on precarious fixed-term contracts, many of which last less than a year (University and College Union 2024). That is an almost 25% increase since the publication of Barnes & O’Hara’s (2002) study on the management of academics on short-term contracts. I suggest that these figures should not be treated as mere coincidences but are indicative of a deeper, systemic issue within the landscape of Higher Education and academic employment. Simbürger & Neary (2016) have already warned us of “taxi professors” in Chile 8 years ago – and while the “gig economy” is perhaps not the best metaphor to use for academic environments, its use by the UCU is telling of an understudied perception of the Higher Education profession. Warnings about the “university in ruins” have also been in place since three decades ago (Readings 1996) with university life navigating its existence towards research excellence frameworks, stretching between the poles of Nation-State-serving and a Globalisation-serving academic landscape.
Paul Virilio’s theory of “turbocapitalism” serves as an apt lens through which to view these developments. Virilio identified the accelerating pace of life under capitalism, marked by a fragmented working schedule and a collective fixation on speed and efficiency. This phenomenon has permeated the academic world, mirroring the broader societal trend towards instant gratification and the consumption of bite-sized content – a tiktokisation of everyday life: we live in a microchip megamachine. In such an environment, the rapid adoption of GenAI technologies like ChatGPT is a predictable outcome, as students and researchers seek to meet the demanding expectations placed upon them. As mentioned, gig economy may not be an apt metaphor, but it does play a role in the co-orchestration of a turbocapitalist university. Baldwin has recently shown how contemporary university business models are associated with long historical capture of native lands, militarisation of university goals, and increased gentrification and local workforce precarisation (Baldwin 2021), while Rabiei-Dastjerdi et al (2022) exemplify how such gentrification processes is deeply correlated to short-let platform economy models, with universities standing in the middle of such a sociotechnical shaping – Edinburgh locals, University students, and newly appointed staff members have learned this the hard way during the ever-lasting housing crisis.
Furthermore, the concept of alienation, as articulated by Karl Marx, provides valuable insights into the predicaments faced by academic workers. Many academics find themselves estranged from their work, engaged in tasks unrelated to their fields of interest or expertise. This detachment, coupled with the high-pressure environment of academia, creates a fertile ground for the adoption of AI technologies capable of producing work that passes the “Turing test.” In essence, the academic becomes disconnected from their intellectual output, viewing it as a mere commodity or a box-to-be-ticked rather than a reflection of their scholarly engagement. This aspect of Generative AI as extension of a broader alienating process when it is used as part of the workplace is indeed overlooked by critical labour scholars (Hui 2023).
These observations underscore the necessity of moving beyond a technologically deterministic perspective. GenAI did not emerge in a vacuum but rather as a response to existing social, political, and cultural mutually reinforcing work conditions and technical demands. The challenge, therefore, lies in shaping an actionable research agenda that addresses these underlying substrata. This entails a commitment to fostering slow research and rich engagement with subject matter, thereby moving away from a purely transactional model of education. How would that look like? Berg & Seeber (2016) have shown initial examples in their Slow Professor that tactics can be embedded in academic culture that reduce anxiety and stress, such as remaining offline, question the quantification of how much we do, challenge the expectation of outcome timeliness, remain playful in academic endeavours, save time to do nothing as part of doing something, and reduce the time spent to talk about time We should, nevertheless, consider the power hierarchies embedded in such initiatives – as Vostal suggests when introducing the volume Inquiring into Academic Timescapes (2015), “slow academia” is often a privilege for the less casaulised academic staff, and new modes of “chronosolidarity” have to be established. As Fleming (2021) contends, in the “Edu-factory speed,” there is little or no hope in remaining slow.
A collaborative effort among educators of any level, policymakers, and technology developers is essential to counter the infrastructural challenge of academic turbocapitalism. A multi-stakeholder approach should prioritise the development of policies that support secure and meaningful academic employment, thereby reducing the reliance on “permanent temporariness” (Rasmussen & Tove 2012) – or rewarding current short-term contract holders by offering permanent roles, as proposed by various scholars investigating academic job insecurity (Barnes & O’ Hara 2002, Ylijoki 2010). Additionally, educational curricula must evolve to encourage deep, critical engagement with subject matter, rather than mere acquisition of credentials for employability. Moreover, the aforementioned hypothesis should be tested: more conversations have to take place in either qualitative or quantitative, anonymous or eponymous formats as to the correlation between uses of automation and labour casualisation. How can a tech industry-academia-government partnership for slow and experiential appreciation of research be fostered?
It is crucial that securing time offline and for meaningful interaction with colleagues and research subject is further supported by cultivating conditions for responsible uses of technology use within academia. While efforts are already being directed towards establishing clear guidelines for the use of GenAI, ensuring that these technologies supplement rather than supplant the intellectual labour of students and researchers, these are not enough – they are lacking key elements as to the broader infrastructural knowledge as to these technologies’ environmental footprint, the intellectual property trade-offs involved in their use, and the often unseen and equally precarious labour underpinning their efficiency and maintenance (Shaji George et al 2023, Swist & Gulson 2023). Researchers and technologists should work together to ensure transparency about GenAI’s bio-geographic value chain and labour process. Learning about how GenAI is produced might be more valuable than learning how to use it best, as it can assist in appreciating one’s own creativity value.
Almost a year ago, I have proposed in Teaching Matters↗️ what I saw as best ways to integrate GenAI in education while maintaining awareness as to the academic culture its flourishing is reliant on and its social underpinnings – my argument has not changed since then↗️ – but we now have more evidence about the policies we need to establish to regulate, not ChatGPT or GenAI, but a generation of academics and students that are treated like AI algorithms. Casualised algo-demics grading student essay-generators feeding into a loop that might as well lead to the academic version of GenAI’s greatest technical present threat: model collapse, a poor performing system because of its increasing reliance on synthetic data, also known as the “curse of recursion” (Shumailov et al 2023). In conclusion, the interplay between the adoption of generative AI in higher education and the sociopolitical context that fuels this trend demands a nuanced, comprehensive response. By acknowledging the broader forces at play, including the interacting registers of precarisation of academic labour and psychological well-being, the environmental register, and the cultural valorisation of speed (Xi & Galanos 2024), we can begin to reimagine a future of Higher Education that values depth, engagement, and meaningful scholarship. It is only through such a collective, reflective effort that we can navigate the challenges and opportunities presented by technological advancements in a manner that benefits the academic community and society, each individual psyche and the surrounding environment at large.
- Want to join the conversation? Participate in the ‘Staff-Student Workshop on Generative AI and the University’ this Thursday (Feb 22nd 2024): https://www.eventbrite.co.uk/e/staff-student-workshop-on-generative-ai-and-the-university-tickets-818315733387
Acknowledgment: I want to thank Dr Karen Gregory for sharing Simbürger & Neary (2016) with me, Prof Mark Paterson for In the Shadow of Ivory Tower, Cassius Frazer Smith for Dark Academia, and Dr Joséphine Foucher for valuable feedback on this post’s first draft.
References
Adams, R. (2024). More than half of UK undergraduates say they use AI to help with essays. The Guardian. 01 February 2024. https://www.theguardian.com/technology/2024/feb/01/more-than-half-uk-undergraduates-ai-essays-artificial-intelligence↗️
Baldwin, D. L. (2021). In the shadow of the ivory tower: How universities are plundering our cities. Bold Type Books.
Barnes, N., & O’Hara, S. (1999). Managing academics on short term contracts. Higher Education Quarterly, 53(3), 229-239. https://onlinelibrary.wiley.com/doi/epdf/10.1111/1468-2273.00128↗️
Berg, M., & Seeber, B. K. (2016). The slow professor: Challenging the culture of speed in the academy. University of Toronto Press.
Fleming, P. (2021). Dark Academia: How Universities Die. London: Pluto Press.
Hui, Y. (2023). ChatGPT, or the Eschatology of machines. E-Flux Journal 137.
Nietzsche, F. (1968). The Will to Power. Trans. W. Kaufmann and R. J. Hollingdale. New York: Vintage Books.
Galanos, V. (2023). ChatGPTeaching or ChatGPCheating? Arguments from a semester with large language models in class. Teaching Matters. 18 and 25 April 2023. Part 1: https://www.teaching-matters-blog.ed.ac.uk/chatgpteaching-or-chatgpcheating-arguments-from-a-semester-with-large-language-models-in-class-part-1/↗️
Rabiei-Dastjerdi, H., McArdle, G., & Hynes, W. (2022). Which came first, the gentrification↗️ or the Airbnb? Identifying spatial patterns of neighbourhood change using AirBnB data. Habitat International, 125, 102582. https://www.sciencedirect.com/science/article/pii/S0197397522000790
Rasmussen, B., & Tove, H. (2012). Permanent temporariness? Changes in social contracts in knowledge work. Old site of Nordic Journal of Working Life Studies, 2(1), 5-22. https://rossy.ruc.dk/index.php/njwls/article/view/2349↗️
Rayner, S. (2004). The novelty trap: Why does institutional learning about new technologies seem so difficult?. Industry and Higher Education, 18(6), 349-355. https://journals.sagepub.com/doi/abs/10.5367/0000000042683601
Readings, B. (1996). The University in Ruins. Harvard University Press.
Simbürger, E., & Neary, M. (2016). Taxi professors: academic labour in Chile, a critical-practical response to the politics of worker identity. Workplace: A Journal for Academic Labor, (28).
Shaji George, A., Hovan George, A., & Gabrio Martin, A. (2023). The Environmental Impact of AI: A Case Study of Water Consumption by ChatGPT. Partners Universal International Innovation Journal, 1(2), 97-104. https://puiij.com/index.php/research/article/view/39↗️
Shumailov, I., Shumaylov, Z., Zhao, Y., Gal, Y., Papernot, N., & Anderson, R. (2023). The curse of recursion: training on generated data makes models forget. ArXiv. Preprint posted online. https://arxiv.org/pdf/2305.17493.pdf↗️
Swist, T., & Gulson, K. N. (2023). Instituting socio-technical education futures: encounters with/through technical democracy, data justice, and imaginaries. Learning, Media and Technology, 48(2), 181-186. https://www.tandfonline.com/doi/full/10.1080/17439884.2023.2205225↗️
University and College Union (2024). 66% of Research Staff on Insecure Contracts: New report exposes ‘gig-economy’ reality of prestigious university research departments. UCU News Website. 31 January 2024. https://www.ucu.org.uk/article/13444/New-report-exposes-gig-economy-reality-of-prestigious-university-research-departments
Virilio, P. (2012). The Great Accelerator. Translated by Julie Rose. Cambridge and Malden: Polity.
Vostal, F. (2021, ed.). Inquiring into Academic Timescapes. Emerald Publishing Limited.
Xi, I. & Galanos, V. (2024, forthcoming). Facing GAIa: Tale of Three ChatGPToxicities. Deleuze & Guattari Studies.
Ylijoki, O. H. (2010). Future orientations in episodic labour: Short-term academics as a case in point. Time & Society, 19(3), 365-386. https://www.tandfonline.com/doi/full/10.1080/0309877X.2015.1117598↗️
Vassilis Galanos
Vassilis Galanos (it/ve/vem) is a Teaching and Research Fellow at the Edinburgh College of Art, University of Edinburgh, currently researching Generative AI risks to journalism as part of the BRAID UK project. Vassilis researches and publishes on the interplay of expectations and expertise in the development of AI, robotics, and internet technologies, with further interests in cybernetics, media theory, invented religions, oriental and continental philosophy, community-led initiatives, and art. Vassilis serves as associate editor of the journal Technology Analysis and Strategic Management and as of June will be Lecturer in Digital Work at the University of Stirling. Vassilis is often playing the mouth harp – using it to invite students back from class break.
Twitter handle: @fractaloidconvo