The seduction of coding in a data-driven society
I attended the second seminar in the Code Acts in Education series, entitled ‘Learning through code/learning to code’, held at the University of Edinburgh on Friday 9th May 2014.The event aimed to examine the sociotechnical interaction of software code and educational institutions. In the midst of mostly social scientists, informaticists, and educationalists, I had to adjust my vocabulary: digital epistemological techno-utopianism was the order of the day! Ben Williamson, University of Stirling, set the scene by stating the world runs on code; software has become an increasingly powerful influence in all aspects of everyday life but are we in danger of academia being manipulated by the “algorithmists” of commercial technology companies?
Jenny Ozga, University of Oxford, spoke of the proliferation of software code which allows us to analyse, visualise, and communicate educational data but, alongside this seductive shift towards engaging pictures and graphs, we need to consider the validity of the data behind them. In her talk entitled ‘Governing By Inspection: Coded Knowledge’ Jenny presented data on the schools inspectorate in England and hinted that as governing becomes more networked, big data sites reconstitute knowledge as policy-forming rather than policy-informing in an increasingly politicised arena. She cautioned us of the threats of de-contextualising data, and acknowledged the tension of inspectors arriving at a judgement between embodied evaluation (during school visits) and disembodied, de-contextualised performance data (e.g. RAISEonline).
Next up was bright-young-thing Matt Finn, who is completing his PhD at the University of Durham. He presented ‘Forging Futures in a Data-Based School’, namely Parkside Academy in County Durham. Here, an abundance of data to enable judgement about learning is resulting in the binding of teachers’ and pupils’ futures together. So far, so good, but this breaks down with diverging visions of the future; there are misalignments between pupils as they perceive themselves, as they are known to teachers through personal interaction, and through their data. Matt concluded in saying there is a real need for research about the way people experience and make sense of data.
Simon Buckingham-Shum, Open University, presented on ‘Theory-Free Learning Analytics?’ and emphasised the importance of intention, meaning, and context when interpreting data from learning analytics. He reminded us that data are loaded with political and ethical implications. Numerous factors impact on learning analytics, including ontological, algorithmic, semiotic, and political. I liked Simon’s analogy of learning analytics to “digital mirrors”; allowing people to see an algorithmic identity will most likely cause us to construct ourselves differently. He ended by stressing the need for humans to follow-up and create a dialogue about the data captured.
In ‘Multimodal profusion in MOOCs’ Jeremy Knox, University of Edinburgh, rounded off the day’s presentations with a discourse on how sociomaterial entanglements within the E-Learning and Digital Cultures MOOC counter an over-emphasis on human agency, and acknowledged ways software and algorithms co-produce digital work rather than being simple tools for human use.
As seductive as Jeremy’s post-humanist thesis was, I am a firm believer in the essential role of humans in making sense of all that coding and big data have to offer, and this would seem to be the common thread in the day’s talks and break-out discussions, and so I left the conference feeling enlightened… and most definitely not obsolete!
Paula Smith, IS-TEL Secondee