I Predict a Riot

Prediction is a word that I am starting to see and hear a lot more recently. It not new, my fitness tracker predicts my steps each day and my expected sleep patterns, my smart lights predict when I will be home and my smart thermostat also likes to know when I am home ensuring I don’t walk into a cold house.

Sounds great doesn’t it, a warm well lit house and me achieving my daily steps and achieving optimal health. All these things I have bought and opted into, I have read the T&Cs (skim read, they are usually extensive and dull), checked for a privacy policy and understand what the technology does however I have little control of what the company does with my digital self (my data) and have limited control of the rules to create some of the predictions.

This data is me and is being used in silo to create a prediction on my behaviour based on my previous behaviours. Kinda makes sense however what happens when I am working at home, or i want to have a lazy day or I want the heat on all day (we all know the first thing we do xmas day is get the heating on). The data I have produced contradicts the algorithm generated prediction but nothing happens to me as it’s just a fitness tracker, a thermostat or a  light bulb. I may get the odd annoying notification reminding me to walk more however its a nudge in a context I understand and have created.

Last week we had Dr Jeremy Knox on the podcast (Podcast information at the bottom of the page) who spoke about datafication of society, data ethics and learning analytics. In preparation Michael Gallagher (my fellow podcast presenter, guerilla radio enthusiast and education guru) recommended I read Jeremys paper:  ‘Data Power in Education: Exploring Critical Awareness with the “Learning Analytics Report Card’

One of the things that has stuck in my mind from speaking to Jeremy and the paper is the potential use of data via institutions central to democracy and society like the justice system.

Discussing a future scenario of crime prediction using big data, Mayer-Schonberger and Cukier (2013, 161) claim, “[t]o accuse a person of some possible future behaviour is to negate the very foundation of justice: that one must have done something before we can hold him accountable for it.” They go on to suggest,The danger is much broader than criminal justice; it covers all areas of society, all instances of human judgement in which big-data predictions are used to decide whether people are culpable for future acts or not. (Mayer-Schonberger and Cukier 2013, 162) – Knox, J. (2017). Data Power in Education: Exploring Critical Awareness with the “Learning Analytics Report Card.

So let’s imagine the latest imaginary gadget, the stay safe gadget. A wearable trendy piece of technology that links you to your surroundings (IOT) has full access to you daily tasks and purchase history and groups you into geographical, demographic and psychographic groups  to help keep you safe. The stay safe gadget use  algorithms to help you stay safe based on all these behaviours and presumptions based on your data and the other users (inturn keeping them safe too).

So Myles Blaney likes to keep on top of trend and has a stay safe gadget, has been waking through the night recently (recorded by my fitness watch), has frequently visited the doctor and has been assigned counselling but attended infrequently (recorded in a central db accessible by my stay safe gadget with the sessions recorded and attendance tracked by facial recognition), recently started to follow a neighbour (via geog-tracking), lives in an area with high crime statistics and recently bought a set of knives.

Based on the information only provided above what do you think? I hope you would be asking for context, further details etc however imagine an algorithm that has been programmed to base predictions from a complex structure of rules. A black box of technical language and intertwined rules that spit out a prediction based on numbers and calculations. A black box that will be used for everyone to create groups, find patterns and predict your safety and predict how safe others are around you.

Let’s revisit Myles and his situation,  he has been waking through the night as he has recently got hit by a car on his bike and has nightmares about the incident, he has been to the doctor who recommended painkillers and some counselling to help Myles back on his bike however Myles works and struggles to attend the sessions. Myles started a new job which involves him renting a flat away from home and to keep costs down its not in a great area. Myles neighbour is partially blind and twice a week he helps them to the shop where Myles bought some knives due to the flat knives being useless.

So would my stay safe gadget check for the above or would I be flagged, arrested, watched. Would my data self be tainted and how would this impact the human me? Prediction or hindsight?

minority report

So should we concerned about the ever growing datafication of everything around us and how quickly we are building models that supposedly predict my future (not going to discuss the bias in models or lack of understanding of the subject matter by humans who create predictive models used in areas where they have no knowledge)?  Should we be concerned about the  whitewashing of ethics by large corporations to make them suit their needs and the lack of a central set of ethics that all should adhere too?

We are starting to see more companies utilising broader data sets and more advanced technology to construct predictions in education. The below tweet from Jesse Stommel (who recently spoke @ ALT) highlights the concerns regarding a new product which would measure student engagement and advise the teacher when to intervene. The assimilation of society that it stereotyped by predictive modelling and patterns where the justice system can predict a riot…..or create one.

 

PS Podcast information for those that made it to the bottom of the blog 🙂

css.php

Report this page

To report inappropriate content on this page, please use the form below. Upon receiving your report, we will be in touch as per the Take Down Policy of the service.

Please note that personal data collected through this form is used and stored for the purposes of processing this report and communication with you.

If you are unable to report a concern about content via this form please contact the Service Owner.

Please enter an email address you wish to be contacted on. Please describe the unacceptable content in sufficient detail to allow us to locate it, and why you consider it to be unacceptable.
By submitting this report, you accept that it is accurate and that fraudulent or nuisance complaints may result in action by the University.

  Cancel