Falling in love with proxies

Data Power in Education: Exploring Critical Awareness with the “Learning Analytics Report Card” (Knox 2017) discusses, amongst other issues, the limitations of measuring proxies for learning (observable behaviour that is easier to measure than learning and thus stands as an approximation) (Godin 2012).   

In the literature that I read for my assignment, (which examined the use of robot teachers in school classrooms), the issue of learning analytics was raised repeatedly (Sharkey 2016), (Selwyn 2018), (Watters 2015).  Robots can observe and record individual student behaviour data such as hand raising in class to assess levels of engagement.   Student engagement is used as a proxy for learning; an engaged student is a learning student (although this assumption is debatable (Coe 2014)).  Robots can also measure activity that is not visible to the human teacher; biometric bracelets that detect slight physiological changes can be worn by students to measure emotional arousal.  Robots can use this data to detect waning student interest and spice up their lessons accordingly (Sharkey 2016).   Knox interrogates the emphasis that learning analytics places on the correlation of data, ‘the what’, at the expense of understanding causation, ‘the why’ (Knox 2017).  In the robot example, students are raising hands, there are also physiological indications that students are (not) aroused but without understanding why, it begs the question: are we measuring student engagement or are we measuring what we think is student engagement?   Hand raising in the primary school can also correlate to frequent bathroom visits or ‘telling on’.   In the case of emotional arousal, it could be an indicator of exciting peer interaction or excitement related to the presence of the robot.

The limitations of measuring learning proxies are not confined to the field of learning analytics, they already exist in traditional classroom practises.  In my years as a primary school teacher I became increasingly frustrated with the culture of measurement in our education system.  Biesta’s ‘Good Education in an Age of Measurement (Biesta 2009) helped me to understand my frustrations.  Biesta asserts that the focus on measurement makes us lose sight of purpose.  As he put it ‘we value what we measure and don’t measure what we value’.  His arguments explained the profound sense of futility that I experienced as I administered the relentless stream of assessments.  I would argue that the culture of measurement is a fertile ground for the growth of learning analytics.

Knox cites the example of Course Signals a LA system that informs ‘what’ the relationship is between the student and the learning outcome (failing or passing) but doesn’t explain ‘why’.  He goes on to state students are urged to foster uncritical relationships with data reporting, and to understand their educational progress in narrowly defined ways’ (Knox 2017)    I contend that this is already the case in traditional reporting systems in education where the grades on a student’s term report reflect the extent to which learning outcomes have been achieved.  

At this point you may be thinking I have misunderstood the point that Knox was making.  Assessment doesn’t rely on correlation to draw conclusions about passing or failing, one is able to analyse the assessment to understand what has caused the student to pass or fail.  Such a notion relies heavily on the (misguided) assumption that performance in the assessment is an adequate proxy for learning (Coe 2014).

It takes great skill to design a meaningful assessment;  content should be weighted to reflect the structure of the course,  tasks should reflect a range of cognitive levels which also require weighting.  Teachers in South Africa produce their own assessments which are mostly issued without any meaningful moderation.   Assuming the teacher manages to design a correctly weighted assessment, the medium of assessment (test, project, oral presentation, role play or poster to name a few) impacts on performance depending on the student’s level of literacy, social skills, public speaking skills or computer literacy.   One also has to consider the context in which the test is delivered.  Did the student have (a decent) breakfast?  Did they get enough sleep?  Are they emotionally stable? Do they feel safe in their classroom?

In the end we are left with a term report that represents educational progress in an incredibly narrow way.  More often than not an assessment mark tells us how well a particular child could perform a particular assessment task, on a particular day, in a particular setting.  Are we in fact measuring learning or are we measuring what we think is learning?

I think this quote says it all:

When we fall in love with a proxy, we spend our time improving the proxy instead of focusing on our original (more important) goal instead. (Godin 2012)

Learning analytics is a good fit for the culture of assessment in our schools, we have become so obsessed with measuring what we think is learning that we have lost sight of the original aims of education.  

 

Biesta, Gert. 2009. “Good Education in an Age of Measurement: On the Need to Reconnect with the Question of Purpose in Education.” Educational Assessment, Evaluation and Accountability(formerly: Journal of Personnel Evaluation in Education) 21 (1): 33–46.

Coe, Robert. 2014. “What Makes Great Teaching.” presented at the IB World Regional Conference (AEM), Den Haag NL. https://www.ibo.org/globalassets/events/aem/conferences/2015/robert-coe.pdf.

Godin, Seth. 2012. “Avoiding the False Proxy Trap.” Seth’s Blog. November 9, 2012. https://seths.blog/2012/11/avoiding-the-false-proxy-trap/.

Knox, Jeremy. 2017. “Data Power in Education: Exploring Critical Awareness with the ‘Learning Analytics Report Card.’” Television & New Media 18 (8): 734–52.

Selwyn, Neil. 2018. “Robots in the Classroom? Preparing for the Automation of Teaching | BERA.” The BERA Blog. 2018. https://www.bera.ac.uk/blog/robots-in-the-classroom-preparing-for-the-automation-of-teaching.

Sharkey, Amanda J. C. 2016. “Should We Welcome Robot Teachers?” Ethics and Information Technology 18 (4): 283–97.

Watters, Audrey. 2015. “Teaching Machines and Turing Machines: The History of the Future of Labor and Learning.” Hack Education. August 10, 2015. http://hackeducation.com/2015/08/10/digpedlab.

 

Leave a Reply

Your email address will not be published. Required fields are marked *