Any views expressed within media held on this service are those of the contributors, should not be taken as approved or endorsed by the University, and do not necessarily reflect the views of the University in respect of any particular issue.

Data and education – observations from the shop floor

When I became Director of Studies at Peponi House Prep School, one of my jobs was to start a data tracking system of the pupils, using standardised testing. Peponi House was a little school in the suburbs of Nairobi, which had suddenly grown expedentially, becoming an accreditated member of the Independent Association of Prep Schools (IAPS). It was clear that relying on teacher intuition and a mishmash of formative and summative assessments would no longer cut the mustard. Kenyan middle class parents are aspirational and expect that their children will go onto to boarding school and university in the UK and the USA.  They expected the school to justify educational decisions with material evidence, as did the UK School Inspectors. Which was where I came in.

So, we started to go down the route of collecting data, analysing it and then, crucially, doing something with it.

How did I go about establishing a system from nothing? My educational agenda had to align with the data science (Knox et al. 2020 )so the first thing I had to do was decide what we wanted to know.

We wanted:

  • to compare our whole school outcomes with those of comparative schools in the UK – were we offering an education akin to schools in the UK? We were working in a vacuum in Kenya without points of reference.
  • to plot year group progress.
  • to compare year groups as they progressed.
  • to look at individual academic progress.

We wanted the data as a tool to evaluate performance of both the pupils and, I’m afraid to say, of the teachers. We wanted better efficiency and more transparency. (Eynon) However, I knew that the data could not simply be a quick ‘fix’ and, as Eynon also points out, the data needed to be used to ’empower, support and facilitate’.

In a nutshell, to fast forward 5 years, our use of data was a resounding success. Academic achievement radically improved. Staff morale improved. Teaching methods improved. We were able to identify Gifted and Talented pupils. We were able to take pupils off the Learning Support register. We were able to catch pupils before they ‘fell’. We were able to track progress and make accurate predictions. Our pupil numbers rose, parental support improved and we were awarded ‘Outstanding’ in all areas in the School’s Inspection. (As a British curriculum school, we fall under the same jurisdiction as British schools).

What did we do that was so successful?

Rather than go deeply into detail about the exact testing we used to extrapolate data, I’d like to make a few key points.

Timings

This proved crucial. Rather than test the pupils at the end of academic years, which is traditionally the case, I decided to test the pupils at the beginning of the year. This meant that the teachers could use the data generated in a timely fashion. It makes no sense to sit standardised testing in June: teacher is exhausted, teacher writes a report based on some of the findings but has run out of time to follow up on them, pupils go on holiday, new year arrives, teacher has new pupils and unless said teacher is extremely dedicated, the data extrapolated in June often collects dust in a file, whilst the teacher ‘gets to know’ the new pupils, and so on. The testing becomes a stand-alone exercise that only serves to give a snapshot of achievement in a given week. Beyond that it has little worth. By testing in September, everyone is motivated to use the data to inform decisions and teaching practice. So you get both the snapshot, but also, and this is the crucial thing, the pupils benefit directly as the year progresses. There is much more of a buy-in on the part of the teachers.

Age of testing and what you test

In my experience, if you are testing children under the age of 11, you need to take the results with a large pinch of salt. Some children are late developers, some are lazy, some are backed up by overly motivated mothers. For monitoring purposes, until 11 years old, it is much more interesting to measure a child’s developed ability. To that end, we used a set of tests called Primary Indicators in Primary Schools (PIPS), run by the Centre for Evaluation and Monitoring (Durham University and now, just passed to Cambridge University).  These tests have just been re-named InCAS.

 

The results from our testing flagged up all sorts of information that we were able to act upon. The most important was it showed an individual’s achievement relative to their developed ability; in other words it showed us which individuals were under-achieving. This is very different from a CAT test.

Training of teachers

It was vital that the teachers played an active role in the standardised testing. To that end, they were encouraged to run the tests themselves. There was training in how to extrapolate the results and what they meant. This was followed up with class observations and peer-to-peer training and support, so that teachers knew how to differentiate their teaching accordingly. The data became a support rather than a simple answer.

Yet very little changed…

In terms of what the teachers were doing in the classroom, very little actually changed. This will be the focus of my next blog.

I am a great supporter of standardised testing if and only if it is used to inform and support the teacher. It takes hours of work to use the data correctly. As an integral part of teaching and learning, rather than a stand-alone feature, data has an important role to play.

 

1 reply to “Data and education – observations from the shop floor”

  1. pevans2 says:

    It is good to read about a positive experience in developing and using a system of data collection, collation and analysis with clear educational benefits. I’ll look forward to the next post on teacher practices.

Leave a reply

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>

css.php

Report this page

To report inappropriate content on this page, please use the form below. Upon receiving your report, we will be in touch as per the Take Down Policy of the service.

Please note that personal data collected through this form is used and stored for the purposes of processing this report and communication with you.

If you are unable to report a concern about content via this form please contact the Service Owner.

Please enter an email address you wish to be contacted on. Please describe the unacceptable content in sufficient detail to allow us to locate it, and why you consider it to be unacceptable.
By submitting this report, you accept that it is accurate and that fraudulent or nuisance complaints may result in action by the University.

  Cancel