Image of people walking over large grid from a distance

Developing a holistic marking rubrics

Image of people walking over large grid from a distance
Image credit: Annie Spratt, Unsplash, CC0

In this post, Teaching Fellows Brodie Runciman and Gary Standinger describe their experience of developing a holistic marking rubrics, which takes into account a broader performance criteria for their students. Brodie and Gary share this work on behalf of the Physical Education and Curriculum Pedagogy 3 course tutor team, which is part of the MA (Hons) Physical Education at Moray House School of Education and Sport. This post belongs to the Mar-May Learning & Teaching Enhancement theme: Assessment and feedback revisited↗️.


Given the challenging circumstances of a 3-week turnaround, a mid-semester assessment, and a high volume of submissions, we (the course marking team) used a marking rubric for the first time in our course: Physical Education and Curriculum Pedagogy 3. The assessment in question was a reflective piece of writing, and as in Dr Alison Cullinane’s experience, due to its personal and subjective nature, we found the rubric particularly helpful in making the marking process more systematic, objective, and consistent.

Initially, we created an analytic rubric based on exemplars provided by The University of Edinburgh Reflectors Toolkit, the assessment criteria for the specific piece of work (see Figure 1 further below), and the University’s common marking scheme.

However, as we began testing the rubric, it faced concerns among staff as it potentially neglected students’ responses as a whole and didn’t provide enough credit for some of their work. To address this, we created a holistic rubric to broaden the performance criteria and accommodate more open and divergent responses. This holistic rubric was also used to check and adjust the initial mark generated by the analytic rubric to ensure the score accurately reflected the grade band. If a student’s answer didn’t fit the pre-defined categories of the analytic rubric, it could still be evidenced and given credit against the holistic rubric.

At this point, we found it difficult to describe the varying levels of judgement for each criterion. We decided against using simple word changes like ‘satisfactory’, ‘good’, ‘very good’, etc, because it didn’t capture the differences in performance or help students understand their strengths and weaknesses. Instead, we worked with the assessment criteria and course learning outcomes to create a well-defined criterion for each performance level in the assessment.

Nevertheless, our initial testing phase revealed that the total mark generated and its alignment with the University’s common marking scheme did not always match. Surprisingly, rubrics from other courses at the university did not align with grade cut-off points. After spending time with a calculator, we modified our rubric, as shown below:

Table outlining assessment criteria
Figure 1: Assessment criteria rubric

Nonetheless, we faced a final challenge of uploading our rubric to the Learn assessment Dropbox, which would enable markers to input a mark alongside each criterion. After consulting with other Course Organisers and ICT, we successfully embedded the rubric, enabling the automatic generation of a final mark based on the scores entered against each criterion (demonstrated below).

What did students and staff think?

Students appreciated having a well-defined rubric before they engaged with the task, which explained what was expected of them. However, they wanted to know how to earn more marks in future work. Upon reflection, the rubric effectively communicated ‘what went well’ and ‘what could be better’ but was less effective in guiding students on how to improve future responses. As for us as staff, the rubric helped provide quicker feedback and increased our confidence in assigning marks.

What next?

Looking ahead, we plan to address students’ concerns about improving their future learning by adding a personalised comment to the rubric feedback, titled, What will improve future submissions?’ After gathering further feedback from students, we will allocate seminar time before the next year to clarify assessment expectations, as suggested by Deborah Holt in her post, Co-creating a grade related criteria matrix with students. This will include demonstrating how staff mark their work, using exemplar materials, and working with the rubric to improve transparency and fairness in marking.

Finally, we will consider standardising the rubric for portfolio and reflective writing task across the programme. Doing so will enable students to reflect on previous rubric feedback and use it to inform their next piece of work.

Summary

Overall, we discovered the importance of testing and continuously tweaking our rubric in consultation with students to enhance their understanding of how they should be assessed. Although our initial drafts did not capture everything needed, they laid the foundation for ongoing improvement. We are now in the process of fine-tuning our rubric for the following year.


photo of the authorGary Standinger

Gary Standinger is a Teaching Fellow at Moray House School of Education and Sport, and teaches on the on the MA Physical Education Programme.

photo of the authorBrodie Runciman

Brodie Runciman is a Teaching Fellow at Moray House School of Education and Sport, and teaches on the on the MA Physical Education Programme.

Leave a Reply

Your email address will not be published. Required fields are marked *