Assessments, Groupwork, and the connectedness of things….
On 4th June the University hosted a meeting for the ceLTIc Project to share experiences of, and work on, the Peer Assessment tool WebPA. Speakers from Queen Margaret University, Hull, Aberdeen, Edinburgh and Imperial College outlined projects using WebPA, some over a number of years, some still pilots. That experience included suggestions for others thinking of introducing peer assessed group work, considering not just this tool but the pedagogic context in which this work can thrive and become a usefully integrated part of student learning. Participants included attendees from UWS, Dublin and Newcastle who were eager to hear from those with more experience of the tool.
There were two aspects of the meeting that I found most interesting and useful: from Hull, Neil Gordon’s illustrations – based on long experience of teaching with WebPA – of the effects of relative weighting of the peer assessed component; and all of the speakers providing summaries of feedback from students and staff.
Against a background where many aspects of “traditional” assessments are being questioned, growing employer demand for students to demonstrate their team working and critical / reflective skills, group work and peer assessment are generating growing interest in HE. Staff praised the WebPA tool for alleviating much of the administrative tedium of managing peer marked groupwork, and also for providing students with an easy-to- use tool.
Stephen Vickers, the developer behind the LTI integration at the heart of the ceLTIc project, outlined some of the improved integrations provided and also some of the questions about the future for these integrations and users of WebPA now that the project is ending. Now that there is such a simple and smooth integration between WebPA and VLEs, especially Blackboard’s Learn, many of the technical barriers to adoption have been removed.
What remains, of course, is discussion about the suitability of this form of assessment for different disciplines, levels of study and types of cohort. It was interesting that someone working with a group of school-leavers preparing for University study had little difficulty in gaining student acceptance, while other staff working with older (and possibly more sophisticated) groups reported that they would question the assignment, their participation, and especially their grades. So there was some discussion about how best to prepare students for these forms of peer review, and what feedback and grades to release to them. The conclusions I took away from it were, as ever, the need to explain to students very clearly why this activity is of value, what criteria will be used to grade it, and how “gaming” the system will be spotted by staff.
Shireen Lock from Imperial College presented the findings of a survey of staff there who had used WebPA, amounting to a “wish list” for further development. The SIG plans to meet and discuss development and service options, including prioritising developments and seeking funding support.
The mood of the meeting was very positive, with agreement that WebPA is extremely useful for its single designed purpose, especially when integrated with other student teaching and learning systems. The support and maintenance overhead seems to be low, for a system which can be made available institution-wide. As ever, the need is less for technical expertise but for pedagogic understanding of appropriate introduction and integration of these forms of assessment.
The sessions were recorded and are available at the following URL: http://listenagain.stir.ac.uk/media/keep/webpa/listenagain.php
More information about the ceLTIc (creating environments for Learning using Tightly Integrated Components) project – http://www.celtic-project.org/
More information about WebPA – http://webpaproject.lboro.ac.uk/