Title: Designing a Course and Teaching Evaluation Platform for the School of Informatics

Author: Selina Zhan and Cristina Alexandru

Theme: Partnerships

Course and teaching evaluation are essential for quality assurance in higher education. For the teaching team, they can help identify what goes well, what does not and what could be improved in the course and teaching approaches. Any changes implemented as a result can enhance students’ academic experience. Additionally, students who are interested in a course can benefit from the feedback that has been collected from previous course students, and any responses to this feedback from the course organisers, to decide whether to choose to study that course. For the university, course and teaching evaluation can inform curricular decisions.

In the School of Informatics, academics are required to collect feedback from their students. To this end, various processes exist including mid-term feedback (a short online questionnaire is provided), the end-of semester Course Enhancement Questionnaire, the student representative meetings, the “Have Your Say” box for course students on Learn. These involve multiple systems which are not integrated. Furthermore, several of these processes have recognised disadvantages: for example, the provided questionnaires do not address every academic’s needs, and some students have ‘questionnaire fatigue’. There is an expectation for academics to additionally devise their own ways of evaluating their course and teaching. This results in great variability as to how such evaluation is performed and the usefulness of its outcomes.

To address these issues, the aim of this project was to design a centralised course and teaching evaluation platform for academics, teaching support staff and students in the School of Informatics, together with them. This system is intended to facilitate all stages of course and teaching evaluation, as well as conducting this evaluation not only with students, but also peer academics or self-reflection. The following stages were pursued:

  1. Review of related systems and literature;
  2. Requirements gathering from academics and students in the school;
  3. Design of a prototype of the system in Figma, considering the high and medium-priority requirements;
  4. Evaluation of the usability and potential impact of the prototype with academics and students in the school.

Our poster details the above steps, provides an overview of the end prototype, and offers directions for future work including adapting this prototype for other schools.