Digital Skills for CSE – a data-driven success story
In a previous blogpost I shared my thoughts around the importance of knowing your learners. Understanding who does – and who doesn’t – attend training is an important step towards identifying organisational digital skills gaps, and helps us tailor and market our services more effectively. From a management perspective, this type of data is incredibly valuable for decision making, as it helps us determine where our services could make the biggest impact without putting too much of an additional strain on our resources.
This spring we had the opportunity to deliver a targeted programme of training for one of the staff groups that have comparatively low attendance rates across our Digital Skills Programme – colleagues in the College of Science and Engineering (CSE). This was initiated by the Learning & Development Partner within the College, who was looking to create a programme of skills development based on staff needs. The L&D Partner had already surveyed College staff on the skills they would find most beneficial to learn, and approached us to discuss whether we would be able to deliver on any of the digital skills identified. Together we conducted a mapping exercise, matching our existing courses to the identified skills gaps, and ‘Digital Skills for CSE’ was born – a programme of 18 training sessions delivered over the course of two months, specifically for CSE staff audiences.
202 CSE colleagues attended the programme of training – this represents approximately 20% of the total annual CSE staff attendance across the entire Digital Skills Programme. ‘Digital Skills for CSE’ had a mean attendance of 11.2 people per session, which is slightly higher than our usual average attendance rate. We saw a no-show rate of 25% – this is lower than our usual no-show rate, which can vary between 30%-50%, depending on the course, time of year, and demographic breakdown of attendees (we usually see a higher no-show rate among student attendees).
The feedback from attendees was overwhelmingly positive, with 96% of respondents stating that the course was relevant to their needs, and 98% of respondents agreeing that attending the course had given them a greater understanding of the topic:
“Duration was the perfect length for an introductory session. The trainer explained topics carefully and was good to see “live” examples. Being able to revisit the slides/video/workbook will be a huge help going forward until I become more proficient.”
“This was exactly the level of content that I was looking for just now. There was the opportunity to ask questions throughout, the trainer checking in on the audience for their understanding and chance for break, questions etc.”
“It was very well presented and the speed and level was appropriate for my level of computer knowledge.”
“The training was well structured and easy to follow, the trainer was very knowledgeable, provided ample examples and really helpful features. Speaking pace was excellent too. The tool itself looks really useful, cannot wait to start using it and can already see many applications in my role. Thanks!”
I’m very pleased with how well this programme of training was received. Leading with learner needs in mind, and collaborating with local stakeholders, we were able to tailor a programme of training from our existing offering that addressed identified skills gaps and raised awareness of our services. We are planning on continuing to work with CSE colleagues on future iterations of ‘Digital Skills for CSE’, and I hope we can explore similar collaborations with other parts of the University, particularly those with hard-to-reach audience groups.