Week 8: A journey of two MOOCs
For the Week 8 activity, I eventually decided to put not one but two MOOCs through their paces. Initially I signed up for ‘Digital Skills: User Experience’ on FutureLearn because I wanted to find out what lay beneath the UX buzzword and consider the extent of its relevance to my professional context, the MScDE programme and the design of online educational platforms and courses. But then I feared that a FutureLearn course might be too obvious a choice and therefore not necessarily the most interesting subject for analysis. I also recalled a MOOC that I came very close to completing about a year and a half ago and quite liked the idea of resurrecting – ‘ELT (English Language Teaching) in the Digital Age’ by ELTJam. So I went for both, which then gave me the added opportunity to identify any interesting parallels and differences between two courses on different but related topics offered by different providers.
Basic course information:
‘Digital Skills: User Experience’ is a 6-hour course divided over 3 weeks which started in February but allows continuous enrollment. Participants range from absolute novices with no prior knowledge of UX to professionals working in related fields with some knowledge of UX.
https://www.futurelearn.com/courses/digital-skills-user-experience
‘ELT in the Digital Age’ is a 5-7 hour course divided into 6 ‘episodes’ (plus an intro and ‘epilogue’ section) which started two years ago but remains open for enrollment. Participants are practising English language teaching professionals – teachers, trainers and materials writers.
https://eltjam.academy/p/elt-in-the-digital-age
Video walk-through:
My first attempt at a screen capture video using Kaltura Capture Space/Media Hopper. The video is a bit longer than I intended so please feel free to fast forward! š
Evaluation:
Overall, both MOOCs appear to be well-designed in terms of the course interface, structure, learning objectives and content delivery using a range of appropriate materials. As I mentioned in the second half of the video, the aspect in which I think the two MOOCs most noticeably differ is the comments threads. While in the User Experience course the threads generally consist of reams of standalone responses to the questions posed in each section, in the ELT course participants frequently responded to the contributions of others, which generated some genuine, if not lengthy, conversational exchange. The factors which may account for this difference are applicable considerations for online course design in general: participantsā existing level of knowledge and/or experience related to the course subject; the nature and frequency of course instructorsā contributions to comments threads or discussion forums; and the type of questions posed to participants for comment. Arguably, the ELT course generated more conversation between participants because, as current practitioners in the field, they knew more about and therefore had more to say about the topics under discussion. The course instructor also had a very active role (at least in the first year of the course) in responding enthusiastically to posts, which seemed to encourage further interaction between participants. Finally, the questions posed in the ELT course perhaps tended to be more open and discursive than those in the UX course, consequently allowing more scope for personalised responses, which led to more varied and interesting threads and provided greater incentive for participation and interaction.
The fact that the presence of the instructor in the ELT course seemed to be such an instrumental driver of interaction between participants calls into question the notion of intuitive and effortless self-directed learning supposedly being facilitated by open educational resources such as MOOCs (Bayne et al., 2015). If MOOC participants were all capable of directing their own learning effectively, and assuming that conversation aids learning, surely then the presence or absence of an instructor in a comments thread would have a less significant bearing on the nature and quality of the interaction taking place.
After reading a blog post written by the academic lead for FutureLearn about the platformās inception and founding principles (Sharples, 2017) – apparently underpinned by Paskās Conversation TheoryĀ – IĀ thought it would be an interesting exercise to compare his vision of the platform to my experience of the UX MOOC:
āwe designed FutureLearn for learning as conversation, and in such a way that learning would improve with scale, so that the more people who signed up, the better the learning experience would beā
Iām not sure how he envisaged this working in practice; surely the more subscribers, the greater the volume of comments and the less likely the prospect of comments taking the form of contributions to a conversation. I wonder if he would ascribe similar learning value to the production of standalone comments which typifies the comments threads I read in the UX MOOC.
āEvery course involves conversation as a core element ā¦ There are also dedicated discussions, in which learners reflect on the weekās activity, describe how they performed on assessments, or answer an open-ended question about the course.ā
In the reflection section of the UX MOOC, numerous participants responded to the questions but, despite the explicit reminder to review other learnersā comments and reply, the comments thread featured little or no ādedicated discussionā.
āAnd online study groups allow learners to work together on a task and discuss their learning goals.ā
In the UX MOOC, I couldnāt see any sign of such a group or any collaboration between learners. Itās likely that this idea turned out to be unworkable for online courses with thousands of subscribers.
āEven student assessment has a conversational component. Learners write short structured reviews of other studentsā assignments, and in return they receive reviews of their assignments from their peers.ā
Likewise, I saw no evidence of this in the UX MOOC. Another idea unworkable at scale?
āQuizzes and tests are marked by computer, but the results come with pre-written responses from the educator.ā
This reminded me a little of the automated feedback generated by the LARC. Tests such as these could help learners to remember and consolidate their understanding of key points from the course, but only if the questions and options are carefully designed. Some participants in the UX MOOC complained of a confusing use of capital letters in one of the quiz question options which caused them to answer the question incorrectly. It was surprising to see course participants paying more attention to the minutiae of the course than the designers apparently did!
āOn average, a third of learners on a FutureLearn course contribute comments and replies.ā
As an aside, this statistic struck me as roughly equal to the proportion of the IDEL cohort who have contributed regularly to the discussion forums. It made me wonder if this was a commonly-occurring statistic for online course forum participation… if this is the case, I can only assume/hope that the majority of learning is taking place elsewhere (re. my previous post)!
Bayne, S., Knox, J. and Ross, J. (2015) Open education: the need for a critical approach, Learning, Media and Technology, 40(3), pp. 247-250.
Sharples, M. (2017). āIn FutureLearn’s MOOCs, Conversation Powers Learning at Massive Scaleā. Available at: https://spectrum.ieee.org/tech-talk/at-work/education/conversation-powers-personalized-learning-in-futurelearn-mooc?fbclid=IwAR3acza3Xn-I_z4CuPGXItGm-gw2GEyh0_amKN_1HziBY1VFtVLOMZC3uNU
Recent comments