Lessons in Online Course Design and Implementation

Lessons in Online Course Design and Implementation


Reflections on Practice


LAM Wanli, Aileen, LUU Tran Huynh Loan, and CHONG Peck Marn, Sarah


Centre for English Language Communication, National University of Singapore



Correspondence
Name:      LAM Wanli, Aileen
Address:  Centre for English Language Communication, National University of Singapore, 10 Architecture Drive, Singapore 117511
Email:        aileenlam@nus.edu.sg


Recommended Citation:
Lam, A. W. L., Luu, T. H. L., & Chong, S. P. M. (2020). Lessons in online course design and implementation. Asian Journal of the Scholarship of Teaching and Learning, 10(2). 245-255.

View as PDF      Current Issue


Abstract

Massive Open Online Courses (MOOCs) are deemed a “potentially important educational practice with significant impact on the future of online learning” (Siemens, 2015, as cited in Erikson et al., 2017, p. 133). The Centre for English Language Communication (CELC), at the National University of Singapore (NUS) has also rolled out several internal online courses to “harness good technology-enabled pedagogical practices for the enhancement of learning outcomes” (National University of Singapore, n.d.). This paper documents the team’s reflections on the design and implementation of the MOOC ELC002 “Effective Online Writing”. Analysing the six data points generated throughout the MOOC’s development process against the Funnel of Participation (Clow, 2013) framework, we found that learners enrolled in an ungraded online course primarily because of the content’s perceived usefulness, technological affordances, socio-psychological factors such as self-efficacy, and even logistical factors such as timing. Aligning the course with these motivations may help MOOC developers effectively promote the course (awareness) and pitch the content at the appropriate level (registration). Implementing learner-centric learning activities is recommended to engage and sustain learners throughout the course (activity), and ultimately acquire meaningful skills and knowledge (progress).

Keywords: MOOCs, online courses, online learning, student engagement, technology-enhanced education

Introduction

With 81 million learners enrolled in 9,400 courses by more than 800 universities worldwide (Shah, 2018), Massive Open Online Courses (MOOCs) “have become an industry in their own right” (Littlejohn & Hood, 2018, p. 2). At the National University of Singapore (NUS), various internal online courses and flipped modules have been rolled out to “harness good technology-enabled pedagogical practices for the enhancement of learning outcomes” (National University of Singapore, n.d.) and ELC002 “Effective Online Writing”, an internal online course developed by our team at the Centre for English Language Communication (CELC), was one of them.
 
ELC002 was first piloted as a course titled “Influential Social Publishing through Blogs” and focused on social publishing, search engine optimisation, and language skills needed for an effective blog. However, the unsatisfactory enrolment (33 sign-ups), engagement, defined as the “interaction with the content and people in the course” (Walji et al., 2016), and completion rate (6%), defined as the percentage of learners completing all the learning materials, prompted us to examine the possible issues and seek solutions for successful MOOC development.

According to the literature, attrition rates are a persistent issue with most MOOCs (Adamopoulous, 2013; Exoo & Exoo, 2013). Clow (2013), through his framework the Funnel of Participation, predicts that MOOC learners typically go through four stages of (1) awareness of a MOOC, (2) registration for it, (3) activity which refers to engagement with the course, and (4) progress which refers to meaningful learning (Figure 1). In view of its relevance to ELC002, this same framework was used to explore the reasons behind the drastic decrease in participation at every stage and propose possible remediation.

Lam et al_Fig1

Figure 1. Funnel of Participation in MOOCs (Clow, 2013).

As illustrated in the flowchart in Figure 2, we began by considering the engagement data (Data Point 1, or DP1) for the first run of ELC002 and formulated our research questions. With these, we designed and conducted a student perception survey (DP2) to better understand our target audience’s needs. Thereafter, we revamped and relaunched ELC002, along with a pre-course survey (DP3), post-course survey (DP5), and in-depth interviews (DP6) based on modified research questions grounded in the literature on learners’ motivations, levels of engagement and rates of completion within the context of MOOCs (Wang & Baker 2015; Xiong et al., 2015; Yang et al., 2017; Zheng et al., 2015). A full description of the data points is in Appendix 1.

Lam et al_Fig2

Figure 2. Research questions and data points. (Click on Figure 2 for the full view)

This paper aims to highlight some useful lessons in MOOC curriculum development and implementation at every stage of the Funnel of Participation.

Stage 1: Awareness—Choose a Relevant Promotional Platform

In the perception n=m>175) (DP2), participants indicated that they obtained information about NUS courses primarily via email, followed by IVLE (NUS’ learning management system at the time of the study), and not as much from NUS websites, video hosting platforms and social media channels such as YouTube, Facebook, Instagram, and Twitter. Hence, ELC002’s second run was publicised through mass emails and on IVLE, in addition to the usual NUS websites and video hosting platforms.

Indeed, 54.4% of pre-course survey participants (n=69) learnt about the course’s second run through email, while 42.6% through the announcement on IVLE (DP3). This points to the importance of relevant and appropriate promotional platforms for the target audience. Other viable options include social media marketing, press releases, reviews and search engine optimisation (Despujol Zabala, 2019).

Stage 2: Registration—Meet the Audience's Needs and Interest

Learners are motivated to sign up and complete a course due to several factors. They include the content and/or the technology, such as the convenience or novelty of e-learning (Xiong et al., 2015; Zheng et al., 2015), self-efficacy which is a behavioural intention to complete the course (Wang & Baker, 2015; Ajzen, 2002), as well as social influences (Zheng et al., 2015; Azjen, 2002). 

Content

The perception survey (DP2, n=175) and pre-course survey (DP3, n=69) (Figure 2) highlight the reasons for registration. Firstly, as illustrated in Figure 3, DP2 data show that participants wanted to learn online writing skills applicable to “various online platforms” (72%) as opposed to only blogs (52%). This prompted us to rename our course to ELC002 “Effective Online Writing” and modify the learning outcomes to align with a wider range of online content (Appendix 2). The substantial rise in enrolment, from 33 to 166 participants, suggests a higher interest in the revamped course.

Lam et al_Fig3

Figure 3. Likelihood of taking types of online courses offered by CELC (DP2) (n=175).

The pre-course survey findings (DP3, n=69), as illustrated in Figure 4, affirmed the significance of the novelty of the topic (83.4%) and the course’s perceived usefulness in enhancing their curriculum vitae for job applications (72.5%) and academic performance (61.4%) in their enrolment decisions (Appendix 3). Clearly, the shift from blogging to online writing skills enhanced the course content’s perceived usefulness and motivated registration.

Lam et al_Fig4

Figure 4. Motivations to register for ELC002 (DP3) (n=69).

Technology, self-efficacy, social influences

DP3 also highlighted the technology-driven interest in the online learning platform in terms of its convenience (80.3%) and the registrants’ curiosity about it (65.7%). Self-efficacy-wise, 78% signed up due to their confidence in completing the course. Interestingly, a small percentage of registrants signed up due to social factors such as peer influence (2.6%) or a desire to join a community of interest (41%).

Other factors

Lastly, timing is important. According to the pre-course survey findings, 79.3% desired to learn new skills during the summer break (DP3), corroborating with DP2 data (69.1%). The improved enrolment rate for the second run, launched during the term break, also suggests the need to consider timing strategically since learners might not prioritise free and ungraded online courses over other commitments.

In all, MOOC developers should consider learners’ perceived usefulness of the content, technology-driven interest, self-efficacy, and other social and logistical factors that influence their decision to register for an online course.  

Stage 3: Activity—Keep Learners Engaged Through Learner-centric Materials

Engagement plays an important role in retention as students interact with the learning materials (de Freista et al., 2015), “obtain feedback, take part in active and collaborative learning, and interact with educators and other students” (Walji et al., 2016, p. 210). Instructor-learner interactions (Peltier et al., 2007) and instructor’s presence (Dixson, 2010) are also found to be significant in teaching and learning. Therefore, in addition to videos, quizzes and readings typical of extended or traditional MOOCS (xMOOCs) which promote one-way information transfer, both runs of ELC002 had online discussions and email communication from instructors to learners to promote collaborative learning, networking and instructor’s presence—a semblance of connectivist MOOCs (cMOOCs) (Daniel, 2012; Downes, 2012; Clow, 2013).

Meanwhile, as illustrated in Figure 5, DP4 data did show a sharp decline in participation in the second run from Week 1 to Week 2 before stabilising over the last three weeks, mirroring MOOC attrition and completion rates (Baker et al., 2014; Tseng et al., 2016). 

Lam et al_Fig5

Figure 5. Participation rates of ELC002 (DP4) (n=166).

Clearly, our effort to connect with learners over emails and facilitate the real-time forum discussions—the strategies deemed to be perceived positively by MOOC learners (Walji et al., 2016)—was not fruitful.

To find out why this happened, we analysed the post-course survey (DP5 – see Figure 6) and interviews (DP6 – see Appendix 6 for a sample of the interview questions). The data revealed that a lack of time, lack of learning communities, selective learning, and basic content were the main reasons for the low engagement rate. While a lack of time has been acknowledged to be a “bottleneck” of MOOC completion rates (Eriksson et al., 2017), the lack of learning communities, selective learning (Ho et al., 2014) and basic content present some noteworthy implications for MOOC curriculum development and promotion.

Firstly, the lack of learning communities suggests that connecting with learners via emails and facilitating online discussions may not be enough to engage learners with diverse backgrounds, interests and capabilities. Instead, more novel ways to connect with learners such as enabling video comments for lecture videos, leveraging social media (Walji et al., 2016) as well as designing activities with more learner autonomy could stimulate interaction online (Davidson, 2017). Furthermore, given that some learners prefer more interaction with peers and instructors while others prefer individualistic activities (watching lecture videos or attempting quizzes individually), a wide range of activities is needed to support both self-paced and collaborative learning in a sustainable community of inquiry (Garrison et al., 2000).

Secondly, instead of the typical sequential progression suggested by Moskal et al. (2015), the documented selective learning in the engagement data (DP4) suggested that some learners may not intend to finish the entire online course (Milligan & Littlejohn, 2017). Instead, they could (1) audit the content by simply watching the lecture videos or doing the readings, (2) sample a few units, (3) complete all, or (4) disengage halfway (Kizilcec et al., 2013). It could be argued that MOOC completion rates may not necessarily be calculated based on the number of learners completing all available units but by the number of topics learners choose to complete. In addition, learners may not always interact with the MOOC syllabus in a linear pattern. Hence, each unit should be designed as a stand-alone unit organised around certain themes.

Thirdly, while the majority of post-course respondents (87.5%) agreed or strongly agreed that the course content was insightful and “not a conventional thing seen in everyday life,” some learners found it “too basic”, “too simple”, and “not very informative”, which corroborated with the post-course survey findings where only 25% of respondents found the content challenging. We mapped participants’ comments with their background knowledge of online writing based on the interviews, and found that those who appreciated the course did not have much experience with online writing. On the other hand, those who found the course “too basic” had some experience in online writing or computing. This calls for a clear indication of the course scope and level of difficulty in the course description for subsequent course interations.  

In summary, while MOOC developers cannot control aspects like learners’ profiles, time constraints or other challenges, they can try to mitigate the reduction in engagement and meet various learner needs via syllabus design. While the entire syllabus could progress in a linear fashion, which will cater to the needs of a learner keen to pick up the skills from the start, it could, at the same time, adopt a unit-based structure—with standalone topics—comprising learner-centric content and activities that support selective, self-paced, collaborative, and non-linear learning patterns (Kizilcec et al., 2013). This design would therefore suggest that unit-based completion rates may be a more appropriate gauge of MOOC success.

Stage 4: Progress—Find Out if the Learning was Meaningful

Meanwhile, the progress stage should see learners acquiring skills and knowledge (Clow, 2013). According to the post-course survey (DP5), as illustrated in Figure 6, 87.5 % of respondents found the overall course content insightful and 75% found the topics of search engine optimisation, as well as style and structure of online writing most useful. This corroborates with what interviewees recalled about the knowledge acquired (DP6). The mentions of key concepts such as search engine optimisation, content value ladder and the originality of online content, copyright infringement, the F-shaped pattern of online writing, ethical language use, and the use of active versus passive voice in writing could demonstrate learning gains.

Lam et al_Fig6

Figure 6. Perceived usefulness by topic (DP5) (n=16).

In addition, the average successful attempt rate across all the quizzes was 82.8% (Table 1). Despite the small number of quiz takers (between 25-59), the findings still suggest learning gains. 

Table 1 
Quiz attempts
Lam et al_Table1

For future MOOCs, a pre- and post-course assessment (McGrath et al., 2015) with a larger sample size may be more useful to measure actual learning gains.

Conclusion

While there are limitations in the generalisability of the findings due to the small sample drawn from the NUS learner community, what we observed and documented still highlight plausible issues in MOOC development. Future efforts should focus on understanding learners’ needs and preferences in terms of content, proficiency levels, online learning patterns, promotional platforms, and timings. Learners are motivated to enroll in an ungraded online course by the content’s perceived usefulness, technological affordances, self-efficacy and timing. Aligning the course with these motivations may help MOOC developers effectively promote the course (awareness), pitch the content at the appropriate level (registration), engage learners using learner-centric activities tailored for different learning styles, and promote a learning community (activity) to encourage learners to engage more with the materials and subsequently, acquire meaningful skills and knowledge (progress).

References

Adamopoulos, P. (2013). What makes a great MOOC? An interdisciplinary analysis of student retention in online courses. Thirty-fourth International Conference on Information Systems, Milan. http://pages.stern.nyu.edu/~padamopo/What%20makes%20a%20great%20MOOC.pdf

Ajzen, I. (2002). Perceived behavioral control, self-efficacy, locus of control, and the Theory of Planned Behavior. Journal of Applied Social Psychology, 32(4), 665-683. https://doi.org/10.1111/j.1559-1816.2002.tb00236.x

Baker, R., Evans, B., Greenberg, E., & Dee, T. (2014). Understanding persistence in MOOCs (massive open online courses): Descriptive & experimental evidence. Proceedings of the European MOOC Stakeholder Summit 2014 (EMOOCs 2014), 5-10. https://www.scribd.com/doc/211003258/Proceedings-Moocs-Summit-2014

Clow, D. (2013). MOOCs and the funnel of participation. Proceedings of the Third International Conference on Learning Analytics and Knowledge, 185-189. https://doi.org/10.1145/2460296.2460332

Daniel, J. (2012). Making sense of MOOCs: Musings in a maze of myth, paradox and possibility. Journal of Interactive Media in Education, 2012(3), p.Art. 18. http://dx.doi.org/10.5334/2012-18

Davidson, C. N. (2017). The new education: How to revolutionize the university to prepare students for a world in flux. Hachette UK.

de Freitas, S. I., Morgan, J., & Gibson, D. (2015). Will MOOCs transform learning and teaching in higher education? Engagement and course retention in online learning provision. British Journal of Educational Technology, 46, 455–471. http://dx.doi.org/10.1111/bjet.12268 

Despujol Zabala, I. (2019, April 19). MOOC digital marketing [Video]. Universitat Politecnica de Valencia (UPV) Media. https://media.upv.es/#/portal/video/dd54ead0-6201-11e9-a3b0-cf96b1704ae8 

Dixson, M. (2010). Creating effective student engagement in online courses: What do students find engaging? Journal of the Scholarship of Teaching and Learning, 10(2), 1–13. https://files.eric.ed.gov/fulltext/EJ890707.pdf

Downes, S. (2010). Learning networks and connective knowledge. In Collective intelligence and E-Learning 2.0: Implications of web-based communities and networking (pp. 1-26). IGI Global.

Eriksson, T., Adawi, T., & Stohr, C. (2017). “Time is the bottleneck”: a qualitative study exploring why learners drop out of MOOCs. Journal of Computing in Higher Education, 29, 133-146. http://dx.doi.org/10.1007/s12528-016-9127-8 

Exoo, C. & Exoo, C. F. (2013, October 28). MOOCs: Corporate welfare for credit. Salon. https://www.salon.com/2013/10/28/moocs_corporate_welfare_for_credit/

Garrison, D. R., Anderson, T., & Archer, W. (2001). Critical thinking, cognitive presence, and computer conferencing in distance education. American Journal of Distance Education, 15(1), 7-23. http://dx.doi.org/10.1080/08923640109527071

Ho, A., Reich, J., Nesterko, S., Seaton, D., Mullaney, T., Waldo, J., & Chuang, I. (2014). HarvardX and MITx: The first year of open online courses, fall 2012-summer 2013 (HarvardX and MITx Working Paper No. 1). https://dash.harvard.edu/bitstream/handle/1/11987422/1%20HarvardX%20MITx%20Report.pdf?sequence=1&isAllowed=y

Kizilcec, R., Piech, C., & Schneider, E. (2013). Deconstructing disengagement: Analyzing learner subpopulations in massive open online courses. LAK’13 Leuven: Belgium. https://web.stanford.edu/~cpiech/bio/papers/deconstructingDisengagement.pdf

Littlejohn, A. & Hood, N. (2018). Reconceptualising Learning in the Digital Age: The [Un]democratising Potential of MOOCs. Springer: Singapore.

McGrath, C. H., Guerin, B., Harte, E., Frearson, M., & Manville, C. (2015). Learning gain in higher education. Santa Monica, CA: RAND Corporation.

Milligan, C., & Littlejohn, A. (2017). Why study on a MOOC? The motives of students and professionals. The International Review of Research in Open and Distributed Learning, 18(2). http://www.irrodl.org/index.php/irrodl/article/view/3033/4086

Moskal, P., Thompson, K., & Futch, L. (2015). Enrollment, engagement, and satisfaction in the BlendKit faculty development open, online course. Online Learning, 19(4), n4. http://dx.doi.org/10.24059/olj.v19i4.555

National University of Singapore. (n.d.). Frequently Asked Questions Note (FAQs) on iBLOCs Offered from January to June 2017. Retrieved from http://nus.edu.sg/ibloc/iBLOC_FAQs_2017.html.

Peltier, J. W., Schibrowsky, J. A., & Drago, W. (2007). The interdependence of the factors influencing the perceived quality of the online learning experience: A causal model. Journal of Marketing Education, 29(2), 140-153. https://doi.org/10.1177%2F0273475307302016

Shah, D. (2016, December 25). By the numbers: MOOCS in 2016. The Report by Class Central. https://www.classcentral.com/report/mooc-stats-2016/

Tseng, S. F., Tsao, Y. W., Yu, L. C., Chan, C. L., & Lai, K. R. (2016). Who will pass? Analyzing learner behaviors in MOOCs. Research and Practice in Technology Enhanced Learning, 11(1), 8. https://doi.org/10.1186/s41039-016-0033-5

Walji, S., Deacon, A., Small, J., & Czerniewicz, L. (2016) Learning through engagement: MOOCs as an emergent form of provision, Distance Education, 37(2), 208-223. http://dx.doi.org/10.1080/01587919.2016.1184400

Wang, Y., & Baker, R. (2015). Content or platform: Why do students complete MOOCs? Journal of Online Learning and Teaching, 11(1), 17. https://www.upenn.edu/learninganalytics/ryanbaker/Wang-Merlot-JOLT.pdf

Xiong, Y., Li, H., Kornhaber, M. L., Suen, H. K., Pursel, B., & Goins, D. D. (2015). Examining the relations among student motivation, engagement, and retention in a MOOC: A structural equation modeling approach. Global Education Review, 2(3), 23-33. https://files.eric.ed.gov/fulltext/EJ1074099.pdf

Yang, M., Shao, Z., Liu, Q., & Liu, C. (2017). Understanding the quality factors that influence the continuance intention of students toward participation in MOOCs. Education Technology Research Development, 65, 1195-1214. https://doi.org/10.1007/s11423-017-9513-6

Zheng, S., Rosson, M. B., Shih, P. C., & Carroll, J. M. (2015). Understanding student motivation, behaviors and perceptions in MOOCs. Proceedings of the 18th ACM conference on computer supported cooperative work & social computing, 1882-1895. https://doi.org/10.1145/2675133.2675217


About the Corresponding Author

Aileen LAM Wanli is from the Centre for English Language Communication (CELC) in NUS and has more than ten years of teaching and corporate training experience in communications and media. She is passionate about the use of technology in education and has developed online and blended courses. She is currently working on a MOOC which will be launched on edX.