Open access peer-reviewed article

How Hands-On Assessments Can Boost Retention, Satisfaction, Skill Development, and Career Outcomes in Online Courses

Alexandra Urban

This Article is part of THE SPECIAL ISSUE: TECHNOLOGIES AND CREATIVE LEARNING, LED BY DR. SHARON MISTRETTA, JOHNS HOPKINS UNIVERSITY SCHOOL OF EDUCATION, USA

Article metrics overview

274 Article Downloads

Article Type: Review Paper

Date of acceptance: May 2023

Date of publication: June 2023

DoI: 10.5772/acrt.23

copyright: ©2023 The Author(s), Licensee IntechOpen, License: CC BY 4.0

Download for free

Table of contents


Introduction
Methods
Results
Discussion
Acknowledgments
Conflict of interest

Abstract

Hands-on assessments provide active opportunities for students to practice new skills they have just learned. Massive open online course (MOOC) platforms offer a uniquely large dataset to track the impact of hands-on assessments on learners’ skill development, satisfaction, and career trajectory. While existing MOOC literature explore enrollment and demographic data, few have investigated the outcomes for learners who engage with different types of assessments within these online courses. This article is important because it quantifies the learner impact of hands-on experiences in MOOCs. With innovative analytics and hundreds of millions of course enrollments, online course platforms can shed light on the influence of alternative teaching decisions and assessment types. MOOCs offer data to quantify individual learners’ skill development in different topics before averaging across all course completers. Metrics, such as satisfaction, utilize learners’ self-reported star ratings of course material. Finally, for career outcomes, MOOC platforms can interact with learners after completing an online course to ask them how the content impacted their job-related outcomes, such as confidence in their role, receiving a promotion, or starting a new position. Control variables such as course domain, instructor characteristics, and learner demographics provide researchers with a robust dataset and thorough methodology to systematically track the benefits of hands-on opportunities in online content. This article examines the content structure and learning behavior data on a MOOC platform. The goal of this empirical study was to examine the impact of hands-on assessments on learner outcomes, including retention, satisfaction, skill development, and career outcomes.

Keywords

  • MOOCs

  • hands-on assessments

  • alternative assessments

  • skill development

  • learner satisfaction

  • career outcomes

  • empirical study

Author information

Introduction

Limari, a mother in Puerto Rico, wanted a boost to her career while retaining the flexibility to be an active parent to her son [1]. She began working as a community manager from home, but it was only after her friend’s suggestion to join Coursera that she had the skills and confidence start as a content strategy manager. Northwestern University’s professional strategy massive open online courses (MOOCs) provided a concrete path for her to advance her career.

In Bengaluru, India, Vivek needed a change, and Coursera offered him a lifeline [2]. After his mother died at age four and his alcoholic father abandoned him, Vivek was not able to finish his formal education but had always wanted to start his own company. By completing programming, machine learning, and algorithm courses on Coursera, he was able to start his own business and hire other self-taught experts. He continues to advocate for use of online courses and carves out time for himself and his employees to continue taking MOOCs to augment their skills.

Paulina, a medical student in Ghana, knew software and data science tools could help her family and her patients [3]. By completing more than a dozen courses on Coursera, Paulina is solving community health challenges through data extraction, geolocation, and clever analysis. As the daughter of farmers, Paulina loves her medical work but also wants to apply her newly gained data science skills to food production and distribution techniques. Paulina is a prime example of how individuals can use what they learn from online content to improve their broader community in a variety of ways.

Limari, Vivek, and Paulina are just three of the millions of learners who have engaged with online courses since 2012. Over the past decade, MOOCs have enabled anyone with internet access to take courses offered by top universities and companies from anywhere in the world. These open-access, largely video-based coursework have played an important role in democratizing higher education, providing faculty-created material at a low price point, and connecting diverse groups of individuals from around the globe [4]. The first MOOC was offered by a university in 2007 and just five years later, there were several companies founded with the goal of providing MOOCs to the world, including Coursera, edX, and Udacity [5]. By 2012, more than one million learners had engaged with a MOOC on one of these platforms.

The context of this research was the Coursera platform, one of the largest higher education platforms in the world. Since its founding in 2012, Coursera has been focused on its mission of providing the world’s best learning experience to anyone anywhere. To create the content it offers, Coursera partners with leading universities and companies to leverage their knowledge, skills, and expertise. By the end of 2020, more than 70 million learners from around the world had come to learn skills of the future in 4,000+ MOOCs offered on Coursera [6].

These MOOC learners are a much broader and more diverse group than traditional, on-campus higher education students. The majority of learners in Coursera’s community are 24 to 45 years old [7]. More than 60% of learners who complete at least one course report career impact as their top goal for enrolling, whether they desire getting a new role, changing industries, gaining confidence in their current job, or starting their own company [6]. Representing a truly global community, learners on the Coursera platform enroll from more than 200 countries worldwide [8].

Students enrolling in the fully online degree programs hosted on Coursera are similarly global and diverse. These degree students span the ages 18 to 65 and represent more than 90 different countries [9]. Plus, more than half of the degree students report working full-time while simultaneously participating in these for-credit learning experiences.

While many of these first MOOCs started with only quizzes, Coursera and others quickly added programming assignments followed by peer review assessments to augment the simple exam-style assessments with more hands-on opportunities to demonstrate their skills [6]. In the for-credit courses, Coursera also offers opportunities for tailored feedback from experts via staff-graded individual and team assignments [9].

On the Coursera platform, there are a variety of hands-on activities that can be included. For example, in a web programming course offered by the University of London Goldsmiths, the course team created a “Sleuth” detective game (see figure 1) for students to solve coding challenges [10]. They used the Coursera Labs infrastructure to design and host this series of programming activities and included custom auto-grading to determine students’ score for each part of the assignment.

Figure 1.

Example of a hands-on assessment for web development.

As a second example, the Google course team created a computer assembly activity to simulate hardware (see figure 2), where learners have put together the pieces of a computer not only in the right place but also in the right order [11]. By using the JavaScript plugin item type on Coursera, the course staff at Google created a hands-on activity for their IT Professional Certificate that had previously only been possible with a physical computer and completed in real life.

Figure 2.

Example of a hands-on assessment simulating physical hardware.

As a third and final example, the course team at Michigan State University created hands-on activities for their photography online courses [12]. While hands-on assessments are often associated with science and technology courses, arts and humanities can equally benefit from having learners test out the new skills they have gained earlier in the course (see figure 3). Peter Glendinning and team created a peer review assignment where students first submit their own photographs with accompanying explanations before receiving feedback from their peers. Importantly, the instructor made the grading rubric to help scaffold and guide learners when giving grades and recommendations to their peers.

Figure 3.

Example of a hands-on assessment for visual art.

Crucially, online hands-on assessments provide opportunities for large groups of students to practice new skills and gain valuable feedback. Hands-on assessments involve the active practicing of skills and offering feedback relevant to the task [13]. These learning activities are in contrast with passively watching a lecture or simple quizzes [14]. Another crucial aspect of hands-on assessments is providing tailored feedback, in the form of either automated or expert guidance [13, 14]. These hands-on activities provide the training students need to meet employers’ expectations [15].

Application-based assessments align more closely with real-world work and better prepare students for industry [16]. Industry-relevant projects were better liked by students, especially by those with less traditional backgrounds or less higher education experience [16]. With changing jobs, industries, and career skills needed, online assessments should reflect this dynamic landscape [17]. While often harder to grade, creative assessments with real-world applications lead to greater student innovation across the learning experience [16, 17].

Moving beyond multiple-choice exams into papers, presentations, and group projects better mirrors industry tasks and can not only increase success of students with less higher education experience but also increase all graduates' employability [18]. Alternative assessments can also help build confidence for some of the most vulnerable groups of students in terms of dropping out [16, 18]. De-emphasizing exams can lead to more equitable outcomes across student populations, and lowering stress can increase students’ ultimate learning gains [19]. Broader and varied assessments can also increase transfer to on-the-job tasks after graduating [18].

Since the start of the COVID-19 pandemic, educators have had a greater need to move these hands-on activities online, through the use of simulated laboratories, projects, and virtual environments [15, 20, 21]. However, many instructors have found it difficult to transfer in-person hands-on assessments into successful online activities and found online platforms lacking the capabilities they desired [22, 23]. Flexibility and autonomy given by alternative assessments are especially important with the diversity of adult learners now re-skilling through online courses—this adaptability is crucial and can lead to higher academic achievement across the board [24].

Hands-on assessments are well-documented to improve student engagement and satisfaction, especially in more technical courses. Students self-report significant preference to laboratory activities and other hands-on tasks as compared with other science course teaching methods, whether learning veterinary dental procedures or designing control systems for robotic devices [13, 14]. These students also reported the hands-on assessments as highly valuable for their learning [13, 14]. In a different training, students in an undergraduate medical course were highly dissatisfied with the shift to online because of the COVID-19 pandemic, linked to the sudden reduction in hands-on assessments and tailored feedback [23]. When connected to the learning objectives of the course, hands-on assessments can provide useful scaffolding and motivation to keep students engaged [25]. Alternative assessment and hands-on experiences appeared to be associated with higher learner readiness, engagement, and motivation in online instruction during the COVID-19 pandemic [24]. This increased engagement can be a positive first step toward stronger learning outcomes [20].

Hands-on assessments can increase skill development for learners. Through observation in MOOCs, researchers rated creativity skills and reported significant increases after project-based learning activities were implemented in the course and participants had to brainstorm, problem solve, and discuss their ideas with peers [26]. Instructors can also include hands-on activities that involve team projects and discussion to encourage collaboration and students’ sense of community in the course, having students work together to create a final deliverable [20].

This article explores research conducted on Coursera, a company specializing in online education that partners with more than 250 universities and companies globally to offer MOOCs and full degree programs to its 100 million registered learners. By analyzing course structures and learners’ behavior, team members at Coursera can offer data-backed insights to empower educators.

Methods

At one of the largest online learning platforms in the world, Coursera team members have conducted robust research to analyze what works best for teaching and learning online. This work drew on Coursera’s large number of learners and immense community of partners. The MOOC research looked across 200 million course enrollments from 9 years of data. The for-credit research drew on 17,960 unique students across 19 degree programs offered on Coursera, including Bachelor’s and Master’s programs from diverse domains, including Computer Science, Business, Data Science, and Public Health [9].

Specific outcome metrics related to quality also needed to be operationalized. For retention, the rate of course completion was used for MOOCs, and the rate of first-year persistence (i.e. percentage of students still active in the degree one year later) was used for analyzing the first courses in degree programs. For an indicator of satisfaction, course completers' percentage of 5-star ratings (out of a maximum five stars) was used. For skill development, Coursera’s own “skill score” was used [27]. To calculate the “skill score,” the difference was calculated for each course completer, comparing their documented skill level upon beginning and then completing the course in question. This change in score, or delta, was then averaged across all course completers of that MOOC. For career outcomes, responses from an in-platform survey automatically sent to learners six months after they completed the course were used to assess impact on individuals’ job, role, and pay. The proportion of course completers from each MOOC reporting some type of career benefit was used as this outcome variable.

Throughout both analyses, researchers at Coursera wanted to investigate the relationship between different learning design choices and the outcome metrics discussed above. To accomplish this goal, the relationship between each driver of interest and the outcome metric was assessed. To standardize the strength of each relationship, regression models were used to isolate the impact of each potential driver on the outcome metrics related to quality, including retention, satisfaction, and skill development. Throughout the analyses, the researchers controlled for learner demographic information, including age, prior educational attainment, and previous learning behavior on the Coursera platform, all of which can influence learners’ likelihood to succeed in online courses. They also controlled for course-level information, including the university brand and topic being taught, both of which can influence degree retention in a particular program. While the regression models provided accurate summaries of the relationship between each driver and outcome of interest, it became challenging to assess the relative effect of each one, since each driver has a different scale of observable measures.

The drivers explored included variables relating to course length, course composition, teaching decisions, and item specifics. For this article, the author has focused on the assessment drivers, particularly those related to hands-on learning, which was a subset of these initial data analysis projects for MOOCs and degrees. Thus, the drivers discussed here include staff-graded assignments, peer review assignments, programming assignments, and hands-on exercises (which includes staff, peer, and programming opportunities together).

This variety in drivers created challenges for attempts to standardize the impact of each and compare them. While course length ranges from three weeks to sixteen weeks, the length of a single video varies from 3 min to 50 min. To account for each driver operating on such a different scale, a unique design was used to normalize the effect across both MOOC and degree analyses. Specifically, the estimated difference was calculated for the outcome metric between the 25th percentile and the 75th percentile for each driver of interest [3, 6]. For example, the 25th percentile MOOC engages its median student on only 5 different days while the 75th percentile MOOC engages its median students on 12 distinct days. The difference between these two, the 25th and 75th percentiles, can be reported on a single display for each outcome metric of interest, be it retention or skill development. In addition, to examine the effect that the presence of a certain assignment type has on retention or satisfaction, a binary system was used, meaning either zero or one, where zero indicated all the courses without that assignment type presence and a one represented all the courses with at least one of those assignments. By comparing the satisfaction at zero, meaning none of those assignments present, and at one, this method provides an estimate of the impact for including that assignment type in the learning experience. Moving from the 25th to 75th percentile measures the standardized scale of each driver in percentile differences instead of the absolute values.

This method provided the additional benefit of moving only within the space of what had already been observed within the sample data, instead of extrapolating beyond the observed sample. A consequence of the chosen technique is that the driver effects can be reasonably changed, given they are occurring in existing online course designs already live on the Coursera platform. By restraining these findings within the observable data, there is sufficient evidence that these changes are both fruitful and possible.

Results

Hands-on assessments emerged as a significant driver across outcome metrics of interest. In both open-access MOOCs as well as closed degree courses, the inclusion of hands-on activities drove increases in retention, satisfaction, skill development, and career outcomes.

MOOC findings

Within MOOCs, using hands-on projects including both peer review and programming assignments demonstrated a 5% increase in satisfaction from learners (see figure 4). Peer reviews provide open-ended prompts for submission and can accommodate any type of project ranging from poetry to images or even a website. Programming assignments allows sophisticated computer grading of code depending on the custom graders set up by the course faculty. Even those courses with hands-on assignments usually take longer and more effort for individuals to complete, the learners are, on average, more satisfied by the course experience when these more substantial assessments are included.

Figure 4.

MOOC drivers influence on learner satisfaction.

Plus, while programming assignments are typically one of the most time-consuming items for learners to complete, given their hands-on nature and often challenging coding prompts, using in-browser coding can aid in engagement and retention. Some MOOCs require students to set up a coding environment on their own computer and conduct the programming project there before uploading their file submission to the platform. This process often requires downloading new software and wasted time trouble-shooting the set up. Alternatively, programming assignments can be set up so the coding is done in the browser, with a full environment configured specifically for programming languages including opportunities to run the code and view outputs within Coursera Labs. “Coursera Labs allows learners to seamlessly work on projects and assignments in a browser without any environment setup or software downloads—simply click a button to instantly work on programming assignments using today’s most in-demand tools like Jupyter Notebook, RStudio, VS Code, cloud software consoles” [28]. The Coursera Labs infrastructure creates a flexible item type that allows instructors to easily configure their own programming exercises with an in-browser coding environment [6, 28].

Comparing these two options, courses that require off-platform coding for their assignment submissions have an average completion rate of approximately 40%, whereas courses with in-browser coding assessments, including Coursera Labs, have an average completion rate of nearly 60% [6] (see figure 5). By removing the need for students to download and configure their own software off of the platform, instructors can increase both the satisfaction and retention by using in-browser coding tools.

Figure 5.

MOOC completion by the type of programming assignment used.

In technical content, using hands-on coding activities can also drive a 30% increase in skill development for course completers as compared with Computer Science (CS) and Data Science (DS) courses not using active programming projects [6]. Thus, while technical courses with only quizzes and exams are often easier and faster for learners to finish, they do end the course feeling as satisfied or with as positive a change in their CS and DS skills. Furthermore, hands-on practice assessments, where the grade does not impact learners’ overall grade, show a 10% increase to course completers’ final skill development from the content [6] (see figure 6). This hands-on practice provides low-stakes opportunities for learners to put their new skills into action without the pressure of a graded assessment.

Figure 6.

MOOC drivers on completers’ average skill increase.

Finally, many learners take MOOCs with the goal of having an impact on their career, and hands-on assessments can help this dream become their reality. In CS and DS courses, having a higher proportion of programming assessments increase course completers’ career outcome rate by 3% [6]. While this may not appear to be a large increase, it is important to remember the scale of Coursera’s platform and how many learners enroll in each course. Even a 1% increase in career outcomes equates to thousands of additional career transformation across the hundreds of CS and DS courses currently live on the platform [6, 8].

Degree program findings

The inclusion of hands-on assessments can also boost year-one retention in fully online degree programs, including Bachelor and Master degrees spanning Business, Computer Science, Public Health, and Data Science. Having at least one staff-graded assessment in the first term of courses increases students’ likelihood to be active in the degree program one year later by 6% [9] (see figure 7). These assessments provide open-ended prompts for students to submit essays, visualizations, recorded presentations, their own websites, and more. Instructors or teaching assistants then provide tailored summative feedback on each student's work as part of the final grade given. These staff-graded assignments often provide hands-on applications of the skills from the degree course and require substantial effort for students to complete.

Figure 7.

Degree course drivers on student first-year retention.

Furthermore, the controlled regression analysis indicated that the presence of at least one hands-on project, instead of exclusively quizzes, increases first-year retention by 3%. These hands-on projects can include peer review, programming assignments, and individual or group staff-graded assessments. Each of these options can provide open-ended prompts for students to apply their new skills in application-style projects instead of answering multiple choice questions.

Discussion

Overall, hands-on projects can provide authentic projects mimicking industry use cases of the skills and adds purpose to the new knowledge learners have gained. The appropriate use of peer review, programming, and staff-graded assignments each enable learners to demonstrate the skills from the course through application-based projects. These hands-on opportunities show them the benefits of applying these skills to real-world scenarios and help to make the link between academic learning and on-the-job tasks they will need in their career. Thus, hands-on assessments can increase individuals’ persistence through the material, satisfaction with the course, ultimate skill development from the learning experience, and eventual impact on their careers.

Implications for future practice

With new technologies, educators can now employ both in-person and online hands-on opportunities to supplement their instructional materials. Crucially, instructors are now armed with quantitative insights on how including these types of hands-on activities in online settings can boost learners’ engagement, satisfaction, skill development, and eventual career outcomes. When considering what to build for an online course and where to spend limited resources, course teams can use the impact metrics presented in this article to bolster the case for developing peer review assessments, programming assignments, and simulations. While these learning activities do take time and effort to create, their tangible benefits are seen for students in the cohorts to come and should be taken into account when justifying where course production budgets are spent.

Implications for future research

This article is only the first step in assessing the impact of online hands-on learning experiences. At Coursera, pedagogy and data science experts are continuing to work together to dive more deeply into hands-on activities. This first round of research presented here aimed to quantify the benefits of including hands-on assessments. The next layer of research will require looking under the hood and exploring how to build the best hands-on experiences online, considering both human-tagged and automatically recorded variables about each project or simulation. This type of research requires significant time and attention to detail, but only with detailed tagging and thorough regression analysis can course developers learn which aspects of hands-on experiences are most important to include in their designs. Just as practitioners can aid researchers in the design and execution of their studies, researchers can help practitioners by shedding light on new intricacies of how to implement their course elements.

Acknowledgments

Appreciation and thanks to Alan Hickey, Eric Karsten, and Laura Wandres at Coursera for making this article possible. Many thanks to Dr. Sharon Mistretta whose leadership made this entire edition possible.

Conflict of interest

The author declares no conflict of interest.

References

  1. 1.
    Learner story: finding flexibility and confidence in a content strategy career. Coursera Blog [Internet]; 2016 [cited 2023 Jan 13]. Available from: https://blog.coursera.org/video/finding-flexibility-and-confidence-in-a-content-strategy-career/.
  2. 2.
    Learner story: becoming an entrepreneur with Coursera. Coursera Blog [Internet]; 2015 [cited 2023 Jan 13]. Available from: https://blog.coursera.org/video/becoming-an-entrepreneur-with-coursera/.
  3. 3.
    Get inspiration and advice from Paulina’s story—and see how she’s using AI to improve healthcare in Ghana!. Coursera Blog [Internet]; 2021[cited 2023 Jan 13]. Available from: https://blog.coursera.org/see-how-paulina-is-using-ai-skills-learned-on-coursera-to-help-solve-healthcare-challenges-in-ghana/.
  4. 4.
    Loizzo J, Ertmer PA. MOOCocracy: the learning culture of massive open online courses. Educ Technol Res Dev. 2016;64: 10131032.
  5. 5.
    Pappano L. The year of the MOOC. The New York Times [Internet]. 2012 Nov.
  6. 6.
    Hickey A, Urban A, Karsten E. Drivers of quality in online learning. Coursera [Internet]; 2020. Available from: https://about.coursera.org/press/wp-content/uploads/2020/10/Coursera_DriversOfQuality_Book_MCR-1126-V4-lr.pdf.
  7. 7.
    Glassberg Sands E, Reddick R, Karsten E. Women and skills report: addressing gender gaps through online learning. Coursera [Internet]; 2021. Available from: https://about.coursera.org/press/wp-content/uploads/2021/09/Coursera-Women-and-Skills-Report-2021.pdf.
  8. 8.
    Urban A. Investigating the gender gap in STEM MOOCs [dissertation]. Baltimore, MD: Johns Hopkins University; 2022.
  9. 9.
    Urban A, Karsten E, Hickey A. Drivers of retention in online degree programs. Coursera [Internet]; 2022 [cited 2022 Dec 20]. Available from: https://about.coursera.org/press/wp-content/uploads/2022/05/Courseras-Drivers-of-Retention-in-Online-Degree-Programs-Report-1.pdf.
  10. 10.
    Anstead E, Katan S. Introduction to computer programming. Coursera [Internet]; 2023 [cited 2023 Jan 13]. Available from: https://www.coursera.org/learn/introduction-to-computer-programming.
  11. 11.
    Google Career Certificates. Google IT support professional certificate. Coursera [Internet]; 2023 [cited 2023 Jan 13]. Available from: https://www.coursera.org/professional-certificates/google-it-support.
  12. 12.
    Glendinning P, Sullivan MV. Photography basics and beyond: from smartphone to DSLR specialization. Coursera [Internet]; 2023 [cited 2023 Jan 13]. Available from: https://www.coursera.org/specializations/photography-basics.
  13. 13.
    Thomson A, Young KM, Lygo-Baker S, Lothamer C, Snyder CJ. Evaluation of perceived technical skill development by students during instruction in dental extractions in different laboratory settings—A pilot study. J Vet Med Educ. 2019;46: 399407.
  14. 14.
    Inghilleri N. Using a hands-on robotics project to affect skill development in a control analysis course [dissertation]. Atlanta, GA: Georgia Institute of Technology; 2021.
  15. 15.
    Noel TC, Rubin JE, Acebo Guerrero Y, Davis M, Dietz H, Libertucci J, Keeping the microbiology lab alive: essential microbiology lab skill development in the wake of COVID-19. Can J Microbiol. 2020;66: 603604.
  16. 16.
    Peoples C. Creative assessment design on a master of science degree in professional software development. J Comput Sci Educ. 2020;12: 4657.
  17. 17.
    Rice R. Let’s make our examination and assessment systems fit for the future and equitable for all. TED [Internet]; 2020 [cited 2022 Apr 28]. Available from: https://www.ted.com/talks/roisin_rice_let_s_make_our_examination_and_assessment_systems_fit_for_the_future_and_equitable_for_all.
  18. 18.
    Burnell I. Widening participation for non-traditional students: Can using alternative assessment methods level the playing field in higher education? Widening Partic Lifelong Learn. 2019;21: 162173.
  19. 19.
    Whitteck E, Fritz D. Toward more equitable assessment. Teaching in Higher Ed. [Internet]; 2021 [cited 2021 Jul 15]. Available from: https://teachinginhighered.com/podcast/toward-more-equitable-assessment/.
  20. 20.
    Elgie C. Digital transformation of pedagogy in design education in the virtual learning environment. In: DEFSA 16th Conference Proceedings 2021. Stadio Higher Education. Pretoria: DEFSA; 2021. Available from: www.defsa.org.za.
  21. 21.
    Tyagi H. Online teaching in Delhi-NCR schools in India during Covid-19 pandemic. Indian J Sci Technol. 2020;13: 40364054.
  22. 22.
    Baluyos G, Clarin AR. Experiences of instructors in online teaching: a phenomenological study. EduLine: J Educ Learn Innovation. 2021;1: 99117.
  23. 23.
    Islam MI, Jahan SS, Chowdhury MTH, Isha SN, Saha AK, Nath SK, Experience of Bangladeshi dental students towards online learning during the COVID-19 pandemic: a web-based cross-sectional study. Int J Environ Res Public Health. 2022 Jul 1;19(13):7786. doi:10.3390/ijerph19137786.
  24. 24.
    Ong MHA, Yasin NM, Ibrahim NS. Immersive experience during Covid-19: the mediator role of alternative assessment in online learning environment. Int J Interact Mob Technol. 2021;15: 1632.
  25. 25.
    Nortvig AM, Petersen AK, Balle SH. A literature review of the factors influencing E-learning and blended learning in relation to learning outcome, student satisfaction and engagement. Electron J Elearn. 2018;16: 4655.
  26. 26.
    Masran MN, Ibrahim NHB, Rahim INBA. PBL through MOOC Vs creativity skills among student. Int J Current Sci Multidiscip Res. 2020;3(6):150155. Available from: www.ijcsmr.in.
  27. 27.
    Reddick R. Using a glicko-based algorithm to measure in-course learning. In: EDM 2019-Proceedings of the 12th International Conference on Educational Data Mining. Washington, DC: ERIC; 2019. p. 754759.
  28. 28.
    Maggioncalda J. Coursera introduces hands-on learning with Coursera labs. Coursera Blog [Internet]; 2019; [cited 2019 Aug 28]. Available from: https://blog.coursera.org/coursera-introduces-hands-on-learning-with-coursera-labs/.

Written by

Alexandra Urban

Article Type: Review Paper

Date of acceptance: May 2023

Date of publication: June 2023

DOI: 10.5772/acrt.23

Copyright: The Author(s), Licensee IntechOpen, License: CC BY 4.0

Download for free

© The Author(s) 2023. Licensee IntechOpen. This is an Open Access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted reuse, distribution, and reproduction in any medium, provided the original work is properly cited.


Impact of this article

274

Downloads

597

Views


Share this article

Join us today!

Submit your Article