Open access peer-reviewed chapter

Together Apart during the COVID-19 Pandemic: Assessing Students’ Readiness for Online Assessments Using an E-Learning System

Written By

Glenda H.E. Gay

Submitted: 06 September 2020 Reviewed: 22 November 2020 Published: 14 December 2020

DOI: 10.5772/intechopen.95097

From the Edited Volume

E-Learning and Digital Education in the Twenty-First Century

Edited by M. Mahruf C. Shohel

Chapter metrics overview

555 Chapter Downloads

View Full Metrics

Abstract

Electronic learning (e-learning) is an indispensable management system that supports face-to-face, blended, or fully online courses. In January 2020, 258 students in a second-year management course at a regional university were evaluated on their preparedness for online lectures via e-learning. However, by mid-semester, the COVID-19 pandemic halted face-to-face teaching, pushed final assessments to an online modality, and forced some governments to quickly repatriate their students. This chapter evaluates students’ level of e-learning readiness (e-readiness) and whether it had any effect on their performance in the final assessment. The results show that six percent of the cohort had returned to their home country, six percent had no privacy to take their final online assessments, while 31% depended on Wi-Fi. However, although two-thirds of the cohort preferred the online modality, only a third had acceptable levels of e-readiness. E-ready students felt the disruption in their study routine most, while those who were not e-ready found more time to study after the curfew restrictions were in place. E-ready students attempted their final online assessment earlier than those who were not yet e-ready, but the two groups had similar assessment grades. Evaluating students’ level of e-readiness is vital in providing support for those who have challenges with online learning.

Keywords

  • higher education and E-learning
  • E-learning
  • E-learning and assessment
  • e-learning readiness
  • e-readiness
  • COVID-19

1. Introduction

The spread of SARS-CoV-2 disease (COVID-19) has affected citizens globally, including the Caribbean region and its academic community. This pandemic has triggered the implementation of strict protocols including ‘physical distancing’ in an attempt to reduce interpersonal contact. For educational institutions, their protocols were intended to lessen the spread of the virus to clusters of students in classrooms and complex networks of social groups often found on a campus [1]. The abrupt cessation of face-to-face classes to prevent further community spread has therefore forced an urgent shift to fully online teaching and learning via the use of e-learning systems (ELSs). Indeed, while online education has been expanding at a rate faster than traditional campus-based programs [2], there is no doubt that the pandemic has heightened the focus on the use of ELSs and online learning [3].

The brief time required to transition to online learning using an ELS has revealed some shortcomings for instructors and students. These include the unknown level of their preparedness for the ELS, inadequate training to quickly adapt to the online learning environment [3], limited or unstable internet connectivity, or an inability to afford the necessary technology such as a laptop [45]. Irrespective of these challenges, the use of ELSs has become an integral part of higher education institutions (HEIs) [6].

This chapter evaluates the effect of e-learning on a cohort of students at a leading Caribbean higher education institution. Data was captured at the start of the semester, and after its temporary closure because of the COVID-19 pandemic. This research focuses on answering three questions:

  • What is the level of e-readiness for this group of management students at a leading Caribbean higher education institution?

  • How does students’ level of e-readiness differ according to key personal demographics (i.e. age, gender) and academic-related factors (i.e. access to Wi-Fi, study time and private space) in the sample?

  • To what extent has students’ level of e-readiness impacted their ability to complete online final assessments?

The next section provides a comparison of face-to-face, online, and blended learning, as well as review of e-learning and e-learning readiness. This is followed by the context and methodology. The results section evaluates students’ level of e-learning readiness (e-readiness), and their ability to adapt to online learning to complete their studies. Finally, this chapter assesses how students may have been affected by the digital divide and measures implemented to ensure a level playing field for all in preparation for their final online assessments.

1.1 Face-to-face, online, and blended learning

The critical questions today regarding fully face-to-face, online, and blended learning involve the conditions and strategies that promote student engagement, satisfaction, and retention [7]. Studies on student satisfaction have been inconsistent where results may be impacted by variables other than the method of course delivery. However, the value of teaching presence was significant among all three learning modalities in guiding students through their course of study [8].

Students in traditional face-to-face courses have the benefit of in-person interaction where they could be influenced and more motivated to submit assignments, resulting in greater chances of completing the course [9]. However, even though students may be advocates of technology, many of them are seemingly unwilling to forgo the face-to-face learning experience [4]. Students have a tendency to be more satisfied with a traditional course, when compared with students’ satisfaction in online courses [8].

Online learning, as an alternative to traditional face-to-face learning, takes place partially or entirely over the Internet [10]. It encourages synchronous and asynchronous participation, but more so, it provides the potential for increased enrolment through access to a wider range of students. However, online learning involves a dependence on technology, an expectation of preparedness for the online learning environment, adequate online communication skills, and a predisposition for independent learning [5, 11, 12]. Although most students could easily adapt to the technology associated with online learning, those who are not adequately prepared for this type of experience tend to have challenges with critical thinking, independent learning and time management [12]. Students in strictly online courses tend to be more self-motivated since the onus is on them to be self-disciplined in order to complete the course. It is therefore not surprising that time management and motivation were found to positively influence their final exam grade [13]. However, feeling a sense of isolation and loneliness are often reported by students in online courses [12].

A meta-analysis and literature review of 176 studies on students in HEIs conducted between 1996 and 2008 compared the effectiveness of face-to-face and online instruction [10]. Their findings showed that students had slightly better exam grades with the online course and suggested that there is support for online learning as a replacement for face-to-face instruction. However, online and face-to-face learning differed in terms of the length of time students reserved for their studying. A more recent literature review of 104 studies between 2014 and 2019 focused on university students in online courses with respect to factors on course design, student support, faculty pedagogy, student engagement, and student success [7]. Their findings showed that for face-to-face courses, increased interaction with course content was associated with better exam grades and increased student satisfaction. For online courses, faculty feedback was paramount, but factors such as previous academic success, self-motivation, family support, workload management, and technology savviness were key for positive course outcomes and increased student satisfaction.

Blended learning, refers to the integration of traditional classroom methods with various online instructional technologies to improve teaching [11, 14]. While it is believed to be using the best elements of the two methods of learning, HEIs are increasingly using blended learning as a complement to, but not a replacement for traditional forms of learning [7]. This is because of the need to maintain the classroom experience as much as possible, which is difficult to mirror behind the structure of an ELS [15]. Higher education students seem to prefer blended learning which is supported by reduced attrition and improved examination marks [9, 14]. However, mature students were noted to drop out from courses more readily than younger students [9, 12, 14].

A study that compared blended and online learning methods regarding learning outcomes, reported that students were able to access online support more easily for courses in blended learning than online learning. However, the workload and difficulty level in online learning environments were believed to be more challenging for them, even though there was no significant difference noted regarding their final scores [16]. Students’ learning strategies [7], e-readiness [5, 17, 18] and levels of motivation [11, 13] were also noted to have a significant effect on both the learning process and learning outcomes.

1.2 E-learning

Electronic learning (e-learning) is defined as the use of technology (e.g., computer or electronic device) to provide learning material including tutorials, simulations, case- or game-based learning modules [19]. It is a system that promotes online learning by storing, processing, and distributing teaching materials. It also supports interaction and communication in the context of teaching and learning [6, 12]. However, it primarily aimed at fostering students’ independence in order to take responsibility for their learning, and thus encouraging them to play a more active role in their learning [11, 19]. In doing so, e-learning has changed the way learning takes place through a student’s experience and perception of studying, and has become the chosen method in higher education [20].

Academic preparedness is known to play a key role in a student’s desire to persist in his or her current program [21]. However, although reliance on ELSs has become an integral part of HEIs, as shown by consistently increasing enrolments in online learning over a 14-year span [2], many students entering university know little about what preparation is required for becoming capable to study via an ELS. That is, students must now ‘learn how to learn, how to study, and how to solve problems’ using this type of modality [18]. The lack of readiness or preparation for the ELS has contributed to higher attrition rates for students in blended or online courses than for traditional face-to-face courses [12]. Two studies reported on the implementation of a mandatory online orientation for students at registration. Students were allowed multiple attempts to successfully pass the orientation, before they could be given access to the online or blended course [22, 23]. Both studies reported that the students were generally better prepared for and more confident with their courses, with higher retention rates. The importance of e-learning therefore mandates that students’ preparation for an ELS be assessed in order to provide training or support [24].

1.3 Student e-learning readiness

E-learning readiness (e-readiness) was originally developed to assess the digital divide between developed and developing countries, organisations, or individuals [25]. In an educational context, it refers to the digital divide between students’ competence and confidence in using electronic devices for online communication, their preferences in using technology tools, and their ability to be involved in self-directed learning [12]. It also captures the potential experience of users of an ELS [18, 24]. Students’ readiness for online learning is also reported to have a positive impact on their successful course performance [16]. Students’ level of e-readiness has therefore become one of the core factors in evaluating success with ELSs [4, 7, 18]. Notably, self-directed learning and student motivation were found to be significant for their e-readiness [26, 27]. However, attributes such as gender, university, or courses, do not seem to show any effect on student e-readiness [17, 18, 24, 28]. More so, whether it is technical competence, self-directed learning, motivation for learning or online communications skills, there was no significant difference between gender and the attitudes and behaviours in these dimensions [28]. However, [17] found that students in the second year of a three-year degree programme were more computer and Internet savvy than students in the first or third year. More so, third year students had better study habits than students at levels one or two.

Since students’ career choices and opportunities depend on completion of their studies, it is important that students’ e-readiness be evaluated to ensure that it is not a hindrance towards their success [18]. This evaluation could assess students’ ability to work around any technological challenges, adapt to collaborative learning and training, as well as determine their motivation for synchronous and asynchronous self-paced learning [24]. A benchmark of student’s abilities would be an indication of which aspects of their skills need to be improved to meet at least the minimum competencies required for the ELS [5].

The level of students’ e-readiness uses specific criteria to establish a baseline value for student success in an online course. It is not used to force students into using the ELS per se, but rather to confirm that they are prepared for and receptive to the ELS [29]. Its purpose is to differentiate those students with substantial readiness for the online course from those who exhibit deficiencies across various categories. Students with deficiencies can be provided with support or pre-course training to give them the necessary tools and skills for working in the online environment. The categories generally evaluate students communicative and collaborative skills, meta-cognitive skills, technology availability and skills, cognitive skills, and self-directed learning [18]. Most categories are grouped into students’ technical readiness, study habits, and online learning preference based on research by [29]. These three categories will be used in this study, and are described in more detail.

1.3.1 Technical readiness

Technical readiness evaluates students’ possession of and access to technology, such as devices with appropriate software, access to the Internet and stable network connectivity [5]. Studies have reported poor internet connectivity as one of the biggest technical readiness challenges for students. However, even with students’ extensive use of digital devices, there is still a gap between technology use for entertainment and technology required for the ELS [17]. Successful learning outcomes in online courses require strong digital literacy skills, which suggests that students’ readiness to use the ELS has some influence on academic achievement [18]. An orientation to the ELS prior to starting a course has shown to benefit students by providing hands-on exercises using tools that they would should be familiar with, as well as helping to rectify any technology issues without added pressure of assignment deadlines [23]. Therefore, students who are proficient with using computers to access the ELS, can comfortably use new technologies with stable Internet access, are also proficient in this category [29].

1.3.2 Study habits

Study habits involve students’ interaction with the course content, rather than engaging with the instructor or other students. Routinely working through the content allows students to become familiar with getting information from the ELS [26]. Having good study habits therefore fosters the ability to effectively navigate the content during the semester. Students who successfully work through a course on their own tend to be more suited to the online learning environment. This includes the necessary writing skills and ability to collaborate online. Students therefore who are not savvy with online communication will be hindered during an online course [8, 17, 18].

1.3.3 Preference for online learning

Students’ preference for online learning reflects their reliance on the ELS for the duration of their course [4, 27]. Thus, it is important that they have attributes that are suitable for blended or online learning [29]. These attributes include being highly motivated and self-confident [30], engaging in self-directed learning [17], and having an ability to interact with their peers using the ELS [26].

Advertisement

2. Context of the study

Students at a Caribbean university rely on an ELS to access their course work which also serves as the central hub for the submission and management of assignments. Generally, students use their own laptops or mobile devices, but desktop computers are also available in five computer laboratories. There is also a stable network connectivity on the campus that students use to access the ELS. Classes are mostly scheduled between 8 am and 9 pm during the week. In recent years, students’ dependence on the ELS has escalated, based on instructor training to use the ELS, but more so because of the ever-increasing number of working students who are challenged to physically attend face-to-face lectures during the day. However, final examinations are traditionally conducted at the end of each semester in a face-to-face environment on the campus.

A second-year core management course is offered each semester in a blended format. Course topics are revealed weekly via the ELS, supported by tutorial sessions in the computer laboratories. At the beginning of the January 2020 semester, as part of the orientation activities, 258 students were invited to complete an anonymous survey. It sought to determine their level of e-readiness regarding their technical preparedness, study skills and motivation for using the ELS. However, six weeks later, at mid-semester, the SARS-CoV-2 (COVID-19) pandemic had reached the region and the campus community, resulting in a 24-hour curfew and country-wide lockdown. This halted face-to-face teaching, forcing some students to immediately return to their home country. Final examinations were redefined as final assessments and placed online via the ELS. With students off campus and even spread across the region, a second anonymous survey was posted at week seven via the ELS to garner additional information in preparation for their final online assessment.

Traditionally, students complete their examinations scheduled in specific rooms under supervision. Since students were now completing their final assessment in an online environment, the integrity of the process required a different approach. Therefore, the format of the online assessment and its scheduling was a major consideration in catering to courses with over 150 students.

Advertisement

3. Survey instruments and exam creation

The two anonymous surveys were created using tools in the ELS. Purposive sampling was used since, according to [31], only the students in this course would have been able to provide the desired information that impacted their assessments. The instructor used the information to make informed decisions that supported these students’ needs in order to successfully complete the topics and assessments for the course. The responses were used to collectively determine whether students’ access to stable network connectivity remained, and whether their access to technology, Internet, study habits, or predisposition for working online had any influence on their final online assessment.

The first survey comprised 18-items that captured students’ level of e-readiness, while the second five-item survey was created to obtain information such as their location off-campus and network connectivity. Both surveys were reviewed by two experienced instructors and pilot-tested before revealing to students. The survey instruments both captured other demographic data such as students’ gender, age range, and year level.

3.1 Survey at the start of semester

This 18-item e-readiness survey was adapted from [29] and used as a framework for evaluating the students’ level of e-readiness. It captured responses on three categories and used a five-point Likert-type scale, ranging from “1 = strongly-disagree” to “5 = strongly-agree”. The three categories are described in more detail below:

The technical readiness (TR) category captured responses on students’ technology literacy and device setup. The items included questions that asked students whether their devices comprised specific software applications required for the course, as well as whether they had a dedicated network connection, access to the Internet, or knew how to contact the IT Services help desk;

The study habits (SH) category captured responses on students’ study environment, including whether they had a private space in which to focus on their course work.

The online learning preference (OL) category captured responses that related to students’ routine and preferences. Items required responses on students’ ability to study alone and meet assignment deadlines, as well as their preference for the structure of a classroom environment.

3.2 Survey at midsemester

The second survey comprised five items that asked students about their geographical location, technology in their possession, stability of their network connectivity that would allow them to logon to the ELS from their location, as well as their perception of their study and time management skills.

3.3 Online assessment creation

Due to the large number of students in the course, an online quiz format was thought to be the best option, where higher-order multi-choice, short-answer items, and essays were created. The online assessment was worth 60 marks and comprised two sections. Section one contained 15 multiple-choice items that addressed the eight topics taught in the course. The items were randomly selected from a question bank of about 200 questions that were categorised by topic. Each student was therefore shown one or two items randomly selected from each topic.

Section two was based on a case study where three questions were randomly selected from a larger group of questions. To do this about 80 short-answer and essay-type items were created, based on the eight topics, ensuring the same or similar level of difficulty and mark allocation. A set of questions, each worth 15 marks, was then designed from these groups of items.

Advertisement

4. Methodology

4.1 Surveys

Both surveys were created in the ELS and set to receive anonymous responses, therefore no identifying information for any student was captured. As a survey was completed, it was assigned a uniquely generated ELS number which on download, provided the ability to merge the raw data from both surveys for statistical analysis. The first survey was distributed during the first week of registration for the course and remained opened for two weeks until the add/drop course deadline. Its purpose was to obtain as many responses as possible during the early weeks of the course. The second survey was distributed at week seven after the lockdown and remained open for three weeks. Completed responses from both surveys were selected for the final data set, which was then cleaned and analysed using the software application Statistical Package for the Social Science (SPSS) version 22.0.

The characteristics of an e-ready student were identified by the items that had the highest mean in the technical competence, study habits, and online learning preference categories. Items with the lowest mean in the same categories denoted challenges for students who would not be adequately prepared for the ELS [5, 17, 29]. To determine the overall level of e-readiness of this cohort, the average score of all three e-readiness scales was calculated. The cohort was rated as satisfactory if this average score was at least four out of a maximum of five points [29]. Furthermore, each student’s level of e-readiness was also calculated, and reported as satisfactory if the average of the ratings in each of the three categories for that student was at least four out of a maximum of five points [29].

This study used a quantitative approach, where students’ e-readiness was used as an independent variable. Chi-square tests were used to determine if there was a significant difference between the two e-readiness groups, their readiness characteristics, and their access to computers. A set of independent samples t-tests were used to determine whether there were any significant differences between the two e-readiness groups and their demographics, access to stable connectivity, and ability to manage study time. A paired samples t-test was used to determine if there was significant difference between students’ ability to have a private place with adequate uninterrupted time before and after the lockdown.

4.2 Final online assessment

The final online assessment comprising randomly selected multiple-choice, short-answer and essay items was duplicated four times. Students were also randomly placed into four groups and allocated to one of the assessments. Figure 1 shows the layout of the final assessments on the course page. Each assessment was visible for 24-hours, and was set so that students could only see the assessment to which they were assigned from midnight to midnight on the following day. However, students were given a duration of one and one-half hours to attempt all questions once they accessed the assessment, and were restricted to one attempt. Once a student started the assessment, a timer started a countdown. The format was also set to be sequential in progression, so that a previously completed item could not be viewed. This arrangement and duration were intended to deter cheating and reduce any chance to search for answers.

Figure 1.

Screenshot of final assessment, showing how they were set to give each student the impression that they were accessing the same assessment.

In order to understand if there was any difference between students’ level of e-readiness and the time they started the exam, the time stamp for these logs were first analysed. Any log that showed a start time between midnight and 8 am was assigned a value of one; the value of two was assigned to the time frame between 8:01 am and 10 am, a value of 3 was assigned to the 10:01 am and 12 noon time frame, and so on, with the value 9 assigned to the 10:01 pm to midnight time frame. The smaller values therefore reflected the earlier times that students started the examination while the higher values indicated a later start time. A set of independent samples t-test were used to determine whether there were any significant differences between the two e-readiness groups and the time of day that they completed their assessment, as well as their final assessment grades.

Advertisement

5. Results

The surveys were distributed to 258 students with 252 (97.7%) completed responses received from the first survey and 233 (90.3%) completed responses from the second survey. This resulted in 228 (88.4%) completed responses from students who responded to both surveys. Table 1 shows that around three-quarters of the cohort were female, in their second year of their degree programme, representing a median age range of 29 years or younger.

Cronbach’s alpha tested the internal consistency of the e-readiness items using 0.7 as an acceptable value for a reliable construct [28, 31]. The instrument was deemed to be reliable since the alpha coefficient for the composite scale was 0.92, while the three scales ranged between 0.71 and 0.86 (Table 2). Correlations among the three variables were positively and significantly correlated to each other, with a p value <0.01. All constructs had strong correlations of above .60 with each other, ranging from .635 to .773.

Pearson correlation analysis was used to examine whether any association existed between the students’ level of e-readiness and demographic variables. None existed for gender χ(1) = −.019, p = .775); age range (χ(1) = .303, p = .069) or year level (χ(1) = .714, p = −.024).

5.1 Characteristics of student e-readiness

The highest means of the three e-readiness items defined an e-ready student as one who routinely communicates with persons using electronic technologies, has access to the Internet for substantial periods of time, perhaps 45 minutes or so, at least 3 times a week, and is eager to try new technologies and applications. In contrast, students who did not meet the threshold scores for e-readiness had characteristics that involved lack of access to software applications required for the course, and a preference for attending a traditional class on the campus, possibly in order to receive immediate verbal feedback during this type of instruction. A comparison of means for the characteristics that depict a student who is deemed to be e-ready or not e-ready is shown in Table 3.

5.2 Students’ level of e-readiness

The average of the three categories for the cohort was calculated as 4.0 out a maximum of 5 (80%), indicating that the cohort had met the minimum requirement to be deemed prepared for engaging in the ELS. The high score for the technical readiness category, however, could have offset the lower scores in the study habits and learning predisposition categories. To derive a better understanding of how these students had met or surpassed the required minimum score, each student’s level of e-readiness was then calculated. A student was considered to be e-ready if four or more points in each of the three e-readiness scales were achieved. The results showed that only 78 (34.2%) students were deemed to be e-ready, while 150 (65.8%) exhibited deficiencies in one or more categories.

Three independent samples t-test were run to determine if there was a significant difference between students who were deemed e-ready and those not yet e-ready in terms of their technical readiness, study habits, and online learning predisposition (Table 4). The results showed that for technical readiness, there was a significant difference between students who were deemed e-ready (M = 4.76, SD = 0.263), (t (194.71) = −9.69, p < 0.05) and those who were not yet e-ready (M = 4.02, SD = 0.866). Thus, students who were deemed to be e-ready had significantly higher technical readiness than those who were not. For their online study habits, the results found that there also was a significant difference between e-ready students (M = 4.58, SD = 0.322), (t (213.28) = −12.13, p < 0.05) and those who were deemed not yet e-ready (M = 3.64, SD = 0.830). That is, e-ready students also had significantly better online study habits than those who were not. The results for their online learning predisposition found a significant difference between e-ready students (M = 4.49, SD = 0.345), (t (225.81) = −13.71, p < 0.05) and those who were deemed not yet e-ready (M = 3.39, SD = 0.687). Again, e-ready students had significantly higher preferences for online learning than those who were not yet e-ready.

5.2.1 Further analysis of non-e-ready students

This section focuses on the 65.8% (150) of the students who were not classified as e-ready. Table 5 examines the extent of the deficiencies exhibited by these students on the three categories. Of the 31.3% (47) in this sub-group who experienced technical challenges, 70.2% (33) were female and 89.4% (42) were 29 years old or younger. For the 58.7% (88) who exhibited unsatisfactory study habits, 79.5% (70) were female with 82.9% (73) in the 29 years or younger age group. The highest number of students, 84.7% (127) showed deficiencies in the online learning predisposition category, with a similar trend of 77.1% (98) female with 81.1% (103) in the 29 years or younger age group. In each category, over 70% were females and over 80% were 29 years or younger [32].

(228)
CategoryVariablesN (%)
GenderMale53 (23.2)
Female175 (76.8)
Age range29 or under185 (81.1)
30–3930 (13.2)
40–497 (3.10)
50–596 (2.60)
Year levelFirst Year48 (21.1)
Second Year150 (65.8)
Third Year30 (13.2)
Home countryLocal215 (94.3)
In Region13 (5.70)

Table 1.

Demographics of respondents.

E-readinessReliabilityM (SD)
Technical readiness (TR)0.864.27 (0.80)
Study habits (SH)0.713.96 (0.83)
Online learning preference (OL)0.723.77 (0.79)
Average0.924.00 (0.72)

Table 2.

Reliability of the three e-readiness categories.

E-readiness characteristics
E-readyM (SD)
Has access to the Internet for substantial periods of time, perhaps 45 minutes or so, at least 3 times a week (TR).4.54 (1.03)
Routinely communicates with others using electronic technology (SH)4.50 (1.02)
Is eager to try new technologies and applications (OL)4.18 (1.11)
Not e-readyM (SD)
Does not have access to software applications required for the course (TR)4.00 (1.18)
Prefers to come to campus to attend a traditional class (SH)3.51 (1.26)
Is uncomfortable waiting for written feedback and prefers receiving immediate verbal feedback (OL)3.41 (1.16)

Table 3.

Characteristics of students who are e-ready or not yet e-ready.

VariableE-readiness benchmarkNumber of students (N)Mean (M)Standard deviation (SD)tdfp
Technical readinessNot yet e-ready1504.020.866−9.69194.71.000
e-ready784.760.263
Study habitsNot yet e-ready1503.640.830−12.13213.28.000
e-ready784.580.322
Preference for online learningNot yet e-ready1503.390.687−13.17225.81.000
e-ready784.490.345

Table 4.

Differences between students who were deemed e-ready and those who were not yet e-ready in relation to technical readiness, study habits and learning preferences.

Number of students with deficiency in...Total number not e-ready (150)Age group
29 or younger
Gender (Females)
Technical readiness47 (31.3%)42 / 47 (89.4%)33 / 47 (70.2%)
Study habits88 (58.7%)73 / 88 (82.9%)70 / 88 (79.5%)
Preference for online learning127 (84.7%)103 / 127 (81.1%)98 / 127 (77.1%)

Table 5.

Analysis of students who were not deemed to be e-ready (n = 150).

Further analysis of the 150 students who were not deemed to be e-ready revealed that while 48% (72) were deficient in any one category, 29.3% (44) had deficiencies in two categories, and 22.7% (34) were found to be deficient in all three categories. These details are displayed in Table 6.

E-readiness deficienciesStudents%
Deficiency in one category7248.0
Deficiency in two categories4429.3
Deficiency in all categories3422.7
Total150100.0

Table 6.

Number of students with deficiencies in one or more categories (i.e. technical readiness, study habits, or online learning predisposition) (n = 150).

5.3 Mid- semester shutdown

With the arrival of COVID-19 in the country at mid-March 2020, and subsequent remote teaching in order to complete courses and bring an end to the semester, the second anonymous survey garnered responses from students on whether they might have been affected by the digital divide, change in study habits or interruption in the move to fully online learning.

5.3.1 Devices and network connectivity

Students were asked to identify the type(s) of computer or mobile device that they used to access the ELS from their geographical location, as well as the stability of their Internet connectivity. Most students had remained in the country, while 13 (5.7%) had travelled back to their home country. Table 7 shows that while most students had a laptop computer, 10 had access to only a mobile device. Those 10 students indicated that they only had an iPad or mobile phone in their possession. Nine of these students were (6.0%) deemed not to be e-ready.

Individually e-readyNot e-readyTotal
78 (34.2%)150 (65.8%)228
Location after shutdown
In campus country72 (92.3%)143 (95.3%)215 (94.2%)
Outside of campus country but in the Caribbean6 (7.7%)7 (4.7%)13 (5.70%)
Access to one set of technology
Laptop24 (30.7%)63 (42.0%)87 (38.1%)
Desktop4 (5.1%)2 (1.3%)6 (2.7%)
Mobile device (iPad, phone)1 (1.3%)9 (6.0%)10 (4.4%)
Stability of Network
Always53 (35.3%)33 (42.3%)86 (37.7%)
Sometimes45 (30.0%)22 (28.2%)67 (29.4%)
Rarely4 (2.7%)1 (1.3%)5 (2.2%)
Depend on Wi-Fi only48 (32.0%)22 (28.2%)70 (30.7%)

Table 7.

Students’ location, access to technology and stable connectivity.

A chi-square test for independence was run to determine if there was a significant difference between students who are deemed to be e-ready and those not yet e-ready in relation to their access to a desktop computer, laptop PC or Mac, or other mobile device. The results indicated that there was no significant difference in the proportion of students who were e-ready from those who were not yet e-ready with regard to their access to a laptop or Mac, X2(1, 227) = .549, p > 0.05, access to a desktop, X2(1, 227) = .418, p > 0.05, and access to another mobile device X2(1, 227) = .816, p > 0.05. This suggests that e-ready students did not differ significantly in their access to online learning devices than those who were not yet e-ready.

An independent samples t-test was run to determine if there was a significant difference between students who were deemed e-ready and those not yet e-ready in terms of the frequency of the stability of their Internet connection. A Likert scale from 1 to 4 indicated whether the frequency of the connection was always, sometimes, or rarely stable or whether there was dependence on Wi-Fi. The test found that there was no significant difference between students who were deemed e-ready (M = 2.85, SD = 1.25), (t (226) = −9.12, p > 0.05) and those who were not yet e-ready (M = 2.69, SD = 1.25) (Table 8). Therefore, students who were deemed e-ready had as stable a connection as those who were not yet e-ready.

VariableE-readiness benchmarkNumber of students (N)Mean (M)Standard deviation (SD)tdfp
Stable connectivityE-ready782.851.25−.912226.363
Not yet
e-ready
1502.691.25

Table 8.

Independent samples t-test determined that there was no significant difference between students who were deemed e-ready and those who were not yet e-ready regarding the stability of their internet connection.

5.3.2 Time management

Students were now dispersed from the campus environs, dependent on their own technology and access to consistent network connectivity. Based on the significant number of students who had poor study habits, and concerns about online learning, an independent samples t-test was run to determine if there was a significant difference between students who were deemed e-ready and those not yet e-ready in terms of their ability to manage their study time. A Likert scale from 1 that indicated that they believed they were able to successfully manage their time to 4 indicating an inability to manage study time at all. The test found that there was a significant difference between students who were deemed e-ready (M 3.40, SD = 0.706), (t (226) = −2.378, p < 0.05) and those who were not yet e-ready (M = 3.18, SD = 0.650) (Table 9). Therefore, students who were deemed e-ready had significantly better time management skills than those who were not yet e-ready.

VariableE-readiness benchmarkNumber of participants (N)Mean (M)Standard deviation (SD)tdfp
Time managementE-ready783.400.706−2.378226.018
Not yet E-ready1503.180.649

Table 9.

Independent samples t-test determined students who were deemed e-ready had significantly better time management skills than those who were not yet e-ready.

5.3.3 Private space, and adequate time for studying

Based on the previous results, it was important to further assess the extent to which students’ rhythm of the semester was disrupted by the campus’ lockdown and subsequent move to online learning via the ELS. Therefore, having a private place at home or work, and having adequate uninterrupted time to study were assessed based on whether the e-readiness of both groups of students.

5.3.3.1 E-ready students

A paired samples t-test was run to determine if there was a significant difference between e-ready students having a private place at home or work they could use for extended periods before and after the lockdown. The results revealed that students who were e-ready had access to such a place significantly more so before the lockdown (M = 4.85, SD = 0.40), than after (M = 4.26, SD = 0.95), (t (77) = 5.082, p < 0.05).

A paired samples t-test was also run to determine if there was significant difference between e-ready students having adequate time that would be uninterrupted in which they could work on their online courses before and after the lockdown. The results revealed that students had adequate uninterrupted time significantly more so before the lockdown (M = 4.71, SD = 0.51), than after (M = 4.32, SD = 0.86), (t (77) = 3.243, p < 0.05). Thus, these results indicate that students’ access to these conditions significantly decreased after the lockdown. Table 10 shows the results of these two conditions.

E-readyNumber of students (N)Mean (M)Standard deviation (SD)tdfp
Private place before784.850.3975.08277.000
Private place after4.260.946
Study-time before784.710.5123.24377.002
Study-time after4.320.860

Table 10.

A paired samples t-test showed that there was less privacy to study and also less time to study after the COVID-19 lockdown.

5.3.3.2 Students not deemed to be e-ready

A paired samples t-test was run to determine if there was significant difference between students who were not yet e-ready having a private place at home or work that they could use for extended periods before and after the lockdown. The results revealed that these students had similar access to such a place before the lockdown (M = 3.93, SD = 1.28), and after (M = 3.78, SD = 1.00), as there was no significant difference (t (149) = 1.271, p > 0.05). Therefore, students indicated that their access to such an environment, or lack thereof, did not change.

A paired samples t-test was run to determine if there was significant difference between students who were not yet e-ready having adequate time that would be uninterrupted in which they could work on their online courses before and after the lockdown. The results revealed that students had less uninterrupted time to work before the lockdown (M = 3.57, SD = 1.24), than after (M = 4.06, SD = 0.84), (t (149) = −4.368, p < 0.05). Table 11 shows the results of the t-test.

Not e-readyNumber of students (N)Mean (M)Standard deviation (SD)tdfp
Private place before1503.931.2781.271149.206
Private place after3.781.001
Study-time before1503.571.239−4.368149.000
Study-time after4.060.837

Table 11.

A paired samples t-test showed that there was no change in a private place to study but more time to study after the lockdown.

5.4 Preparing for and taking the final online assessment

An independent samples t-test was then run to determine if there was a significant difference between students who were deemed e-ready and those not yet e-ready, regarding the time they started the final online assessment. The test found that there was a significant difference between students who were deemed e-ready (M = 4.38, SD = 2.17), (t (225) = 2.153, p < 0.05) and those who were not yet e-ready (M = 5.01, SD = 2.05) (see Table 12). Therefore, e-ready students started their online assessment earlier than those who were not yet e-ready.

E-readiness benchmarkNumber of participants (N)Mean (M)Standard deviation (SD)tdfp
Time student started exame-ready784.382.1702.153225.032
Not yet e-ready1505.012.047

Table 12.

An independent samples t-test determined that e-ready students started their online examination earlier than those who were not yet e-ready.

5.5 Final course results

An independent samples t-test was run to determine if there was a significant difference between e-ready students and those not yet e-ready in relation to their final grade. The test found that there was no significant difference between e-ready students (M = 39.41, SD = 6.10), (t (225) = −0.308, p > 0.05) and those who were not yet e-ready (M = 39.14, SD = 6.33) (Table 13). Therefore, both groups of students had similar final grades.

E-readiness benchmarkNumber of participants (N)Mean (M)Standard deviation (SD)tdfp
Final gradee-ready7839.416.100−.308225.758
Not yet e-ready15039.146.332

Table 13.

Independent samples t-test to determine whether there was a significant difference between e-ready students and those not yet e-ready in relation to their final grade.

The final assessment results for the cohort are shown in Table 14. Most of the e-ready students performed slightly better in the A-to A+ range. However, students who were not deemed to be e-ready performed slightly better in the B- to B+ and C to C+ range. Three students who were not deemed to be e-ready failed or were absent for the final examination.

GradeE-ready N (%)Not e-ready N (%)Total N (%)
A- to A33 (42.3)53 (35.3)86 (37.7)
B- to B+39 (50)77 (51.3)116 (50.9)
C to C+6 (7.7)17 (11.3)23 (10.1)
Failed0 (0)2 (1.3)2 (0.9)
Absent0 (0)1 (0.7)1 (0.4)
Total78 (34.2)150 (65.8)228 (100)

Table 14.

Comparison of examination results for students who were e-ready and not e-ready.

Advertisement

6. Discussion

The demographic data of this cohort was representative of other studies with the majority female students, in the 29 years or younger age range, who were at level 2 of their studies [4, 5, 17, 27]. Analyses on student characteristics and their level of e-readiness of this cohort, resulted in no significant differences among demographics such as age or gender with any of the e-readiness variables as reported in other research by [17, 18, 27].

6.1 Characteristics of student e-readiness

The characteristics of e-ready students identified in this study were shown to be similar to those in other studies, especially with having a natural inclination towards adapting to new technologies [4, 5, 27]. Characteristics of students who did not meet the criteria for e-readiness were also similar to those who preferred the face-to-face component of blended courses [21]. These students who suffer from technology challenges or require immediate verbal feedback highlight resistance towards changing their study habits and learning preferences in order to interact with an ELS [45]. Those students who can easily access help desk support in face-to-face or blended courses, may be reluctant to switch to an online format if the course places high value on course assignments and outcomes. The dependence on technology for assistance and support may negatively impact their preference for an online course [11].

6.2 Students’ level of e-readiness

The results revealed that about a third of the students were deemed to be e-ready, while about two-thirds showed deficiencies in one or more e-readiness categories. Other studies reported comparable results with similar cohorts, especially where e-ready students were found to be technologically savvy and had acquired some computer literacy skills [4, 5, 17]. However, for those with deficiencies, challenges still exist in adapting their learning lifestyle and study habits towards working with an ELS [5, 17].

Providing students with an orientation in the use of the ELS for students might still be useful, in spite of their general lack of interest in what may appear to be another course or workshop [5, 22, 23]. Providing a compulsory orientation to the ELS prior to enrolment in their courses could give students an opportunity to become more comfortable with the ELS. Hands-on exercises that involve interacting with technology tools that they will be using in their course, along with time to resolve any potential technology barriers can be beneficial without the added anxiety associated with coursework or grades. In this study, while an orientation activity was available for this cohort of students as part of the introductory tasks, it did not specifically focus on training for the ELS. As noted by [17], since most students were in the second year of their studies, it could have been assumed that they would have had the requisite knowledge or expertise at this level of their studies.

Further analyses of students’ level of e-readiness or lack thereof, showed that e-ready students had significantly higher technical readiness, significantly better online study habits, and significantly higher preferences for online learning than those who were not yet e-ready. This is supported by other studies that indicated that students who take responsibility for their learning and have good study habits are suited for working in the ELS [18].

In contrast, a drill down of the data to evaluate the extent of deficiencies exhibited by students on the three measures also support studies where students tend to rate highest in the technical readiness scales but lowest in the self-directed or learning predisposition scales [5, 17, 18]. One consideration is that students could be distracted by other online activities while learning online, thus impacting their study habits [4, 17].

In this study, of the 150 students who were not deemed to be e-ready, 48% (72) were deficient in any one category, and 52% (78) who were deficient in two or three categories is cause for concern. The sudden move to online learning might not have given these students time to adapt or seek additional resources in order to become prepared for the change to online assessment. Although students are mostly exposed to and possess basic technological skills, significant challenges remain in adapting their lifestyle and learning to interacting with an ELS [32]. One such challenge that is frustrating for students involves lack of help desk assistance when facing technical problems while learning online [8].

Furthermore, 22.7% (34) students were deficient in all three categories and would be at a distinct disadvantage based on their inability to effectively use the technology, with poor study habits and a preference for the face-to-face environment. These results are comparable to another study that reported 25.1% of its sample in a blended cohort and 22.3% in an online cohort who had similar deficiencies in multiple categories [5]. Even though these numbers are small, all students should be given the resources and support for a fair chance at successfully completing their course of study. Students with technical deficiencies, are encouraged to visit the campus IT service desk with software and hardware issues to receive technical assistance. They could also visit one of the six computer laboratories on the campus to work on their assignments. However, with a stay-at-home order, this would have compounded the situation for these students.

The other two deficiencies - study habits and online learning predisposition categories - were beyond the expertise of the course facilitator to directly support the students’ needs. However, in an effort to provide guidance to these students, hyperlinks and information on the university’s student services were shared during scheduled weekly lectures and posted on the ELS. These services include useful information and scheduled appointments to discuss study habits and academic support, whether in an online or face-to-face setting.

6.3 Mid- semester shutdown

6.3.1 Devices and network connectivity

Regarding the type(s) of computer or mobile devices that students used to access the ELS from their geographical location, as well as the stability of their Internet connectivity while most students had a laptop computer, it was concerning that of the 10 students who had access to only a mobile device, nine (6.0%) of them were not deemed to be e-ready. Those students indicated that they only had an iPad or mobile phone in their possession. The fact that 10 (4.4%) students relied on an iPad or mobile phone, while 70 (30.7%) students depended on Wi-Fi were causes for concern regarding their continued access and ability to manipulate content in the ELS for the final online assessment. While these numbers were small, it was imperative that those caught in the digital divide were also given a fair chance to complete their courses. Fortunately, some internet service providers quickly offered to assist students by delivering laptops to those who required them for the duration of the assessment period [33, 34].

Based on these contributions, the chi-square test determined there was no significant difference in the proportion of e-ready students and not yet e-ready students regarding access to a laptop PC or Mac, or other mobile device, as well as the frequency of the stability of their internet connection. That is, e-ready students did not differ significantly in their access to online learning devices than those who were not yet e-ready. Also, students who were deemed e-ready had as stable a connection as those who were not yet e-ready. These results differ from [17], who reported that 80% of their respondents had challenges with poor connectivity.

6.3.2 Time management

E-ready students were found to have significantly better time management skills than those who were not yet e-ready. These results seem to be realistic since students who have good study skills are reported to be more organised and manage their time better [26]. However, for those students who were not deemed to be e-ready, the results are supported by [17] who noted that lack of self-directed learning or motivation for learning, were major challenges for this category of student since it is reflected in poor study habits and therefore poor time management skills. In this study, it is unknown whether any students had sought assistance from the Office of Student Services before the curfew. Nevertheless, one could argue that they would probably not have had enough time to successfully follow through with any interventions before the disruption. More so, students employed in security, medical and other essential services, would most likely have seen an increase in their workload. This could have negatively impacted their ability to successfully manage their study time.

6.3.3 Private space, and adequate time for studying

The results indicated that students who were deemed to be e-ready had less privacy and less time to study after the lockdown. These results highlighted the impact of the COVID-19 pandemic on this group of e-ready students. A similar study reported that this category of student typically has high expectations for themselves regarding their own study schedules and could manage their time well [17]. Additionally, as noted by [27] in their research, more than 60% of those students studied at home during COVID-19, compared to 94.3% of the students in this study. E-ready students who are seemingly disciplined could have previously used a private space such as the library on campus or one of the six computer laboratories to spend time on their assignments. However, with the lockdown in place, this would have resulted in restricted access to the campus library, possible distractions from other family members, or having to reduce the time spent online in order to allow another family member to use the computer or laptop. These factors could have disrupted an otherwise scheduled study routine.

For students who were not deemed to be e-ready, the results showed that while access to a private place for study time remained the same before and after the lockdown, it appears that they had more time to study after lockdown. This seems to have worked for the benefit of this group of students, since the country was under lockdown for over four weeks. While the lack of privacy remained during the lockdown, perhaps the importance of studying for final assessments or the confinement to one location could have created some personal and perhaps family structure with time required to dedicate to their studying.

6.4 Preparing for and taking the final online assessment

The results showed that e-ready students started their final online assessment earlier than those who were not yet e-ready. One could suggest that students who were not yet e-ready may still have lacked confidence with working in the online environment, which could have contributed to their delay in starting the online assessment. They could also have used the available time to study further, thus waiting until the last moment to take the online assessment. In contrast, students who were e-ready could have had the confidence to complete the assessment early in order to study for other assessments.

6.5 Final course results

This study found that both groups of students had similar final grades. This result is supported by [9] who reported no significant differences in the final scores, and [16] who showed that the format of instructional delivery may not have an effect on students’ learning outcomes. However as suggested by [9, 10, 14], blended learning activities provided in this course prior to the lockdown may have had some influence on the students’ final marks.

Based on these final results, one could suggest that the course content could have been accessible to all students. Additionally, the scheduling of the final assessment and questions could be considered to be as fair as in the opportunity given to students to select their time to take the assessment, and duration to complete the required questions. More so, students who were not deemed to be e-ready could have benefitted from the additional time taken before starting the assessment, while those who were e-ready could have suffered from the disruption in their study routine. Nevertheless, most of the e-ready students performed slightly better in the A-to A+ range. Students who were not deemed to be e-ready performed slightly better in the B- to B+ and C to C+ range. Three students who were not deemed to be e-ready failed or were absent for the final assessment.

Advertisement

7. Conclusions

This study has shown that for this cohort of students, only about a third were deemed to be e-ready. They displayed significantly higher technical readiness, significantly better online study habits, and significantly higher preferences for online learning than those who were not yet e-ready. As confirmed by [4], it should not be assumed that all students have access to a laptop and a quiet study space. Those who relied on visiting the library and using computer labs on the campus could have been especially disadvantaged by the lockdown if they did not have access to technology, adequate space and time to study while under the lockdown. Fortunately, both categories of students had about the same access to online learning devices and stable connectivity. Further, private companies and certain university services had arranged for students in the digital divide to receive laptops for the duration of the lockdown.

For e-ready students, the impact of the lockdown was felt most by a disruption in their study routine. This is supported by [13, 26] who found that time management was a necessary factor in positively influencing a student’s final grade. The findings in this study also revealed that e-ready students had less privacy and less time to study after the lockdown. However, for those who were not yet e-ready, while there was no change in their private space for study, they found more time to study after the lockdown.

It may seem that e-ready students are more disciplined and would therefore spend more time in a quiet space with little to no disruption. However, students who are not yet e-ready are known to prefer a classroom environment. The scheduled class time each week could have provided a welcomed routine and maybe force some time management skills for the duration of the lecture. The lockdown, however, could have possibly mimicked some of that environment for those students which now became a place of learning and a space for learning. The findings by [13, 17] suggest that in this case, the forced time at home to study could have been a silver lining for this category of student. One could only wonder to what extent this change impacted their study routine and time management skills because of other influences beyond their control.

The results showed 31% of the students had technical challenges, over 60% had issues with study habits and over 80% had weaknesses with working in the online environment. Further analyses are required to determine if those who were borderline e-ready could have slipped below the benchmark after the lockdown. Perhaps, a post-survey reassessment of students’ level of e-readiness could have highlighted the plight of those on that e-readiness boundary.

The results also revealed that students’ level of e-readiness has some influence on when their take their online assessment via an ELS. It also impacts their final assessment results based on their self-discipline, study routine and study habits, as supported by other studies [26, 29]. Similar to [10], students had slightly better exam grades, since in this study, e-ready students performed slightly better in the A-to A+ range while those not e-ready performed slightly better in the B- to B+ and C to C+ range.

Advertisement

8. Recommendations

While the Office of Student Services remained available for academic support using online consultations during the lockdown, their extensive services seem not be known to the wider student body. If the e-readiness evaluation is implemented, then students, who were deemed to be not yet e-ready could initially benefit from these resources. At the beginning of each semester assistance on improving their study habits, or perhaps information on building a study schedule could be provided in order to give them a good chance to excel during the semester. For students in this course, the recommendation for visits to student services during the early weeks of the semester could have made a positive impact on those who were not deemed to be e-ready. However, it is not known if any students made appointments and more importantly if they did, whether they followed through with their sessions and changed any practices.

It is further recommended that during the initial weeks of every semester, officers of student services schedule visits or join courses, whether face-to-face or online, to introduce students to various types of academic support, and share information or quick tips. Instructors could also place links and ‘information bits’ on course pages to bring awareness to these services so that students are informed about the range of support services. Furthermore, supported by findings of this study and those of [12, 19], it is strongly recommended that prior to the start of their courses, students complete an ELS orientation inclusive of an e-readiness assessment, even if it is a self-evaluation. This assessment, also supported by [2122], would provide an initial baseline and also highlight any e-readiness deficiencies that require additional support or ELS training. Over subsequent semesters of re-evaluation, students could monitor changes from previous semesters and pinpoint the type of assistance they may need before the start of the semester. It is hoped that a different type of self-awareness towards evaluating their own proficiency would develop when working via the ELS. The end result could possibly be seen in increased student satisfaction, and overall improvement of student outcomes which is ultimately important for university administrators, instructors, and indeed the students.

Thanks

Sincere appreciation and admiration to the students in this course and across the globe who persisted in the midst of COVID-19 pandemic to complete their online assessments.

References

  1. 1. Weeden KA, Cornwell B. The small-world network of college classes: Implications for epidemic spread on a university campus. Sociological Science. 2020;7: 222-241
  2. 2. Seaman JE, Allen IE, Seaman J. Grade increase: Tracking distance education in the United States. Babson Park, MA: Babson Survey Research Group, LLC; 2018
  3. 3. Rashid S, Yadav SS. Impact of Covid-19 pandemic on higher education and research. Indian Journal of Human Development. 2020. DOI: https://doi.org/10.1177/0973703020946700
  4. 4. Adams D, Sumintono B, Mohamed A, Noor NSM. Elearning readiness among students of diverse backgrounds in a leading Mayalsian higher education institution. Malaysian Journal of Learning and Instruction. 2018;15(2):227-256
  5. 5. Gay G. Fixing the 'Ready' in e-learning readiness. In: Sinecen M, editor. Trends in E-learning. Trends in E-learning. London: InTech Open; 2018. p. 22. DOI: 10.5772/intechopen.74287
  6. 6. Chang C-L, Fang M. E-Learning and online instructions of higher education during the 2019 novel coronavirus diseases (COVID-19) epidemic. ICCASIT 2020 Journal of Physics: Conference Series: IOP Publishing; 2020
  7. 7. Lockman AS, Schirmer BR. Online instruction in higher education: Promising, research-based, and evidence-based practices. Journal of Education and e-Learning Research. 2020;7(2):130-152
  8. 8. Young S, Duncan HE. Online and face-to-face teaching: How do student ratings differ? MERLOT Journal of Online Learning and Teaching. 2014;10(1):70-79
  9. 9. Burns K, Duncan M, Sweeney DC, North JW. A longitudinal comparison of course delivery modes of an introductory information systems course and their impact on a subsequent information systems course. MERLOT Journal of Online Learning and Teaching. 2013;9(4):453-467
  10. 10. Means B, Toyama Y, Murphy R, Bakia M, Jones K. Evaluation of evidence-based practices in online learning: A meta-analysis and review of online-learning studies2010 Nov 12, 2014 [cited 2014 Nov 12]. Available from: https://www2.ed.gov/rschstat/eval/tech/evidence-based-practices/finalreport.pdf
  11. 11. Keskin S, Yurdugül H. Factors affecting students’ preferences for online and blended learning: Motivational vs. cognitive European Journal of Open, Distance and E-Learning. 2019. DOI: 10.2478/eurodl-2019-0011
  12. 12. Parkes M, Stein S, Reading C. Student preparedness for university e-learning environments. The Internet and Higher Education. 2015;25(1):1-10
  13. 13. Broadbent J. Comparing online and blended learner's self-regulated learning strategies and academic performance. The Internet and Higher Education. 2017;33:24-32. DOI: https://doi.org/10.1016/j.iheduc.2017.01.004
  14. 14. López-Pérez MV, Pérez-López MC, Rodríguez-Ariza L. Blended learning in higher education: Students’ perceptions and their relation to outcomes. Computers & Education. 2011;56:818-826
  15. 15. Murphy MPA. COVID-19 and emergency eLearning: Consequences of the securitization of higher education for post-pandemic pedagogy. Contemporary Security Policy. 2020;41(3):492-505. DOI: 10.1080/13523260.2020.1761749
  16. 16. Lim DH, Morris ML. Learner and instructional factors influencing learning outcomes within a blended learning environment. Educational Technology & Society. 2009;12(4):282-293
  17. 17. Chung E, Noor NM, Mathew VN. Are you ready? An assessment of online learning readiness among university students. International Journal of Academic Research in Progressive Education and Development. 2020;9(1):301-317. DOI: 10.6007/IJARPED/v9-i1/7128
  18. 18. Rasouli A, Rahbania Z, Attaran M. Students’ readiness for E-learning application in higher education. Malaysian Online Journal of Educational Technology. 2016;4(3):51-64
  19. 19. Wargadinata W, Maimunah I, Dewi E, Rofiq Z. Student’s responses on learning in the early COVID-19 pandemic. Journal of education and teacher training. 2020;5(1):141-153
  20. 20. Garrison D, Anderson T. E-learning in the 21st century: E-learning framework for research and practice. London: Routledge/Falmer; 2003
  21. 21. Stoebe A, Grebing R. The effect of new student orientations on the retention of online students. Online Journal of Distance Learning Administration. 2020;23(2)
  22. 22. Taylor JM, Dunn M, Winn SK. Innovative orientation leads to improved success in online courses. Online Learning. 2015;19(4):1-9
  23. 23. Jones KR. Developing and Implementing a Mandatory Online Student Orientation. Journal of Asynchronous Learning Networks. 2013;17(1):43-45
  24. 24. Hashim H, Tasir Z, editors. E-Learning readiness: A literature review. 2014 international conference on teaching and learning in computing and engineering (LaTiCE); 2014; Kuching: IEEE
  25. 25. Hung W-H, Chang L-M, Lin C-P, Hsiao C-H. E-readiness of website acceptance and implementation in SMEs. Computers in Human Behavior. 2014;40:44-55
  26. 26. Kaymak ZD, Horzum MB. Relationship between Online Learning Readiness and Structure and Interaction of Online Learning Students. Kuram ve Uygulamada Egitim Bilimleri. 2013;13(3):1792-1797
  27. 27. Chung E, Subramaniam G, Dass LC. Online learning readiness among university students in Malaysia amidst Covid-19. Asian Journal of University Education. 2020;16(2):46-58. DOI: https://doi.org/10.24191/ajue.v16i2.10294
  28. 28. Hung ML, Chou C, Chen CH, Own ZY. Learner readiness for online learning: Scale development and student perceptions. Computers & Education. 2010;55(3):1080-1090
  29. 29. Holsapple CW, Lee-Post A. Defining, assessing, and promoting e-learning success: An information systems perspective. Decision Sciences Journal of Innovative Education. 2006;4(1):67-85
  30. 30. Baturay M, Yukselturk E. The role of online education preferences on student achievement. Turkish Online Journal of Distance Education. 2015;16(3):1-10
  31. 31. Sekaran U. Research methods for business: A skill-building approach. New York: John Wiley & Sons; 2003
  32. 32. Mandernach BJ, Mason T, Forrest KD, Hackathorn J. Faculty views on the appropriateness of teaching undergraduate psychology courses online. Teaching of Psychology. 2012;39(3):203-208. DOI: 10.1177/0098628312450437
  33. 33. Flow providing connections for students. May 1, 2020 [cited 2020 October 19]. Available from: https://barbadostoday.bb/2020/05/01/flow-providing-connections-for-students/,UWI
  34. 34. Mona to distribute over 500 tablets to students in need. Loop News [Internet]. 27 April, 2020. Available from: https://www.loopjamaica.com/content/uwi-mona-distribute-over-500-tablets-needy-students

Written By

Glenda H.E. Gay

Submitted: 06 September 2020 Reviewed: 22 November 2020 Published: 14 December 2020