Open access peer-reviewed chapter

Fixing the ‘Ready’ in E-Learning Readiness

Written By

Glenda H. E. Gay

Submitted: 09 October 2017 Reviewed: 23 January 2018 Published: 01 August 2018

DOI: 10.5772/intechopen.74287

From the Edited Volume

Trends in E-learning

Edited by Mahmut Sinecen

Chapter metrics overview

1,347 Chapter Downloads

View Full Metrics

Abstract

Evaluating the effectiveness of e-learning systems (ELSs) for course delivery can be achieved by measuring the user’s level of readiness for the ELS. While e-learning readiness (e-readiness) is well researched using several models, studies generally provide recommendations for the institution or instructor. However, most students are typically not equipped for using the ELS. This chapter focuses on assisting students in online and face-to-face courses who have e-readiness challenges when accessing an ELS throughout a semester. A survey captures responses on their technological, lifestyle and learning preparedness for the ELS to produce an e-readiness score. A modified DeLone and McLean model evaluates the impact of their level of e-readiness during their use of the ELS. Identifying where and when students have difficulties, pinpointing their deficits or recommending the more appropriate modality could help students achieve a positive course outcome.

Keywords

  • e-learning readiness
  • e-learning systems
  • student preparedness
  • blended course
  • online course

1. Introduction

Electronic learning (e-learning) has become an essential feature in the delivery of distance education. Its effectiveness relies on a stable network with specific software, a repository for managing the delivery of content, and a good social environment created by the online interaction among students [1]. This interaction at any time and from anywhere has resulted in extensive integration of e-learning systems (ELSs) in most universities [2]. With the major shift in how students learn and therefore how they are taught, there is an increasing need to understand what contributes to student satisfaction when using an ELS. These systems can be used to enhance students’ learning in a classroom setting by incorporating online resources including discussion boards, quizzes, chat sessions and assignment tracking [3]. However, instructors tend to be unaware of the level of their students’ social, communication, and technological competencies that are necessary for ELSs [4]. Indeed, the strength of the connection among students’ as they interact socially and academically during their courses influences retention rates [2].

ELSs, such as Blackboard and Moodle facilitate instruction in courses that offer face-to-face, blended, or online delivery to students. However, blended learning emphasizes the central role of the ELS, thus enabling access and flexibility but reducing face-to-face contact hours, while online learning relies solely on the ELS with no face-to-face contact hours [5]. Studying via a blended or online modality has its benefits. Student interaction in the ELS involving group and individual projects, discussions, and assignments were reported to be the most effective learning activities in both modes of study [6]. Furthermore, students who initially enrolled in either mode were better prepared and performed significantly better in subsequent courses of the same modality [7, 8]. Students who have the ability to understand course materials in an online format and interact with an ELS could be well suited to a blended course where there is less need to meet as often with the instructor [9, 10], or in a fully online course if they are comfortable with working independently [9]. Researchers who examined differences in learning outcomes for students in either mode concluded that there was little effect on their learning or application of learning [6], little to no significant differences among students regarding their final grades [9], and no significant differences among demographics such as age, gender, area of residence, and academic class level [8]. Despite these two modes of delivery, it is the students’ experiences, expectations and perceptions of the ELS and its tools that influence successful e-learning outcomes [7].

Even though students may be reasonably prepared to deal with the technology of e-learning, major weakness were reported in lifestyle and learning preparedness regarding the quality of academic work required, including synthesizing ideas and working with others [11, 12]. While the network infrastructure, hardware and software address the technical side of an ELS, students’ interaction in a course provides necessary non-technical aspects of the e-learning experience. Therefore, a core requirement in assessing the effectiveness of a course is students’ level of preparedness at course orientation, and engagement during course delivery [13, 14]. E-learning readiness (e-readiness) encompasses the seamless nature of students’ technological, lifestyle, and organisational preparation for the ELS, and is characterized by their competence in “using” an ELS and its technology tools [14]. In 2015, the World Economic Forum assessed 143 countries on their state of e-readiness regarding ICT infrastructure, affordability of ICTs, and capacity of the population to make effective use of ICTs. However, these e-readiness ranks were not inclusive of tertiary level education. With the intention of developing countries to create an effective knowledge economy and enhance lifelong learning, additional research is necessary to evaluate the success of students in both developed and developing countries who are enrolled in courses at tertiary level that incorporate online components [15, 16]. In an educational context therefore, e-readiness is defined as the capability of e-learning users to adapt to a new learning environment, using new technologies, and be involved in self-directed learning [14]. However, there are students who have returned to further their education tertiary level, are doing so for the first time using an unfamiliar interface such as an ELS [17]. An e-ready student should be capable of efficiently and effectively applying the essential technology tools in an ELS in order to satisfactorily interact with the content and engage other students [13, 18]. Additional reports of an underutilization of ELSs by students in developing countries, also identifies a need to understand why and how this can be addressed [17, 19].

While research on e-readiness using several models is not new, much of it is still limited [13], and has not been able to keep apace of new technologies that support the social and interactive nature of e-learning [20, 21]. Few studies using e-readiness factors have developed benchmarks on the e-readiness of students. One study assessed the preparedness of students for a range of e-learning competencies and identified an overall ‘low level’ of perceived student preparedness [12]. It was attributed to students having little or no exposure to ELSs prior to their university studies. Another study that evaluated students’ use of technology prior to taking an online course showed that they had ‘less than average’ training in technology requirements [14]. Further examination of students’ e-readiness with course structure and interaction in the ELS found that ‘higher levels’ of e-readiness were positively associated with increased students’ interactions and less reliance on the need for structure and guidance in an online course [20]. Some researchers sought to categorize an e-ready student as having a high rating in three readiness scales: technical competence, lifestyle aptitude, and learning preference [22]. Their study however, did not generate scores that could be used as an initial benchmark or self-evaluation for students.

A relationship was found between e-readiness and factors such as students’ self-directed learning, and self-motivation [20], as well as use of technology [23]. In reality, many students most likely have experienced traditional classroom environments for most of their primary and secondary education, but may not have sufficient experience with ELSs as they pursue higher education [17, 24]. Some researchers therefore proposed that students be assisted in more accurately gauging their readiness for online learning before they start a course [4]. Others believe that as students acquire new skills and knowledge during a course to enhance their learning, their level of e-readiness could still affect their interaction, resulting in positive or negative benefits at course completion [20]. Nevertheless, readily available access to course materials in an ELS does not always guarantee successful outcomes, since social and learning readiness factors also contribute to the overall measure of a student’s capabilities in an online environment [4]. Therefore, preparing students with suitable techniques and enhancing their ability to integrate socially and academically during their online courses could reduce the risks of attrition [3, 25]. This preparation should be more than a pre-course familiarization exercise on using the ELS for first-time students [26]. The use of an e-readiness assessment instrument to identify students’ strengths and challenges could also be an important tool in improving success rates in higher education [24]. Continued evaluation of students’ readiness skills prior to, and during a course of study is necessary in order to address any challenges that could prevent their successful course completion [4].

This chapter therefore focuses on identifying the characteristics of students who register for blended or online courses but may be unprepared for studying using an ELS. It also determined to what extent students’ level of preparedness influenced their experience at the start of the course, and during course delivery. The findings could be used to compare these characteristics among various groups of students in order to implement measures for improving their success.

Advertisement

2. Research framework

An e-readiness framework was first proposed in the late 1990s to evaluate the depth and breadth of the digital divide between developed and developing countries [27]. The framework used for this study was adapted from two previously tested models [22, 28]. This model tests the effect of e-readiness (a) at course orientation when the student first accesses the course and evaluates the quality of the ELS, initial course materials, and support services, (b) during course delivery when students regularly use and become satisfied with the ELS, and (c) at course completion where the students’ experiences influence their decision about registering for another blended or online course. Figure 1 illustrates the proposed framework comprising student e-readiness and the six contributing factors at the course orientation, delivery and completion phases.

Figure 1.

Conceptual framework for student e-readiness at course orientation, delivery and completion.

2.1. Student e-readiness

E-readiness evaluates students’ level of preparedness for an ELS though their technical competence, lifestyle aptitude, and learning preference [22]. These factors are described in more detail below:

  • Technical competence: This factor evaluates students’ tendency to comfortably use new technologies [18]. Students who are technically skilled, are able to easily access the Internet, maintain a dedicated network connection, and possess a level of competence when using essential technology tools required for the ELS are considered to be e-ready [22]. While technical competencies are necessary for successful learning experiences when using the ELS, students should still be evaluated to address any perceived disconnect between institutional expectations for technology use and students’ technology practices [24].

  • Lifestyle aptitude: This factor assesses students’ study habits and communication patterns when using the ELS [22]. This includes whether they are able to devote uninterrupted time to assignments and activities in the ELS, or post messages to other students or the instructor via the ELS. Students’ course participation is also based on their comfort level, and the ability to understand course material without the face-to-face interaction with an instructor [22].

  • Learning preferences: This factor detects students’ values and learning styles that are suitable for a blended or online modality [22]. Positive benefits from the ELS experience and successful learning outcomes are produced when students are highly motivated and self-confident [26], self-directed, and interactive with other students in the ELS [20]. Students are therefore more likely to be successful when they are committed to or possess a high level of interest in completing the course [21, 26]. In contrast, students who lack adequate technology skills and have a preference for face-to-face interaction may be accustomed to this lifestyle of learning prior to tertiary level and could have challenges with working in the ELS [12]. More so, students who have challenges with reading, comprehension, essays and other higher education writing skills are more prone to underachieving in courses with an online component that provide mostly written text and instructions [12, 21].

This section explains the factors associated with the three phases of a course, namely course orientation, course delivery and course completion.

2.2. Course orientation

At this stage, students access the blended or online course for the first time. It is an important phase since their initial active involvement during these early weeks can influence their persistence in the course [29]. This phase therefore focuses on the interaction between students and the quality of the ELS, information provided in the ELS, and support services:

  • ELS quality: The quality of the ELS is measured by its stability, ease of use, and responsiveness to students who may not persist if they experience technical problems at this early stage [15, 22, 28]. ELS quality can therefore be hampered by inconsistent connectivity, system crashes, insufficient bandwidth, infrastructure or software maintenance, and accessibility issues [30].

  • Service quality: Students could become frustrated if assistance is not available when problems arise with the ELS, or if they do not know how to contact technical support [30]. Timely and effective assistance could include an online ‘help desk’, ‘frequently asked questions’ forum, and email support [31].

  • Information quality: Once students have accessed the ELS, they are exposed to course content and other information. Poorly designed course materials could also affect their enthusiasm that is necessary for early engagement [13, 27]. Instructional material should therefore be clear, up to date, written at a level that is easily understood, and formatted to cater to different learning styles [21, 26].

2.3. Course delivery

This active phase evaluates the students’ actual use of and satisfaction with the ELS during the course.

  • ELS use: Students’ perceptions of how regularly and consistency they access the ELS is evaluated [10]. Students depend on the ELS for their class materials and to submit assignments. ELS use is therefore determined by whether the ELS adds value to their learning experience [9, 31].

  • Student satisfaction: This factor evaluates students’ interaction and experiences as they use the ELS. Consistent interactivity, commitment, and increasing familiarity with the ELS during the course could influence student satisfaction, which subsequently increases ELS use resulting in academic success [7, 14, 21, 26]. However, student satisfaction could also decrease if information quality decreases, resulting from inadequate study materials, assignment instructions, or even out-of-date, confusing, or unimportant notices [6].

2.4. Course completion

As students evaluate their ELS experiences at course completion, these benefits could influence their overall satisfaction with the ELS and determine whether they will consider taking another blended or online course. Therefore, having a positive experience at course orientation, and enhancing these experiences during the course could increase their confidence and intention to register for more courses in that modality [22, 26]. Online students could benefit from becoming empowered with enhanced online skills, but could also be discouraged by dependence on the technology and feelings of isolation [22].

Advertisement

3. Materials and methods

3.1. Context and participants

The University of the West Indies is a multi-site institution affiliated with 17 countries in the region. There are three main campuses with a fourth campus that caters specifically to online students who rely on the ELS. Online students from non-campus territories who do not own, have access to, or lack adequate computer or Internet facilities at home may visit their country site to access resources including microcomputer laboratories and libraries. With the introduction of ELSs, instructors on the main campuses have been trained to offer blended courses (classroom-based or computer laboratory-based with online components). Students in either cohort can complete a three-year undergraduate degree with full-time registration.

This study analysed students in a blended and an online course pursuing the same second-year undergraduate core management course at the university. The courses were offered through one campus and the online campus respectively. Both courses had the same instructor and four teaching assistants. All course content, course assignments, and proctored course exams were developed by the instructor using the same criteria and standards. The course content was identical and located on a Moodle-based ELS in both courses comprising digital-learning materials including videos, PDF slides and laboratory exercises. The course assessments were uploaded to each ELS for grading. Once a week, tutorial assistance was provided in computer laboratories on the campus for students in the blended course, while live online sessions and supplementary videos are provided for the online students. No compulsory ELS training was provided for students in either cohort.

3.2. Survey instrument

A web-based survey instrument was used to capture responses from the students in each course. The items were reviewed by four experienced instructors and pilot-tested before posting in the student forum of each course. The instrument was posted in the ELS of both courses and was set to allow only one submission from each student. Responses were captured over two consecutive semesters. The instrument comprised an e-readiness section, and ELS section, and a section to capture demographics:

  • The 18-item e-readiness section captured responses on three factors: students’ technical competence, lifestyle aptitude, and learning preference when studying. The technical competence items captured responses on computer knowledge and technical literacy, such as whether students knew how to use software applications such as a word processor, had access to a printer, the Internet, a dedicated network connection, or knew how to contact the ELS’ help desk. The lifestyle aptitude items captured responses on whether students had a place that could be used uninterrupted for extended periods to study, routinely communicated with other students using electronic technologies such as e-mail, and had either persons or resources nearby who could assist with any technical problems. The learning preference items captured responses on students’ self-motivation, eagerness to use new software applications, preference for face-to-face or online courses, and preference for written or verbal feedback. All items used a five-point Likert-type scale, ranging from ‘1 = strongly-disagree’ to ‘5 = strongly-agree’.

  • The ELS section comprised 30 items that captured responses on six factors from students’ perceptions during the phases of the course. At course orientation, the quality of the ELS, information provided on the ELS, and quality of support services were obtained. During course delivery, students’ perceptions on their use of the ELS, and their satisfaction with the ELS, and at course completion their positive or negative experiences of the ELS were obtained. All items used a five-point Likert-type scale, ranging from ‘1 = strongly-disagree’ to ‘5 = strongly-agree’.

  • Demographic data of the respondents was also captured, including gender, age range, the number of courses registered for the semester, and whether the student resided on a campus territory.

3.3. Procedure

Incomplete responses were removed before analysing the data using SPSS version 22.0. The means of each factor were calculated. The attributes of an e-ready online student were then highlighted using the items with the highest mean in the technical competence, lifestyle aptitude, and learning preference scales, while items with the lowest mean were identified as possible challenges. A mean of at least four of a maximum of five points was used as an indication of an acceptable level of e-readiness [22]. Therefore, a student was categorized as e-ready if four or more points in each of the three e-readiness scales were achieved, while the level of e-readiness for a cohort was calculated using the aggregate score of all three e-readiness scales.

Linear regression was used to check the effect of students’ level of e-readiness on ELS quality, information quality, and service quality during course orientation. Multiple regression was performed to test the effect on ELS use and their satisfaction during course delivery, as well as their perceived benefits at course completion. An independent samples T-test was used to determine whether there were any significant differences among the demographic features of the students.

Advertisement

4. Results and discussion

A total of 963 completed responses were analysed, comprising 539 from the blended course and 424 from the online course. The demographic data for both cohorts revealed that the median age range of respondents was 29 years or younger, the majority were female and in their second year of the programme. In keeping with the course registration patterns, students in the blended cohort registered for five courses, while those in the online cohort registered for three courses. Although over 90% of the students in the blended course lived in the country of the campus, about 56% of the students in the online course also lived the country where a main campus was located. The demographic results of the students are presented in Table 1.

Table 1.

Demographic features of students in blended and online courses.

The internal consistency of the survey items was tested using Cronbach’s alpha, a reliability coefficient that indicates how well the items in a set are positively correlated to one another. The alpha coefficient for each scale was greater than 0.70 confirming that the survey instrument was reliable. Pearson correlation analysis was used to examine the strength of the association between the students’ level of e-readiness and demographic variables. There was no significant association between level of e-readiness and demographic variables for either cohort, such as gender for the blended cohort (χ(1) = 2.519, p = 0.112) or the online cohort (χ(1) = 0.017, p = 0.897); age range for the blended cohort (χ(1) = 3.614, p = 0.306) or the online cohort (χ(1) = 1.522, p = 0.677); and territory of residence for the blended cohort (χ(1) = 2.117, p = 0.146) or the online cohort (χ(1) = 0.817, p = 0.366). These results support other published research of no significant differences among demographics such as age, gender, or territory of residence [8].

4.1. Characteristics for student e-readiness

The highest means of the three e-readiness factors suggest that whether in a blended or online course, an e-ready student routinely communicates with persons using electronic technologies, and is a self-motivated and independent learner. Other research also reported that highly motivated and self-confident students could produce better e-learning outcomes [21, 22].

The highest scores of other items also identified e-ready students in each cohort. An e-ready student in the blended course should also have access to the Internet for substantial periods of time, perhaps 45 minutes or so, at least 3 times a week. This seems reasonable as these students would also need to spend time watching lecture videos, downloading course materials or submitting assignments that supplement their face-to-face classes. Both cohorts also showed much higher means in the technical readiness scale than the lifestyle aptitude, and learning preference scales. This is valid since students at the university have access to a dedicated network connection, printers and various software applications, either personally or in the computer laboratories on campus.

In comparison, an e-ready student in the online course should also be eager to try new technology tools, and be able to receive emails sent to the online campus email address. This would be expected since online students are exposed to new technology tools in the ELS due to the various methods in which instructors may present their course and tutorial materials. Using the online campus’ email is the primary method of contacting instructors and administrative staff. These attributes align with other research that identifies online students as having a tendency towards adapting to new technologies [14, 18].

Previous research reported that although most students have been exposed to and possess basic technological skills (computer and Internet literacy), significant challenges remain in adapting their lifestyle and learning to interacting with an ELS [21]. This was shown by the lowest means in the items which identified students who were classified as not e-ready. Students in both cohorts were unable to access support services, and preferred immediate verbal feedback compared to written feedback. If students have technical issues with the ELS that are not resolved quickly, they may be hindered from progressing in the course. Furthermore, students in the blended course preferred to attend face-to-face classes on campus. It could be that they were more comfortable with and expected a traditional course structure, or had difficulty adapting to a new modality of learning. Students in the online course had challenges with finding persons and/or resources nearby to assist with any hardware or software problems. This could cause further frustration if these students have difficulty in accessing support services along with a lack of technical assistance from persons nearby. Apart from delays in keeping up with the course work or failure to submit assignments on time, these challenges could also impact their final grades. A summary of the highest and lowest means and standard deviations of items are presented in Table 2.

Blended M (SD) Online M (SD)
An E-Ready learner…
(LP) Is self-motivated, and independent learner 4.04 (0.86) 4.20 (0.80)
(LA) Routinely communicates with others using electronic technology 4.26 (0.84) 4.12 (0.82)
(TC) Has access to the Internet for substantial periods of time, perhaps 45 minutes or so, at least 3 times a week. 4.25 (0.88)
(TC) Receives emails sent to my Open Campus email address even though it may not be my primary account 4.30 (0.69)
(LP) Is eager to try new technology or software applications 4.20 (0.82)
A learner who is not E-Ready…
(TC) Does not know how to access the online help desk 3.74 (1.03) 4.06 (0.80)
(LP) Is not comfortable giving written feedback; prefers giving immediate verbal feedback 3.17 (1.05) 3.34 (1.12)
(LA) Does not need flexibility; prefers to come to campus to attend a traditional class 3.36 (1.22)
(LA) Does not have persons or resources nearby to help with technical problems soft/hardware 3.55 (0.61)

Table 2.

Characteristics of an e-ready student using highest lowest mean (M) and standard deviation (SD).

TC, technical competence; LA, lifestyle aptitude; LP, learning preparedness.

4.2. Level of student e-readiness

Table 3 presents the results and ratings of students’ e-readiness for the three factors. The overall score for the cohort in the blended course was 3.88 (77.5%), and 4.01 (80.2%) in the online course. However, when each student’s level of e-readiness was calculated, only 124 (33.0%) in the blended cohort and 146 (34.4%) in the online cohort were deemed to be e-ready. Students in the blended cohort were seemingly not prepared for engaging in an ELS, while those in the online cohort were minimally ready. This is in stark contrast with research on online instructors where the cohort was 91% prepared and 73% individually e-ready [32]. These instructors completed mandatory training for the ELS, and were not allowed to facilitate in the ELS if they were not deemed to be e-ready. For students, no mandatory training exists, and they were not counselled on expectations prior to registering for blended or online courses. Nevertheless, while training of students for the ELS is highly recommended, there are mixed outcomes resulting from students’ general lack of interest in completing ELS training [23].

Factor Reliability Blended M (SD) Online M (SD)
TC 0.96 0.89 4.25 (0.56) 4.33 (0.49)
LA 0.71 0.72 3.75 (0.68) 3.85 (0.63)
LP 0.72 0.74 3.69 (0.60) 3.86 (0.61)
Overall means 0.90 0.89 3.88 (0.50) 4.01 (0.46)

Table 3.

Cronbach’s alpha, means (M) and standard deviation (SD) of e-readiness factors.

TC, technical competence; LA, lifestyle aptitude; LP, learning preparedness.

All items were measured via a 5-point Likert scale: 1 = strongly-disagree, 2 = disagree, 3 = neutral, 4 = agree, 5 = strongly-agree.

Further analysis was conducted on the 415 (77.0%) students in the blended cohort and 278 (65.5%) in the online cohort who were not deemed to be e-ready. There were 104 (25.1%) students in the blended cohort and 62 (22.3%) in the online cohort who were deficient in all three scales. This group of students could be at a distinct disadvantage for studying via an ELS since they seem to be more suited to a traditional classroom environment. Introducing online components in traditional courses may therefore hinder students who require face-to-face interaction in a traditional classroom environment. Also, while not all students in an online course are able to adapt to working in an ELS, some mandatory orientation or support during their studies could still improve their chances for successful completion of the course.

4.3. E-readiness during the course

Table 4 presents the means and standard deviations in each cohort for the factors at course orientation, course delivery and course completion.

Factor Reliability Mean (SD)
BL OC BL OC
Course orientation
ELS quality (EQ) 0.76 0.85 3.72 (0.52) 3.93 (0.56)
Information quality (IQ) 0.82 0.89 3.79 (0.50) 3.86 (0.56)
Service quality (SQ) 0.76 0.76 3.35 (0.69) 3.70 (0.65)
Course delivery
ELS use (EU) 0.75 0.75 4.11 (0.76) 3.86 (0.81)
Student satisfaction (SS) 0.86 0.89 3.89 (0.67) 3.98 (0.64)
Course outcome
Net benefits (NB) 0.62 0.75 3.58 (0.45) 3.65 (0.42)

Table 4.

Cronbach’s alpha, means and standard deviation (SD) for ELS quality (EQ), information quality (IQ), service quality (SQ), ELS use (EU), student satisfaction (SS), and net benefits (NB).

BL, blended course; OL, online course.

All items were measured via a 5-point Likert scale: 1 = strongly-disagree, 2 = disagree, 3 = neutral, 4 = agree, 5 = strongly-agree.

4.4. E-readiness at course orientation

During course orientation, students in blended course found that the ELS was easy to use (M = 4.11, SD = 0.69), provided information relevant to their learning (M = 4.12, SD = 0.59), and the support specialists were helpful (M = 3.40, SD = 0.79). The students in the online cohort found that the ELS was always available (M = 4.11, SD = 0.76), provided information relevant to their learning (M = 4.11, SD = 0.58), and the support specialists provided adequate assistance and explanations for their issues (M = 3.75, SD = 0.81).

The concerns from the blended cohort revealed that the ELS lacked attractive features (M = 3.16, SD = 0.85), while it was not responsive enough (M = 3.66, M = 0.82) for the students in the online course. However, students in both courses complained that the ELS contained insufficient information (M = 3.64, SD = 0.77 for blended, and M = 3.71, SD = 0.82 for online), and that the support specialists were unavailable when they had a technical problem (M = 3.26, SD = 0.85 for blended, and M = 3.66, SD = 0.81 for online).

The most important focus at the start of a course should be reducing early frustrations with the ELS. Introducing more shifts or additional support staff during the first few weeks of the semester could alleviate these problems. Students need timely assistance during this phase to quickly have passwords reset and other log in issues settled in order to help them focus on interacting with peers and becoming acquainted with the ELS. According to Chyung [29], students’ active participation in the first few weeks of an online course more likely results in course completion.

Further evaluation of students’ level of e-readiness at course orientation confirmed that for students in the blended course, the level of e-readiness was indeed a predictor of ELS quality (β = 0.37, p < 0.001), information quality (β = 0.31, p < 0.001), and service quality (β = 0.22, p < 0.001). E-readiness accounted for 13.9% of the variance in ELS quality, 9.8% of the variance in information quality, and 4.6% of the variance in service quality. For students in the online cohort, the level of e-readiness was also a predictor of ELS quality (β = 0.41, p < 0.001), information quality (β = 0.43, p < 0.001), and service quality (β = 0.37, p < 0.001). E-readiness accounted for 17.1% of the variance in ELS quality, 18.2% of the variance in information quality, and 13.5% of the variance in service quality. Higher levels of e-readiness seem to be necessary for students in the online course, so that they could quickly and efficiently access the ELS to start gathering information, complete orientation activities, and start interacting with their peers.

4.5. E-readiness during course delivery

Students were asked about their use of and satisfaction with the ELS. The highest means for the items indicated that students in both courses frequently used the ELS (M = 4.21, SD = 0.86 for blended; M = 3.96, SD = 0.87 for online), and were pleased with the experience of using the ELS (M = 3.91, SD = 0.71 for blended; M = 3.99, SD = 0.72 for online). Once there are no major setbacks during course orientation, students’ satisfaction with the ELS increases since they were able to easily access course materials, interact and become comfortable with the ELS during this second phase. Details on the effect of students’ level of e-readiness with other items during course delivery are explained in the following sections.

  • ELS use: The model accounted for 16.0% of the variance on ELS use for the blended cohort (F(4, 534) = 25.75, p < 0.001, R2 = 0.16), where the level of e-readiness (17.6%), the quality of the ELS (18.2%), and the information posted in the ELS (15.6%) influenced ELS use. However, for students in the online cohort the model accounted for 12.0% of the variance, where only e-readiness (11.4%) and the ELS’ quality (18.2%) influenced ELS use (F(4, 419) = 14.66, p < 0.001, R2 = 0.12). For students in both cohorts, it appeared that the quality of the ELS was more important when interacting in the ELS than their level of e-readiness. One can appreciate that frequently accessing the ELS during this active phase is the primary focus for students who expect the ELS to be available and ease of use in order to complete their course assignments. While students in the online course depend on using the ELS for the all components of their courses, those in the blended cohort may use the ELS at intervals since they still have a face-to-face component. Nevertheless, students would not want to be disappointed with quality concerns when uploading assignments to meet deadlines.

  • For students in the online course, information quality (p = 0.106) along with service quality in both courses (p = 0.981 for blended; p = 781 for online) had no influence on ELS use. One could also envision that by now students would have become familiar with the rhythm of the course, along with provision of up-to-date course-related materials. This could reduce the number of times a student accesses the ELS as it minimizes the need for them to repeatedly check the ELS for these course updates.

  • Student satisfaction: The model accounted for 47.8% of the variance on student satisfaction for the blended cohort (F(4, 534) = 122.171, p < 0.001, R2 = 0.478), and 50.1% of the variance for the online cohort(F(4, 419) = 105.278, p < 0.001, R2 = 0.501). This may imply that the more prepared students are for the ELS, then the more satisfied they are with the experience and confidence in interacting with the ELS.

Information quality (30.5% for blended; 33.4% for online) was the largest contributor to student satisfaction in both cohorts followed by ELS quality (30.9% for blended; 21.5% for online), and service quality (17.6% for blended; 15.9% for online). Studies support the strong link between information quality with user satisfaction [33] since the very nature of online learning mandates that instructional materials be clear, easily understood and accessible by students [11]. The low percentages for support services may suggest that satisfied students rarely require the help desk services if their technology issues are minimized.

The level of e-readiness contributed 20.6% to students’ satisfaction with the ELS only for the online cohort. This implies that high levels of e-readiness enhance their competence in knowing how to quickly navigate and interact constantly in the ELS.

4.6. E-readiness at course completion

At the end of the course, students in both courses assessed the benefits obtained from their experiences with the ELS. The students in the blended cohort indicated that they saved time by using the ELS (M = 4.10, SD = 0.70). This provides them with the opportunity to upload course assignments and review course materials using the ELS without having to travel to a physical campus to deliver a printed assignment. Those in the online course expressed that the ELS contributed to their academic success (M = 4.23, SD = 0.66), which suggests that the ELS provides a meaningful avenue for them to further their academic studies while staying fully employed or remaining in their home country.

The challenge for the students in the blended course was feelings of isolation (M = 3.71, SD = 0.70) while those in the online course reported a lack of contact with others (M = 3.48, SD = 0.99). These results also support findings that students who take online courses especially for the first time tend to feel lonely and socially isolated, mainly because they are not familiar with the social interaction of ELS environment [4].

At the end of the blended and online courses, the model accounted for 40.8 and 35.1% of the variance in net benefits (F(3, 535) = 122.986, p < 0.001, R2 = 0.408, and F(3, 420) = 75.688, p < 0.001, R2 = 0.351) respectively. Student satisfaction was the largest contributor in both courses (47.2% for the blended course and 39.9% for the online course). As the courses come to an end, there is less use of the ELS but overall satisfaction with the experience, which is consistent with research that found user satisfaction to be the most significant contributor of ELS success [14].

For students in the blended cohort, using the ELS contributed 19.3% of the variance for the net benefits of taking the course, followed by 14.6% for their level of e-readiness. Having the convenience of submitting course assignments via the ELS could be more beneficial for students. Conversely, for the online cohort, their level of e-readiness contributed 19.5%, with 13.3% for using the ELS. This could imply that high levels of e-readiness are most beneficial to these students who depend on the ELS.

An independent samples t-test was conducted to examine whether there was a significant difference between students in either course who were e-ready from those who were not, based on their net benefits at course completion. The test revealed a statistically significant difference for students in the online course (t = −6.95, df = 270.57, p < 0.000). Those who had higher levels of e-readiness (M = 3.89, SD = 0.53) seemed to have benefited more by the end of the course than those who were not e-ready (M = 3.52, SD = 0.48).

The overall influence of students’ e-readiness showed a positive impact on ELS quality, information quality, and service quality for both cohorts during course orientation. This influence was greater for the students in the online course which steadily diminished during course delivery to the end of the course. In contrast, the influence steadily increased with the blended cohort where student satisfaction was strongest during course delivery through to the course completion. The positive benefits of having high levels of e-readiness saved the students’ time in the blended cohort and contributed to online cohorts’ academic success. The negative outcome was lack of contact with others for the blended cohort and feelings of isolation for the online cohort. It may appear that the students in the blended course benefit from having higher levels of e-readiness to help them through their course. Figure 2 shows the impact on each factor at each stage, with the strongest influence of the two cohorts shown in bold.

Figure 2.

Summary results of influence of students’ readiness at course orientation, during course delivery, and at course completion. Only the results of the strongest impact of the two cohorts (blended; online) at each stage are shown. None indicates no influence for the factor.

Advertisement

5. Conclusions and future research

The framework seems to effectively evaluate the levels of e-readiness throughout a blended or online course. The results suggest that students’ level of e-readiness contributes to their academic success via different pathways. While students often have a choice of selecting a blended or online course, it should not be assumed that the provision of an ELS could replace the traditional classroom. Evaluating their suitability for different modalities should be paramount so that they are given the best options, and advice, to make an informed decision. One can only wonder how the experiences of the 166 students who were deficient in all three scales could have been improved with pre-course screening and ELS training. These methods should be embedded in courses with an online component, where students can be assessed, and if necessary exposed to technical and social skills required for the ELS. Failure to do so could result in missed opportunities for improving expectations in online environment and unnecessary increases in attrition rates. Another recommendation of fixing the ‘ready’ in e-readiness would be to increase the visibility and accessibility of help desk services for students in both cohorts, by possibly embedding email contacts and online chat facilities with every ELS log in screen and on every course page. While it may initially seem trivial to do so, it could improve retention rates of those who would otherwise become frustrated with the ELS and drop the course.

There were some limitations to the study. First, students in the region who still pay high ‘per minute’ fees, use ‘dial-up’ to access the Internet, do not own a computer or have reliable Internet access are less likely to spend extra time completing an online survey. More so, students who were not categorized as e-ready and were still not familiar with the ELS could have possibly contributed to under-reporting of responses if they were not keen on completing a non-essential task such as an online survey. These findings could be used as a benchmark for comparisons of levels and characteristics of e-readiness in other blended and online courses. However, tracking students’ levels of e-readiness, whether categorized as e-ready or not, in subsequent courses through to graduation would be a most useful study for university administrations and instructors in an effort to understand and use key indicators to reduce attrition rates.

References

  1. 1. Allen IE, Seaman J. Grade Change: Tracking Online Education in the United States 2014 [cited 2016 April 2]. Available from: http://www.onlinelearningsurvey.com/reports/gradechange.pdf
  2. 2. Vanslambrouck S, Zhu C, Lombaerts K, Philipsen B, Tondeur J. Students' motivation and subjective task value of participating in online and blended learning environments. The Internet and Higher Education. 2017;36:33-40
  3. 3. van Rooij SW, Zirkle K. Balancing pedagogy, student readiness and accessibility: A case study in collaborative online course development. The Internet and Higher Education. 2016;28:1-7
  4. 4. Yu Y, Richardson JC. An exploratory factor analysis and reliability analysis of the student online learning readiness (SOLR) instrument. Online Learning 2015. 2016;19(5):22; [cited 2016 March 31]
  5. 5. Lim DH, Morris ML, Kupritz VW. Online vs. blended learning: Differences in instructional outcomes and learner satisfaction. Journal of Asynchronous Learning Networks. 2007;11(2):27-42
  6. 6. Lim HL. Community of inquiry in an online undergraduate information technology course. Journal of Information Technology Education. 2007;6:153-168
  7. 7. Artino AR Jr. Online or face-to-face learning? Exploring the personal factors that predict students' choice of instructional format. The Internet and Higher Education. 2010;13(4):272-276
  8. 8. Burns K, Duncan M, Sweeney DC, North JW. A longitudinal comparison of course delivery modes of an introductory information systems course and their impact on a subsequent information systems course. MERLOT Journal of Online Learning and Teaching. 2013;9(4):453-467
  9. 9. Wagner SC, Garippo SJ, Lovaas P. A longitudinal comparison of online versus traditional instruction. MERLOT Journal of Online Learning and Teaching. 2011;7(1):68-73
  10. 10. Cole MT, Shelley DJ, Swartz LB. Online instruction, e-learning, and student satisfaction: A three year study. The International Review of Research in Open and Distance Learning. 2014;15(6):111-131
  11. 11. Mandernach BJ, Mason T, Forrest KD, Hackathorn J. Faculty views on the appropriateness of teaching undergraduate psychology courses online. Teaching of Psychology. 2012;39(3):203-208
  12. 12. Parkes M, Stein S, Reading C. Student preparedness for university e-learning environments. The Internet and Higher Education. 2015;25(1):1-10
  13. 13. Darab B, Montazer GA. An eclectic model for assessing e-learning readiness in the Iranian univerisities. Computers & Education. 2011;56(3):900-910
  14. 14. Hashim H, Tasir Z. E-Learning Readiness: A Literature Review. 2014 International Conference on Teaching and Learning in Computing and Engineering (LaTiCE). Kuching: IEEE; 2014
  15. 15. Bhuasiri W, Xaymoungkhou O, Zo H, Rho JJ, Ciganek AP. Critical success factors for e-learning in developing countries: A comparative analysis between ICT experts and faculty. Computers & Education. 2012;58(2):843-855
  16. 16. World Economic Forum. The Global Information Technology Report 2015. Geneva: Johnson Cornell University; 2015
  17. 17. Warrican SJ, Leacock CJ, Thompson BP, Alleyne ML. Predictors of student success in an online learning environment in the english-speaking Caribbean: Evidence from the University of the West Indies Open Campus. Open Praxis. 2014;6(4):331-346
  18. 18. van Zyl JM, Els CJ, Blignaut AS. Development of ODL in a newly industrialised country according to face-to-face contact, ICT, and e-readiness. The International Review of Research in Open and Distance Learning. 2013;14(1):84-105
  19. 19. Enightoola K, Fraser S, Brunton T. Exploring the community of inquiry model: Students' attitudes towards e-learning. Caribbean Teaching Scholar. 2014;4(2):81-98
  20. 20. Kaymak ZD, Horzum MB. Relationship between online learning readiness and structure and interaction of online learning students. Kuram ve Uygulamada Egitim Bilimleri. 2013;13(3):1792-1797. PubMed PMID: 1416275786. English
  21. 21. Mandernach BJ, Donnelli E, Dailey-Hebert A. Learner attribute research juxtaposed with online instructor experience: Predictors of success in the accelerated, online classroom. Journal of Educators Online. 2006;3(2):17. PMCID: July 26
  22. 22. Holsapple CW, Lee-Post A. Defining, assessing, and promoting e-learning success: An information systems perspective. Decision Sciences Journal of Innovative Education. 2006;4(1):67-85
  23. 23. Emelyanova N, Voronina E. Introducing a learning management system at a Russian university: Students' and teachers' perceptions. The International Review of Research in Open and Distance Learning. 2014;15(1):273-289
  24. 24. Watkins R, Leigh D, Triner D. Assessing readiness for e-learning. Performance Improvement Quarterly. 2004;17(4):66-79
  25. 25. Tinto V. Reflections on student retention and persistence: Moving to a theory of institutional action on behalf of student development. Studies in Learning, Evaluation, Innovation and Development. 2005;2(3):89-97
  26. 26. Baturay M, Yukselturk E. The role of online education preferences on student achievement. Turkish Online Journal of Distance Education. 2015;16(3):1-10
  27. 27. Mutala SM, van Brakel P. An evaluation of e-readiness assessment tools with respect to information access: Towards an integrated information rich tool. International Journal of Information Management. 2006;26(3):212-223
  28. 28. DeLone WH, McLean ER. The DeLone and McLean model of information systems success: A ten-year update. Journal of Management Information Systems. 2003;19(4):9-30
  29. 29. Chyung S. Systematic and systemic approaches to reducing attrition rates in online higher education. The American Journal of Distance Education. 2001;15(3):36-49
  30. 30. Al-Busaidi KA. An empirical investigation linking learners’ adoption of blended learning to their intention of full e-learning. Behaviour & Information Technology. 2013;32(11):1168-1176
  31. 31. Klobas JE, McGill TJ. The role of involvement in learning management system success. Journal of Computing in Higher Education. 2010;22(2):114-134
  32. 32. Gay GHE. An assessment of online instructor e-learning readiness before, during, and after course delivery. Journal of Computing in Higher Education. 2016:1-22
  33. 33. Petter S, McLean ER. A meta-analytic assessment of the DeLone and McLean IS success model: An examination of IS success at the individual level. Information Management. 2009;46(3):159-166

Written By

Glenda H. E. Gay

Submitted: 09 October 2017 Reviewed: 23 January 2018 Published: 01 August 2018