Open access peer-reviewed chapter

Extending Health Information System Evaluation with an Importance‐Performance Map Analysis

Written By

Mohd Idzwan Mohd Salleh, Rosni Abdullah and Nasriah Zakaria

Submitted: 12 October 2016 Reviewed: 24 February 2017 Published: 23 August 2017

DOI: 10.5772/68122

From the Edited Volume

Advances in Health Management

Edited by Ubaldo Comite

Chapter metrics overview

1,707 Chapter Downloads

View Full Metrics

Abstract

Evaluation of a health information system is necessary for determining effective use and for enhancing the productivity of medical practitioners. However, the current system evaluation toolkit does not recommend specific areas required for further improvement. The objective of this chapter was to identify those constructs and their attributes that were the most suitable candidates for managerial intervention by applying partial least squares structural equation modeling. In doing so, the quantitative survey was adopted from the past studies together with new items creation representing system quality, records quality, service quality, and knowledge quality as the predictors while effective use and user performance as the outcomes. When extending the findings in importance‐performance map analysis, two‐system quality attributes (workflows fit and work styles fit) and all‐knowledge quality attributes exhibited higher importance rank for managerial actions. The chapter also provides a valuable recommendation for the policy and decision‐makers at the managerial level on how to apply the proposed system evaluation method in producing more efficient strategic‐planning strategies for further system upgrades and new implementation at health facilities.

Keywords

  • summative evaluation
  • health information system
  • effectiveness
  • partial least squares structural equation modeling
  • importance‐performance map analysis

1. Introduction

The widespread implementation and adoption of health information systems (HISs) around the world are believed to improve access and use of health data in ensuring high quality of care and health system efficiency and fostering clinical research [1, 2]. The acceleration of HIS implementations will further enhance sharing of health information electronically across different clinical settings [3] that eventually generate quality benefits and minimize medical costs from avoiding unnecessary clinical trials, examinations, and treatments [4]. Therefore, the management and presentation of HISs are vital to accelerate patient care and its continuity across health institutions [5]. The success of system implementation relies upon a high quality of information outputs from HISs required to make timely and accurate clinical decisions by various health practitioners [6]. Besides enabling care continuity, HIS is regarded as the wealthiest source of clinical evidence to support continuous communication among individual clinicians and surgical team works [7]. With the use of HIS, it is not only capable to reduce human errors [8] but also contributes to an increased adherence to clinical guidelines and deterrence of medical errors [9, 10], thereby delivering greater patient safety and medication management [11].

In Malaysia, the expenditures of customized HISs are fully supported by the government in the efforts to retain a higher standard of patient care [12, 13]. All new public hospitals should be equipped with HISs designed from multiple vendors hired by the government. Although the investment of IS can improve health service, it will also present more costs in maintenance, hardware replacements, end‐user trainings, and system upgrades [14, 15]. Increasing medical costs [16] and enormous budget cuts among local hospitals have demanded for a comprehensive evaluation of HIS to investigate the most possible strengths and weaknesses for further improvements. In reality, the effectiveness of HIS adoption among implemented government hospitals had never been assessed since its first kick‐off at Selayang Hospital in 1999. Hospitals with HISs are repeatedly distributing user satisfaction surveys without concentrating on significant success factors and impact on the performance of the health personnel. They conducted these surveys to satisfy the auditing needs but the results were still insufficient in recommending which critical attributes for improving system use and user productivity. As a consequence, the government hospitals were still incapable of choosing the right HIS and vendors and even assessing its performance after implementation [17]. A systematic IS evaluation will not only promote efficient use and medical cost savings but also cope with unresolved issues of clinicians’ heavy workloads and shortage of specialists in this multi‐racial country [14, 15].

Identifying the needs of health workforce and acknowledging the characteristics of HIS are essential to their productivity that must be emphasized in any evaluation studies [18, 19]. For that reason, recognizing the main attributes of HIS can improve health practitioners’ performance from their daily use. Strategies to upgrade an HIS could not precede with an absence of in‐depth knowledge about the most significant HIS characteristics in predicting user productivity. Consequently, there will be wasted expenses on any system upgrades without careful understanding of the potential system impacts or benefits to the user performance, thereby introducing dissatisfaction and risks of system failure [18].

Unfortunately, there is little evidence on the prior HIS research in measuring the influence of IS attributes toward satisfaction and productivity of medical practitioners [20, 21]. Besides, the previous evaluation works did not completely assess the importance and performance of multiple HIS attributes especially in ranking those attributes with high importance for managerial attention. There are only two current studies attempted to prioritize different HIS quality measures among small samples acquired in one public hospital [18, 19]. Furthermore, the current trends in examining HIS use and user satisfaction in the scholarly publications are still plenty by ignoring core success drivers that will predict user and organizational impacts. By contrast, there are many empirical studies on HIS evaluation concerning the effects of system quality, information quality, service quality, usage, user satisfaction, and net benefits in the developed and developing countries [22, 23], but none of them address on the critical quality or success factors required for managerial response. Most studies only present significant results without recommending specific measures or indicators that will guide the hospitals in prioritizing the most important indicators for improving effective use and health personnel productivity.

Advertisement

2. Conceptual foundation

The DeLone and McLean IS success models (DMISMs) are the most outstanding theoretical frameworks adopted by IS researchers since the past two decades for IS evaluation including the health‐care domain [22, 23]. The models embrace system quality, information quality, service quality, actual use, and user satisfaction to predict individual impact, organizational impact, and net benefits [24, 25]. In our empirical study, the traditional DMISM models will be extended to incorporate knowledge quality and effective use in predicting individual performance based on the perception of medical practitioners as HIS system users.

2.1. Effective use and user performance

Effective use and user performance are the two outcome constructs measured in our evaluation study. When actual system use denotes the extent or frequency of HIS usage [26], effective use more refers to the outputs of HIS usage that allows the medical practitioners to complete their clinical tasks easily without any misdiagnosis and inaccurate medication. Because of the mandatory use of HIS, the actual use remains unreliable in assessing IS success [27, 28].

Previous research indicated that user satisfaction had a strong relationship with system quality, information quality, and individual impact [29, 30]. This construct is indeed composed of system quality and individual impact measures [31] that finally disclosed a little explanatory power [32]. Consequently, user satisfaction is omitted as the outcome construct in the study.

On the other hand, individual impact is the outcome generated by IS workers from their applied IT knowledge, skills, and experiences [33]. Likewise, user performance in this study refers to the level to which the practitioners gain benefits from the effective use of HIS by considering patient care and safety, work productivity, and performance score.

2.2. Predictors of health information system evaluation

System quality is the attributes or characteristics of HIS including functionality, features, interface design, and its performance to facilitate ease of clinical task completion [34]. With regard to past empirical studies on the most important predictors of HIS quality [19, 22, 23, 35], we will limit the scope of measuring this predictor with the four measures namely adequate IT infrastructure, system interoperability, perceived security concerns, and system compatibility.

In the conventional DMISMs, information quality describes the usable, meaningful, and understanding the content and format of IS outputs [24, 36]. Clinicians can deliver the right care depending on the quality of information produced from HIS [37]. For that reason, successful adoption of HIS is determined from the quality of records it produced [7]. The researchers will more specify the generic term of information quality with records quality based on timely access, consistency, standardized, accuracy, duplication prevention, and completeness of patient notes, reports, prescriptions, images, laboratory test results, and discharge summaries.

In general, service quality is about the type of IS support delivered by the responsible IS providers or personnel [38]. We will extend service quality construct with quick assistance, problem‐solving capability, follow‐up service, and adequate training in the study.

The advancement of interoperable HISs from time to time will not only create, store, and manage data and information but also knowledge [12, 35]. The aim of HIS adoption in most hospitals is to acquire, classify, store, access, and simplify the use of knowledge from a HIS repository of patient health information for supporting clinical decision‐making, actions, and problem solving [39, 40]. Besides, HISs can be utilized to promote knowledge management activities in a health organization through medical research and education [41]. In essence, medical knowledge is classified into two types such as tacit and explicit. Tacit knowledge is gathered through professional practices and experiences of medical practitioners while explicit knowledge is generally embedded and presented into the forms of electronic health records (EHRs), electronic medical records (EMRs), clinicians’ workflows, clinical guidelines, and protocols [42, 43]. HIS also integrates clinical decision support system (CDSS) and computerized provider order entry (CPOE) as the knowledge tools to hold medical knowledge [39, 4244]. It should be noted that the wide adoption of HIS worldwide is not only due to EHRs but also its integration with CDSS and CPOE to raise higher quality of patient care [45]. Hence, the quality of knowledge must be included in any HIS evaluation [12, 41]. As a new measuring predictor in this study, knowledge quality is defined as the level to which the medical practitioners believe that using HIS will increase their medical knowledge and competencies [41] and then practice it to deliver the best patient care.

Advertisement

3. Empirical example

Our study would bridge the knowledge gap with current empirical proof in the local health system to determine the importance and performance of several effectiveness factors for immediate managerial actions with regard to the effective use of HISs and medical practitioners’ performance as the measuring outcomes. The research design would employ a quantitative method with the distribution of survey questionnaire to the four groups of health personnel in the three different government hospitals with multiple HISs. By utilizing importance‐performance map analysis (IPMA) feature in partial least squares structural equation modeling (PLS‐SEM), the expected outcomes could establish the most critical quality attributes for effective use and user performance improvements.

An ethic approval was obtained from the Medical Research and Ethics Committee Malaysia as the study engaged the human subject responses from varying clinical professionals. Subsequently, the data were gathered from three hospitals situated in different states with different HIS packages. These hospitals had more than 1000 health personnel with more than 500 beds for patients. Specifically, Kedah Hospital used iSOFT system, Pahang Hospital used F1S1C1EN® system, and Johor Hospital used Cerner system. Connected via a centralized and secured 1Gov*Net network, all HISs are integrated with various clinical modules including patient management, laboratory, radiology, pharmacy, picture archiving and communication, nursing, and operating theater management. The implemented systems are in the current phase of operation and maintenance while the contract is renewed for every 3 years. The government did not standardize the use of single HIS package across their administered hospitals in order to avoid monopoly by a sole vendor that will render a negative image to the public.

Adopted from past surveys [36, 38, 41, 4650] with 19 new item additions anchoring by seven‐point Likert scales from 1 of strongly disagree to 7 of strongly agree, the questionnaire draft was proven valid and reliable after pretesting between key HIS experts and pilot testing among 100 samples of end users using exploratory factor analysis in statistical package for social science (SPSS) software. The field survey data contained 888 samples from specialists, medical officers, and nursing staffs collected by the mean of convenience sampling technique. Overall, 353 participated respondents were from Kedah Hospital, 213 from Pahang Hospital, and 322 respondents from Johor Hospital. Specifically, 71 and 96 were specialists and assistant medical officers, respectively, 328 were medical officers, and 393 were nurses. More than 70% of respondents were female due to imbalance recruitments of clinical professionals and nurses were majorly female while 64% of total samples aged between 25 and 35 years old. About 53% of assistant medical officers and nurses had Diploma qualifications in medical and nursing, respectively, whereas the remaining 47% medical officers and specialists had Bachelor, Masters, or PhD Degree in medical.

The collected data were subjected to confirmatory factor analysis using SmartPLS software. In this study, system quality characteristics namely adequate IT infrastructure, system interoperability, perceived security concerns, and system compatibility are identified as the formative measures. The formative model exhibited no collinearity issue for all measuring indicators and passed weight significance at a level of 1%. Then, in the reflective model, all question items satisfy the required outer loadings, composite reliability (CR), and average variance extracted (AVE) scores above the suggested thresholds [51, 52], confirming the convergent validity. However, one attribute of knowledge quality (knowqual_4) was deleted due to lower factor loading below 0.70.

Discriminant validity was then executed using the Fornell and Larcker [53] criterion, and cross‐loading methods. Every construct average variance extracted is more than 0.50 that satisfied the required criterion [53, 54] while cross‐loading scores of bolded indicators are higher than its opposing indicators in other constructs [55].

The next assessment was preceded to evaluate the path model. After running a complete bootstrapping test with 5000 subsamples and no sign option setting, the PLS results in Table 1 demonstrate that the observed path coefficients were statistically significant at either 0.05 or 0.01 level, and had positive effects on the outcomes or target constructs except for service quality and effective use relationships. The outcome of user performance had the largest predictive power explained by quality predictors and effective use. More importantly, knowledge quality as a new predictor became the strongest predictor for user performance at a 1% level of significance. This construct also had large effect size among other predictors that justified a need for measuring knowledge quality in future system evaluation studies.

(effective use)R‐squared: 0.260 Path coefficients (user performance) R‐squared: 0.640
System quality 0.320 (6.025***) 0.122 (3.127***)
Records quality 0.103 (2.115**) 0.137 (3.515***)
Service quality 0.047 (1.244) 0.139 (4.632***)
Knowledge quality 0.121 (2.520**) 0.489 (12.464***)
Effective use 0.104 (4.170***)

Table 1.

Path coefficients.

Significance Level: ***p < 0.01,


**p < 0.05.


The path coefficient scores for each latent construct would be subjected to further assessment in importance‐performance map analysis. IPMA in PLS‐SEM adopts the traditional IPA method in ranking both critical constructs and their measured indicators’ importance and performance for managerial intervention [51, 56]. Moreover, PLS‐SEM simplifies the researchers to model both higher‐order constructs and their individual indicators simultaneously for calculating attribute importance scores. It helps to reduce the collinearity issues between the attribute items if using a simple regression analysis [57]. The study results can be valuable in contributing to the practical implications to decision‐makers and administrators by incorporating IPMA. IPMA extends the PLS‐SEM results for path coefficient scores by contrasting the total effects of constructs’ importance in measuring target constructs with their average latent scores representing their performance.

In a graphical representation, IPMA contrasts the (unstandardized) total effects on the horizontal axis with the latent construct scores, rating on a scale of 0–100, on the vertical axis. The estimated results will be emphasized on the bottom of IPMA diagram [58]. The key objective of this analysis is to improve the performance of constructs with greater importance (strong total effect) but lower performance (small construct score) in predicting a single or more target constructs [51, 55]. Hence, the subsequent analysis would apply IPMA to highlight which latent constructs and their manifest attributes necessary for remedial attentions by both decision‐makers and hospital administrators.

The IPMA diagram in Figure 1 exhibits system quality has the strongest total effect over the outcome construct. Consequently, knowledge quality, records quality, and service quality should be improved to increase the effective use of HISs.

Figure 1.

IPMA for effective use at construct level.

When selecting user performance as a target construct as displayed in Figure 2, knowledge quality becomes the highest importance among others. System quality, records quality, service quality, and effective use are deserved for critical managerial attention to enhance the performance of medical practitioners. No underperforming construct below 50% is identified.

Figure 2.

IPMA for user performance at construct level.

As this construct level of analysis does not reveal which specific attributes required for further improvement, a subsequent analysis is continued with the individual measuring items for each latent construct. In Figure 3, syscom_1 (workflows fit) and syscom_2 (work styles fit) should be maintained for the continued effective use of HISs. By contrast, other quality attributes that fall into low performance must be stressed for managerial actions. For example, the attribute secc_4 (secure and save) has an average importance on effective use, while offering room for improving its performance. IT departments can focus on offering hands‐on training to educate HIS users about securing their access when using the systems [18]. In addition, user access control policy should be enforced and applied across the government hospitals with HISs to prevent unauthorized access and misuse of patient health information by non‐responsible doctors. Unfortunately, secc_1 (unauthorized access) attribute was removed from the analysis due to negative outer weight score in the measurement model assessment as suggested by Ringle and Sarstedt [58].

Figure 3.

IPMA for effective use at indicator level.

Next, in Figure 4, by retaining knowledge quality for sustaining greater user performance, all effective use, service quality, system quality, and records quality attributes demand for urgent intervention. For instance, indicator effuse_2 (misdiagnosis prevention) should receive particular attention by promoting HIS adoption across the country so that any misdiagnosis will be averted from timely and full access to comprehensive EHR of every patient. As a result, the importance of effective use increases and then improves user performance outcome. Interestingly, no attribute falls into the bottom zone, signifying that all measuring items for every predictor achieved more than 60% of performance score in the diagram.

Figure 4.

IPMA for user performance at indicator level.

More specifically, Table 2 lists the importance and performance scores for every predictor attribute with its discrepancy, calculating by subtracting performance value against importance value [59, 60]. In doing so, performance score in percentage of individual attribute has to be converted into three decimal places before computation. The results confirmed that attribute secc_3 (robust security control) of the largest discrepancy in effective use warranted for immediate managerial intervention mainly when the respondents expressed their concerns over lack of security control in HISs. When referring to previous IPMA diagram, this attribute had the lowest total effect (importance) score. Again, a proper security policy must be in place to limit the access level by specific clinical roles. Regular monitoring and reporting of access activities can be further improved with audit trail feature. On‐site training can be emphasized on instructing users by changing passwords frequently with a combination of numbers, alphabets, and symbols as well as securing their accounts through routine check of logging off after using the systems. By contrast, attribute adin_2 (adequate computers) had the highest discrepancy in user performance outcome, demanding for more computers to use HISs. In coping with a tight budget facing by most hospitals and the increasing rates of doctors, the hospitals may consider to provide grants in purchasing high‐performance desktop and laptop computers at low costs from their contracted system vendors.

Target construct: user performance
Attribute (question item) Performance Importance Discrepancy
Faster network (adin_1) 0.074 0.024 0.050
Adequate computers (adin_2) 0.072 0.014 0.058
Learning of knowledge (knowqual_1) 0.066 0.019 0.047
Researching of knowledge (knowqual_2) 0.067 0.018 0.049
Applying of knowledge (knowqual_3) 0.067 0.021 0.046
Decision‐making capability (knowqual_5) 0.067 0.022 0.045
Problem‐solving capability (knowqual_6) 0.066 0.026 0.040
Complete medical source (knowqual_7) 0.067 0.019 0.048
Timely access (recqual_1) 0.070 0.019 0.051
Records consistency (recqual_2) 0.073 0.013 0.060
Standardized format (recqual_3) 0.074 0.020 0.054
Records accuracy (recqual_4) 0.064 0.018 0.046
Repeated tests prevention (recqual_5) 0.063 0.016 0.047
Records completeness (recqual_6) 0.073 0.020 0.053
Data protection (secc_2) 0.065 0.029 0.036
Robust security control (secc_3) 0.066 0.005 0.061
Secure and safe (secc_4) 0.066 0.036 0.030
Quick assistance (servqual_1) 0.067 0.010 0.057
Problem solver (servqual_2) 0.069 0.011 0.058
Follow‐up service (servqual_3) 0.066 0.012 0.054
Adequate training (servqual_4) 0.068 0.011 0.057
Workflows fit (syscom_1) 0.068 0.066 0.002
Work styles fit (syscom_2) 0.068 0.063 0.005
Clinical practices fit (syscom_3) 0.067 0.028 0.039
Patient needs fit (syscom_4) 0.069 0.047 0.022
Interoperable systems (sysi_1) 0.070 0.012 0.058
Treatment cost reduction (sysi_2) 0.070 0.032 0.038
Coordinated care (sysi_3) 0.075 0.024 0.051
Target construct: user performance
Faster network (adin_1) 0.074 0.012 0.062
Adequate computers (adin_2) 0.072 0.007 0.065
Ease of task completion (effuse_1) 0.074 0.032 0.042
Misdiagnosis prevention (effuse_2) 0.068 0.040 0.028
Right medication (effuse_3) 0.064 0.038 0.026
Learning of knowledge (knowqual_1) 0.066 0.084 ‐0.018
Researching of knowledge (knowqual_2) 0.067 0.080 ‐0.013
Applying of knowledge (knowqual_3) 0.067 0.091 ‐0.024
Decision‐making capability (knowqual_5) 0.067 0.094 ‐0.027
Problem‐solving capability (knowqual_6) 0.066 0.112 ‐0.046
Complete medical source (knowqual_7) 0.067 0.083 ‐0.016
Timely access (recqual_1) 0.070 0.029 0.041
Records consistency (recqual_2) 0.073 0.020 0.053
Standardized format (recqual_3) 0.074 0.030 0.044
Records accuracy (recqual_4) 0.064 0.026 0.038
Repeated tests prevention (recqual_5) 0.063 0.024 0.039
Records completeness (recqual_6) 0.073 0.031 0.042
Data protection (secc_2) 0.065 0.015 0.050
Robust security control (secc_3) 0.066 0.003 0.063
Secure and safe (secc_4) 0.066 0.018 0.048
Quick assistance (servqual_1) 0.067 0.031 0.036
Problem solver (servqual_2) 0.069 0.035 0.034
Follow‐up service (servqual_3) 0.066 0.039 0.027
Adequate training (servqual_4) 0.068 0.035 0.033
Workflows fit (syscom_1) 0.068 0.034 0.034
Work styles fit (syscom_2) 0.068 0.032 0.036
Clinical practices fit (syscom_3) 0.067 0.014 0.053
Patient needs fit (syscom_4) 0.069 0.024 0.045
Interoperable systems (sysi_1) 0.070 0.006 0.064
Treatment cost reduction (sysi_2) 0.070 0.016 0.054
Coordinated care (sysi_3) 0.075 0.012 0.063

Table 2.

Performance and importance scores for individual attribute.

Advertisement

4. Recommendations for improving system effectiveness at minimal cost

4.1. HIS scorecard

Unfortunately, the Ministry and hospitals in Malaysia did not perform strategic planning in the design, implementation, and upgrade of the HISs. In fact, the future direction of the Ministry is to develop HIS product for extending the system to other hospitals. At present, they are only focused on delivering maintenance services and operational support to existing HISs to ensure uninterrupted hospital services. These services will be continued until a new in‐house system is entirely designed and deployed in all IT hospitals. So far, the selected vendor has been initiating the plan for HIS development and implementation, while the Ministry has been the sole licensed user of the product.

In addressing the gaps through proper strategic planning in order to achieve effective use and enhanced user performance objectives, the balanced scorecard (BSC) framework, designated as HIS scorecard (Figure 5), is extended on the basis of the applicability of the empirical study results that is highly recommended for the Ministry and IT hospitals. The scorecard is designed by extracting the key results from the IPMA on the basis of the importance scores of the estimated constructs at the indicator level of the analysis. With this scorecard, the respective parties can focus on the development of concrete goals and strategies from validated evidence‐based findings for the planning and evaluation of the system implementation rather than on the initiation of a new BSC template. More importantly, it can serve two central purposes:

Figure 5.

HIS scorecard.

  • As a metric for the policymakers at the Ministry level that facilitates effective decisions concerning the expenditures of HISs in new hospitals or upgrading the current ones. In this regard, the team implementing HIS must define their specific, measurable, achievable, realistic, and time‐frame (SMART) actions in order to achieve high effectiveness in their goal concerning predefined system quality, records quality, service quality, knowledge quality, and effective use indicators. After all actions for each strategy have been undertaken, the hospital management will present the completed scorecard with the assistance of the implementing team in front of the Board of Directors of the Ministry during the annual strategic plan meeting. Thus, HIS scorecard can be a significant measurable indicator to guide the strategic direction and the objectives of the national health technology investments in the present and future.

  • As a performance measurement for the auditors that assess whether or not the implemented HIS in a single IT hospital is effective. Specifically, it serves as a checklist that determines whether the previous actionable plans are well executed. The next session will further explain on how to execute simple evaluation survey using a concise guideline.

Consequently, the transformation of the study findings into a measurable scorecard will empower the hospital administrators and decision‐makers, thus facilitating their thorough understanding on how the performance of HISs positively influences their strategic decision‐making through systematic monitoring and increased effective use. Thus, it may contribute to adequate governance because of increased quality of patient care, and facilitates the efficient or prudent use of government budgets.

4.2. Concise HIS effectiveness guideline

In acquiring the inputs for every indicator in the scorecard, we have developed simple ways to evaluate the effectiveness of HIS by proposing “Easy Guide to Efficiently Evaluate Your HIS” in the form of flowchart diagram (see Figure 6) for practitioners. The subsequent steps are described as follows:

Figure 6.

Easy guide to efficiently evaluate your HIS.

  • Collect the surveys using a validated questionnaire (see Appendix A). This evaluation can be performed either by manual distribution in paper‐based during medical education programs held by clinical departments to gain better responses. But before that, a memo that is written and signed by the hospital director should be endorsed to all departments informing the purposes, significance, and implications of this survey.

  • When using paper‐based surveys, the acquired responses must be entered into SPSS software after data collection is completed.

  • Import the Excel file of a dataset into SPSS software and check for outliers, unengaged responses, and normality. Fix those problems accurately and save it into CSV format.

  • Import the converted dataset into SmartPLS software and start the algorithm and bootstrap routine procedures.

  • Observe the final results report for the path coefficients significance. If more than 50% of the estimated hypotheses are negative and not significant, execute IPMA for target constructs. If all the effects are significant and positive, perform IPMA as well, observe the endogenous constructs with high performance, and improve the constructs’ scores by their indicators. On the contrary, for instance with a non‐significant relationship; Service Quality ‐> Effective Use, the HIS implementation team must continually improve their quick assistance to the users when they are facing problems with the system or computers especially through online or telephone helpdesk supports. Nevertheless, if the total effects score is similar to other indicators within its measured construct, please refer to the lowest performance score between these indicators and take immediate improvement.

Hence, “Easy Guide to Efficiently Evaluate Your HIS” can allow a hospital to assess the system effectiveness efficiently not only at the individual but also at the organizational level by responsible IT department in cooperation with clinical research centers’ staffs. Through applying this clear guideline, the precision of HIS performance measurement will be greater and contributes to the effectiveness of the subsequent decision‐making by HIS users, stakeholders, and policymakers resulting from a good reputation of successful implementation while reducing costs for future upgrades and sustaining effective use and user performance. The guideline can be the best practical evaluation tool at very minimal cost to be executed for a comprehensive HIS evaluation survey at the national level.

Advertisement

5. Conclusions

The chapter endeavors to identify areas of HIS adoption in which focused effort would yield the most benefit in terms of effective use and user performance. In addressing the present gaps, the study did this by surveying system users at three Malaysian government hospitals using three different HIS packages during postimplementation. When the significance score did not clearly propose which construct and indicators required for operational improvement, the results were extended to include IPMA in ranking the possible constructs and attributes by highlighting the most critical areas for specific responses [58]. As a result, system quality should be maintained for continued effective use and knowledge quality for enhanced user performance. Specifically, effective use must be sustained by improving the design of HISs to fit with clinicians’ workflows. Then, the uses of CDSS and CPOE have to be regularly updated with latest features in accelerating patient care with right diagnosis and medications, thus guaranteeing that user performance does not decline. These additional findings also recommend an urgent action by the hospitals relating to the lack of security control and insufficient available computers.

For managerial implications, the extended findings are useful for decision‐makers at the government level in allocating proper budgets during strategic planning with HIS scorecard tool for further system upgrades and new implementation at other health facilities. “Easy Guide to Efficiently Evaluate Your HIS” can be a standardized guideline in performing the system effectiveness evaluation survey among IT hospitals. As the performance scores of measuring attributes for all systems did not reach below 50%, the surveyed hospitals must promote the benefits of interoperable systems across the setting, as user performance will be increased exponentially. With high performance but low‐importance constructs, it will produce relevant prescriptions for courses of action that the IT departments and system vendors can re‐look and immediately fix these issues to avert user dissatisfaction and low productivity. Finally, the hospitals can focus on selected quality criteria and their measuring indicators for these purposes so that more spending may be concentrated on upgrading other health facilities for patient care.

To the best of our knowledge, this study is the first summative evaluation of a country’s HISs by utilizing IPMA in the clinical setting. To produce a complete HIS evaluation before and after implementation, it is highly recommended for future health informatics researchers to include IPMA [18] along with new predictor of knowledge quality and improved effective use measures. This technique will therefore increase the rates of health worker’s engagement in HIS evaluation survey by indirectly forcing them to choose what they believe to be the most important attributes for the system effectiveness and to rank those attributes by importance score in a clearly map representation. This powerful technique can be extrapolated and applied to other organizations or countries with extreme budget tight while offering efficient resource consumption. In achieving minimal health expenditure, IPMA can be further explored on how it will achieve potential cost savings by prioritizing health‐care spending in both developed and developing nations.

Advertisement

Acknowledgments

The authors would like to thank the Director of Health Malaysia for the permission to publish this book chapter. Special appreciation goes to Kedah, Pahang, and Johor Hospitals for their participation in this research. The study received no funding support.

Advertisement

  1. effuse_1: HIS enables me to complete my tasks successfully in a few easy steps.

  2. effuse_2: HIS allows me to prevent misdiagnosis.

  3. effuse_3: HIS allows me to provide the right medications to patients.

  4. adin_1: Faster network access is critical for me to use HIS.

  5. adin_2: Adequate computer hardware is critical for me to use HIS.

  6. sysi_1: I only need to enter and save data once, then use the system with multiple HIS modules.

  7. sysi_2: The cost for patient’s treatment is reduced with the use of HIS.

  8. sysi_3: The connection between different HISs is critical to enable coordinated patient care.

  9. secc_1: I believe my HIS does not allow unauthorized access.

  10. secc_2: I believe my HIS protects patient’s information.

  11. secc_3: I believe my HIS has a robust security control.

  12. secc_4: I feel secure and safe using HIS.

  13. syscom_1: HIS fits my workflows.

  14. syscom_2: HIS fits the way I work and my work styles.

  15. syscom_3: HIS fits my clinical practices.

  16. syscom_4: HIS fits my patients’ needs.

  17. recqual_1: Access to HIS contents is timely.

  18. recqual_2: HIS contents are consistent when viewing from other computers.

  19. recqual_3: HIS contents are available in a standardized format.

  20. recqual_4: HIS contents are accurate.

  21. recqual_5: HIS contents avoid duplication of diagnostic tests.

  22. recqual_6: HIS contents are complete.

  23. servqual_1: IT support staff/vendor provides quick assistance when I face problems with HIS.

  24. servqual_2: IT support staff/vendor is always able to solve my problems with HIS.

  25. servqual_3: IT support staff/vendor provides follow‐up service to HIS users like me.

  26. servqual_4: IT support staff/vendor provides adequate training for me to use HIS.

  27. knowqual_1: HIS is useful for learning new medical knowledge.

  28. knowqual_2: HIS is useful when researching or creating new medical knowledge.

  29. knowqual_3: HIS is helpful when applying medical knowledge to my tasks.

  30. knowqual_4: HIS helps me share my medical knowledge with others.

  31. knowqual_5: HIS provides knowledge that increases my ability to make clinical decisions.

  32. knowqual_6: HIS provides knowledge that improves my ability to solve clinical problems.

  33. knowqual_7: HIS provides a complete medical source that I can refer to for more information.

  34. hcperf_1: HIS increases my time with patients.

  35. hcperf_2: HIS enhances the safety of patient care.

  36. hcperf_3: HIS increases my work productivity.

  37. hcperf_4: HIS increases my chance of obtaining better annual performance marks.

References

  1. 1. Gheorghiu B, Hagens S. Measuring interoperable EHR adoption and maturity: A Canadian example. BMC Medical Informatics and Decision Making [Internet]. 2016;16(1):1-7. Available from: http://dx.doi.org/10.1186/s12911‐016‐0247‐x.
  2. 2. Strudwick G, Booth R, Mistry K. Can social cognitive theories help us understand nurses’ use of electronic health records?. CIN: Computers, Informatics, Nursing [Internet]. 2016 [cited 2016 Feb 16];34(4):169-74. Available from: http://europepmc.org/abstract/med/26844529.
  3. 3. Wen K, Kreps G, Zhu F, Miller S. Consumers’ perceptions about and use of the Internet for personal health records and health information exchange: analysis of the 2007 Health Information. Journal of medical Internet research [Internet]. 2010 [cited 2016 Feb 16]; Available from: http://www.jmir.org/2010/4/e73/?trendmd‐shared=1.
  4. 4. Lammers E, Adler‐Milstein J, Kocher K. Does health information exchange reduce redundant imaging? Evidence from emergency departments. Medical care [Internet]. 2014 [cited 2016 Feb 16]; Available from: http://journals.lww.com/lww‐medicalcare/Abstract/2014/03000/Does_Health_Information_Exchange_Reduce_Redundant.7.aspx.
  5. 5. Jensen LGL, Bossen C. Factors affecting physicians’ use of a dedicated overview interface in an electronic health record: The importance of standard information and standard. International Journal of Medical Informatics [Internet]. 2016 [cited 2016 Apr 5];87:44-53. Available from: http://www.sciencedirect.com/science/article/pii/S1386505615300769.
  6. 6. Häyrinen K, Saranto K, Nykänen P. Definition, structure, content, use and impacts of electronic health records: A review of the research literature. International Journal of Medical Informatics [Internet]. 2008 May [cited 2014 Jul 10];77(5):291-304. Available from: http://www.ncbi.nlm.nih.gov/pubmed/17951106.
  7. 7. Ghazisaeedi M, Mohammadzadeh N, Safdari R. Electronic health record (EHR) as a vehicle for successful health care best practice. Medical Archives [Internet]. 2014 [cited 2016 Feb 25];68(6):419. Available from: http://www.scopemed.org/?mno=175185.
  8. 8. El‐Kareh R, Hasan O, Schiff GD. Use of health information technology to reduce diagnostic errors. BMJ Quality & Safety [Internet]. 2013 Jul;22 Suppl 2(August):ii40–ii51. Available from: http://www.ncbi.nlm.nih.gov/pubmed/23852973%5Cnhttp://qualitysafety.bmj.com/content/22/Suppl_2/ii40.full.pdf.
  9. 9. Reis S, Sagi D, Eisenberg O, Kuchnir Y. The impact of residents’ training in Electronic Medical Record (EMR) use on their competence: report of a pragmatic trial. Patient education and … [Internet]. 2013 [cited 2016 Jan 29]; Available from: http://www.sciencedirect.com/science/article/pii/S0738399113003145.
  10. 10. Dhumal P. Financial model for investment recovery period in electronic health records implementations. International Journal of Economics and Business Research [Internet]. 2014 [cited 2016 Jan 30];9(1):65-79. Available from: http://www.inderscienceonline.com/doi/abs/10.1504/IJEBR.2015.066015.
  11. 11. Gagnon MP, Ghandour EK, Talla PK, Simonyan D, Godin G, Labrecque M, et al. Electronic health record acceptance by physicians: Testing an integrated theoretical model. Journal of Biomedical Informatics [Internet]. Academic Press Inc.; 2014 Apr [cited 2014 Oct 2];48:17-27. Available from: http://www.ncbi.nlm.nih.gov/pubmed/24184678.
  12. 12. Salleh MIM, Abdullah R, Zakaria N. Validating electronic health records system effectiveness questionnaire using partial least squares‐structural equation modeling. Australian Journal of Basic and Applied Sciences [Internet]. 2015;9(25):87-95. Available from: http://www.ajbasweb.com/old/ajbas/2015/Special IPN Langkawi (Aug)/87‐95.pdf.
  13. 13. Hassan R, Tajuddin MZM. Implementation of total hospital information system (THIS) in Malaysian public hospitals: Challenges and future prospects. International Journal of Business and Social Research [Internet]. 2012 [cited 2016 Feb 24];2:33-41. Available from: http://thejournalofbusiness.org/index.php/site/article/view/189.
  14. 14. Bahagian Perancangan dan Pembangunan KKM. Laporan Kajian Separuh Penggal, Pelan Strategik KKM 2011-2015. Putrajaya; 2014.
  15. 15. Bahagian Perancangan dan Pembangunan KKM. Country Health Plan 10th Malaysia Plan 2011-2015. Putrajaya; 2011.
  16. 16. Lee HW, Ramayah T, Zakaria N. External factors in hospital information system (HIS) adoption model: A case on Malaysia. Journal of Medical Systems [Internet]. 2012 [cited 2016 Feb 24];36(4):2129-40. Available from: http://link.springer.com/article/10.1007/s10916‐011‐9675‐4.
  17. 17. Khalifa M. Organizational, financial and regulatory challenges of implementing hospital information systems in Saudi Arabia. Journal of Health Informatics in Developing Countries. 2016;10(1):30-45.
  18. 18. Cohen JF, Coleman E, Kangethe MJ. An importance‐performance analysis of hospital information system attributes: A nurses’ perspective. International Journal of Medical Informatics [Internet]. Elsevier Ireland Ltd; 2015;86:82-90. Available from: http://www.sciencedirect.com/science/article/pii/S1386505615300526.
  19. 19. Salleh MIM, Zakaria N, Abdullah R. The influence of system quality characteristics on health care providers’ performance: Empirical evidence from Malaysia. Journal of Infection and Public Health [Internet]. King Saud Bin Abdulaziz University for Health Sciences; 2016;(September):1-10. Available from: http://dx.doi.org/10.1016/j.jiph.2016.09.002.
  20. 20. Booth RG. Examining the functionality of the DeLone and McLean information system success model as a framework for synthesis in nursing information and communication technology research. CIN: Computers, Informatics, Nursing [Internet]. 2012 [cited 2016 Oct 19];30(6):330-45. Available from: http://journals.lww.com/cinjournal/Abstract/2012/06000/Examining_the_Functionality_of_the_DeLone_and.9.aspx.
  21. 21. Holden RJ, Karsh BT. The technology acceptance model: Its past and its future in health care. Journal of Biomedical Informatics [Internet]. 2010 [cited 2016 Apr 2];43(1):159-72. Available from: http://www.sciencedirect.com/science/article/pii/S1532046409000963.
  22. 22. Nguyen L, Bellucci E, Nguyen LT. Electronic health records implementation: An evaluation of information system impact and contingency factors. International Journal of Medical Informatics [Internet]. Elsevier Ireland Ltd; 2014 Jul 22 [cited 2014 Sep 8];83(11):779-96. Available from: http://www.ncbi.nlm.nih.gov/pubmed/25085286.
  23. 23. Bossen C, Jensen LG, Udsen FW. Evaluation of a comprehensive EHR based on the DeLone and McLean model for IS success: Approach, results, and success factors. International Journal of Medical Informatics [Internet]. Elsevier Ireland Ltd; 2013 [cited 2016 Feb 24];82(10):940-53. Available from: http://www.sciencedirect.com/science/article/pii/S1386505613001287.
  24. 24. DeLone W, McLean E. Information systems success: The quest for the dependent variable. Information Systems Research [Internet]. 1992 [cited 2015 Jul 23];3(1):60-95. Available from: http://pubsonline.informs.org/doi/abs/10.1287/isre.3.1.60.
  25. 25. DeLone WH, McLean ER. Information systems success revisited. Proceedings of the 35th Annual Hawaii International Conference on System Sciences [Internet]. 2002. p. 2966-76. Available from: http://ieeexplore.ieee.org/lpdocs/epic03/wrapper.htm?arnumber=994345.
  26. 26. Yusof MM, Paul RJ, Stergioulas LK. Towards a framework for health information systems evaluation. Proceedings of the 39th Annual Hawaii International Conference on System Sciences. IEEE; 2006. p. 1-10.
  27. 27. Li F. A framework for examining relationships among electronic health record (EHR) system design, implementation, physicians’ work impact [Internet]. University of Southern California; 2014 [cited 2015 Apr 26]. Available from: http://gradworks.umi.com/36/28/3628222.html.
  28. 28. Abdullah ZS. Hospital information systems implementation framework: Critical success factors for Malaysian public hospitals [Internet]. Curtin University; 2013 [cited 2015 Apr 26]. Available from: http://espace.library.curtin.edu.au/cgi‐bin/espace.pdf?file=/2013/09/20/file_1/192723.
  29. 29. McGill T, Hobbs V, Klobas J. User developed applications and information systems success: A test of DeLone and McLean’s model. Information Resources Management Journal [Internet]. 2003 [cited 2016 Feb 24];16(1):24-45. Available from: http://www.igi‐global.com/article/information‐resources‐management‐journal‐irmj/1235.
  30. 30. Negash S, Ryan T, Igbaria M. Quality and effectiveness in web‐based customer support systems. Information & Management [Internet]. 2003 [cited 2016 Feb 24];40(8):757-68. Available from: http://www.sciencedirect.com/science/article/pii/S0378720602001015.
  31. 31. Sedera D, Tan F. User satisfaction: An overarching measure of enterprise system success. Pacific Asia Conference on Information Systems [Internet]. Bangkok, Thailand; 2005 [cited 2016 Feb 24]. p. 963-76. Available from: http://eprints.qut.edu.au/17079/.
  32. 32. Sedera D, Gable G. A factor and structural equation analysis of the enterprise systems success measurement model. Twenty‐Fifth International Conference on Information Systems Proceedings [Internet]. 2004 [cited 2016 Feb 24]. p. 449-64. Available from: http://aisel.aisnet.org/cgi/viewcontent.cgi?article=1124&context=icis2004.
  33. 33. Chang C‐S, Chen S‐Y, Lan Y‐T. Motivating medical information system performance by system quality, service quality, and job satisfaction for evidence‐based practice. BMC Medical Informatics and Decision Making [Internet]. 2012 [cited 2015 Jul 23];12(1):1-12. Available from: http://www.biomedcentral.com/1472‐6947/12/135?utm_source=twitterfeed&utm_medium=twitter.
  34. 34. Yusof MM, Kuljis J, Papazafeiropoulou, A. Stergioulas LK. An evaluation framework for health information systems: Human, organization and technology‐fit factors (HOT‐fit). International Journal of Medical Informatics [Internet]. 2008 [cited 2016 Mar 9];77(6):386-98. Available from: http://www.sciencedirect.com/science/article/pii/S1386505607001608.
  35. 35. Salleh MIM, Abdullah R, Zakaria N. Electronic health records’ system characteristics, use, and effectiveness: A proposed theoretical framework. In: Soliman KS, editor. Proceedings of The 24th International Business Information Management Association Conference. Milan, Italy: International Business Information Management Association (IBIMA); 2014. p. 1669-81.
  36. 36. Gable GG, Sedera D, Chan T. Re‐conceptualizing information system success: The IS‐impact measurement model. Journal of the Association for Information Systems [Internet]. 2008 [cited 2016 Feb 24];9(7):377-408. Available from: http://search.proquest.com/openview/5aea21bcb44768af52392e119307e085/1?pq‐origsite=gscholar.
  37. 37. Kimiafar K, Sadoughi F, Sheikhtaheri A, Sarbaz M. Prioritizing factors influencing nurses’ satisfaction with hospital information systems: A fuzzy analytic hierarchy process approach. CIN: Computers, Informatics, Nursing [Internet]. 2014 [cited 2016 Mar 6];32(4):174-81. Available from: https://www.researchgate.net/profile/Farahnaz_Sadoughi/publication/259956372_Prioritizing_Factors_Influencing_Nurses’_Satisfaction_With_Hospital_Information_Systems_A_Fuzzy_Analytic_Hierarchy_Process_Approach/links/0f3175382a5644c27e000000.pdf.
  38. 38. DeLone WH, McLean ER. The DeLone and McLean model of information systems success: A ten‐year update. Journal of Management Information Systems [Internet]. 2003 [cited 2015 Jul 23];19(4):9-30. Available from: http://www.tandfonline.com/doi/abs/10.1080/07421222.2003.11045748.
  39. 39. Lin CH, Wei A, Yang H, Pittayachawan S, Vogel D, Wickramasinghe N. Inquiring knowledge management systems—A Chinese medicine perspective. Proceedings of the Annual Hawaii International Conference on System Sciences. Kauai, HI: IEEE; 2015. p. 3682-90.
  40. 40. Lin, C., Yang, W., Pittayachawan, S., & Wickramasinghe N. Using IS/IT to support the delivery of Chinese medicine: The design of a Chinese medicine clinic system. 24th Australasian Conference on Information Systems [Internet]. Melbourne, Australia: RMIT University; 2013 [cited 2016 Apr 6]. p. 1-10. Available from: https://researchbank.rmit.edu.au/view/rmit:22820.
  41. 41. Chang I‐C, Li Y‐C, Wu T‐Y, Yen DC. Electronic medical record quality and its impact on user satisfaction—Healthcare providers’ point of view. Government Information Quarterly [Internet]. Elsevier Inc.; 2012 Apr [cited 2014 Sep 22];29(2):235-42. Available from: http://linkinghub.elsevier.com/retrieve/pii/S0740624X12000068.
  42. 42. Aldekhail MS. Effect of knowledge management based error mitigation techniques in medical and diagnostic applications. International Conference on Convergence and Security (ICITCS). Beijing: IEEE; 2014. p. 1-6.
  43. 43. Bordoloi P, Islam N. Knowledge management practices and healthcare delivery: A contingency framework. Electronic Journal of Knowledge Management. 2012;10(2):110-20.
  44. 44. Tsai J, Hung S. Determinants of knowledge management system adoption in healthcare. Journal of Organizational Computing and Electronic Commerce [Internet]. 2016 [cited 2016 Jun 16];2-58. Available from: http://www.tandfonline.com/doi/abs/10.1080/10919392.2016.1194062.
  45. 45. Huang YH, Gramopadhye AK. Recommendations for health information technology implementation in rural hospitals. International Journal of Health Care Quality Assurance. 2016;29(4):1-22.
  46. 46. Gray C. Electronic health record systems in a centralized computing services environment: Critical success factors for implementation [Internet]. Robert Morris University; 2014 [cited 2015 Jul 23]. Available from: http://gradworks.umi.com/36/28/3628910.html.
  47. 47. Mansoor MME, Majeed R. Achieving interoperability among healthcare organizations [Internet]. Blekinge Institute of Technology; 2010 [cited 2015 Jul 23]. Available from: http://www.diva‐portal.org/smash/record.jsf?pid=diva2:831672.
  48. 48. Yousafzai S, Pallister J, Foxall G. Multi‐dimensional role of trust in Internet banking adoption. The Service Industries Journal [Internet]. 2009 May [cited 2014 Oct 22];29(5):591-605. Available from: http://www.tandfonline.com/doi/abs/10.1080/02642060902719958.
  49. 49. Tulu B, Burkhard R, Horan T. Information systems and health care xiv: Continuing use of medical information systems by medical professionals: Empirical evaluation of a work system model. Communications of the Association for Information Systems [Internet]. 2006 [cited 2015 Jul 23];18(1):641-56. Available from: http://aisel.aisnet.org/cgi/viewcontent.cgi?article=3102&context=cais.
  50. 50. Wu J‐HJ, Wang Y‐MY. Measuring KMS success: A respecification of the DeLone and McLean’s model. Information & Management [Internet]. 2006 Sep [cited 2014 Jul 28];43(6):728-39. Available from: http://linkinghub.elsevier.com/retrieve/pii/S0378720606000498.
  51. 51. Hair JF, Hult GTM, Ringle C, Sarstedt M. A primer on partial least squares structural equation modeling (PLS‐SEM) [Internet]. Thousand Oaks, CA: SAGE; 2014 [cited 2015 Aug 19]. 328 p. Available from: https://books.google.com/books?hl=en&lr=&id=TjzABAAAQBAJ&oi=fnd&pg=PR1&dq=A+primer+on+partial+least+squares+structural+equation+modeling+(PLS‐SEM)&ots=hy2q9NQfnG&sig=SOp8hQyxw7Mk9cX1wtt92‐PXZNY.
  52. 52. Hair JF, Ringle CM, Sarstedt M. PLS‐SEM: Indeed a silver bullet. The Journal of Marketing Theory and Practice [Internet]. 2011 [cited 2015 Aug 17];19(2):139-52. Available from: http://www.tandfonline.com/doi/abs/10.2753/MTP1069‐6679190202.
  53. 53. Fornell C, Larcker DF. Evaluating structural equation models with unobservable variables and measurement error. Journal of Marketing Research [Internet]. 1981 [cited 2015 Jul 23];18:39-50. Available from: http://www.jstor.org/stable/3151312.
  54. 54. Chin WW. How to write up and report PLS analyses. Handbook of Partial Least Squares [Internet]. Springer Berlin Heidelberg; 2010 [cited 2015 Jul 23]. p. 655-90. Available from: http://link.springer.com/chapter/10.1007/978‐3‐540‐32827‐8_29.
  55. 55. Hair JF, Hult GTM, Ringle CM, Sarstedt M. A primer on partial least squares structural equation modeling (PLS‐SEM). 2nd ed. Thousand Oaks, CA: SAGE; 2017. 361 p.
  56. 56. Hock C, Ringle CM, Sarstedt M. Management of multi‐purpose stadiums: Importance and performance measurement of service interfaces. International Journal of Services Technology and Management. 2010;14(2-3):188-207.
  57. 57. Gustafsson A, Johnson MD. Determining attribute importance in a service satisfaction model. Journal of Service Research [Internet]. 2004 [cited 2016 Nov 3];7(2):124-41. Available from: http://jsr.sagepub.com/content/7/2/124.short.
  58. 58. Ringle CM, Sarstedt M. Gain more insight from your PLS‐SEM results: The importance‐performance map analysis. Industrial Management & Data Systems. 2016;116(9):1865-86.
  59. 59. Abalo J, Varela J, Manzano V. Importance values for importance‐performance analysis: A formula for spreading out values derived from preference rankings. Journal of Business Research. 2007;60(2):115-21.
  60. 60. Sethna BN. Extensions and testing of importance‐performance analysis. Business Economics [Internet]. 1982 [cited 2016 Nov 3];28-31. Available from: http://www.jstor.org/stable/23482618.

Written By

Mohd Idzwan Mohd Salleh, Rosni Abdullah and Nasriah Zakaria

Submitted: 12 October 2016 Reviewed: 24 February 2017 Published: 23 August 2017