Open access

The Telemedicine Service Maturity Model: A Framework for the Measurement and Improvement of Telemedicine Services

Written By

Liezl van Dyk and Cornelius S.L. Schutte

Submitted: 02 May 2012 Published: 13 December 2013

DOI: 10.5772/56116

From the Edited Volume


Edited by Ramesh Madhavan and Shahram Khalid

Chapter metrics overview

3,289 Chapter Downloads

View Full Metrics

1. Introduction

General consensus exists that information and communication technology has the potential to enhance health systems applications and many good examples of such applications exist all over the world. Unfortunately, with respect to eHealth and telemedicine, there are also examples of disillusionment and scepticism. Many studies acknowledge the importance and challenge of finding models suitable for use in the facilitation, evaluation, and measurement of the success rate of eHealth and telemedicine projects. These measures are vital for facilitating the success, sustainability and optimisation of telemedicine services [1 -4].

1.1. Purpose

The purpose of this chapter is to present the TeleMedicine Service Maturity Model (TMSMM), a maturity model, including a capability statements that can be used to measure and manage the maturity of any telemedicine process.

1.2. Why a maturity model?

In many respects, telemedicine projects have experienced similar problems to those of the US military projects, implemented during the 1980’s. These projects, involving software contractors, ran over-budget and were completed far later than planned, if at all. In order to address this, the US Defense Software Engineering Institute (SEI) developed a process maturity framework to aid in the capability evaluation of the software contractors, to be used as part of the contract awarding process [9]. This model, which was orginally based on Crosby’s Quality Management Maturity Grid [5], became the Capability Maturity Model (CMM) and later the Capability Maturity Model Integration (CMMI). The CMM and CMMI also serve as compliance standards [6, 7].

The CMM was the inspiration for the development of dozens of other maturity models which were developed and applied in various domains and contexts. With a maturity model the maturity (i.e. competency, capability, level of sophistication) of a selected domain is assessed based on a more or less comprehensive set of criteria. [6]. Maturity models, firstly, provide a way of measuring the status quo by means of maturity level indicators. Secondly, they facilitate an improvement process that best suits the enterprise, while remaining within the prescribed best practice parameters of the particular domain [8].

Most definitions of maturity combine an evolutionary or experiential element with the adoption of ‘good’ (or appropriate) practice [9]. It is proposed that a telemedicine service maturity model will address the need for a framework that can be used to measure and grow the maturity of existing and prospective telemedicine services. This model should be useful for self-assessment and well as benchmarking to guide a telemedicine service or project towards the identification and adoption of best practices.

1.3. Maturity models within the context of health systems

As result of a meta-analysis of maturity models Fraser et. al. [9] divided maturity models into three groups.

  • CMM-like models

  • Hybrids and Likert-like questionnaires

  • Maturity grids

This categorization appealed to a number of authors to follow [10, 11] and is also of significance to this study:

1.3.1. CMM-like models

CMM-like models [9] are based upon a more formal architecture, specifying a number of goals and key practices to reach a predefined level of sophistication [11]. Many CMM-like models follow a standard format, are internationally recognized and are also use for certification purposes. Within the context of health systems, the British National Health System Infrastructure Maturity Model (NIMM) is possibly the only maturity model that fits this description.

1.3.2. Hybrids and likert-like scales

Hybrids and Likert-like questionnaires are comparable with maturity grids, but the focus is more inclined on scoring specific statements of “good practice” and not on describing the overall levels of maturity [11].

Technology readiness, system readiness and organizational readiness instruments are typical examples of this type of maturity assessment [10]. A few readiness instruments are already developed and in use within the context of telemedicine and ehealth. Jennett et al. [12] specifically refers to eHealth readiness when arguing that time, money and energy can be saved if the status quo of an eHealth/telemedicine system context is determined before implementation.

Legare et al. [13] identified six different assessment tools which use likert scale questionnaires to measure e-readiness within a certain health context: The first of these tools were developed in 1996. The Organizational Information Technology/ Systems Innovation Readiness Scale supports, within the context of telehealth, the evaluation, diagnosis and treatment selection for different steps in patient care. The second, third and fourth tools mentioned by [13] followed on each other and are focussed on home-based telehealth applications. Specifically the most recent two tools, namely the eHealth Readiness Assessment Toolset for Healthcare Institutions in Developing Countries [3] and a generic telehealth readiness assessment tool [12], was considered during the development of the TMSMM.

1.3.3. Maturity grid

Maturity grids are typically descriptive frameworks, used for self-assessment purposes. With a maturity grid a number of levels of maturity are described in a simple, textual manner, normally not exceeding a few pages of text [9, 10].

In contrast to CMM-like models, the purpose of maturity grids is not to provide a means of certification. Companies often follow a number of approaches in parallel and maturity grid assessment may be used as a stand-alone assessment, or as a subset of a broader improvement initiative [10, 11]. Furthermore, a typical maturity grid allows for the visualization of maturity levels, which is not necessarily the case for CMM-like models.

According to the classification presented in this section, the Telemedicine Service Maturity Model (TMSMM) presented in this chapter is considered to be a maturity grid.

Most of the maturity models developed within the context of health systems fit this description. Examples of these include, the Quintegra Maturity Model for Electronic Healthcare [14], the Healthcare Information Management Systems Society (HIMSS) Maturity Model for Electronic Medical Records, the Health Industry Insights Maturity Model for Information Systems and Technology (IST) development in hospitals [15], as well as the Picture Archiving Communication Systems (PACS) maturity model [16].

None of the frameworks mentioned in this section are an „off-the-shelf“ answer to the need for a framework to guide in the implementation and management of telemedicine services. However, all of the models mentioned in this section provided input to the development of a new maturity model, which is presented in this chapter.


2. Methodology

The scientific approach in the development of the TMSMM (refer to Figure 1), resembles the procedure suggested in [17]. This procedure takes into account the iterative nature of the maturity model development process, as well as the need to combine theoretical and empirical research as recommended by other respected scholars of maturity models [6, 9, 18, 19].

Figure 1.

Scientific approach to the development of the TMSMM

The conceptual telemedicine service maturity model (TMSMM) was developed by following an iterative process involving telemedicine practitioners from five different South African provincial departments of health (DoH). The involvement of these stakeholders in the process, ensured the contribution of their domain knowledge for solution creation and in turn, contributed to the validity of the model. As a consequence, greater user acceptance was achieved [20].

Between June 2011 and December 2011, a series of workshops was held throughout South Africa. Representatives included healthcare workers (e.g. specialists, radiologists, radiographers and nurses), as well as persons responsible for the development, implementation and maintenance of hospital information and communication technology (ICT).

The first day of these workshops was aimed at educating representatives about current state-of-the-art telemedicine. The second day of each workshop was an interactive session in which groups of delegates engaged with the maturity model. At the end of each session, workshop delegates were asked to reflect upon and provide feedback on the effectiveness of the maturity model in terms of its characteristics. This feedback, together with the thoughts of other researchers from the literature, was used to revise key design features, as well as the maturity model itself [21-23].

Upon finalization of the conceptual model, a top-down approach was followed to develop the descriptive model [6]: An overarching maturity scale was defined, after which domain-specific maturity scales were developed in alignment with these. A first iteration of TMSMM descriptors was defined, using as inputs firstly, the insights gained in the development of the conceptual model and secondly, frameworks and theories for the management and evaluation of telemedicine services. The descriptors were refined, based on stakeholder input.


3. The Telemedicine Service Maturity Model (TMSMM)

The conceptual TMSMM is shown in Figure 2, which also provides a framework for the remainder of this chapter.

Figure 2.

Conceptual model of the TMSMM

3.1. Design of the TMSMM

The TMSMM is designed along three dimensions. The intercept of each pair of these dimensions form a matrix, each with a specific significance and function.

3.1.1. Three dimensions

Firstly, five domains are defined (domain dimension), which provides a holistic view on all factors that impact on the implementation of telemedicine services. Secondly, the telemedicine service dimension represents five micro-level processes, one meso-level, and one macro-level process per domain. The third dimension is the maturity scale, which provides yardsticks for maturity measurement. The three primary colours are deliberately selected to represent each of these three dimensions. Each of the three dimensions are briefly described in section 3.2 (telemedicine service, yellow), section 3.3 (domain, red) and 3.5 (maturity scale, blue).

3.1.2. Three matrices

The orange matrix (refer to section 3.4) is the intercept of the telemedicine service dimension (yellow) and the domain dimension (red) and provides a framework, according to which, all aspects of the telemedicine under consideration are described. The domain-specific maturity indicators are outlined on the purple matrix (refer to section 3.6). This is the intercept of domain dimension (red) and the maturiy scale dimension (blue). The maturiy capability statements (green) in section 4 shows the maturity scale (blue) descriptors per telemedicine service process (yellow) for each respective domain.

3.2. The telemedicine service dimension

Telemedicine is per definition the delivery of healthcare service (medicine) over a distance (tele). This dimension includes all micro-level, meso-level and macro-level processes required to make this happen.

3.2.1. Micro-level telemedicine service

The micro-level telemedicine service is broken up into five generic processes, which are applicable to any telemedicine service. All five of these processes need to be effective for the telemedicine service to be successful. (1) Patient data is captured, (2) transmitted to where the (3) data is analysed and converted into useful information (diagnosis). This useful information is then (4) transmitted back so that it can be (5) acted upon.

3.2.2. Meso-level and macro-level telemedicine services

The systems engineering mindset dictates that complex systems (such as a health system), should be viewed as systems of systems. The meso- and macro-level telemedicine systems are systems which are situated higher up the health systems hierarchy. It does not exclusively relate to the telemedicine service, but it has a significant impact on the success of the telemedicine service. Hence, the maturity of these systems must be evaluated in conjunction with the maturity of the micro-level telemedicine process.

3.3. The domain dimension (red)

References are often made to the so-called „alphabet soup“ of the business world, for example the 4Ps of marketing, the 5Ps of strategy, the 4Ps of healthcare, the 5Ss of lean manufacturing, the 4Ms, 5Ms, 6Ms and 7Ms of manufacturing, and the list continues. The value of these models (representations of the real world) lies in its simplicity and re-usablity. Furthermore, the fact that these concept groupings are applied repetitively and to a variety of contexts is indicative of its validity and generalizability.

The TMSMM must be simple in structure and language, so that it can be understood and used by a range of stakeholders. Therefore, the domains were named and organized in a way similar to other „alphabet-soup“ frameworks. After each iteration of the development of the TMSMM the domains were adjusted, based on new insights gained. The final set of domains resembles the 5Ms for manufacturing, namely „man“, „machine“, „material“, „method“ and „money“.

At first glance it may seem inappropriote to apply the 5Ms of manufacturing to the domains of the TMSMM. However, the generic description for a manufacturing process is well aligned with the telemedicine service: The telemedicine service entails the sourcing and acquisition of raw material (raw patient data and information) at the right place, the right time, and according to the right specification. This information is then reworked into a useful product (a diagnosis, treatment prescription etc.) which only has value if it reaches the customer (patient) at the right place, the right time, and according to the right specification.

3.4. Framework for the description of the telemedicine service

Figure 3 (orange matrix) provides a framework for the description of the telemedicine service, since it provides domain (red) -indicators for each of the micro-, meso- and macro-level telemedicine service processes (yellow).

Figure 3.

Micro-, meso-, and macro-level telemedicine processes

3.5. The maturity scale

The maturity scale of the TMSMM is based on the generic level indicators of the capability maturity model (CMM). This is done for two reasons: Firstly, most of the existing maturity models use a maturity scale, which is either identical to, or strongly resembles the CMM-scale. This is indicative of the usefulness and validity of this scale. Secondly, it opens up the possibility that the TMSMM, which is categorized as a descriptive maturity grid, is used in conjunction with comparative CMM-like maturity models, for example those developed for project management and innovation management.

The generic levels are described below:

  • Level 1:Ad hoc - The service is unpredictable, experimental, and poorly controlled

  • Level 2:Managed - The service is characterised by projects and is manageable

  • Level 3:Standard - The service is defined/confirmed as a standard business process

  • Level 4:Quantitatively managed - The service is quantitatively measured and controlled

  • Level 5:Optimising - Deliberate focus on continuous improvement

3.6. Domain-specific maturity indicators

Domain specific maturity indicators for the micro-level as well as meso- and macro-level telemedicine service processes are shown in Figure 4 and Figure 5 respectively. Each of these rows serves as heading for each of the respective maturity grids, which are presented in the next section.

Figure 4.

Micro-level domain-specific maturity indicators

Figure 5.

Macro-level domain-specific maturity indicators


4. Maturity capability statements for the micro-level processes of each domain

If the three-dimensional concept model (Figure 2) is sliced up along the domain dimension, fifteen sets of maturity capability statements (green) are created, three for each domain. The three sets per domain are with respect to the micro-level, meso-level and macro-level processes. This section presents the complete sets of maturity capability statements with respect to micro-level processes. Each of the five rows of Figure 4 represents the heading for each of the five respective sets of maturity capability statements presented in this section.

4.1. Users (Man-domain)

The capability statements for the micro-level services within context of the man-domain is shown in Figure 6.

Figure 6.

Maturity capability statements for the micro-level processes of the “Man” domain

Telemedicine services have a wide range of users. Depending on how the service is set up, the telemedicine process can include patients as well as healthcare workers, such as medical specialists, nurses, radiologists, midwifes, primary care practitioners and counsellors, amongst others. On the the left hand side of examples are given of users (orange) for each of the steps within the micro-level telemedicine service. These examples are derived from a meta-analysis of telemedicine projects on which reports were published between 2007 and 2010 in the International Journal for Telemedicine and Telecare [24].

Patients are always the beneficiaries of telemedicine services and are often categorised as users. In home-based telemedicine services, it is common that patients are equipped with appropriate technology enabling them to collect and transmit their healthcare data themselves. On the other hand, patients are not necessarily a user of the system. For example, a nurse can use a telemedicine service to deliver appropriate care to a patient. In this case the nurse would be catagorised as the user of the telemedicine service and not the patient. By the same token, the role of ICT technologists is imperative for the operation of a telemedicine service, but the ICT technologist is not necessarily a user of the system.

Many authors, for example, [25] and [26], agree that the involvement of a so-called champion is one of the contributing factors to the successful implementation of telemedicine services. A champion is a user from the community who takes the role of innovator and advocate. Both level 2 and level 3 imply that the user is appropriately trained and motivated to execute the task consistently and on an ongoing basis. A maturity level of 2 is assigned when this process is driven by a champion. A maturity level of 3 is allocated if every user is trained and motivated to execute the task consistently and on an ongoing basis.

Another determinant for the successful implementation of telemedicine, is the integration of the telemedicine service with other business processes of the hospital or health system [1, 27]. For example, when the users are human resources of the hospital or health system, integration with the human resource business process is expected. The maturity level for the user-domain is scored at level 4 (quantitatively managed) if worker performance metrics exist for each of the steps of the telemedicine process and if data is effectively included in performance management and in the work appraisal process. A maturity level of 5 indicates a deliberate attempt towards continuous professional development with respect to the use of the micro-level telemedicine service.

4.2. Devices and software applications (Machine-domain)

The maturity capability statements for the micro-level machine domain is shown in Figure 7.

Figure 7.

Maturity capability statements for the micro-level processes of the “Machine” domain

Examples of devices and applications (orange) for each of the steps of the micro-level telemedicine service (yellow) are given. These examples are derived from a meta-analysis of telemedicine projects on which reports were published between 2007 and 2010 in the International Journal for Telemedicine and Telecare [24].

Admittedly, technology is developing at a rapid pace and it is likely that the current list of most used telemedicine devices and software looks somewhat different to those identified in this study. This does not influence the relevance of the TMSMM, since it is technology-independent. The purpose of the TMSMM is not to measure the complexity of technological advancement of the devices and software. Telemedicine is a service innovation [28] and not a technological innovation. The technology most commonly used for telemedicine services is relatively simple. Hence, the TMSMM measures the maturity of the telemedicine service and as such considers the appropriateness, maintainability, availability and interoperability of the device and software.

According to the layered telemedicine implementation model [27] during the prototype phase, the focus is on the technological feasibility. Technological feasibility includes aspects such as the availability, reliability, and maintainability (RAM) of the used technology. Until this is achieved, the maturity of the telemedicine service, in terms of devices and software, is pitched at level 1 (ad hoc).

For a managed telemedicine service (level 2) the telemedicine device and/or software must be regularly available and maintained and user support must be available within the context in which it is to be used. This is typical in the pilot phase. In South Africa [21] and throughout the world [27] it seems that the initial enthusiasm of the 1990s has given way to.more reflective views of the place of telemedicine in healthcare delivery. This is mostly due to the fact that the majority of pilot projects have failed to be sustained. Yunkap Kwankam, eHealth coordinator of the World Health Organisation, ironically described this situation as “suffering from pilotitis” [29].

A telemedicine device or software application is only considered to be standard (level 3) if it is standardised, interoperable and integrated with other systems in the healthcare system. Keeping track of standard software upgrades entails technology maintenance (level 3).

A maturity level of 4 is reached if systems are in place to monitor the reliability, availability and maintainability of the technology, whilst a maturity level of 5 indicates that hardware and software are continuously upgraded, where appropriated.

4.3. Electronic health records (Material-domain)

The telemedicine service converts raw data into useful information, similar to any manufacturing process, in which raw material is converted into a useful product. Hence, “material” is included as a domain in the final TMSMM, referring to electronic health records (micro-level) and local EHR systems (meso-level) and national EHR systems (macro-level). The maturity capability statements of individual EHRs (micro-level) appear in Figure 8.

Figure 8.

Maturity capability statements for the micro-level processes of the “Material” domain

A maturity level of 3 is awarded to the “capture data”, “perform diagnosis” and “react” processes if the EHRs are compiled according to a standard data format. If no intentional record keeping protocol is in place, the maturity of these three processes is gauged at 1. A maturity level of 2 indicates a consistent and repeatable record keeping process, but without considering standard data formats.

It is important that personal data about patients are transmitted securely and the privacy of data be ensured. This includes the raw data which is used for diagnosis, as well as the prognosis and information which is sent back to the person who needs to react. If this is accomplished according to encryption and decryption standards of the governing institution, the “transmit data” and “transmit feedback” processes are placed on maturity level 3. A maturity level of 1 indicates that data security protocols are not considered, whatsoever. A maturity level of 2 is awarded if security protocols are considered, but not necessarily managed by the governing institution.

Level 4 pertains to the tracking of electronic health records, whilst level 5 indicates deliberate attempts to increase the quality of electronic health records.

4.4. Work protocols (Methods-domain)

Refer to Figure 9 for the maturity capability statements for the micro-level processes of the “methods” domain, namely work protocols.

Figure 9.

Maturity capability statements for the micro-level processes of the “Methods” domain.

Any healthcare service needs to be executed according to a certain set of well defined protocols in order to ensure consistency, integrity and ethical conduct. In most cases these protocols was defined, before ICT made telemedicine possible. In many cases the telemedicine service deviates significantly from the original way of doing and a new protocol for the telemedicine service does not exists. In this case, a maturity level of 1 is allocated. A maturity level of 2 indicates that the telemedicine process is executed consistently and repeatably, but the protocol is not formally defined. Once it is formally defined the maturity is guaged at 3. Performance control metrics enables the consistent measurement of the effectiveness of the protocol (maturity level 4). A maturity level of 5 is reached as soon as the causes of performance deviations are continuously identified and rectified.

4.5. Costs (Money-domain)

The money-domain (considers maturity in terms of operational cost (micro-level), business model (meso-level) and health economics (macro-level). Figure 10 shows the maturity capability statements for the micro-level processes of the “money” domain.

Figure 10.

Maturity capability statements for the micro-level processes of the “Money” domain.

Maturity level 1 and 2 applies when the operational costs are provided by external entities, either for purposes of research and development (level 1) or for philantropic reasons by external donors. These funding modes are not sustainable. For financial sustainability, is it compulsory the operational expenses are covered as part of the standard budgeting process of the governing organization (maturity level 3). Maturity level 4 concerns accountability. It must be possible to measure and report on the costs related to this process. On maturity level 5 the cost effectiveness of the process are continuously improved.


5. Maturity capability statements for the meso and macro level processes of each domain

The scope of this chapter does not allow for detail maturity capability statements for meso- and macro-level processes. In this section follows a brief discussion on each of these processes.

5.1. User communities (Man-domain)

Telemedicine services inevitably cut across epistemic communities, for example medical practitioners, engineers, patients or public health actors [29]. The users of each step of the telemedicine process are members of one of these communities. The health worker / patient community and the society as a whole are added as respectively as meso-level and macro-level telemedicine systems, within the context of the user domain, since the maturity of the telemedicine service is dependent on the extent to which this service is accepted within these communities [27].

Maturity level 3 indicates that this activity is accepted as standard practice by the entire community. For example, if healthcare professionals are exposed to this telemedicine service as part of their training and if these practices are accepted by their professional societies or a it socially acceptable to use technology to communicate healthcare information.

Evidence-based practice is an interdisciplinary approach, which started in medicine as evidence-based medicine and spread across other fields over the past two decades [30]. Evidence based medicine aims for the ideal that healthcare professionals should make “conscientious, explicit, and judicious use of current best evidence in everyday practice.” As such, healthcare workers and other professionals are familiar with this concept.

Evidence-based practice dictates that all practical decisions made should (1) be based on research studies and (2) that these research studies are selected and interpreted to some specific and quantitative norm. A maturity level of 4 applies if such research studies are successfully being executed with context of the respective user community.

According to the World Health Organization task shifting entails the reallocation of certain tasks from more-specialised to less-specialised health care workers across the board. For example, tasks are shifted from the physician to the non-professional health care worker [31, 32]. A maturity level of 5 indicates that the telemedicine service deliberately causes task shifts for an entire professional community.

5.2. Information and Communication Technology (ICT) infrastructure (Machine-domain)

ICT infrastructure refers to all of the hardware, software, networks, facilities etc. that are required to develop, test, deliver, monitor, control or support applications and IT services. [33].

The maturity of the ICT infrastructure depends on its availability, reliability and maintainability (maturity level 3) as well as the measurement thereof (maturity level 4). Continuous improvement (maturity level 5) within this context relates to technology upgrades and scalability.

5.3. EHR systems (Material-domain)

The maturity scale described below applies equally to meso-level, local EHR systems and the macro-level, national EHR systems.

Syntactic interoperability involves a common data format and common protocol to structure any data so that the manner of processing the information will be interpretable. When the different systems involved in a telemedicine service are capable of communicating and exchaning data, they are syntactically interoperable [35] and a maturity level of 2 is indicated.

„Beyond the ability of two or more computer systems to exchange information, semantic interoperability is the ability to automatically interpret the exchanged information meaningfully and accurately in order to produce useful results as defined by the end users of both systems“ [35]. A maturity level of 3 is awarded if semantic interoperability is achieved. If no intentional record keeping protocol is in place, the maturity of these three processes is gauged at 1.

Business intelligence (BI) is a broad category of applications and technologies for gathering, storing, analyzing, and providing access to data to help enterprise users make better business decisions. A maturity level of 4 indicates that such applications and technologies existsBusiness analytics (BA) is often used as synonymn for BI. However, for purposes of the chapter, it is recognized that for BA statistical methods are used to develop an understanding of business performance and to provide a feedback loop towards continuous business improvement.

5.4. Change management (Methods-domain)

The need for deliberate and effective change management is echoed throughout studies on the implementation of telemedicine services [1-4, 23, 26, 37]. Change management is the process of changing processes. Within context of the TMSMM, change management is positioned as meso-level process of the Methods-domain (Figure 9).

The majority of telemedicine services do not sustain after pilot phase [22, 27, 26] even though the concept was technologically proved during prototype and pilot phase. In those cases, the change management process was ineffective (maturity level 1).

A champion is a user from the community who takes the role of innovator and advocate. Many authors [1, 25, 26] mention the involvement of a so-called champion as a critical success factor to the successful implementation of telemedicine services. Maturity level 2 applies when such a champion is either self-appointed or appointed by the institution.

A maturity level of 3 indicates sustainable institutional commitment to accomplish change. This commitment is firstly demonstrated by the formal and permanent appoint of a change agent (champion) and secondly if the change management process also manifests in other business processes, for example, during the budget process or facilities design process.

The effectiveness of the change management process is measured in terms of performance indicators (maturity level 4). Maturity level 5 implies that processes are in place to ensure continuous improvement in terms of these performance indicators.

5.5. Financial sustainability (Money-domain)

The maturity of the micro-level telemedicine service – as far as the money-domain is concerned – is measured in terms of the costs to operate and maintain this service. On macro-level the financial sustainability of the money-domain is considered, firstly, with respect to the specific telemedicine service and, secondly, on a higher level, with respect to the macro-economic healthcare system.

This subdomain of the TMSMM is the concern of health economics, which is a branch of economics concerned with the functioning of macro-economic healthcare systems as well as health affecting behaviours and interventions – such as the use of technology [36]. Health economists throughout the world still graple with challenges to find financial justification for telemedicine services [37, 38]. No clear-cut financial model has yet been developed. It is also not the intension of the TMSMM to provide answers concerning how the financial sustainability and return on investment can be measured and managed, but merely if it is being managed and measured.

A maturity level of 1 is typical to projects in the research and development or pilot phases, where the focus is on technical feasibility [27]. At level 2 some form of financial management system is in place. However, the service relies on donor or R&D funding. A maturity level of 3 indicates that the telemedicine service will sustain financially, without external funding.

With the context of health economics, many approaches can be found with respect to the measurement of the return on investment(ROI). The TMSMM does not dictate the method of ROI-measurement. Rather, a maturity level of 4 indicates that such measure is decided upon and these measures are indeed realized.

Macro-level continuous improvement and optimization within context of the money domain requires not only financial sustainability, but also reinvestment – in which case a maturity level of 5 is allocated.


6. Conclusion

In this chapter a telemedicine service maturity model (TMSMM) is presented. This TMSMM is developed in response to the need for a framework according to which the maturity of existing and proposed telemedicine projects can be measured with the purpose of supporting decision making towards sustained telemedicine services.

The TMSMM was developed by following an iterative process involving telemedicine practitioners from five different South African provincial departments of health (DoH). With each iteration a cross-functional group was involved in workshop format. Self-assessment maturity models, like the TMSMM proved to be particularly effective with cross-functional and interdisciplinary groups.

This descriptive maturity model can be used as basis for the development of a prescriptive and eventually comparative maturity model for which the following design principles applies [39]: Firstly, decision calculus is included the assist in the evaluation of different alternatives. Secondly, an adoption methodology is included which features a procedure model, advice on how to concretize and adapt improvement measures.



The work presented in this article is based on research for purposes of the PhD thesis of Liezl van Dyk, for which financial support was provided by the National Research Foundation (NRF) of South Africa. The authors also wishes to acknowledge Jill Fortuin (Medical Research Council of South Africa) for her role in the initial workshops that lead to the development of the TMSMM as well as Kim Viljoen and André Hartmann (Stellenbosch University), who contributed to the definition of the capability statements through the execution of their respective MEng projects.


  1. 1. Yellowlees, P. Successfully developing a telemedicine system. Journal of Telemedicine Telecare, Bd. 11, Nr. 7, pp. 331-336, 2005.
  2. 2. Edwards, J. Hype Cycle for Telemedicine (G00214814). Gartner Industry Research, 28 July 2011. [Online]. Available: [Retrieved on 8 March 2012].
  3. 3. Khoja, S., Scott, R., Casebeer, A., Mohsin, M., Ishaq, A. and Gilani, S. E-health readiness assessment tools for healthcare institutions in developing countries. Telemedicine e-Health, Bd. 13, Nr. 4, pp. 425-432, 2007.
  4. 4. Van Gemert-Pijnen, J.E., Nijl, N., Van Limburg, M., Ossebaard, H.C., Kelders, S.M., Eysenbach, G., and Seydel, E.R. A Holistic Framework to Improve the Uptake Impact of eHealth Technologies. Journal of Medical Internet Research, Bd. 13, Nr. 4, p. Publised online, 2011.
  5. 5. Crosby, P.B. Quality is still free : making quality certain in uncertain times, ISBN: 0070145326 : McGraw-Hill, 1996.
  6. 6. De Bruin, T., Freeze, R., Kaulkarni, U., and Rosemann, M. Andersting the Main Phases of Developing a Maturity Assessment Model. in Australasian Chapter of the Association for Information Systems, Sydney, 2005.
  7. 7. Wikipedia. Capability Maturity Model. Wikipedia, 26 June 2012. [Online]. Available: [Retrieved on 26 June 2012].
  8. 8. Essman, H. Toward innovation capability maturity. Stellenbosch University, PhD Thesis, 2009.
  9. 9. Fraser, P., Moultrie, J., and Gregory, J. The use of maturity models/grids as a tool in assessing product development capability. Engineering Management Conference, Bd. 1, Nr. Cambridge University, UK, pp. 244-249, 2002.
  10. 10. Maier, A.M., Moultrie, J., and Clarkson, J.P. Assessing Organizational Capabilities: Reviewein Guiding the Development of Maturity Grids. IEEE Transactions on Engineering Management, Bd. 59, Nr. 1, pp. 138-159, 2012.
  11. 11. Mettler, T., Rohner, P., and Winter, R. Towards a Classification of Maturity Models in Information Systems“.
  12. 12. Jennett, P., Gagnon, M., and Brandstadt, H. Readiness models for rural telehealth. Journal of Postgraduate Medicine, Bd. 51, Nr. 4, pp. 279-283, 2010.
  13. 13. Legare, E., Vincent, C., Lehoux, P., Person, D., Kairy, D. and Gagnon, M et al. Telehealth readiness assessment tools. J Telemed Telecare 2010;16(3):107.. Journal of Telemedicine Telecare, Bd. 16, Nr. 3, pp. 107-115, 2010.
  14. 14. Sharma, B. Electronic Healthcare Maturity model (eHMMM). June 2008. [Online]. Available: [Retrieved on 10 April 2012].
  15. 15. Holl, M., Piai, S., Dunbrack, L.A. Healthcare IT Maturity Model: Western European Hospitals - the Leading Countries. IDC Health Insights, Framingham, MA, Tech. Rep No. H1210231.
  16. 16. Van de Wetering, R. and Batenburg, R. A PACS maturity model: A systematic meta-analytic review on maturation evolvability of PACS in the hospital enterprise. International Journal of Medical Informatics, Bd. 78, pp. 127-140, 2009.
  17. 17. Solli-Sæther, H., and Göttschalk, P. The Modeling Process for Stage Models. Journal of Organizational Computing Electronic Commerce, Bd. 20, Nr. 2010, pp. 279-293, 2010.
  18. 18. Von Wangenheim, C.G., Hauch, J.C., Zoucas, A., Salviano, C.A., McCafferty, F. and Shull, F. Creating Software Process Capability/Maturity Models. IEEE Computer Society, Bd. Voice of Evidence, Nr. July/August 2010, pp. 92-94, 2010.
  19. 19. Mettler, T. Maturity Assessment Models: A Design Science Research Approach. Internation Journal of Society Systems Science, Bd. 3, Nr. 1-2, pp. 81-98, 2011.
  20. 20. Van der Zee, D. Developing participative simulation models—framing decomposition principles forjoint andersting. Journal of Simulation, Bd. 1, pp. 187-202, 2009.
  21. 21. Van Dyk, L., Fortuin, J.B. and Schutte, C.S.L. A Systems Engineering Approach to Telemedicine System Implementation. in International Conference on Industrial Engineering, Systems Engineering Engineering Management for Sustainable Global Development, Spier, 2011.
  22. 22. Van Dyk, L., Schutte, C.S.L. and Fortuin, J.B. A Maturity Model for Telemedicine Implementation. in The Fourth International Conference on eHealth, Telemedicine Social Medicine, Valencia, Spain, 2012.
  23. 23. Van Dyk, L. and Schutte, C.S.L. Development of a Maturity Model for Telemedicine. Southern African Journal for Telemedicine, Bd. 23, Nr. 2, pp. 61-72, 2012.
  24. 24. Van Zyl, A.J. and Van Dyk, L. An information system to support telemedicine projects in South Africa, Bd. MScEng Thesis, Stellenbosch: Stellenbosch University, 2012, pp. 1-151.
  25. 25. Araki, Y., Scott. R.E. and Lear, S. Challenges of telehealth research in clinical setting. Journal of Telemedicine Telecare, pp. 425-426, 2007.
  26. 26. Mars, M. Telemedicine in South Africa. in Telehealth in the Developing World , Online, OECD, 2009, pp. 222-231.
  27. 27. Broens, T.H.F., Vollenbroek-Hutten, M.M.R., Hermens, H.J., Van Halteren, A.T., Nieuwenhuis, L.J.M. et al. Determinants of successful telemedicine implementations. Journal of Telemedicine Telecare, Bd. 6, Nr. 303, p. 13, 2007.
  28. 28. Kuusisto, J., Den Hertog, P., Berghäll, S., Hjelt, M., Ahvenharju, S. and Van der Aa, W. Service Typologies tools for effective innovatation policy development. European policies intruments to support service innovation (EPISIS), Pro Inno Europe, 27 May 2011.
  29. 29. Schrapel, N. Connecting Africa - African Connections: Africa's engagement with ICTs their role for development - the case of telemedicine in South Africa. in ICT Development - Research Voices from Africa, Makerere University, Uga, International Federation for Information Processing (IFIP), Technical Commission 9 - Relationship between Computers Society, 22-23 March 2010, pp. 1-11.
  30. 30. Wikipedia. Evidence-based practice. Wikipedia, 27 June 2012. [Online]. Available: [Retrieved on 27 June 2012].
  31. 31. Hermann, K., Van Damme, W., Pariyo, G.W., Schouten, E., Assefa, Y., Cirera, A. and Massavon, W. Community health workers for ART in sub-saharan Africa: learning from experience = capitalizing on new opportunities. Human Resources for Health, Bd. 7, Nr. 31, pp. 1-11, 9 April 2009.
  32. 32. Fulton, B.D., Scheffler, R.M., Sparkes, S.P., Auh, E.Y., Vujicic, M. and Soucat. Health workforce skill mix task shifting in low income countries: a review of recent evidence. Human Resources for Health, Bd. 9, Nr. 1, pp. 1-11, 2011.
  33. 33. Information Technology Infrastructure Library. Information Technology Infrastructure Library glossary abbreviations. 2011. [Online]. Available: [Retrieved on 28 June 2012].
  34. 34. Business dictionary. Legacy. [Online]. Available:
  35. 35. Wikipedia. Interoperability. Wikipedia, 12 August 2012. [Online]. Available: [Retrieved on 2012 August 2012].
  36. 36. Wikipedia. Health Economics. Wikipedia, 2 August 2012. [Online]. Available: [Retrieved on 2012 August 2012].
  37. 37. Bashshur, R., Shannon, G. Hasan, S.H. Telemedicine evaluation. Telemedicine Journal & e-Health, Bd. 11, Nr. 3, pp. 296-316, 2005.
  38. 38. World Health Organization. Health Economics Research Group. [Online].
  39. 39. Reuglinger, J. Pöppelbuss and Becker, J. Maturity models in business process management. Business Process Management Journal, Bd. 18, Nr. 2, pp. 328-346, 2012.

Written By

Liezl van Dyk and Cornelius S.L. Schutte

Submitted: 02 May 2012 Published: 13 December 2013