Open access peer-reviewed chapter

Using Content Validity for the Development of Objective Structured Clinical Examination Checklists in a Slovenian Undergraduate Nursing Program

By Nino Fijačko, Zvonka Fekonja, Margaret Denny, Brian Sharvin, Majda Pajnkihar and Gregor Štiglic

Submitted: September 14th 2016Reviewed: March 10th 2017Published: May 17th 2017

DOI: 10.5772/intechopen.68454

Downloaded: 494

Abstract

Introduction: The objective structured clinical examination (OSCE) has been adopted by many universities for the assessment of healthcare competencies and as a formative teaching tool in both undergraduate and postgraduate nursing education programs. This pilot study evaluates the validity of OSCE checklists to be used in first‐year undergraduate nurse practice education.

Keywords

  • objective structured clinical examination
  • development
  • checklist
  • content validity index
  • nursing

1. Introduction

The objective structured clinical examination (OSCE) was originally developed for medical education in Scotland by Harden and colleagues in 1975 [1], but has now been widely accepted as a “fit‐for‐purpose” instrument for measuring clinical skills competency in healthcare education [2, 3]. The OSCE is defined as “an approach to the assessment of clinical competence in which the components of competence are assessed in a well‐planned or structured way with attention being paid to objectivity” [4].

The OSCE has been adopted by many universities for the assessment of healthcare competencies, and it is generally accepted as a valid assessment tool and as a formative teaching approach in both undergraduate and postgraduate nursing education programs [58]. The benefits accrued by using the OSCE tool include the development of students’ confidence [5]; the preparation of students for clinical practice [9]; the achievement of deeper and more meaningful learning [7]; the ability to provide students with feedback on their clinical skills performance; and additionally its enables students to identify their strengths and weaknesses in clinical skills [6].

The OSCE typically consists of a circuit or series of short assessment tasks, each of which is assessed by an examiner using a predetermined, objective marking scheme to make the assessment of clinical skills more objective rather than subjective [10]. In an OSCE, each student has to demonstrate specific skills and behaviors in a simulated environment. The OSCE acronym has itself evolved over the years, and there are now many variations, for example, Group Objective Structured Clinical Examination [11], Objective Structured Video Exam [12], Objective Structured Assessment of Technical Skill [13], Objective Structured Teaching Encounter [14], etc. The latter number of variations on the OSCE has evolved because of its utility and applicability as an assessment and teaching tool in nursing and interprofessional education [15].

The development of new criteria for assessing clinical skills requires critical scrutiny to ensure that the validity and reliability of each assessment are maximized [10]. Validity focuses on whether a test actually succeeds in addressing the competencies it is designed to test [16]. The assessment checklists used in the OSCE are developed according to evidence‐based practice guidelines and standards of nursing care to establish content validity [17]. Evaluating content validity is a critical early step in enhancing the construct validity of an instrument, and therefore, content validation is an important topic for clinicians and researchers who require high‐quality measurements [18]. The content validity index (CVI) based on expert ratings of relevance is the most widely used method among nurse researchers of quantifying content validity for multi‐item scales [10].

This study describes two interconnected methodological phases that could be considered and implemented when developing and establishing the CVI of a checklist, which is designed to measure nursing student performance during clinical skills assessment using OSCE. Checklists in an OSCE provide an ideal method for assessing skills that require a series of steps that should be completed with consistency and continuity each time the skill is performed [20].

2. Methods

The checklist for the OSCE was developed in three methodological and chronological phases (Figure 1).

Figure 1.

Methodological and chronological phases of developing checklist for the OSCE.

In first phase, a comprehensive search of the literature relating to OSCE in Slovenian nursing was conducted and no published research examining the use of OSCE in Slovenian nursing curriculum was found.

In second phase, the degree of complexity (DOC) phase, a 10‐point scale was created and used to evaluate the DOC for each essential nursing skill as perceived by nursing educators. All essential nursing skills that were included in this study were part of a first‐year curriculum in the practical nursing education at one of University in Slovenia.

The DOC has 10 levels in which 1 represents “very low complexity” and 10 represents “very high complexity.” The DOC scores for each essential nursing skill were classified into three categories. A score between 1 and 4 belonged to the low complexity category; a score between 4 and 8 belonged to the medium complexity category, and a score between 8 and 10 belonged to the high‐complexity category.

Phase three, the content validity index (CVI) phase, included various measurements [18]. Educators evaluated each item in nursing procedures by using a four‐point Likert‐type ordinal scale in which 1 = not relevant, 2 = somewhat relevant, 3 = quite relevant, 4 = highly relevant, nurse. Two metrics were calculated in the scope of CVI analysis: (1) Item Content Validity Index (I‐CVI) and (2) Content Validity Index Average (S‐CVI/Ave). The first metric (I‐CVI) represents the count of all items in the essential nursing skill, which were rated with 3 or 4 divided by the total number of nursing educators. The S‐CVI/Ave was calculated after summing all I‐CVI numbers and dividing them by the number of items in the essential nursing skill [18].

3. Results

Nursing educators (n = 12) systematically evaluated seventy‐two essential nursing skills using a 10‐point DOC scale. The average DOC score for each procedure was then rearranged into one of three categories (low, medium and high). Twelve essential nursing skills with an average DOC score of 3.44 (95% confidence interval (CI): 3.07–3.82) were ranked in the low complexity category, forty‐six essential nursing skills with an average DOC score of 5.84 (CI: 5.55–6.12) in the medium complexity category, and fourteen essential nursing skills with an average DOC score of 8.54 (CI: 8.36–8.71) in the high‐complexity category.

Table 1 presents the essential nursing skills (n = 14) with average DOC scores from highest to lowest in the high‐complexity category. Peripheral cannula insertion and female urinary catheterization were estimated with the highest average DOC score (9.00 or “very high complexity”). In the DOC, phase essential nursing skills were also arranged into areas in nursing.

Essential nursing skillsAreas in nursingAverage degree of complexity score
Peripheral cannula insertionDiagnostic/therapeutic essential nursing skills9.00
Urinary catheterization: femaleElimination9.00
Suctioning the nasopharyngeal airwayRespiratory care8.75
Medication: injection of intravenous drugsMedical management8.75
Suctioning the oropharyngeal airwayRespiratory care8.67
Tracheostomy: suctioning a patientRespiratory care8.67
Endotracheal suctioning of the adult intubated patient with open suction systemsRespiratory care8.67
VenipunctureDiagnostic/therapeutic essential nursing skills8.50
Cleaning infected woundDiagnostic/therapeutic essential nursing skills8.50
Nursing care of tracheostomyRespiratory care8.33
Insertion of a nasogastric tubeNutrition8.25
Pressure Ulcer TreatmentDiagnostic/therapeutic essential nursing skills8.18
Mouth care in unconscious patientsPersonal hygiene8.17
Rinsing infected woundDiagnostic/therapeutic essential nursing skills8.08

Table 1.

Ranking of essential nursing skills from highest to lowest by average degree of complexity score in high‐complexity category.

Nursing essential nursing skills (n = 6) with the highest average DOC score in each separate nursing areas were used for further estimate by CVI phase. Eleven nursing educators estimate I‐CVI for each essential nursing skill with different number of items (range from 28 to 58). For peripheral cannula insertion, I‐CVI was calculated for 39 of items and ranged from 0.82 to 1.00, which represents a good content validity. None of the items were deleted during CVI because they met agreements recommended by Polit and colleagues (Table 2 and Figure 2) [19].

Figure 2.

Elements of content validity index in peripheral cannula insertion check-list.

ItemE1E2E3E4E5E6E7E8E9E10E11Number of expertsI‐CVI
142444444144110.82
242344444144110.82
343444444244110.91
444444444444111.00
543444434344111.00
644344433344111.00
744444444444111.00
832341434344110.82
943444444244110.91
1043444443444111.00
1142443443344110.91
1243443444244110.91
1332343434243110.82
1443444444344111.00
1544442443444110.91
1644444444444111.00
1734343444243110.91
1843444444344111.00
1943444434444111.00
2043444444444111.00
2143444444444111.00
2243442444444110.91
2343444444444111.00
2443444444444111.00
2533444444344111.00
2643444444344111.00
2743444444444111.00
2843341444344110.91
2943441444444110.91
3043444444444111.00
3142444444344110.91
3243441444344110.91
3343444444344111.00
3443441444444110.91
3544443444244110.91
3644443444244110.91
3742444444344110.91
3844444444444111.00
3943443443444111.00
S‐CVI/Ave = 0.95

Table 2.

Item‐level content validity index for peripheral cannula insertion.

I‐CVI = item content validity index; S‐CVI, content validity index for the scale.

All I‐CVI scores were summated to calculate S‐CVI/Ave and then divided by the number of items: (0.82 + 0.82 + 0.91 + … + 1.00)/39 = 0.95. All calculated results of S‐CVI/Ave exceeded 0.90, which in combination with CVI (S‐CVI) levels above 0.78 (Table 2) represent excellent content validity (Table 3) [19].

Essential nursing skillsAreas in nursingNumber of itemsS‐CVI/ave
Peripheral cannula insertionDiagnostic/therapeutic essential nursing skills390.95
Urinary catheterization: femaleElimination540.95
Medication: injection of intravenous drugsMedical management280.94
Mouth care in unconscious patientsPersonal hygiene280.93
Insertion of a nasogastric tubeNutrition360.93
Suctioning the nasopharyngeal airwayRespiratory care510.92

Table 3.

Ranking of essential nursing skills from highest to lowest based on their content validity index average.

4. Discussion and conclusion

Methodological phases described in this pilot study could be considered and implemented when developing and establishing the checklist, which is designed to measure nursing students’ performance during clinical skills assessment using OSCE. The purpose of developing a DOC score in the study was to represent the range of complexity in essential nursing skills and to identify criteria for further research in the CVI phase. Results of the CVI analysis demonstrated a good content validity (I‐CVI and S‐CVI/Ave) for the essential nursing skills that were included in the evaluation.

The benefits of using CVI for OSCE checklists have to be considered in terms of how it might undermine the essential nursing skill as a whole. For example, the calculated CVI for some items in a procedure might be calculated as lower than recommended. That in turn questions the need for the item within an essential nursing skill, and yet it is argued that every item in an essential nursing skill has a purpose. Eliminating those items with a low CVI could therefore be detrimental to the whole OSCE essential nursing skill and presents a challenge to nurse educators. On the other hand, CVI is widely used for developing different methodological researching tools [2125].

Using OSCE in undergraduate nursing education offers a fresh approach for nurse educators in Slovenia and provides a new opportunity for determining nursing students’ competency levels in simulated environment.

Findings from the CVI analysis are promising for developing OSCE checklist and are promising for further research using OSCE as an assessment modality.

How to cite and reference

Link to this chapter Copy to clipboard

Cite this chapter Copy to clipboard

Nino Fijačko, Zvonka Fekonja, Margaret Denny, Brian Sharvin, Majda Pajnkihar and Gregor Štiglic (May 17th 2017). Using Content Validity for the Development of Objective Structured Clinical Examination Checklists in a Slovenian Undergraduate Nursing Program, Teaching and Learning in Nursing, Majda Pajnkihar, Dominika Vrbnjak and Gregor Stiglic, IntechOpen, DOI: 10.5772/intechopen.68454. Available from:

chapter statistics

494total chapter downloads

More statistics for editors and authors

Login to your personal dashboard for more detailed statistics on your publications.

Access personal reporting

Related Content

This Book

Next chapter

Comparing Students’ Self-Assessment with Teachers’ Assessment of Clinical Skills Using an Objective Structured Clinical Examination (OSCE)

By Zvonka Fekonja, Jasmina Nerat, Vida Gönc, Milena Pišlar, Margaret Denny and Klavdija Čuček Trifkovič

Related Book

First chapter

Designing, Conducting and Reporting Randomised Controlled Trials: A Few Key Points

By Hamidreza Mahboobi, Tahereh Khorgoei and Neha Bansal

We are IntechOpen, the world's leading publisher of Open Access books. Built by scientists, for scientists. Our readership spans scientists, professors, researchers, librarians, and students, as well as business professionals. We share our knowledge and peer-reveiwed research papers with libraries, scientific and engineering societies, and also work with corporate R&D departments and government entities.

More About Us