Open access peer-reviewed chapter

Comparing Students’ Self-Assessment with Teachers’ Assessment of Clinical Skills Using an Objective Structured Clinical Examination (OSCE)

Written By

Zvonka Fekonja, Jasmina Nerat, Vida Gönc, Milena Pišlar, Margaret Denny and Klavdija Čuček Trifkovič

Submitted: 09 September 2016 Published: 17 May 2017

DOI: 10.5772/67956

From the Edited Volume

Teaching and Learning in Nursing

Edited by Majda Pajnkihar, Dominika Vrbnjak and Gregor Stiglic

Chapter metrics overview

1,578 Chapter Downloads

View Full Metrics

Abstract

Evaluation of clinical skills is a demanding and complex process and is dependent on many complex factors, such as teaching and learning approaches, simulated learning, and psychometrically validated assessment tools. Therefore, it is imperative that adequate strategies and methods are employed to evaluate the success of a nursing care activity. One such strategy in the field of nursing care is the application of objective structured clinical examination (OSCE) of a nursing activity. The purpose of this article is to highlight the importance of evaluating nursing activities in a simulated clinical environment with OSCE to determine synchronicity of the teacher and student assessment. A cross-sectional study was carried out, in which we compared the evaluation of nursing activity by the teacher and the 51 students. Summative content analysis was used to analyze open-ended questions about possible improvement of performed nursing activity. The data showed a large discrepancy (81.9%) in evaluating nursing activity between the teacher and the student. The synchronicity between the teacher and student assessment modality occurred only in 18%. Students were mostly less successful in their assessment of competence with knowledge about carrying out interventions (36.5%), preparing for interventions (24.3%), and infection control (14.4%). Clinical skills acquisition remains an essential element of a student nurse’s development, as competence in nursing skills is essential to patient safety. Simulation is viewed as an increasingly popular approach to the teaching and assessing of clinical skills. The process of evaluating nursing activity demands the usage of objective instruments that require objectivity, fairness, impartiality, and comprehension. The use of OSCE is one such method of promoting reliable and valid assessments in nursing skills.

Keywords

  • nursing care activities
  • assessment
  • OSCE
  • teacher
  • student

1. Introduction

Nursing is a discipline based on clinical practice–based interventions [1] with the purpose of nursing development and adjusting to needs of contemporary society [2]. The study of nursing care must prepare students to carry out and accept professional responsibility in a changing clinical environment [35]. Only in a clinical environment does a nurse's knowledge become most evident [6]. Because of the importance of clinical training, appropriate strategies for clinical assessment must be sought, with the purpose of specifying a success rate of clinical training [7]. Such strategies are important for evaluating the quality of teaching and learning processes [8], identifying student's weaknesses, increasing their motivation for competence achievement, and helping teachers to assess their work [912]. Knowledge evaluation is a time intensive and complicated process, leaning mostly on teacher's subjective judgment. In addition, students identify they are not satisfied with current methods and results of assessment [7, 13, 14].

Simulated learning using objective structured clinical examination (OSCE) has emerged as an alternative teaching method to assist students in acquiring clinical skills competency because it attempts to replicate a real-life situation in a simulated environment [15, 16]. There is a current trend in nursing education to use OSCE to assess and examine clinical competence [16, 17] within this simulated setting. The OSCE was originally designed for the medical profession [18] but has been modified and applied in nursing education, providing means to assess competence in a simulated environment without posing a risk to patient safety [16, 19]. A number of studies using various research designs including qualitative [20, 21], quantitative [22], and mixed methods [23, 24] have all reported positive results from using OSCE as a teaching methodology [16].

Objective structured clinical examination is a versatile multipurpose assessment tool that can be used to evaluate the nursing care that students deliver to patients in a clinical setting [25]. It is used in a planned and structured manner with a clear emphasis on objectivity [26]. Additionally, as an assessment tool, it uses a task analysis of general clinical skills [27] specific to the student’s program and year of study and as stated previously is usually carried out in a simulated environments [28, 29]. Contrary to traditional assessment modalities, OSCE evaluates areas that are most critical to nursing competence, for example, blood pressure, pulse, respiration, communication skills, and various other nursing interventions [25] in a simulated work environment [30]. The OSCE has for a long time been recognized as a formal assessment method in medical education [31] and is now more frequently used as an assessment method in nursing [32, 33]. It is argued that despite the rhetoric of a student-centered approach, education remains wedded to conventional teaching and learning approaches, which fail to engage with the individual and unwittingly silence the individual’s voice [34, 35]. Therefore, it is imperative that more contemporary student-centered approaches to teaching and learning are incorporated in nursing assessment.

The OSCE assessment tool was introduced to the University of Maribor Faculty of Health Sciences to examine performance of various clinical skills in a simulated clinical environment. The course unit of study was “Nursing Care in academic year 2015/2016” (first year students of nursing care). Standardized assessment checklists were evaluated together with the teacher's assessment of clinical training success. Students were encouraged to reflect on various nursing case scenarios in order to recognize mistakes and ensure competent practice. Simulation as a teaching method allows for multiple learning objectives to be taught in a realistic clinical environment without causing harm to patients and can provide ‘microworlds’ whereby important interactions between patients, doctors, nurses, and other health professionals can be highlighted, illustrated, explained, and replayed [16, 36].

Emphasis was placed on recognizing the mistakes made by both students and teacher when carrying out various nursing care skills and on the aspects of practice they did. The following research questions were posed:

  1. Is there consensus between student and teacher evaluations when assessing nursing practice using an OSCE?

  2. Does a student assessed with OSCE method recognize mistakes made when carrying out various nursing care skills?

  3. Can the student critically evaluate the performed nursing care activity in concordance with teacher’s evaluations?

Advertisement

2. Methodology

2.1. Design

To answer the research questions, a cross sectional observational research study was used. Nursing students performed care activities within an OSCE situation and then assessed their own performance. The nursing care was also independently assessed by a teacher. In this study, we compared these assessments. To explore this, we conducted an analysis of open-ended question about possible improvements of nursing care activity using summative content analysis.

2.2. Setting and participants

An initial available sample of 51 participants was recruited from a cohort of first-year B.Sc. general nursing students using an available sampling approach. Available sampling involves taking a sample of what one would call typical, normal, or average for a particular phenomenon under study [37]. All students were recruited through purposive sampling. The student successfully completed training in the clinical skills laboratory for nursing care of an adult patient and also successfully passed mid-term assessment. Participants were all first year nursing students who had completed the same clinical skills training course in the 1st degree study program of Nursing Care. The majority of participants (46.90%) were female. The response rate was 56% (n = 52). This means that more than half of the students completed the OSCE and wrote a reflective piece. Assessment was carried out by higher education teachers of nursing care in January 2016. Students had to meet the following inclusion criteria: (1) active participation at laboratory clinical skills in a simulated clinical environment (150 hours), (2) student's assessment based on OSCE assessment paper, (3) student's reflective writing after completing the nursing care activity, and (4) written self-evaluation.

2.3. Ethical considerations

The study complied with ethical procedures of research, the Declaration of Helsinki, provisions of the Oviedo Convention and Code of Ethics for nurses and medical technicians from Slovenia [38], which was adhered here throughout the process. Research participants' anonymity was assured by numerical coding of their written self-evaluation sheet, so that the participants could not be identified from the information. Data collection took place in classroom settings, and participation in the research was completely voluntary. The research team was the only people who had access to the data, and the data were stored on a computer that was password protected.

2.4. Data collection

Students' practical work at the course unit Nursing Care was assessed with objective structured clinical examination checklist OSCE, which consisted of 42 clinical skills. The checklist was composed by teachers, which perform training for students in the clinical skills laboratory. Each checklist of simulated interventions consisted of a series of performance-based observations and rated students’ performance as completed accurately; inaccurately or not completed. Students were acquainted with the method of evaluation for their clinical skills competence, and they also had the opportunity to view the assessment checklist items. All OSCE assessors had worked previously with the students in the clinical skills laboratory. An independent evaluator (a teacher who didn't participate in the training process) assessed the students during the performance of the nursing care activity by observing and filling out the assessment checklists. Using OSCE, 42 nursing care activities were evaluated. After performing the nursing intervention, the students had approximately 10 min to evaluate their nursing care activity using an unstructured method of reflection with the purpose of analyzing mistakes and endeavoring to improve future practice of such skills and prevent student from being awarded a fail grade. To assist the critical reflection process, students were posed a question: What could you have improved during carried out intervention or would you improve something?

2.5. Data analysis

The first and last author transcribed the students’ reflective writing (after 1 month of the evaluation process). The transcripts were then analyzed and coded by two authors (ZF & KČT) using descriptive statistics and summative content analysis [39] for analysis of the open-ended question. This approach contains a quantitative element, because the results show in the proportions of how many times some words are repeated or a response depending on the particular subject of research. Summative content analysis includes counting and comparing keywords and content with the interpretation of the meaning of these [39]. In the first step, the reflective sheets were read carefully, trying to obtain a sense of the entire contents [39, 40] and then to derive topics. We highlight the exact word from the text, which covered the main idea or concept. The incidence of identified keyword and concepts were then calculated and presented in percentages. This method was chosen as there was a large quantity of data [40] and using this approach enabled the opportunity to quantify words or content and subsequent data interpretation.

Advertisement

3. Results

Answers to the open question were categorized in six themes (see Table 1) and organized after occurrence—from most frequent to less frequent repetition.

Theme Number of items Percentage (%) Meaning
Knowledge needed for carried out interventions 159 36.5 Theoretical and practical knowledge
Preparation for intervention 106 24.3 Assuring privacy, condition evaluation, preparation of the room and accessories
Infections treatment 63 14.4 Disinfection of material, accessories and hands, preventives, recycling
Safety assurance 51 11.6 Placement of safety protectors and buzzer by the bed, comparison of identity and doctor's requirements, marking of material
Communication skills 43 9.8 Explanation of purpose, goal and course of care intervention, obtaining of an agreement, passing future instructions
Treatment after intervention 15 3.4 Fixing up the patient, arranging of material and of the room, documenting
Total 435 100

Table 1.

Content analysis and occurring themes.

The analysis of clinical skills' evaluation (performing a nursing care activity) showed that students were least successful at themes of “Knowledge needed for carried out interventions (159 [36.5%])” that contained theoretical and practical knowledge needed for the successful performance of an intervention. Themes of category “Treatment after intervention” were assessed as less successful from student's and teacher's assessment viewpoint (15 [3.4%]).

Student's and teacher's assessments were in harmony in 18.1%, regarding all items of the assessment papers. In 81.9%, the assessments were not in harmony (Table 2).

Harmony Number Percentage (%)
Yes 79 18.1
No 357 81.9
Total 436 100

Table 2.

Harmony of the student's and teacher's assessment.

Most discrepancies at evaluation were noted in items of “Knowledge needed for carried out interventions,” where discrepancy between the student's and teacher's assessment was 37%. Least discrepancies were noted at items of “Treatment after interventions,” 3.5% (Table 3).

Theme Total number ofitems Percentage (%) Student's grade N (%) Teacher's grade
N (%)
Preparation for intervention 88 24.3 24 19.7 64 26.6
Infection management 48 13.2 23 18.8 25 10.4
Communication skills 38 10.5 17 13.9 21 8.8
Safety assurance 43 11.8 13 10.6 30 12.5
Knowledge needed for carried out interventions 134 37.0 41 33.6 93 38.7
Treatment after intervention 13 3.5 5 4.0 8 3.3
Total 362 100 122 100 240 100

Table 3.

Discrepancy of the student's and teacher's assessment regarding themes.

Advertisement

4. Discussion

For assessment of a nursing care activity, we sought approaches that help teachers objectively evaluate student's knowledge. One of such instrument for evaluation of theoretical knowledge and performance of nursing care activity is the OSCE [41]. In the research carried out on 51 first-year nursing care students, we wanted to find out differences in assessment of a nursing care activity between a student and a teacher in a simulated environment. The data identified a big discrepancy (81.9%) in the evaluation of nursing care. Harmony between the teacher and student was present only in 18% of the data. On completion of the activity, students had to write their reflection about performance and answer questions on possible improvements. In this process, they showed a lack of criticism and in-depth performance insight. This supported other literature [25] as the most critical assessment areas: communication skills and ability to manage patient's inappropriate behavior. In our research, we found out that a student was least successful in the knowledge area of intervention performance, preparation for intervention, and infection management. Results indicate that students lacked self-criticism and lack of theoretical knowledge. The reference [42] states that objective methods for assessment are an appropriate alternative to traditional knowledge assessment forms. These methods are even more effective because of instant feedback of students and teachers [43]. As in previous studies [44, 45], the methods of objective assessment requires a lot of time, many teachers, and financial investment, although this should be balanced with greater satisfaction to students and teachers.

4.1. Limitations of this study

Our research has some limitations that require careful consideration. Because of a small sample, generalization is less reliable. Students were not assessed using exactly the same interventions. They chose interventions randomly, and they were not the same level of difficulty. Audio and video recording as a feedback method on their own performance for recognizing mistakes was not assessed. A structured reflective model was not used as students wrote their reflection based on questions posed by the teacher. Finally, the assessment was performed by one teacher; therefore, adequate tests of objectivity cannot be assured.

Advertisement

5. Implications for nursing education

OSCE is an assessment technique in which students perform nursing interventions under a variety of simulated conditions. Through the assessment of students, performance on the same skill and of the same place, the equity, and the objectivity to the students are enhanced. The introduction of a student self-assessment checklist contributed to a reflection of their own performance of each step of the skill before evaluation with OSCE and prompt identification of student’s strengths and weaknesses during exercise of skills performance. Using the OSCE as an assessment method that is based on real scenarios increased the confidence of student.

Advertisement

6. Conclusion

There has been increased attention over the past decades on the use of objective structured clinical examination systems as a training and assessment tool. OSCE mainly assesses basic knowledge and technical performance of an activity. It is harder to assess the ability of critical thinking, interacting communication skills, experiencing, student's expectations and skills, in order to treat a patient like a whole. The use of new assessment systems is limited and one-sided; therefore, a richer combination of assessment systems is required. In pursuing clinical competence in nursing students' clinical skills, nursing teachers have to endeavor to find new ways to assess and examine these essential skills that are necessary for patient safety and care. Future studies should include a larger sample of students, and the use of a reflective model to capture student perception of learning is recommended.

References

  1. 1. Henderson A., Tyler S. Facilitating learning in clinical practice: Evaluation of a trial of a supervisor of clinical education role. Nurse Educ Pract 2011;11(5):188–92. DOI: 10.1016/j.nepr.2011.01.003.
  2. 2. Farzianpour F., Monzavi A., Yassini E. Evaluating the quality of education at dentistry School of Tehran University of medical sciences. Dent Res J (Isfahan) 2011;8(2):71–9.
  3. 3. Liang L. The gap between evidence and practice. Health Aff 2007;26(2):w119–21. DOI: 10.1377/hlthaff.26.2.w119.
  4. 4. Butler M.P., Cassidy I., Quillinan B., Fahy A., Bradshaw C., Tuohy D., O'Connor M., Mc Namara M.C., Egan G. Competency assessment methods – Tool and processes: A survey of nurse preceptors in Ireland. Nurse Educ Pract 2011;11(5):298–303. DOI: 10.1016/j.nepr.2011.01.006.
  5. 5. Chiang V.C., Chan S.S. An evaluation of advanced simulation in nursing: A mixed-method study. Collegian 2014;21(4):257–65.
  6. 6. Mansoorian M.R., Hosseiny M.S., Khosravan S., Alami A., Alaviani M. Comparing the effects of objective structured assessment of technical skills (OSATS) and traditional method on learning of students. Nurs Midwifery Stud 2015;4(2):e27714. DOI: 10.17795/nmsjournal27714.
  7. 7. Noohi E., Motesadi M., Haghdoost A. Clinical teachers' viewpoints towards objective structured clinical examination in Kerman University of Medical Science. Iran J Med Educ 2008;8(1):113–20.
  8. 8. Iqbal M., Khizar B., Zaidi Z. Revising an objective structured clinical examination in a resource-limited Pakistani Medical School. Educ Health (Abingdon) 2009;22(1):209.
  9. 9. Joyce B. Developing an Assessment System Facilitator’s Guide. Accreditation Council for Graduate Medical Education 2006;1:15–17.
  10. 10. Epstein R.M. Assessment in medical education. N Engl J Med 2007;256(4):387–96. DOI: 10.1056/NEJMra054784.
  11. 11. Aronson L., Niehaus B., Hill-Sakurai L., Lai C., O'Sullivan P.S. A comparison of two methods of teaching reflective ability in year 3 medical students. Med Educ 2012;46(8):807–14. DOI: 10.1111/j.1365-2923.2012.04299.x.
  12. 12. Fernández-Sola C., Granero-Molina J., Márquez-Membrive J., Aguilera-Manrique G., Castro-Sánchez A.M. Implementation of the new clinical practice training model in Andalusia: A qualitative evaluation of the nursing and physiotherapy degrees. Enferm Clin 2014;24(2):136–41. DOI: 10.1016/j.enfcli.2013.09.004.
  13. 13. Wilkinson T.J., Frampton C.M. Comprehensive undergraduate medical assessments improve prediction of clinical performance. Med Educ 2004;38(10):1111–6. DOI: 10.1111/j.1365-2929.2004.01962.x.
  14. 14. Cheater F.M., Baker R., Reddish S., Spiers N., Wailoo A., Gillies C., Robertson N., Cawood C. Cluster randomized controlled trial of the effectiveness of audit and feedback and educational outreach on improving nursing practice and patient outcomes. Med Care 2006;44(6):542–51. DOI: 10.1097/01.mlr.0000215919.89893.8a.
  15. 15. Decker S., Sportsman S., Puetz L., Billings L. The evolution of simulation and its contribution to competency. J Contin Educ Nurs 2008;39(2):74–80.
  16. 16. Sharvin B. Theory-Practice Integration for Clinical Skills Competence among Under-graduate Nursing Students in Ireland: A Mixed Method Study [thesis] 2016. Thesis submitted to the University of Derby for award of Doctor of Education.
  17. 17. Meechan R., Jones H., Valler-Jones T. Students' perspectives on their skills acquisition and confidence. Br J Nurs 2011;20(7):445–50. DOI: 10.12968/bjon.2011.20.7.445.
  18. 18. Harden R.M., Gleeson F.A. Assessment of clinical competence using an objective structured clinical examination (OSCE). Med Educ 1979;13:41–54. DOI: 10.1111/j.1365-2923.1979.tb00918.x.
  19. 19. Cant R., McKenna L., Cooper S. Assessing preregistration nursing students' clinical competence: A systematic review of objective measures. Int J Nurs Pract 2013;19(2):163–76. DOI: 10.1111/ijn.12053.
  20. 20. Morgan D.L. Paradigms lost and pragmatism regained: Methodological implications of combining qualitative and quantitative methods. J Mixed Methods Res 2007;1(1):48–76. DOI: 10.1177/2345678906292462.
  21. 21. Houghton C.E., Casey D., Shaw D., Murphy K. Students' experiences of implementing clinical skills in the real world of practice. J Clin Nurs 2013;22(13–14):1961–9. DOI: 10.1111/jocn.12014.
  22. 22. Nevin M., Neill F., Mulkerrins J. Preparing the nursing student for internship in a pre-registration nursing program: Developing a problem based approach with the use of high fidelity simulation equipment. Nurse Educ Pract 2014;14(2):154–9. DOI: 10.1016/j.nepr.2013.07.008.
  23. 23. McCaughey C.S., Traynor M.K. The role of simulation in nurse education. Nurse Educ Today 2010;30(8):827–32. DOI: 10.1016/j.nedt.2010.03.005.
  24. 24. Hope A., Garside J., Prescott S. Rethinking theory and practice: Pre-registration student nurses experiences of simulation teaching and learning in the acquisition of clinical skills in preparation for practice. Nurse Educ Today 2011;31(7):711–5. DOI: 10.1016/j.nedt.2010.12.011.
  25. 25. Zayyan M. Objective structured clinical examination: The assessment of choice. Oman Med J 2011;26(4):219–22. DOI: 10.5001/omj.2011.55.
  26. 26. Harden R.M. What is an OSCE?. Med Teach 1988;10(1):19–22. DOI: 10.3109/014215 98809019321.
  27. 27. Ward H., Willis A. Assessing advanced clinical practice skills. Prim Health Care 2006;16(3):22–4. DOI: 10.7748/phc.16.3.22.s23.
  28. 28. Merriman C., Westcott L. Succeed in OSCEs and Practical Exams: An Essential Guide for nurses 2010. 1st ed. Maidenhead: McGraw-Hill Education.
  29. 29. Fidment S. The objective structured clinical exam (OSCE): A qualitative study exploring the healthcare student’s experience. Stud Engagem Exp J 2012;1(1):1–11. DOI: 10.7190/seej.v1i1.37.
  30. 30. Mitchell M.L., Henderson A., Groves M., Dalton M., Nulty D. The objective structured clinical examination (OSCE): Optimising its value in the undergraduate nursing curriculum. Nurse Educ Today 2009;29(4):398–404. DOI: 10.1016/j.nedt.2008.10.007.
  31. 31. Ward H., Barratt J. Assessment of nurse practitioner advanced clinical practice skills: Using the objective structured clinical examination (OSCE). Prim Health Care 2005;15(10):37–41. DOI: 10.7748/phc2005.12.15.10.37.c563.
  32. 32. Wessel J., Williams R., Finch E., Gémus M. Reliability and validity of an objective structured clinical examination for physical therapy students. J Allied Health 2003;32(4):266–9.
  33. 33. Bartfay W.J., Rombough R., Howse E., Leblanc R. Evaluation. The OSCE approach in nursing education. Can Nurse 2004;100(3):18–23.
  34. 34. Denny M., Redmond-Stokes O., Wells J., Weber E., O’Sullivan K., Taylor, M. The learning experience: A new approach to optimise student learning. In: Callara E.. editor. Nursing Education Challenges in the 21st Century 2008. 1st ed. New York: Nova Science Publisher. pp. 23–48.
  35. 35. Denny M., Weber E.F., Wells J., Stokes O.R., Lane P., Denieffe S. Matching purpose with practice: Revolutionising nurse education with mita. Nurse Educ Today 2008;28(1):100–7. DOI: 10.1016/j.nedt.2007.03.004.
  36. 36. Valler-Jones T., Meechan R., Jones H. Simulated practice—a panacea for health education?. Br J Nurs 2011;20(10):628–31. DOI: 10.12968/bjon.2011.20.10.628.
  37. 37. Heppner P., Kivlingham D.M., Wampold B.E. Research Design in Counseling 1992. Pacific Grove, CA: Brooks/Cole.
  38. 38. Ovijač D., Velepič M., Adamič M., Eder J., et al. Kodeks etike v zdravstveni negi in oskrbi Slovenije. Ljubljana: Zbornica zdravstvene in babiške nege – Zveza strokovnih društev medicinskih sester, babic in zdravstvenih tehnikov Slovenije 2014.
  39. 39. Hsieh H.F., Shannon S.E. Three approaches to qualitative content analysis. Qual Health Res 2005;15(9):1277–88. DOI: 10.1177/1049732305276687.
  40. 40. Elo S., Kyngäs H. The qualitative content analysis process. J Adv Nurs 2008;62(1):107–15. DOI: 10.1111/j.1365-2648.2007.04569.x.
  41. 41. Stunden A., Halcomb E., Jefferies D. Tools to reduce first year nursing students' anxiety levels prior to undergoing objective structured clinical assessment (OSCA) and how this impacts on the student's experience of their first clinical placement. Nurse Educ Today 2015;35(9):987–91. DOI: 10.1016/j.nedt.2015.04.014.
  42. 42. Akhtar-Danesh N., Valaitis R., O'Mara L., Austin P., Munroe V. Viewpoints about collaboration between primary care and public health in Canada. BMC Health Serv Res 2013;13:311. DOI: 10.1186/1472-6963-13-311.
  43. 43. Rushforth H.E. Objective structured clinical examination (OSCE): review of literature and implications for nursing education. Nurse Educ Today 2007;27(5):481–90. DOI: 10.1016/j.nedt.2006.08.009.
  44. 44. Berkenstadt H., Ziv A., Gafni N., Sidi A. Incorporating simulation-based objective structured clinical examination into the Israeli National Board Examination in Anesthesiology. Anesth Analg 2006;102(3):853–8. DOI: 10.1213/01.ane.0000194934.34552.ab.
  45. 45. AbdiShahshahani M., Ehsanpour S., Yamani N., Kohan S. The evaluation of reproductive health PhD program in Iran: The input indicators analysis. Iran J Nurs Midwifery Res 2014;19(6):620–8.

Written By

Zvonka Fekonja, Jasmina Nerat, Vida Gönc, Milena Pišlar, Margaret Denny and Klavdija Čuček Trifkovič

Submitted: 09 September 2016 Published: 17 May 2017