In high-acuity settings, the nurse's clinical judgment must be accurate and efficient for optimal patient outcomes (Lasater, 2011). Within simulation research, educators have found that as the complexity of the patient scenario increased, students' abilities to transfer knowledge from the theoretical to the contextual diminished (Dreifuerst, 2012; Neill & Wotton, 2011). For this research, clinical judgment was based on Tanner's (2006) clinical judgment model (TCJM)—noticing, interpreting, responding, and reflecting—and was operationalized through the Lasater Clinical Judgment Rubric (LCJR) (Lasater, 2007). Debriefing after simulation has been identified as a critical component to simulated learning experiences. However, inconsistencies in research methodology and framework have not allowed for the identification of best practice during debriefing (Jensen, 2013; Kuiper, Pesut, & Kautz, 2009; Lasater, 2011; Levett-Jones & Lapkin, 2014).
Scripts show how something should be performed from beginning to end, and as they have no scoring or grading criteria, scripts focus the student's attention on the learning process (O'Donnell, Reeve, & Smith, 2012). Further, scripts have been identified as ideally suited for highly cognitive learning experiences (Panadero, Alonso-Tapia, & Reche, 2013). Therefore, the purpose of this research was to investigate whether the introduction of a standardized clinical judgment script, based on Tanner's (2006) model, into clinical and simulation debriefings could positively affect the development of student clinical judgment skills.
Structured and standardized debriefing allows students to learn and cultivate a metacognitive level of reflection (Dreifuerst, 2012). Scripts have been used to provide structure and standardization during debriefing in an attempt to promote reflective thinking and clarify learning (Mariani, Cantrell, Meakim, Prieto, & Dreifuerst, 2013). Framed in a TCJM-based script, examples of prompts used in this research include:
- What did you notice about your client?
- What was your primary concern?
- What was the plan of care and did the plan change?
- In what ways will this affect your future actions?
Cheng et al. (2013) developed a scripted debriefing looking at novice instructors in the American Heart Association Pediatric Advanced Life Support course. The study's results demonstrated that a scripted debriefing improved participant knowledge and team leader performance. In educational research, scripts have demonstrated a framework from which to self-assess and regulate personal learning (Panadero et al., 2013). Students are able to self-assess through a process of setting goals, determining strategies for learning, monitoring the effectiveness of the strategies, and making needed adjustments along the way (Stegers-Jager, Cohen-Schotanus, & Themmen, 2012). Although cognitively demanding for the student, prompts and cues contained within the script outline how something should be performed and focus the student's attention on the learning process itself (O'Donnell et al., 2012).
In 2013, Panadero et al. found that scripts helped students during the activity, facilitating specific steps based on a predetermined model or framework. Although both rubrics and scripts are useful to the development of self-regulation and learning, scripts were found to encourage students to activate more learning strategies and are therefore ideally suited for learning experiences that were highly cognitively demanding (Panadero et al., 2013).
Participants and Setting
Approval to conduct the research was obtained prior to collecting data from senior baccalaureate nursing students enrolled in an 8-week synthesis course focused on providing complex, critical care at a large public university in the southeast United States. Seventy-five students enrolled in the course and all were required to participate in simulations and clinical learning experiences. The researcher did not teach in this course; thus, the students were not subject to coercion to participate in the research; only data from students providing written agreement to participate were used in the research.
Students, clinical instructors, and independent raters used a standardized debriefing script based on Tanner's (2006) model. To address the first three research questions, the LCJR was used as a scoring guide of clinical judgment. The fourth research question asked students to rate the effectiveness of the debriefing script as a tool to foster reflective thinking and clinical judgment using an investigator-developed Likert-style survey (i.e., content validity index, 0.90).
The six clinical instructors participated in an introductory workshop developed by the researcher to explain the purpose of the research, Tanner's (2006) model as a framework for the debriefing script, and the importance of reflective thinking during debriefing. Prior to data collection, two independent raters were asked to participate in “Advanced Evaluation,” an online course designed by Adamson and Kardong-Edgren (2014) available through the National League for Nursing Simulation Innovation Resource Center. This course focused on the use of the LCJR in the simulated learning environment and provided the means to establish interrater reliability among the raters (i.e., Kappa calculated, 0.814).
Educators and students have demonstrated an appreciation for the use of simulation in nursing education (Bambini, Washburn, & Perkins, 2009; Cato, Lasater, & Peeples, 2009). However, attempts to approximate actual clinical experience in simulation fail to fully recreate the diversity and urgency of real-world experiences (Blum, Borglund, & Parcells, 2010). Therefore, this research did not limit itself to simulated learning experiences, but also included clinical learning experiences.
During the course, students had six clinical learning experiences in area hospitals and two simulated learning experiences. At the conclusion of each experience, the clinical instructor led a 30- to 45-minute debriefing guided by the standardized debriefing script to promote reflective discussion. Following the debriefing, on the second and fifth clinical days and after both simulated experiences, students rated perceptions of their individual clinical judgment skills using the LCJR and clinical instructors rated each student's reflective skills using only the reflective level of the LCJR. For logistical reasons, the two independent raters were limited to watching and scoring participants' clinical judgment during the simulation experiences. Following the second simulation, students completed the brief survey to rate the effectiveness of the debriefing script. Table 1 provides a pictoral summary of the research question, the instrument of measure, and the statistical tests used for this research.
Research Question, Instrument of Measure, and Statistical Test
The sample for this study was a convenience sample of 53 participating students, which was sufficient based on G*Power analysis and medium effect size for dependent t test and repeated measures analysis (i.e., 45 being identified at 0.95 power). Participants were a homologous group; most were women (96%), ages 21 to 23 years (98%), and self-identified as Caucasian (90%).
Research Question One
Does the introduction of a standardized clinical judgment script into clinical and simulation debriefings improve student clinical judgment as seen in simulation activities? Participant simulation experiences were observed by two independent raters. Prior to data collection, the Kappa was calculated (0.814), indicating strong interrater reliability. The raters watched participants' recorded simulated learning experiences via a digital and audio management system. Students were then scored using the LCJR. Data from the scoring guides were combined and SPSS analysis performed using the dependent t test (α = .05) to compare the first simulation to the second simulation. The paired samples test demonstrated significant improvements in the categories of noticing (t = 5.109, df = 52, p = .000), interpreting (t = 5.463, df = 52, p = .000), and reflecting (t = 6.058, df = 52, p = .000). However, in the category of responding, the paired samples test shows there was a significant decrease in student responding (t = 15.044, df = 52, p =.000).
Research Question Two
Are student perceptions about clinical judgment skills different after the introduction of a standardized clinical judgment script? A one-factor repeated measures ANOVA showed significant improvement within groups effect using the clinical judgment script on student noticing, interpreting, and responding (p = .000) level, and reflecting (p = .003).
Research Question Three
Are clinical instructor perceptions about students' reflective thinking different after the introduction of a standardized clinical judgment script into debriefing sessions? A one-factor repeated measures ANOVA showed improvement in students' reflective thinking (p = .002). The Kappa was calculated and was found to be 0.814, showing good agreement among the clinical instructors' perceptions of student reflecting.
Research Question Four
How do students rate the effectiveness of a standardized clinical judgment script on the fostering of reflective thinking skills needed for the development of clinical judgment? A short 5-point scale Likert-style survey (Table 2) indicated the script helped them: 1 = Evaluate and analyze performance (M = 4.42, SD = .57), 2 = Analyze decision making (M = 4.6, SD = .53), 3 = Identify strengths and weaknesses (M = 4.45, SD = .64), 4 = Develop a plan for improvement (M = 4.5, SD = .64), and 5 = Was a useful tool (M = 4.5, SD = .7).
The Effectiveness of Scripted Debriefing on Fostering Reflective Thinking Survey
Scripts offer students a framework from which to self-assess and regulate learning, thus providing a complete picture of how something should be performed, including prompts to facilitate specific steps. Educational research supports the idea that if used properly, scripts encourage students to activate more learning strategies, making them ideal for learning experiences that are highly cognitive and demanding (Panadero et al., 2013). Clinical judgment involves multiple forms of knowledge and develops in a cyclical pattern; this cyclical process of learning can be stimulated via well-designed, thoughtful questions (Lasater, 2011). It was the intent of this research to determine whether the use of a standardized debriefing script, based on the TCJM, could improve student reflective thinking and, consequently, clinical judgment.
Data collected from independent raters, such as the watching and scoring of participants in recorded simulated learning activities, demonstrated statistically significant improvement in student noticing, interpreting, and reflecting. However, student performance in responding decreased. The first simulation was a high-acuity experience requiring students to physically notice, interpret, and respond to patient cues. The second simulation had an interdisciplinary focus, and the emphasis was on professional communication. Consistent with existing literature, despite the diverse nature of the two simulations, students demonstrated a decrease in responding in both simulations. Bogossian et al. (2014) looked at student recognition and responding in a patient deterioration simulation. They concluded that senior nursing students lacked the essential tools to competently manage a deteriorating patient. Shinnick, Woo, Horwich, and Steadman (2011) also found that pre- and posttest scores decreased after the hands-on component of the simulation experience, but after debriefing the scores in the experimental group improved, suggested that learning did not occur during responding but rather during debriefing (Shinnick et al., 2011).
In this current research, student perceptions of their own noticing, interpreting, responding, and reflecting skills all improved, consistent with previous research where a structured debriefing framework accelerated clinical reasoning and development of clinical judgment (Kuiper et al., 2009; Nielsen, 2009).
Results showed statistically significant improvement in students' reflective thinking skills after the introduction of the debriefing script, supporting the work of Shinnick et al. (2011), who concluded that guided structured reflection is the most valuable component of simulation when measuring gains in knowledge. Further, Panadero et al. (2013) found that the use of scripts encouraged students to activate more learning strategies, self-assess, and think reflectively.
Finally, the short Likert-style survey showed students overwhelmingly found scripted debriefing to be an effective resource in the fostering of reflective thinking. These results are in agreement with those of Kelly, Hager, and Gallagher (2014), who also used the TCJM as their framework for debriefing.
Implications for Nurse Education
The results supported the use of a TCJM-based script to improve clinical judgment in nursing students in both the simulated and patient-based clinical experience. Nurse educators are encouraged to consistently incorporate embedded cues and prompts as part of the teaching–learning process.
Limitations and Recommendations for Future Research
In both simulations, student responding decreased; this decrease in responding is a documented phenomenon in simulation (Bogossian et al., 2014; Shinnick et al., 2011). However, the core competencies for baccalaureate nursing education and the standards of professional nursing practice include inter- and intraprofessional communication skills, team building and collaborative strategies, and effective communication techniques for evidence-based, patient-centered care (American Association of Colleges of Nursing, 2008; American Nurses Association, 2010). The simulations used in this course were team exercises, requiring skillful communication both within the team and intraprofessionally. The results of this study suggest that, in addition to knowledge and clinical skills, students need further opportunity to practice communication skills in team and intraprofessional environments as a strategy to improve student responding in patient care scenarios.
Clinical judgment involves multiple forms of knowledge and the process develops cyclically (Lasater, 2011). This study has demonstrated that the introduction of a standardized debriefing script based on Tanner's (2006) clinical judgment model did improve student noticing, interpreting, and reflecting in simulated learning experiences. In the clinical and simulated learning environments, student perceptions about their clinical judgment improved after the introduction of a standardized clinical judgment script and both students and clinical instructors reported the script was effective in promoting the development of reflective thinking, a key component in the process of clinical judgment.
- Adamson, K. & Kardong-Edgren, S. (2014, October6). Advanced evaluation [Online course]. Retrieved from http://sirc.nln.org/mod/page/view.php?id=727.
- American Association of Colleges of Nursing. (2008). The essentials of baccalaureate education for professional nursing practice. Retrieved from http://www.aacn.nche.edu/education-resources/BaccEssentials08.pdf.
- American Nurses Association. (2010). Nursing: Scope and standards of practice (2nd ed.). Silver Springs, MD: Author.
- Bambini, D., Washburn, J. & Perkins, R. (2009). Outcomes of clinical simulation for novice nursing students: Communication, confidence, clinical judgment. Nursing Education Perspectives, 30, 79–82.
- Blum, C.A., Borglund, S. & Parcells, D. (2010). High-fidelity nursing simulation: Impact on student self-confidence and clinical competence. International Journal of Nursing Education Scholarship, 7, doi:10.2202/1548-923X.2035 [CrossRef]
- Bogossian, F., Cooper, S., Cant, R., Beauchamp, A., Porter, J., Kain, V. & Phillips, N.M. (2014). Undergraduate nursing students' performance in recognizing and responding to sudden patient deterioration in high psychological fidelity simulated environments: An Australian multi-centre study. Nurse Education Today, 34, 691–696. doi:10.1016/j.nedt.2013.09.015 [CrossRef]
- Cato, M.L., Lasater, K. & Peeples, A.I. (2009). Nursing students' self-assessment of their simulation experiences. Nursing Education Perspectives, 30, 105–108.
- Cheng, A., Hunt, E.A., Donoghue, A., Nelson-McMillan, K., Nishisaki, A., LeFlore, J. & Nadkarni, V.M. (2013). Examining pediatric resuscitation education using simulation and scripted debriefing. JAMA Pediatrics, 167, 528–536. doi:10.1001/jamapediatrics.2013.1389 [CrossRef]
- Dreifuerst, K.T. (2012). Using debriefing for meaningful learning to foster development of clinical reasoning in simulation. Journal of Nursing Education, 51, 326–333. doi:10.3928/01484834-20120409-02 [CrossRef]
- Jensen, R. (2013). Clinical reasoning during simulation: Comparison of student and faculty ratings. Nurse Education in Practice, 13, 23–28. doi:10.1016/j.nepr.2012.07.001 [CrossRef]
- Kelly, M.A., Hager, P. & Gallagher, R. (2014). What matters most? Students' rankings of simulation components that contribute to clinical judgment. Journal of Nursing Education53, 97–101. doi:10.3928/01484834-20140122-08 [CrossRef]
- Kuiper, R., Pesut, D. & Kautz, D. (2009). Promoting the self-regulation of clinical reasoning skills in nursing students. The Open Nursing Journal, 3, 70–76. doi:10.2174/1874434600903010076 [CrossRef]
- Lasater, K. (2007). Clinical judgment development: Using simulation to create an assessment rubric. Journal of Nursing Education, 46, 496–503.
- Lasater, K. (2011). Clinical judgment: The last frontier for evaluation. Nurse Education in Practice, 11, 86–92. doi:10.1016/j.nepr.2010.11.013 [CrossRef]
- Levett-Jones, T. & Lapkin, S. (2014). A systematic review of the effectiveness of simulation debriefing in health professional education. Nurse Education Today, 34, e58–e63. doi:10.1016/j.nedt.2013.09.020 [CrossRef]
- Mariani, B., Cantrell, M.A., Meakim, C., Prieto, P. & Dreifuerst, K.T. (2013). Structured debriefing and students' clinical judgment abilities in simulation. Clinical Simulation in Nursing, 9, e147–e155. doi:10.1016/j.ecns.2011.11.009 [CrossRef]
- Neill, M.A. & Wotton, K. (2011). High-fidelity simulation debriefing in nurse education: A literature review. Clinical Simulation in Nursing, 7, e161–168. doi:10.1016/j.ecns.2011.02.001 [CrossRef]
- Nielsen, A. (2009). Concept-based learning activities using the clinical judgment model as a foundation for clinical learning. Journal of Nursing Education, 48, 350–354. doi:10.3928/01484834-20090515-09 [CrossRef]
- O'Donnell, A.M., Reeve, J. & Smith, J.K. (2012). Educational psychology reflection for action (3rd ed.). Hoboken, NJ: John Wiley & Sons.
- Panadero, E., Alonso-Tapia, J. & Reche, E. (2013). Rubrics vs. self-assessment scripts effect on self-regulation, performance and self-efficacy in pre-service teachers. Studies in Educational Evaluation, 39, 125–132. doi:10.1016/j.stueduc.2013.04.001 [CrossRef]
- Shinnick, M.A., Woo, M., Horwich, T.B. & Steadman, R. (2011). Debriefing: The most important component in simulation?Clinical Simulation in Nursing, 7, e105–e111. doi:10.1016/j.ecns.2010.11.005 [CrossRef]
- Stegers-Jager, K., Cohen-Schotanus, J. & Themmen, A.P.N. (2012). Motivation, learning strategies, participation and medical school performance. Medical Education, 46, 678–688. doi:10.1111/j.1365-2923.2012.04284.x [CrossRef]
- Tanner, C.A. (2006). Thinking like a nurse: A research-based model of clinical judgment in nursing. Journal of Nursing Education, 45, 204–211.
Research Question, Instrument of Measure, and Statistical Test
|Research Question||Instrument of Measure||Statistical Test|
|Does the introduction of a standardized clinical judgment script into clinical and simulation debriefings improve student clinical judgment as seen in simulation activities?||LCJR (two raters)||Dependent t test|
|Are student perceptions about clinical judgment skills different after the introduction of a standardized clinical judgment script?||LCJR (students)||One-factor repeated measures ANOVA|
|Are clinical instructor perceptions about student's reflective thinking different after the introduction of a standardized clinical judgment script into debriefing sessions?||Modified LCJR (Clinical instructors will use only reflective portion of LCJR)||One-factor repeated measures ANOVA|
|How do students rate the effectiveness of a standardized clinical judgment script on the fostering of reflective thinking skills needed for the development of clinical judgment?||Likert-style student survey||Descriptive statistics|
The Effectiveness of Scripted Debriefing on Fostering Reflective Thinking Survey
|Statement||Students Responding Agree or Strongly Agree||Mean||SD|
|1. The debriefing script helped me evaluate and analyze my clinical performance.||96%||4.4151||.56955|
|2. The debriefing script helped me evaluate and analyze key decision making points in the day.||98%||4.6038||.53131|
|3. The debriefing script helped me identify my strengths and weaknesses.||92%||4.4528||.63748|
|4. The debriefing script helped me develop a plan for personal improvement.||90.5%||4.2642||.68363|
|5. The debriefing script helped guide the debriefing/postconferencing discussions.||94%||4.4906||.63919|
|6. The debriefing script is a useful simulation/clinical education tool.||92%||4.4906||.69677|