Critical thinking (CT) and clinical reasoning skills are essential for nursing practice as hospitalized patients are more critical and require more complex care than in previous decades (Benner, Sutphen, Leonard, & Day, 2010; King, Smith, & Glenn, 2003; Robert Wood Johnson Foundation, 2007). The literature varies on the teaching and learning methods to develop these skills, but there is evidence that the use of simulation with debriefing and guided reflective journaling can independently foster the development of CT and provide opportunities to practice clinical reasoning skills (Benner et al., 2010; Dillard et al., 2009; Kennison, 2006; Larew, Lessans, Spunt, Foster, & Covington, 2006; Lisko & O'Dell, 2010; Twibell, Ryan, & Hermiz, 2005).
Simulation has helped in providing students with opportunities to identify abnormal assessment findings, as well as prioritize assessments and interventions. Simulation also allows students the opportunity to develop clinical judgment and to practice communication skills (Bambini, Washburn, & Perkins, 2009; Dillard et al., 2009; Doody & Condon, 2013; Lindsey & Jenkins, 2013; Mould, White, & Gallagher, 2011). When combined with debriefing and reflection, simulation can enhance learning (Lapkin & Levett-Jones, 2011; Wotton, Davis, Button, & Relton, 2010). Lavoie, Pepin, and Boyer (2013) found that using Nielsen's, Stragnell's, and Jester's Guide for Reflection (2007) as a framework for their structured debriefing, after simulation helped novice nurses to understand their cognitive processes as they thought through the simulation experience, helped in the development of clinical judgment and fostered creative strategies to improve psychomotor and communication skills.
Guided reflective journaling, like reflective debriefing, can help students to learn from their experiences (Boud, 2001; Boud, Keogh, & Walker, 1985; Boud & Walker, 1998; Fink, 2013; Nielsen et al., 2007; Padden, 2011). Reflective journaling has been associated with CT (Kennison, 2006; Hoffman & Elwin, 2004; Twibell et al., 2005). Marchigiano, Eduljee, and Harvey (2011) found that nursing students preferred reflective journaling to written care plans as a method of improving their thinking skills. Tanner (2006) stated that reflection is essential for the development of clinical knowledge and clinical reasoning.
Simulation combined with debriefing and reflection affords students the opportunity to practice decision-making skills in uncertain situations using what Facione and Facione (2009) identified as the Two Parallel Functioning Rational Decision Making Systems (RDMS). The RDMS is composed of System One and System Two thinking. System One thinking is classified as more intuitive in nature and involves associative, well-trained, automatic judgments. System Two thinking involves systematic reflective reasoning and carefully considered criterion-based judgments. Clinical situations in simulation stimulate opportunities to practice System One thinking, where the student may make accurate or inaccurate reactive decisions and either employ or possibly lack the ability to employ System Two thinking when making these decisions.
System Two thinking may occur during simulation but it may also be fostered with faculty guidance. Facione (2015) explained that System One and System Two thinking are both valuable decision-making tools when faced with uncertainty. However, it is the role of the educator to guide students to be aware of, and to think about, the decisions they made and how they made them in System One thinking while simultaneously guiding students to develop and refine System Two thinking. This can be conducted through skilled debriefing and guided reflection. The goal is to develop critical thinkers who possess strong System One thinking and also recognize and use System Two thinking when making decisions in uncertain situations.
Purpose and Method
The primary purpose of this study was to investigate the use of the combination of clinical simulation with debriefing and guided reflective journaling to stimulate CT in prelicensure baccalaureate degree nursing students. A secondary purpose was to determine whether a relationship exists among the variables of the level of reflection and CT.
A descriptive correlational design was used for this study. After receiving institutional review board approval, 23 of 27 junior baccalaureate degree nursing students (i.e., 17 women and six men), ages 19 to 29 years, enrolled in their second clinical course during the pediatric rotation, consented for data to be included in this study. All 27 students participated in all learning activities. Data were collected through the spring semester from February until May 2013. Small groups of students, two or three per group, were observed and rated by two faculty members on the use of CT using the criterion-based Facione's and Facione's (2009) Holistic Critical Thinking Skills Rubric (HCTSR) during a clinical simulation experience. The simulation experience included the use of a human simulator for abnormal patient parameters and acting students as patient and family members. All groups participated in the same repeated scenario. Following the simulation, students were guided through a reflective debriefing session and within 72 hours, submitted a written guided reflective journal entry. During that same semester, students submitted two additional guided reflective journal entries from two additional real-life patient care clinical experiences without the use of guided reflective debriefing. The two additional clinical experiences each included a 1-day rotation in an acute care pediatric unit and a pediatric long-term care rehabilitation facility.
After all small groups rotated through the simulation laboratory, the entire clinical group and the two faculty raters met in the third classroom for the reflective debriefing. The format for debriefing followed a structure similar to the guided reflective journaling. Students were asked to describe the event, discuss feelings, and interpret the findings, as well as the effects of their decisions and interventions, while also identifying what they did well and noting areas for improvement.
The principle investigator and three clinical faculty members served as observers and raters. The principle investigator rated all simulation observations and guided reflective journal entries with one other clinical faculty member who observed the same simulation exercise or rated the same journal entries. The HCTSR (Facione & Facione, 2009) was used to rate the level of CT used by the students during simulation as observed by the two faculty raters and ratings were conducted and compared immediately after each simulation exercise. The HCTSR and the level of reflection on action assessment (LORAA) (Padden, 2011, 2013) were used to rate guided reflective journal entries for level of CT and level of reflection, respectively, from simulation and the two subsequent clinical experiences. Each observation and reflective journal entry were rated individually and then compared and discussed with the faculty who observed the same simulation experience and rated the same journal entries. In addition to rating the journal entries, the faculty responsible for the simulation or clinical experience provided written supportive feedback to the students on each reflective journal entry within 1 week of the submission.
Overall faculty and principle investigator independent ratings were consistent with each other from observations and the journal entries 94% of the time. When inconsistencies occurred, the principle investigator and the co-rating faculty discussed the ratings and were able to achieve 100% agreement. As a final measure of clinical reasoning, student level scores on the national standardized Assessment Technologies Institute (ATI) (2013) test were also compared with all HCTSR and LORAA scores.
A modified version of the Guide for Reflection (GFR; Nielsen et al., 2007) was used as a format for the debriefing and reflective journal entries but was not used in the data collection. The instruments used for data collection included two criterion-based instruments and one nationally standardized multiple choice examination. Facione's and Facione's (2009) criterion-based HCTSR was used to assess student level of CT during the simulation exercise and on each of three guided reflective journal entries. The HCTSR was developed by two internationally recognized experts in CT and the “validity and reliability of the HCTSR ratings rest on the ability of the rater to recognize and discriminate between varying examples of reasoning processes” (Insight Assessment, 2016, para. 5).
The LORAA (Padden, 2011, 2013) was used to determine the level of reflection achieved on the written journal entries. Content validity for the LORAA was established through a panel of three international experts on guided reflection and interrater reliability for the LORAA was established in two prior studies obtaining ratings ranging from .67 (Padden, 2011) to .94 (Padden, 2013), respectively.
Prior to the initiation of data collection, the principle investigator and the three faculty co-raters practiced rating five sample student reflective journal entries not included in the current study using the HCTSR and the LORAA. Interrater reliability among the four raters for each instrument was achieved initially at .70 and .80, respectively. After discussion, raters were able to achieve 100% agreement for each instrument using the five sample entries. Due to the nature of this study, interrater reliability was not able to be practiced on observation of student use of CT in simulation, as this pilot study was the first attempt to apply this method of assessment.
The ATI 2010 examination for the Nursing Care of Children Form B was used as an objective measure of CT and clinical reasoning. This 60-item standardized assessment of the student's basic comprehension and mastery of the nursing care of children was found to be reliable (α = .62) from normative data based on 59,956 participant test scores.
Data were analyzed using SPSS software. Descriptive statistics were used to analyze the data in this study. Mean Scores on all criterion-based measures were compared using Spearman's rho to determine whether there was a relationship between level of reflection and CT.
Scores from faculty observation of CT in simulation, and faculty ratings of CT and level of reflection on the three reflective journal entries, along with student ATI level scores, are shown in Table 1. The C1 column in Table 1 corresponds with student reflective journal entries on an acute inpatient pediatric experience. The C2 column corresponds with student reflective journal entries from a pediatric long-term–care clinical experience. Four students did not submit journal entries on the C1 (i.e., acute inpatient clinical experience) timely enough for scores to be included in data analysis.
Student Scores on CT in Simulation; LORAA and CT on Reflection on Simulation, C1, and C2; and ATI
Mean scores on both the level of reflection and CT on the guided reflective journal entries after simulation and debriefing were highest when compared with the guided reflective journal entries on the additional two clinical experiences (Table 2). Spearman's rank-order correlation test was used to compare the degree of association between the level of reflection and CT scores on the guided reflective journal entries (Polit & Beck, 2016). The results indicated a statistically significant positive relationship between level of reflection and CT scores on guided reflective journal entries after simulation and debriefing (r = .543, p < .01), on the guided reflective journal entries on pediatric long term care clinical experience (r = .742, p < .01), as well as on the pediatric acute care clinical experience (r = .718, p < .01). No relationship was found with variables compared with the HCTSR observed scores and ATI level scores.
Mean Scores on CT in Simulation, LORAA and CT on Reflection on Simulation, C1, and C2
The results from this study support the relationship between CT and reflection (Kennison, 2006; Hoffman & Elwin, 2004; Twibell et al., 2005). Students were provided with the opportunity to use Facione's (2013) System One thinking during the simulation experience with direct faculty observation and assessment of CT in action. Observation of CT occurred as the students discussed decision making with each other. Assessment of CT was conducted through the students' ability to recognize the problem, prioritize, and intervene appropriately to stabilize the patient and was rated on the HCTSR. The guided reflective debriefing allowed the students to practice the RDMS using System One and System Two thinking simultaneously. Students verbalized their thought processes as they reflected on the situation during the debriefing. They reflected further on the experience in the reflective journaling assignment, and these were rated using the HCTSR and the LORAA.
The higher mean scores on the HCTSR and the LORAA following simulation and guided debriefing may indicate that when students are guided through debriefing, CT is improved and reflection is higher. The use of guided reflective debriefing models how one should think through and reflect on a given clinical issue. This faculty-guided model provided students with a structure for thinking, reflecting, and reasoning through a clinical issue.
Although students had received prior reading assignments and didactic instruction about the clinical diagnosis and care required to respond to the patient problem in simulation experience, the lower HCTSR-observed scores during this experience are not surprising. The scenario was fairly complex, requiring the simultaneous inclusion of assessment, communication, and medication administration skills. Although this was a challenging scenario, the students commented during the debriefing that they appreciated the opportunity to “function like a real nurse.” They supported this statement with anecdotes that they have never had to call a health care provider before and have never been in the situation where they had to think through the problem and respond, as opposed to reporting the problem to the nurse as they typically do as students. However, the higher mean HCTSR scores from the reflective journal entries on simulation (2.9), compared with the score (2.3) from the observation, are supported by Bulman's, Lathlean's, and Gobbi's (2014) argument of the need for guidance in reflection using dialogue and questioning to develop CT.
The mean scores on the LORAA are fairly high for all three experiences at 5.3, 4.8, and 4.8. LORAA scores at this level indicate that the students were able to analyze or interpret the experience (i.e., level 4), and recognize new perceptions as a result of the experience (i.e., level 5). Mean scores on the HCTSR at 2.9, 2.8, and 2.6 from the guided reflective journal entries are close to the acceptable ability to accurately interpret, analyze, and fair mindedly follow the evidence to nonfallacious conclusions (i.e., level 3).
The discrepancy between the HCTSR and LORAA scores from individual students can be explained through the different foci of the instruments and the phenomena themselves. The HCTSR measures the ability to think through the problem and the LORAA measures the ability to reflect, or rather to think about the thinking, about that problem. Neither phenomenon is a linear process and should not be evaluated as such. Although some students could not critically think through a problem, as evidenced by a level 1 or 2 on the HCTS, they were able to recognize where they were lacking in knowledge or skill and could identify a plan to address that lack of knowledge, as evidenced by achievement of the level 6 on the LORAA.
The lack of relationship among the variables and ATI level scores is difficult to explain. The ATI examination may not have been the appropriate instrument to measure CT and clinical reasoning. In addition, the weight of the ATI on the overall course grade is minimal at 2%, which is evidence that students may not have taken this examination seriously.
Limitations, Conclusions, and Recommendations
The results of this pilot study should be interpreted cautiously due to the small, self-selected sample and cannot be generalized to a larger population. The study was also limited by geographic location. This small study further supports the value of simulation, guided reflective debriefing, and guided reflective journaling as methods to stimulate CT and reflection in nursing students. The results also support the relationship between CT and reflection, reinforcing Facione's (2015) RDMS model by allowing faculty to observe CT, as well as guide students to think critically through reflection during debriefing. The abilities to think critically and reflect were captured through ratings on the HCTSR and the LORAA. After the simulation reflective debriefing, students were able to achieve higher scores on CT and reflection on journal entries than on reflective journal entries on a typical hospital clinical experience without debriefing. Simulation offers the unique ability to design a clinical situation to achieve desired learning outcomes. Guided reflective debriefing, in addition to guided reflective journaling, can provide students with the opportunity to practice CT, reflect, and identify learning needs. The lack of a relationship between the HCTSR and the LORAA with ATI scores has been considered, and future research should involve a more appropriate measure of clinical reasoning or an instrument standardized to measure CT. Further research in this area should also involve a larger sample size, possibly more than one simulation exercise with debriefing, and measures of CT over time to determine whether additional practice with simulation and debriefing will improve scores.
An unintended outcome of this study was discovering the value of combining an assessment of CT with an assessment of reflection. Combining the two instruments to evaluate reflective journal entries provided faculty with a clear picture of how students thought through a clinical problem and what the students thought about their thinking through that problem. Written feedback was provided to students on each reflective journal entry and if faculty reasoning was found, faculty could use prompts included with the LORAA to guide students to reframe the problem or think about it from a different perspective. If students could self-identify their faulty thinking, the use of both instruments allowed the faculty to identify the problem and encourage the student with the student's plan to address that problem and possibly suggest additional strategies. Providing both criterion-based instruments to students to be used as guides prior to submission of the reflective journal entries may also serve as guides to think through problems and improve abilities to reflect and think critically.
- Asessment Technologies Institute. (2013). RN Content Mastery Series (CMS)® 2010 Technical Manual Psychometrics. Leawood, KA: Assessment Technologies Institute®, LLC.
- Bambini, D., Washburn, J. & Perkins, R. (2009). Outcomes of clinical simulation for novice nursing students: Communication, confidence, clinical judgment. Nursing Education Perspectives, 30, 79–82.
- Benner, P., Sutphen, M, Leonard, V. & Day, L. (2010). Educating nurses: A call for radical transformation. San Francisco, CA: Jossey-Bass.
- Boud, D. (2001). Using journal writing to enhance reflective practice. In English, L.M. & Gillen, M.A. (Eds.). Promoting journal writing in adult education. New directions in adult and continuing education No. 90 (pp. 9–18). San Francisco, CA: Jossey-Bass. doi:10.1002/ace.16 [CrossRef]
- Boud, D., Keogh, R. & Walker, D. (1985). Reflection: Turning experience into learning. London, UK: Kogan Page.
- Boud, D. & Walker, D. (1998). Promoting reflection in professional courses: The challenge of context. Studies in Higher Education, 23, 191–206. doi:10.1080/03075079812331380384 [CrossRef]
- Bulman, C., Lathlean, J. & Gobbi, M. (2014). The process of teaching and learning about reflection: research insights from professional nurse education. Studies in Higher Education, 39, 1219–1236. http://dx.doi.org/10.1080/03075079.2013.777413 doi:10.1080/03075079.2013.777413 [CrossRef]
- Dillard, N., Sideras, S., Ryan, M., Carlton, K.H., Lasater, K. & Siktberg, L. (2009). A collaborative project to apply and evaluate the clinical judgment model through simulation. Nursing Education Perspectives, 30, 99–104.
- Doody, O. & Condon, M. (2013). Using a simulated environment to support students learning clinical skills. Nurse Education in Practice, 13, 561–566. http://dx.doi.org/10.1016/j.nepr.2013.03.011 doi:10.1016/j.nepr.2013.03.011 [CrossRef]
- Facione, P.A. (2015). Critical thinking: What it is and why it counts. Retrieved from http://www.insightassessment.com/Resources/Critical-Thinking-What-It-Is-and-Why-It-Counts.
- Facione, P.A. & Facione, N.C. (2009). Insight assessment: Holistice critical thinking rubric. Millbrae, CA: The California Academic Press.
- Fink, L.D. (2013). Creating significant learning experiences: An integrated approach to designing college courses. San Francisco, CA: Jossey-Bass.
- Hoffman, K. & Elwin, C. (2004). The relationship between critical thinking and confidence in decision-making. Australian Journal of Advanced Nursing, 22, 8–12.
- Insight Assessment. (2016). Holistic critical thinking scoring rubric (HCTSR). San Jose, CA: The California Academic Press, LLC. Retrieved from http://www.insightassessment.com/Resources/Holistic-Critical-Thinking-Scoring-Rubric-HCTSR/%28language%29/eng-US
- Kennison, M.M. (2006). The evaluation of students' reflective writing of evidence of critical thinking. Nursing Education Perspectives, 27, 269–273.
- King, M.S., Smith, P.L. & Glenn, L.L. (2003). Entry-level competency needed by BSNs in acute health care agencies in Tennessee in the next 10 years. Journal of Nursing Education, 42, 179–181.
- Lapkin, S. & Levett-Jones, T. (2011). A cost-utility analysis of medium vs. high-fidelity human patient simulation manikins in nursing education. Journal of Clinical Nursing, 20, 3543–3552. doi:10.1111/j.1365-2702.2011.03843.x [CrossRef]
- Larew, C., Lessans, S., Spunt, D., Foster, D. & Covington, B.G. (2006). Innovations in clinical simulation: Application of Benner's theory in an interactive patient care simulation. Nursing Education Perspectives, 27, 16–21.
- Lavoie, P., Pepin, J. & Boyer, L. (2013). Reflective debriefing to promote novice nurses' Clinical judgment after high-fidelity simulation: A pilot test. Dynamics (Pembroke, Ont.), 24(4), 36–41.
- Lindsey, P.C. & Jenkins, S. (2013). Nursing students clinical judgment regarding rapid response: The influence of a clinical simulation education intervention. Nursing Forum, 48, 61–70. doi:10.1111/nuf.12002 [CrossRef]
- Lisko, S.A. & O'Dell, V. (2010). Integration of theory and practice: Experiential learning theory and nursing education. Nursing Education Perspectives, 31, 106–8.
- Marchigiano, G., Eduljee, N. & Harvey, K. (2011). Developing critical thinking skills from clinical assignments: A pilot study on nursing students self-reported perceptions. Journal of Nursing Management, 19, 143–152. doi:10.1111/j.1365-2834.2010.01191.x [CrossRef]
- Mould, J., White, H. & Gallagher, R. (2011). Evaluation of a critical care simulation series for undergraduate nursing students. Contemporary Nurse, 38, 180–190. doi:10.5172/conu.2011.38.1-2.180 [CrossRef]
- Nielsen, A., Stragnell, S. & Jester, P. (2007). Guide for reflection using the clinical judgment model. Journal of Nursing Education, 46, 513–516.
- Padden, M.L. (2011). The effects of guided reflective journaling on nursing students' level of reflection, self-awareness, and perceived clinical decision making skills [Doctoral dissertation]. Retrieved from ProQuest Dissertations and Theses Database (UMI No. 3500773).
- Padden, M.L. (2013). A pilot study to determine the validity and reliability of the level of reflection-on-action assessment. Journal of Nursing Education, 52, 410–415. doi:10.3928/01484834-20130613-03 [CrossRef]
- Polit, D.F. & Beck, C.T. (2016). Nursing research: Generating and assessing evidence for nursing practice. Philadelphia, PA: Lippincott, Williams, and Wilkins.
- Robert Wood Johnson Foundation. (2007). Charting nursing's future. Facts and controversies about nurse staffing policy: A look at existing models, enforcement issues, and research needs. Retrieved from http://www.rwjf.org/files/research/nursingissue5revfinal.pdf
- Tanner, C.A. (2006). Thinking like a nurse: A research-based model of clinical judgment in nursing. Journal of Nursing Education, 45, 204–211.
- Twibell, R., Ryan, M. & Hemriz, M. (2005). Faculty perceptions of critical thinking in student clinical experiences. Journal of Nursing Education, 44, 71–79.
- Wotton, K., Davis, J., Button, D. & Relton, M. (2010). Third-year undergraduate nursing students perceptions of high-fidelity simulation. Journal of Nursing Education, 49, 632–639. doi:10.3928/01484834-20100831-01 [CrossRef]
Student Scores on CT in Simulation; LORAA and CT on Reflection on Simulation, C1, and C2; and ATI
|Student||HCTSR Simulation Observed||LORAA Reflection on Simulation||HCTSR Reflection on Simulation||LORAA Reflection on C1||HCTSR Reflection on C1||LORAA Reflection on C2||HCTSR Reflection on C2||ATI Level Score|
Mean Scores on CT in Simulation, LORAA and CT on Reflection on Simulation, C1, and C2
|Measure||HCTSR Simulation Observed||LORAA Reflection on Simulation||HCTSR Reflection on Simulation||LORAA Reflection on C1||HCTSR Reflection on C1||LORAA Reflection on C2||HCTSR Reflection on C2||Mean ATI Level Score|