Journal of Nursing Education

Research Briefs 

Peer Debriefing Versus Instructor-Led Debriefing for Nursing Simulation

Blanca Rueda-Medina, PhD; Jacqueline Schmidt-RíoValle, PhD; Emilio González-Jiménez, PhD; Ángel Fernández-Aparicio, RN; María Encarnación Aguilar-Ferrándiz, PhD; María Correa-Rodríguez, PhD

Abstract

Background:

Debriefing is the reflective process following the simulation experience. We aimed to compare the debriefing assessment and debriefing satisfaction perceived by nursing students who underwent different debriefing methods.

Method:

An experimental study conducted on three groups (instructor-led debriefing, peer debriefing, and combined debriefing) was performed for 177 nursing students. Differences in the debriefing satisfaction were assessed using the Clinical Experience Simulation scale, the Visual Analogue scale, and the Debriefing Assessment for Simulation in Healthcare (DASH).

Results:

VAS scores for satisfaction differed significantly between the instructor-led debriefing, peer debriefing, and combined debriefing groups. In the Clinical Experience Simulation scale, the combined debriefing group was significantly higher compared with instructor-led debriefing. The total score for DASH was significantly higher in the combined debriefing group compared with instructor-led debriefing, and in instructor-led debriefing compared with peer debriefing.

Conclusion:

Combining debriefing after a simulation session improves the debriefing satisfaction and the perceived debriefing assessment among nursing students. [J Nurs Educ. 2021;60(2):90–95.]

Abstract

Background:

Debriefing is the reflective process following the simulation experience. We aimed to compare the debriefing assessment and debriefing satisfaction perceived by nursing students who underwent different debriefing methods.

Method:

An experimental study conducted on three groups (instructor-led debriefing, peer debriefing, and combined debriefing) was performed for 177 nursing students. Differences in the debriefing satisfaction were assessed using the Clinical Experience Simulation scale, the Visual Analogue scale, and the Debriefing Assessment for Simulation in Healthcare (DASH).

Results:

VAS scores for satisfaction differed significantly between the instructor-led debriefing, peer debriefing, and combined debriefing groups. In the Clinical Experience Simulation scale, the combined debriefing group was significantly higher compared with instructor-led debriefing. The total score for DASH was significantly higher in the combined debriefing group compared with instructor-led debriefing, and in instructor-led debriefing compared with peer debriefing.

Conclusion:

Combining debriefing after a simulation session improves the debriefing satisfaction and the perceived debriefing assessment among nursing students. [J Nurs Educ. 2021;60(2):90–95.]

Simulation is becoming more common in the nursing profession because it offers an opportunity to practice and learn several skills without exposing patients to risk (Baptista et al., 2016; George et al., 2020). Students can learn decision making and psychomotor skills, and it provides a safe and effective tool for preparing students for nursing practice (Morgan et al., 2018).

Debriefing is the reflective process that enables students to analyze, synthesize, and reflect on their performance (Kim & Yoo, 2020). Debriefing is an essential component of simulation because it has been proposed that it is where most of the learning takes place in a simulation experience (Padden-Denmead et al., 2016).

It has been demonstrated that debriefing following a simulation experience helps to improve students' comprehension, knowledge acquisition, critical thinking, and clinical performance ability (Abatzis & Littlewood, 2015; Ali & Miller, 2018). To date, different debriefing approaches have been developed (Gaylle, 2019; Lapum et al., 2019; Ostovar et al., 2018; Ryall et al., 2016; Schuler, 2020; Zhang et al., 2019). Instructor-led debriefing is a traditional debriefing involving an expert facilitator who guides the students in a supportive, respectful, and reflective environment (Ryoo & Ha, 2015). Because this approach requires the presence of an experienced instructor, students may feel intimidated or stressed during a debriefing session for fear of judgement. Recently, peer debriefing, a method in which participants are debriefed by self or by a peer, has been established as a valuable strategy that might improve team competence and encourage students to review their own strengths and weaknesses (Oikawa et al., 2016). Peer debriefing may also promote self-confidence and self-efficacy because students must reflect on their performance (Kang & Yu, 2018; Kim & De Gagne, 2018). However, previous studies comparing peer debriefing with the gold standard of an expert evaluation—an instructor-led debriefing—have reported inconclusive findings (Boet et al., 2011; Boet et al., 2013; Kim & De Gagne, 2018; Oikawa et al., 2016; Ostovar et al., 2018; Verkuyl et al., 2018).

This study aimed to compare the debriefing assessment (strategies and techniques used to conduct debriefings) and debriefing satisfaction perceived by nursing students who experienced instructor-led debriefing, peer debriefing, and combined debriefing, respectively. Our hypothesis was that the combined debriefing would improve the satisfaction and debriefing assessment of nursing students.

Method

Study Design and Participants

A study with an experimental design was performed using a convenience sample of 177 students in a bachelor's in nursing degree program at a public Spanish university. Students with no prior exposure to simulation were encouraged to become involved in this study. Participants provided written informed consent after receiving information about the purpose of the study. It was explained to the students that participation was voluntary; therefore, those who chose not to participate would not be academically disadvantaged. Ethical approval was obtained from the Institutional Review Board of the University of University of Granada.

Interventions

One week prior to the simulation session, an presentation providing the theoretical background to the scenario was emailed to all students who expressed a desire to participate in the study. The simulation scenario was designed to evaluate the technique employed to administer enteral feeding. Students were randomly assigned a different debriefing method (instructor, peer, or combined) based on their scheduled class, and a maximum of 15 participants were admitted in a session. The duration of each simulation with its subsequent debriefing was 90 minutes (a 15-minute simulation and a 75-minute debriefing session). The three different debriefing methods were conducted as follows:

  • Instructor-led debriefing group: To minimize intragroup differences, the same instructor was involved in all simulation sessions and the same scenarios were used. After the simulation, students reviewed a video recording of their intervention and the instructor encouraged them to discuss their team performance following a structured gather-analyze-summarize (GAS) method.
  • Peer debriefing group: The instructor provided each of the students with a self-debriefing questionnaire, according to the GAS method. The instructor then left and was not present for the peer debriefing. The participants reviewed the video of their performance scenario by themselves.

Peer Debriefing and Instructor-Led Debriefing Groups

The participants underwent peer debriefing using a structured questionnaire, according to the GAS method. This was followed by an instructor-led debriefing session wherein the instructor encouraged them to analyze their team performance according to the GAS method once they had reviewed the video record of their intervention.

Considering that the instructor-led debriefing is considered the gold standard, once the participants in the peer debriefing had completed the questionnaires, they also received an instructor-led team debriefing. After their respective debriefings, all participants completed a posttest debriefing assessment and debriefing satisfaction questionnaire, which took approximately 15 minutes. The study was conducted between October 2018 and February 2019. A sociodemographic questionnaire was used to request information on age and gender.

Debriefing Satisfaction

Satisfaction was assessed using the Clinical Experience Simulation scale (Baptista et al., 2014), which consists of 17 items that allow the students to give their opinion on the simulation experience using a 10-point Likert-type scale ranging from 1 (lowest level of satisfaction) to 10 (highest level of satisfaction). The Clinical Experience Simulation scale had a Cronbach's alpha of .891. Debriefing satisfaction was also measured using a 10-cm Visual Analogue scale (VAS). Scores are recorded by making a handwritten mark on a 10-cm line that represents a continuum between 0 (not satisfied) to 10 (very satisfied). Consequently, the VAS is treated as an interval-level scale and subjected to arithmetical operations. Greater satisfaction with simulation experience is correlated with higher scores.

Debriefing Assessment

The Debriefing Assessment for Simulation in Healthcare (DASH; student version) instrument was used to evaluate the strategies and techniques used to conduct debriefings by examining specific behavior (Simon et al., 2010). The DASH comprised six elements: establishes a learning environment and makes it engaging; maintains an engaging learning environment; structures the debriefing in a an organized way; incites engaging discussions; identifies and explores performance gaps; and helps trainees to achieve or sustain good future performance. Students reported the effectiveness of the debriefing sessions on a 7-point scale (1 = extremely ineffective and 7 = extremely effective). The sum of the six elements yielded a total DASH score with a maximum of 161 points. The Cronbach's alpha for the DASH was .909.

Data Analysis

Data were analyzed using SPSS® version 22.0 software. Continuous variables were reported as mean (SD) and categorical variables as frequencies and percentages. Kolmogorov–Smirnov Tests were conducted to test for normality. Tests for homogeneity of variance were conducted using Levene's Test of Equality of Error Variances. The Kruskal-Wallis (K-W) Test was used to compare between groups regarding nonnormally distributed numeric variables. For significant tests, Bonferroni post hoc analysis was performed for the pairwise comparisons (instructor-led debriefing, peer debriefing, and combined debriefing); p value of < .05 were considered statistically significant. Internal consistency on the scales was determined using Cronbach's alpha.

Results

The main characteristics of the study participants are shown in Table 1. Most students who enrolled in the study were women (n = 151, 84.8%) and the mean age of the population was 20.73 years (SD = 3.62) years. Using the K-W Test, significant differences in debriefing satisfaction and debriefing assessment between the three debriefing groups of nursing students were found (Table 2). Bonferroni correction analyses for multiple comparisons is shown in Table 3. Total VAS scores for satisfaction differed significantly between the instructor-led debriefing, peer debriefing, and combined debriefing groups (K-W= 8.653; p = .013). Bonferroni post hoc analyses showed that in the combined debriefing group, VAS was significantly higher compared with peer debriefing groups (p = .015) (Table 3). Regarding the Clinical Experience Simulation scale, significant differences were observed between the debriefing groups (K-W= 6.508; p = .039). In the combined debriefing group, the Clinical Experience Simulation scale was significantly higher compared with instructor-led debriefing after correction for multiple testing (p = .032). For the debriefing assessment evaluated using the DASH, total scores differed significantly between the instructor-led debriefing, peer debriefing, and combined debriefing groups (K-W= 12.781; p = .002). Post hoc analyses showed that the total score for DASH was significantly higher in the combined debriefing group compared with instructor-led debriefing (p = .003), and in the instructor-led debriefing compared with peer debriefing (p = .008). After correction for multiple testing, we also observed significant differences between the combined debriefing group compared with instructor-led debriefing for the following elements of DASH: establishes an engaging learning environment (p = .008); maintains an engaging learning environment (p = .003); and structures the debriefing in an organized way (p = .015). Bonferroni post hoc analyses also showed significant differences between the instructor-led debriefing compared with peer debriefing for the elements: maintains an engaging learning environment (p = .003); and provokes engaging discussions (p = .040).

Participants Debriefing Satisfaction and Debriefing Assessment in Nursing Students (N = 177)

Table 1:

Participants Debriefing Satisfaction and Debriefing Assessment in Nursing Students (N = 177)

Differences in Debriefing Satisfaction and Debriefing Assessment Between the Groups in Nursing Students (N = 177)

Table 2:

Differences in Debriefing Satisfaction and Debriefing Assessment Between the Groups in Nursing Students (N = 177)

Pairwise Comparisons in Debriefing Satisfaction and Debriefing Assessment Between the Groups In Nursing Students (N = 177)

Table 3:

Pairwise Comparisons in Debriefing Satisfaction and Debriefing Assessment Between the Groups In Nursing Students (N = 177)

Discussion

The purpose of this study was to investigate the perception of debriefing satisfaction and debriefing assessment by nursing students who received instructor-led debriefing, peer debriefing, and combined debriefing sessions, respectively. No previous research has compared the effect of these three debriefing models after a simulation scenario. Our study demonstrated that the satisfaction level of nursing students who participated in simulated clinical sessions involving a combined debriefing is significantly higher than those who participated solely in either peer debriefing or instructor-led debriefing. Furthermore, students in the combined group reported higher DASH scores than those in the instructor-led debriefing.

In agreement with our findings, a quasi-experimental study comparing the debriefing satisfaction between an experimental group of 123 nursing students who received either a combined debriefing or only instructor-led debriefing, reported that the combined group reported significantly higher scores for debriefing satisfaction (Kang & Yu, 2018). This is the only study to previously evaluate the effect of using combined debriefing versus instructor-led debriefing in simulation-based nursing education. However, it should be noted that, in the study by Kang and Yu (2018), debriefing satisfaction was assessed only by the VAS, whereas our study also used the Clinical Experience Simulation scale (Baptista et al., 2014). As Kang and Yu (2018) pointed out, because students in the combined group have time to review possible errors with their peers prior to an instructor-led debriefing, they may have a higher satisfaction level. Furthermore, because the peer debriefing session prior to instructor-debriefing enables students to review their own weaknesses, this may increase their perceived satisfaction.

Additionally, our study revealed that the combined debriefing enhanced the perceived debriefing assessment score. We found that in three of the six DASH elements, nursing students from the combined debriefing group reported the highest scores. Similarly, Kang and Yu (2018) also showed that combined debriefing improves debriefing assessments in nursing students. It has been reported that within-team debriefing (a reflective process led by the teams themselves rather than by an external trainer) can provide, in addition to a formative self-assessment, a formative peer review component (Boet et al., 2011). Boet et al. (2011) examined the effectiveness of self-debriefing compared with instructor-led debriefing among 50 anesthesiology residents and observed significant improvements in performance in both groups but observed no differences between the two groups. Although our study did not compare differences in performance, our findings suggest that peer debriefing, followed by an instructor-led debriefing session, improve the debriefing assessment. Similarly, Zentz et al. (2014) reported that peer-assisted learning is an effective educational strategy for imparting nursing skills and also strengthens the confidence of students.

Of note, all groups reported high scores on the clinical experience simulation scale, the VAS, and the DASH. These results show that all students, regardless of the type of debriefing, had a highly positive experience. These results support the available evidence suggesting that, independent of the method, debriefing guarantees more effective learning and enhances student satisfaction (Fey & Jenkins, 2015). However, it should be noted that in our study, students in the instructor-led debriefing group reported the lowest scores; this could be because peer debriefing provides participants with the opportunity to reflect on their own mistakes without embarrassment by talking it through with their peers first (Kang & Yu, 2018).

In contrast to our findings, some previous studies have reported that peer debriefing is a less accurate method, whereas others have indicated that, for health care students, the debriefing experiences were similar, regardless of the method (Boet et al., 2011; Ha & Song, 2015; Verkuyl et al., 2018). Ha and Song (2015) analyzed the effect of structured self-debriefing versus instructor-led debriefing in a sample of 76 nursing students and reported no significant differences in educational satisfaction between the two groups. Possible causes of inconsistencies in the data reported could be due to differences in sample characteristics (sample size, year of bachelor's degree, and simulation scenarios).

This study has several limitations that must be addressed. First, this study involved nursing students taking part in a simulated enteral feeding scenario. Therefore, our findings may not be generalizable to students of other degrees and/or simulation scenarios. Additionally, we cannot disregard the possibility that the students who consented to participate in this study have a more positive perception of clinical simulation than those who did not. Another limitation is that it was difficult to allocate the recommended time for the combined debriefing after the simulation (Rudolph et al., 2006). Due to the schedule of regular teaching sessions, the combined debriefing had the same duration as the other debriefing methods. Nevertheless, our findings indicate that performing peer debriefing combined with instructor-led debriefing seems to be the best method, even though there is less time for the debriefing. This might reflect that the value was in the amount of time to complete both methods of debriefing. The students who received combined debriefing sessions may have gained more insight from peer debriefing. When the instructor went through the structured debriefing, the students were more receptive of feedback and reflection.

Conclusion

Our findings highlight that combining peer debriefing and instructor-led debriefing after a simulation session improves the debriefing satisfaction and the perceived debriefing assessment among nursing students. Peer debriefing should be combined with instructor-led debriefing within the simulation curriculum because adding peer debriefing to traditional debriefing encourages nursing students to analyze their own strengths and weaknesses, giving them the opportunity to reflect on their performance with their peers and increasing their satisfaction.

References

  • Abatzis, V. T. & Littlewood, K. E. (2015). Debriefing in simulation and Beyond. International Anesthesiology Clinics, 53(4), 151–162 doi:10.1097/AIA.0000000000000070 [CrossRef] PMID:26397791
  • Ali, A. A. & Miller, E. T. (2018). Effectiveness of video-assisted debriefing in health education: An integrative review. Journal of Nursing Education, 57(1), 14–20 doi:10.3928/01484834-20180102-04 [CrossRef] PMID:29381155
  • Baptista, R. C., Paiva, L. A., Gonçalves, R. F., Oliveira, L. M., Pereira, M. F. & Martins, J. C. (2016). Satisfaction and gains perceived by nursing students with medium and high-fidelity simulation: A randomized controlled trial. Nurse Education Today, 46, 127–132 doi:10.1016/j.nedt.2016.08.027 [CrossRef] PMID:27639211
  • Baptista, Rui Carlos Negrao, Amado, Martins Jose Carlos, Pereira, M. F. C. R. & Mazzo, A. (2014). Satisfacción de los estudiantes con las experiencias clínicas simuladas: Validación de escala de evaluación [Students' satisfaction with simulated clinical experiences: validation of an assessment scale]. Revista Latino-Americana de Enfermagem, 22(5), 709–715 doi:10.1590/0104-1169.3295.2471 [CrossRef] PMID:25493664
  • Boet, S., Bould, M. D., Bruppacher, H. R., Desjardins, F., Chandra, D. B. & Naik, V. N. (2011). Looking in the mirror: Self-debriefing versus instructor debriefing for simulated crises. Critical Care Medicine, 39(6), 1377–1381 doi:10.1097/CCM.0b013e31820eb8be [CrossRef] PMID:21317645
  • Boet, S., Bould, M. D., Sharma, B., Revees, S., Naik, V. N., Triby, E. & Grantcharov, T. (2013). Within-team debriefing versus instructor-led debriefing for simulation-based education: A randomized controlled trial. Annals of Surgery, 258(1), 53–58 doi:10.1097/SLA.0b013e31829659e4 [CrossRef] PMID:23728281
  • Fey, M. K. & Jenkins, L. S. (2015). Debriefing practices in nursing education programs: Results from a national study. Nursing Education Perspectives, 36(6), 361–366 doi:10.5480/14-1520 [CrossRef] PMID:26753294
  • Gaylle, D. (2019). In-simulation debriefing increases therapeutic communication skills. Nurse Educator, 44(6), 295–299 doi:10.1097/NNE.0000000000000643 [CrossRef] PMID:30640799
  • George, T. P., Gainey, K. L., Kershner, S. H., Weaver, D. L. & Hucks, J. M. (2020). Junior and senior nursing students: A near-peer simulation experience. Journal of Nursing Education, 59(1), 54–56 doi:10.3928/01484834-20191223-13 [CrossRef] PMID:31945178
  • Ha, E.-H. & Song, H.-S. (2015). The effects of structured self-debriefing using on the clinical competency, self-efficacy, and educational satisfaction in nursing students after simulation. Journal of Korean Academic Society of Nursing Education, 21(4), 445–454 doi:10.5977/jkasne.2015.21.4.445 [CrossRef]
  • Kang, K. & Yu, M. (2018). Comparison of student self-debriefing versus instructor debriefing in nursing simulation: A quasi-experimental study. Nurse Education Today, 65, 67–73 doi:10.1016/j.nedt.2018.02.030 [CrossRef] PMID:29533836
  • Kim, S. S. & De Gagne, J. C. (2018). Instructor-led vs. peer-led debriefing in preoperative care simulation using standardized patients. Nurse Education Today, 71, 34–39 doi:10.1016/j.nedt.2018.09.001 [CrossRef] PMID:30218850
  • Kim, Y. J. & Yoo, J. H. (2020). The utilization of debriefing for simulation in healthcare: A literature review. Nurse Education in Practice, 43, 102698 doi:10.1016/j.nepr.2020.102698 [CrossRef] PMID:32004851
  • Lapum, J. L., Verkuyl, M., Hughes, M., Romaniuk, D., McCulloch, T. & Mastrilli, P. (2019). Self-debriefing in virtual simulation. Nurse Educator, 44(6), E6–E8 doi:10.1097/NNE.0000000000000639 [CrossRef] PMID:30585886
  • Morgan, J., Green, V. & Blair, J. (2018). Using simulation to prepare for clinical practice. The Clinical Teacher, 15(1), 57–61 doi:10.1111/tct.12631 [CrossRef] PMID:28371049
  • Oikawa, S., Berg, B., Turban, J., Vincent, D., Mandai, Y. & Birkmire-Peters, D. (2016). Self-debriefing vs instructor debriefing in a pre-internship simulation curriculum: Night on call. Hawai'i Journal of Medicine & Public Health: A Journal of Asia Pacific Medicine & Public Health, 75(5), 127–132 PMID:27239391
  • Ostovar, S., Allahbakhshian, A., Gholizadeh, L., Dizaji, S. L., Sarbakhsh, P. & Ghahramanian, A. (2018). Comparison of the effects of debriefing methods on psychomotor skills, self-confidence, and satisfaction in novice nursing students: A quasi-experimental study. Journal of Advanced Pharmaceutical Technology & Research, 9(3), 107–112 doi:10.4103/japtr.JAPTR_291_18 [CrossRef] PMID:30338237
  • Padden-Denmead, M. L., Scaffidi, R. M., Kerley, R. M. & Farside, A. L. (2016). Simulation with debriefing and guided reflective journaling to stimulate critical thinking in prelicensure baccalaureate degree nursing students. Journal of Nursing Education, 55(11), 645–650 doi:10.3928/01484834-20161011-07 [CrossRef] PMID:27783819
  • Rudolph, J. W., Simon, R., Dufresne, R. L. & Raemer, D. B. (2006). There's no such thing as “nonjudgmental” debriefing: A theory and method for debriefing with good judgment. Simulation in Healthcare, 1(1), 49–55 doi:10.1097/01266021-200600110-00006 [CrossRef] PMID:19088574
  • Ryall, T., Judd, B. K. & Gordon, C. J. (2016). Simulation-based assessments in health professional education: A systematic review. Journal of Multidisciplinary Healthcare, 9, 69–82 PMID:26955280
  • Ryoo, E. N. & Ha, E.-H. (2015). The importance of debriefing in simulation-based learning. CIN: Computers, Informatics. Nursing, 33(12), 538–545 PMID:26587701
  • Schuler, M. S. (2020). Simulation debriefing using Kim's critical reflective inquiry model. Nurse Educator, 45, 272 Advance online publication. doi:10.1097/NNE.0000000000000819 [CrossRef] PMID:32097241
  • Simon, R., Raemer, D. & Rudolph, J. (2010). Debriefing assessment for simulation in healthcare. Center for Medical Simulation.
  • Verkuyl, M., Atack, L., McCulloch, T., Liu, L., Betts, L., Lapum, J. L., Hughes, M., Mastrilli, P. & Romaniuk, D. (2018). Comparison of debriefing methods after a virtual simulation: An experiment. Clinical Simulation in Nursing, 19, 1–7 doi:10.1016/j.ecns.2018.03.002 [CrossRef]
  • Zentz, S. E., Kurtz, C. P. & Alverson, E. M. (2014). Undergraduate peer-assisted learning in the clinical setting. Journal of Nursing Education, 53(3), S4–S10 doi:10.3928/01484834-20140211-01 [CrossRef] PMID:24512330
  • Zhang, H., Mörelius, E., Goh, S. H. L. & Wang, W. (2019). Effectiveness of video-assisted debriefing in simulation-based health professions education: A systematic review of quantitative evidence. Nurse Educator, 44(3), E1–E6 doi:10.1097/NNE.0000000000000562 [CrossRef] PMID:30015683

Participants Debriefing Satisfaction and Debriefing Assessment in Nursing Students (N = 177)

VariableMean (SD)
Satisfaction-VAS8.72 (1.72)
ESECS141.90 (18.73)
DASH
  Establishes an engaging learning environment22.51 (4.22)
  Maintains an engaging learning environment30.06 (4.89)
  Structures the debriefing in an organized way22.89 (3.69)
  Provokes engaging discussions29.75 (4.87)
  Identifies and explores performance gaps11.85 (2.12)
  Helps trainees achieve or sustain good future performance18.19 (3.21)
  Total score135.25 (20.02)

Differences in Debriefing Satisfaction and Debriefing Assessment Between the Groups in Nursing Students (N = 177)

Mean (SD)Mean RankKruskal-Wallis Valuedfp value


ID (n = 51)SSD (n = 58)SSD and ID (n = 68)ID (n = 51)SSD (n = 58)SSD and ID (n = 68)
Satisfaction-VAS8.37 (2.24)8.40 (1.74)9.25 (1.09)74.4469.0290.538.6532.013
ESECS136.07 (20.81)140.71 (20.85)146.60 (13.95)69.1882.9692.516.5082.039
DASH
  Establishes an engaging learning environment21.16 (4.53)22.72 (4.25)23.22 (3.88)64.4783.2388.627.3072.026
  Maintains an engaging learning environment28.23 (5.19)30.46 (5.41)30.93 (4.01)59.4388.1888.2912.282.002
  Structures the debriefing in an organized way21.37 (3.99)23.32 (3.90)23.51 (3.09)61.4788.5286.7310.0712.007
  Provokes engaging discussions27.88 (5.30)30.03 (5.25)30.66 (3.96)62.0985.8388.349.4402.009
  Identifies and explores performance gaps11.35 (2.46)11.98 (2.25)12.04 (1.73)70.7786.0082.642.8442.241
  Helps trainees achieve or sustain good future performance17.49 (3.35)18.30 (3.52)18.52 (2.84)65.4487.5984.876.4892.039
  Total score127.49 (21.95)136.82 (21.53)138.88 (16.26)58.9887.7688.9012.7182.002

Pairwise Comparisons in Debriefing Satisfaction and Debriefing Assessment Between the Groups In Nursing Students (N = 177)

Satisfaction and Assessment VariablesPairwise ComparisonsTest StatisticsBonferroni Test
Satisfaction-VASSSD-ID5.4231.000
SSD-SSD and ID−21.510.015
ID-SSD and ID−16.090.139
ESECSSSD-ID−13.770.462
SSD-SSD and ID−9.5530.837
ID-SSD and ID−23.3300.032
DASH
Establishes an engaging learning environmentSSD-ID−18.5560.160
SSD-SSD and ID−5.3891.000
ID-SSD and ID−23.9450.008
  Maintains an engaging learning environmentSSD-ID−28.7500.003
SSD-SSD and ID−0.1111.000
ID-SSD and ID−28.8610.004
  Structures the debriefing in an organized waySSD-ID−27.0550.014
SSD-SSD and ID1.7891.000
ID-SSD and ID−25.2660.015
  Provokes engaging discussionsSSD-ID−23.7370.040
SSD-SSD and ID−2.5061.000
ID-SSD and ID−26.2430.011
  Helps trainees achieve or sustain good future performanceSSD-ID−22.1480.059
SSD-SSD and ID2.7171.000
ID-SSD and ID−19.4310.089
  Total scoreSSD-ID−28.7830.008
SSD-SSD and ID−1.1361.000
ID-SSD and ID−29.9190.003
Authors

Dr. Rueda-Medina and Dr. Correa-Rodríguez are Faculty of Health Sciences, Department of Nursing, University of Granada, and Biohealth Research Institute; Dr. Schmidt-RíoValle, Dr. González-Jiménez, and Mr. Fernández-Aparicio are Faculty of Health Sciences, Department of Nursing, University of Granada; and Dr. Encarnación Aguilar-Ferrándiz is Faculty of Health Sciences, Department of Physical Therapy, University of Granada and Biohealth Research Institute Granada, Granada, Spain.

The authors have disclosed no potential conflicts of interest, financial or otherwise.

The authors thank the students who participated in this study and the University of Granada who approved this study.

Address correspondence to María Encarnación Aguilar-Ferrándiz, PhD, Faculty of Health Sciences, Department of Physical Therapy, University of Granada, 60 Avenida de la Ilustración, Granada, Spain, 18060, email: e_aguilar@ugr.es.

Received: February 14, 2020
Accepted: June 17, 2020

10.3928/01484834-20210120-06

Sign up to receive

Journal E-contents