Dr. Rutherford-Hemming is Assistant Professor, University of Pittsburgh School of Nursing, Pittsburgh, Pennsylvania.
The author has disclosed no potential conflicts of interest, financial or otherwise.
The author thanks Dr. Amy Rose, Dr. Laurel Jeris, Dr. Thomas J. Smith, and Dr. Jeanette Rossetti, Northern Illinois University, for their support and guidance in this research.
Address correspondence to Tonya Rutherford-Hemming, EdD, RN, ANP-BC, Assistant Professor, University of Pittsburgh School of Nursing, 3500 Victoria Street, 415 Victoria Building, Office 421B, Pittsburgh, PA 15261; e-mail: firstname.lastname@example.org.
Despite the growing popularity of using simulations in the nursing curriculum, research in the area of simulation with standardized patients is limited. The field of nursing has used simulation for more than 35 years (Frejlach & Corcoran, 1971); however, few studies exist that pertain to the use of standardized patients in the education of nurse practitioner students. The studies suggested that this type of simulation is often used for evaluation purposes (Bramble, 1994; Ebbert & Connors, 2004; Gibbons et al., 2002; Khattab & Rawlings, 2001; McDowell, Nardini, Negley, & White, 1984; O’Connor, Albert, & Thomas, 1999; Vessey & Huss, 2002). Furthermore, studies conducted to assess student satisfaction with standardized patient experiences indicated that most students thought the learning experience was valuable, improved their clinical skills and competence, reinforced their knowledge, and provided valuable feedback (Becker, Rose, Berg, Park, & Shatzer, 2006; Bramble, 1994; Ebbert & Connors, 2004; Gibbons et al., 2002; Theroux & Pearce, 2006).
Nonetheless, there is little research that identifies whether the use of simulation with standardized patients by nurse practitioner students facilitates a transfer of learning in clinical competency from the laboratory to actual clinical practice (May, Hyun Park, & Lee, 2010; Scherer, Bruce, Graves, & Erdley, 2003). Although Starkweather and Kardong-Edgren (2008) found that student participation with simulation in the nursing curriculum increased safety and decreased medical errors in the clinical setting, other research has investigated the difference in the transfer of learning between two groups (Bramble, 1994). Still, the research has not evaluated whether or how transfer of learning occurred, thus producing a gap in the literature and leaving questions unanswered relating to the transfer of learning as it pertains to simulation with standardized patients. Until this gap is closed, the notion that simulation assists students to acquire competencies more reliably and produces reliable transfer to the clinical setting is hypothetical.
The purpose of the current study was to examine whether nurse practitioner students, who participated in a simulation experience with standardized patients prior to entering their first clinical rotation, demonstrated an increase in clinical competency in the clinical practice setting. Guiding this study were the following questions:
- To what extent does the clinical competency of students in the standardized patient simulation setting differ from their clinical competency in the clinical setting?
- Is there a relationship between the clinical competency of students in the standardized patient setting and their clinical competency in the clinical setting?
- How do students describe the effects of a standardized patient simulation on their clinical competence?
The theoretical perspective guiding this study was the transfer of learning. Gagne (1965) observed that individuals rely on prior knowledge when learning complex skills, which indicates that learning is cumulative in nature. According to Ellis (1965), transfer of learning occurs when “experience or performance on one task influences performance on some subsequent task” (p. 3). McKeachie (1987) defined transfer as “the use of previous learning in a situation somewhat different from the situation in which learning took place” (p. 707), and Prawat (1989) defined transfer as “the ability to draw on or access one’s intellectual resources in situations where those resources may be relevant” (p. 1).
This study used a descriptive research design with a small sample (n = 14) of nurse practitioner students who volunteered and participated in a standardized patient simulation prior to entering their first semester of clinical courses. Students were observed in the simulation setting and then in the clinical setting. Students answered written questions regarding whether and how the standardized patient experience affected their learning. The setting for this study was a large private university in the midwestern United States. The university’s institutional review board approved all forms and procedures prior to the start of this study. Institutional review board approval at the home university of the sample participants, as well as at all participating clinical sites (hospitals), was obtained.
Data Collection and Instrumentation
Data were collected by digitally recording each student as he or she interviewed and performed a physical examination on a standardized patient. Additional data were then collected by directly observing the student with a patient in the clinical setting. Only one researcher (T.R-H.) collected the data (i.e., performed the observations).
The researcher developed a checklist, History Taking and Physical Examination Competency, which was used by the researcher at two points in the study—Time 1: watching a videotape of the student in the simulation laboratory with the standardized patient, and Time 2: observing the student in the clinical setting with a patient. The History Taking and Physical Examination Competency checklist was developed to assess the clinical competency (history taking and physical examination skills) of the student. The checklist operationalizes the standard objectives and outcomes currently used in history taking and physical examination courses for nurse practitioner students. It was sent to an expert panel of nursing faculty for evaluation of content and to increase the validity of the checklist, and changes were made based on the feedback received.
For the purpose of analyzing student performance, the History Taking and Physical Examination Competency checklist was divided into 10 domains. The domains were Initiating the Interview, Facilitating Skills, History of Present Illness, Past Medical History, Family History, Social History, Review of Systems, Transition to the Physical Examination, Clinical Courtesy, and Physical Examination.
Six open-ended questions were posed to the students during the study. The questions were:
- Did the standardized patient experience assist you in your clinical knowledge and skills in performing a history and physical examination? If so, how? If not, why not?
- In what ways do you think your history taking and physical examination skills with a patient will change now that you have completed the standardized patient simulation?
- Describe the most beneficial aspect of having a standardized patient experience.
- Describe the least beneficial aspect of having a standardized patient experience.
- What additional comments do you have about the standardized patient experience?
- How did the standardized patient experience prepare you for history taking and physical examination now that you have completed these on an actual patient?
Data in the history taking assessment section of the History Taking and Physical Examination Competency checklist were coded as 1 = yes, 0 = no, and 99 = not applicable. For the physical examination assessment section of the checklist, student performance was coded as 1 = attempted, satisfactory and 2 = attempted, below satisfactory or did not attempt. Scores for each domain were computed by taking the mean of the items for each domain. Total competency scores were computed as the mean of the domain scores for all domains.
Student responses to the open-ended questions were read, and traditional thematic analysis techniques (Miles & Huberman, 1994) were used to analyze the data. The researcher identified meaningful segments and units, and a categorization scheme, with corresponding codes, was developed to organize and sort the data. Finally, themes were identified as they emerged. The data were reviewed an additional time in their entirety to ensure all themes had been identified.
The age of the participants ranged from 25 to 56 years, with the average age being 35 years. Four men and 10 women participated in the study.
To assess growth in overall competency, a paired samples t test was conducted, and p values <0.05 were considered statistically significant. Results showed that, on average, participants demonstrated significant and substantial growth in overall competency (Mdiff = 0.08, SE = 0.02, t(13) = 3.03, p = 0.01, r = 0.64). When the competency domains were considered individually, paired samples t tests revealed that participants showed significant and substantial growth in Facilitating Skills (Mdiff = 0.12, SE = 0.05, t(13) = 2.50, p = 0.03, r = 0.57), History of Present Illness (Mdiff = 0.13, SE = 0.04, t(13) = 3.41, p < 0.01, r = 0.69), and Physical Examination (Mdiff = 0.24, SE = 0.07, t(13) = 3.68, p < 0.01, r = 0.71). Growth in the remaining domain scores was not statistically significant. The Table summarizes the competency growth scores of participants.
Table: Study Statistics for Mean Difference (Growth) in Competency Scores
Because the difference in domain scores for Clinical Courtesy was somewhat skewed, Wilcoxon’s test was performed to assess the significance of growth. The results indicated that growth was not statistically significant (z = 0.57, p = 0.57, r = 0.15); this result was consistent with the parametric test described.
To assess whether significant change occurred in the two binary outcomes (Family History, Review of Systems) across time, McNemar’s test was conducted. No significant change was evident from Time 1 to Time 2.
The results from research question two showed a statistically significant correlation between the overall competency scores of students in the simulation laboratory (Time 1) and the overall competency scores of the same students in the clinical setting (Time 2), with r(12) = 0.63, p > 0.01. The Time 1 and Time 2 scores of the Facilitating Skills, Past Medical History, and Physical Examination domains also showed statistically significant correlations.
If the standardized patient exercise did have an effect, one would expect to see growth in knowledge and skills from the simulation laboratory to the clinical bedside. Likewise, if the standardized patient experience did not have an effect, one would not expect to see growth in competency. However, the presence of growth alone, which was investigated in research question one, is not sufficient evidence of the effect of the standardized patient experience. Because this study lacked a control group, other pieces of evidence from research questions two and three are needed to determine whether the growth in competency was related to the standardized patient experience. Therefore, the results of this study should be viewed holistically, using the findings of research questions one, two, and three to gain a better understanding of the effect the simulation experience had on the growth in competency in the students.
The results of research question one indicated that, on average, students showed growth in their overall competency from the standardized patient simulation setting to the clinical setting. When individual domains were analyzed, some domains indicated an increased growth in competency (Facilitating Skills, History of Present Illness, Physical Examination). But why was there growth in some domains and not in others? The sample consisted of RNs, which may explain some of the variability. The sample consisted of nurses who elected to return to school to obtain an advanced practice degree as nurse practitioners. As such, these students would already have practice in some of the competencies in their roles as nurses. The domains Initiating the Interview, Transition to the Physical Examination, and Clinical Courtesy all reflect competencies that nurses would use in their daily practice with patients. Therefore, if the nurse practitioner students who participated in this study were accustomed to performing these skills, it may partially explain the nonsignificant growth found with the domain Initiating the Interview.
However, two domains in which significant growth in competency was observed from the simulation laboratory to the clinical setting do not reflect competencies that are part of a nurse’s day-to-day practice. Facilitating Skills and History of Present Illness include skills that are more specific to the role of a nurse practitioner and hence would be new knowledge and skills currently learned by the student. The results of this study indicate that there was a significant increase in knowledge and skills from the simulation laboratory to the clinical setting in these two domains.
The third domain in which statistically significant growth was observed was Physical Examination. Although, as nurses, all students who participated in this study would already have been doing physical examinations on patients in practice, the detail and specificity of the physical examinations would be greater for the nurse practitioner who, unlike RNs, will be making medical diagnoses based on the history and physical examination of the patient. This may explain the growth observed from the standardized patient simulation to the clinical setting in this domain.
Regarding research question two, findings from the study indicated that overall there was a significant relationship between the competency scores of students in the simulation laboratory (Time 1) and the competency scores of the same students in the clinical setting (Time 2). The overall relationship was positive, indicating that the more competent the students were in the simulation laboratory, the more competent they were in the clinical setting. Although it cannot be known for certain that the simulation experience itself increased the competence of students, the findings suggest that students who are more competent in a simulation experience with standardized patients prior to seeing an actual patient in the clinical setting demonstrate more competence in the clinical setting. This lends support to having students come to the simulation laboratory with a solid foundation of knowledge in history taking and physical examination because, in return, they will have increased competence when performing these skills in the patient setting.
Three themes emerged from the qualitative question three (how does standardized patient simulation affect clinical competence). All themes were positive in terms of how the standardized patient affected clinical competence. In fact, no student indicated that he or she felt the simulation experience did not positively affect their clinical competence. The perceptions of increased confidence, increased preparedness, and valuable feedback from the standardized patients, coupled with the findings from research question one (increased growth in competence scores) and from research question two (the more competent students were in the simulation laboratory, the more competent they were in the clinical setting), lend support to the notion that the standardized patient simulation may have been the reason why students had increased growth in competency from the simulation laboratory to the clinical setting.
Limitations and Future Research
Because the study lacked a control group, the quantitative findings alone do not allow causal inferences regarding the effectiveness of standardized patient simulation. Therefore, this study should be replicated; a larger sample size would be advantageous in that the findings may be more representative of the population. Having an experimental research design with a control group and treatment group would allow for further exploration into whether the growth in competency from simulation laboratory to clinical setting is secondary to the simulation intervention. Future research should also include the development of additional instruments to further test learning transfer.
Implications for Practice
Little is known about direct clinical translation of simulated skills into real-world health care conditions when these strategies are applied in nurse practitioner education. This study represents one of the first systematic investigations of the processes by which students transfer simulated skills into direct clinical practice. This study then represents an opportunity to develop a template of skill transfer from the simulated environment to the real world of clinical practice for nurse practitioner students. The outcomes of this investigation can serve to structure templates of simulated instruction directly transferable to advanced practice by nurse practitioners. The results of this study will add to the body of evidence needed for developing best-practice approaches for using simulation with standardized patients with nurse practitioner students.
- Becker, K.L., Rose, L.E., Berg, J.B., Park, H. & Shatzer, J.H. (2006). The teaching effectiveness of standardized patients. Journal of Nursing Education, 45, 103–111.
- Bramble, K. (1994). Nurse practitioner education: Enhancing performance through the use of the objective structured clinical assessment. Journal of Nursing Education, 33, 59–65.
- Ebbert, D.W. & Connors, H. (2004). Standardized patient experiences: Evaluation of clinical performance and nurse practitioner student satisfaction. Nursing Education Perspectives, 25, 12–15.
- Ellis, H.C. (1965). The transfer of learning. New York, NY: Macmillan.
- Frejlach, G. & Corcoran, S. (1971). Measuring clinical performance. Nursing Outlook, 19, 270–271.
- Gagne, R.M. (1965). The conditions of learning and theory of instruction (4th ed.). New York, NY: Holt, Rinehart & Wilson.
- Gibbons, S.W., Adamo, G., Padden, D., Ricciardi, R., Graziano, M., Levine, E. & Hawkins, R. (2002). Clinical evaluation in advanced practice nursing education: Using standardized patients in health assessment. Journal of Nursing Development, 41, 215–221.
- Khattab, A.D. & Rawlings, B. (2001). Assessing nurse practitioner students using a modified objective structured clinical examination (OSCE). Nurse Education Today, 21, 541–550. doi:10.1054/nedt.2001.0590 [CrossRef]
- May, W., Hyun Park, J. & Lee, J.P. (2010). A ten-year review of the literature on the use of standardized patients in teaching and learning: 1996–2005. Medical Teacher, 31, 487–492. doi:10.1080/01421590802530898 [CrossRef]
- McDowell, J., Nardini, D.L., Negley, S.A. & White, J.E. (1984). Evaluating clinical performance using simulated patients. Journal of Nursing Education, 23, 37–39.
- McKeachie, W.J. (1987). Cognitive skills and their transfer: Discussion. International Journal of Educational Research, 11, 707–712. doi:10.1016/0883-0355(87)90010-3 [CrossRef]
- Miles, M.B. & Huberman, A.M. (1994). Qualitative data analysis: An expanded sourcebook (2nd ed.). Thousand Oaks, CA: Sage.
- O’Connor, F.W., Albert, M.L. & Thomas, M.D. (1999). Incorporating standardized patients into a psychosocial nurse practitioner program. Archives of Psychiatric Nursing, 13, 240–247. doi:10.1016/S0883-9417(99)80034-X [CrossRef]
- Prawat, R.S. (1989). Promoting access to knowledge, strategy, and disposition in students: A research synthesis. Review of Educational Research, 59, 1–41.
- Scherer, Y.K., Bruce, S.A., Graves, B.T. & Erdley, W.S. (2003). Acute care nurse practitioner education: Enhancing performance through the use of clinical simulation. AACN Clinical Issues: Advanced Practice in Acute & Critical Care, 14, 331–341. doi:10.1097/00044067-200308000-00008 [CrossRef]
- Starkweather, A.R. & Kardong-Edgren, S. (2008). Diffusion of innovation: Embedding simulation into nursing curriculum. International Journal of Nursing Scholarship, 5, 1–11.
- Theroux, R. & Pearce, C. (2006). Graduate students’ experiences with standardized patients as adjuncts for teaching pelvic examinations. Journal of the American Academy of Nurse Practitioners, 18, 429–435. doi:10.1111/j.1745-7599.2006.00158.x [CrossRef]
- Vessey, J.A. & Huss, K. (2002). Using standardized patients in advance practice nursing education. Journal of Professional Nursing, 18, 29–35. doi:10.1053/jpnu.2002.30898 [CrossRef]
Study Statistics for Mean Difference (Growth) in Competency Scoresa
|Domain||Mean Difference||df||tTest||Effect Size (r)|
|Initiating the Interview||0.02||13||0.37||0.10 (moderate)|
|Facilitating Skills||0.12||13||2.50*||0.57 (large)|
|History of Present Illness||0.13||13||3.41**||0.69 (large)|
|Past Medical History||0.08||13||1.37||0.36 (moderate)|
|Social History||–0.16||13||−1.60||0.41 (moderate)|
|Transition to the Physical Examination||0.00||13||0.00||0.00 (no effect)|
|Clinical Courtesy||0.04||13||0.59||0.16 (small)|
|Physical Examination||0.24||13||3.68**||0.71 (large)|