Journal of Nursing Education

Research Briefs 

Pilot Test of a Three-Station Palliative Care Observed Structured Clinical Examination for Multidisciplinary Trainees

Amy M. Corcoran, MD; Susan Lysaght, PhD, GNP-BC, ACHPN; Denise LaMarra, MS; Mary Ersek, PhD, RN, FAAN

Abstract

Developing effective communication and symptom assessment skills is an important component of palliative care training for advance practice nurses (APNs) and other health care providers. The purpose of this project was to develop and pilot test a three-station palliative care Observed Structured Clinical Examination (OSCE) for APN students and physician fellows. Three stations included discussing goals of care, breaking bad news, and assessing delirium. Measures included the Interpersonal Skills Tool, Station Checklists, the OSCE Evaluation Tool, and a focus group to solicit learners’ perspectives about the experience. Findings showed that learners evaluated the exercise as appropriate for their level of training and that standardized patients were convincing and provided helpful feedback. Learner self-evaluation means were significantly lower than those of standardized patient or faculty, and faculty raters demonstrated low interrater reliability. Initial evaluation suggests a three-station palliative care OSCE exercise is effective for multidisciplinary learners, although additional refinement is necessary. [J Nurs Educ. 2013;52(5):294–298.]

Dr. Corcoran is Assistant Professor of Clinical Medicine, Department of Medicine, and Ms. LaMarra is Director, Standardized Patient Program, Perelman School of Medicine, Dr. Lysaght is Research Fellow, and Dr. Ersek is Associate Professor, School of Nursing, University of Pennsylvania, Philadelphia, Pennsylvania.

This study was funded through a University of Pennsylvania School of Nursing Educational Innovations Award. Additional funding was provided by a Health Resources and Services Administration grant (HRSA GACA K01HP20493-01-01) to Dr. Corcoran, and a John A. Hartford Building Academic Geriatric Nursing Capacity Scholar Award and Ruth L. Kirschstein Individual NRSA Predoctoral Fellowship (1F31NR013103) to Dr. Lysaght. This article is the result of work supported with resources and the use of facilities at the Philadelphia Veterans Affairs Medical Center. The information or content and conclusions are those of the authors and should not be construed as the official position or policy of, nor should any endorsements be inferred by, the Department of Health and Human Services, Health Resources and Services Administration, Bureau of Health Professions, Veterans Affairs, or the U.S. Government. The authors thank Salimah Meghani, PhD, CRNP, Barbara Reville, CRNP, and Valerie Cotter, DrNP, CRNP, for providing expert consultation on the clinical scenarios; the Standardized Patient Program at the Perelman School of Medicine, University of Pennsylvania, for their implementation of this project; and Paul Lanken, MD, for allowing the authors to adapt a previously developed standardized patient scenario.

The authors have disclosed no potential conflicts of interest, financial or otherwise.

Address correspondence to Amy M. Corcoran, MD, Assistant Professor of Clinical Medicine, Department of Medicine, University of Pennsylvania, Perelman School of Medicine, 3615 Chestnut Street, Philadelphia, PA 19104; e-mail: amym.corcoran@uphs.upenn.edu.

Received: August 13, 2012
Accepted: December 19, 2012
Posted Online: March 28, 2013

Abstract

Developing effective communication and symptom assessment skills is an important component of palliative care training for advance practice nurses (APNs) and other health care providers. The purpose of this project was to develop and pilot test a three-station palliative care Observed Structured Clinical Examination (OSCE) for APN students and physician fellows. Three stations included discussing goals of care, breaking bad news, and assessing delirium. Measures included the Interpersonal Skills Tool, Station Checklists, the OSCE Evaluation Tool, and a focus group to solicit learners’ perspectives about the experience. Findings showed that learners evaluated the exercise as appropriate for their level of training and that standardized patients were convincing and provided helpful feedback. Learner self-evaluation means were significantly lower than those of standardized patient or faculty, and faculty raters demonstrated low interrater reliability. Initial evaluation suggests a three-station palliative care OSCE exercise is effective for multidisciplinary learners, although additional refinement is necessary. [J Nurs Educ. 2013;52(5):294–298.]

Dr. Corcoran is Assistant Professor of Clinical Medicine, Department of Medicine, and Ms. LaMarra is Director, Standardized Patient Program, Perelman School of Medicine, Dr. Lysaght is Research Fellow, and Dr. Ersek is Associate Professor, School of Nursing, University of Pennsylvania, Philadelphia, Pennsylvania.

This study was funded through a University of Pennsylvania School of Nursing Educational Innovations Award. Additional funding was provided by a Health Resources and Services Administration grant (HRSA GACA K01HP20493-01-01) to Dr. Corcoran, and a John A. Hartford Building Academic Geriatric Nursing Capacity Scholar Award and Ruth L. Kirschstein Individual NRSA Predoctoral Fellowship (1F31NR013103) to Dr. Lysaght. This article is the result of work supported with resources and the use of facilities at the Philadelphia Veterans Affairs Medical Center. The information or content and conclusions are those of the authors and should not be construed as the official position or policy of, nor should any endorsements be inferred by, the Department of Health and Human Services, Health Resources and Services Administration, Bureau of Health Professions, Veterans Affairs, or the U.S. Government. The authors thank Salimah Meghani, PhD, CRNP, Barbara Reville, CRNP, and Valerie Cotter, DrNP, CRNP, for providing expert consultation on the clinical scenarios; the Standardized Patient Program at the Perelman School of Medicine, University of Pennsylvania, for their implementation of this project; and Paul Lanken, MD, for allowing the authors to adapt a previously developed standardized patient scenario.

The authors have disclosed no potential conflicts of interest, financial or otherwise.

Address correspondence to Amy M. Corcoran, MD, Assistant Professor of Clinical Medicine, Department of Medicine, University of Pennsylvania, Perelman School of Medicine, 3615 Chestnut Street, Philadelphia, PA 19104; e-mail: amym.corcoran@uphs.upenn.edu.

Received: August 13, 2012
Accepted: December 19, 2012
Posted Online: March 28, 2013

Advanced practice certification in both palliative care and gerontological nursing requires that clinicians demonstrate knowledge and skill in assessing and managing symptoms, communicating with patients and families, and eliciting their goals and preferences for care (American Association of Colleges of Nursing, 2010; Perley & Dahlin, 2007). Pedagogical methods for evaluating these skills must include interactive, performance-based strategies. Although actual clinical exposure to specific patient problems and situations is one useful method, professional education programs often include role-play exercises and, more formally, clinical laboratory simulations using human patient simulators performing in standardized patient scenarios or Objective Structured Clinical Examinations (OSCEs) (Alexander, Keitz, Sloane, & Tulsky, 2006; Lane & Rollnick, 2007; O’Sullivan, Chao, Russell, Levine, & Fabiny, 2008; Perelman School of Medicine, 2011; Shaw et al., 2010; von Gunten et al., 2005).

A standardized patient (SP) is a person (often an actor) who has been systematically coached to simulate an actual patient in a consistent manner. In performing the simulation, the SP presents the gestalt of the patient being simulated—not only the patient’s history, but also his or her body language, physical findings, emotional state, and personality characteristics (Donovan, Hutchison, & Kelly, 2003). SPs may or may not be incorporated into OSCEs (Bosse et al., 2010).

An OSCE is a timed examination in which learners interact with one or more SPs in one or more stations that involve the demonstration of clinical skills, such as physical examination, interview, counseling, or patient management (Harden & Gleeson, 1979).

Now, this method can also include additional components, such as observer ratings, learner self-evaluations, and written quizzes, based on the clinical case presented (Rushforth, 2007; Turner & Dankoski, 2008).

OSCEs and SPs are used for both formative and summative evaluation of learner competencies in clinical skills. Formative evaluation allows for the prospective identification of learners’ strengths and weaknesses; in contrast, summative evaluation assesses concrete achievement of learning objectives (Ebbert & Connors, 2004; Turner & Dankoski, 2008). Using OSCEs in a formative evaluation also can mitigate disadvantages in situations where learners have no experience with the OSCE format (Bosse et al., 2012).

OSCEs have been used to evaluate palliative care skills, predominantly specific communication skills, such as breaking bad news and eliciting goals of care (Alexander et al., 2006; Back et al., 2007; Chipman, Beilman, Schmitz, & Seatter, 2007; Clayton et al., 2012; Donovan et al., 2003; von Gunten et al., 2005). Most reports involve medical trainees, although a few studies describe effective nursing education programs using OSCEs for palliative care skills (Kruijver et al., 2001; Paquette, Bull, Wilson, & Dreyfus, 2010). The emphasis on medical and nursing education collaboration (Institute of Medicine, 2009) and the cost of designing and using OSCEs led us to consider the effectiveness of OSCE exercises to evaluate multidisciplinary learners.

We could identify only two studies describing such multidisciplinary efforts. Donovan et al. (2003) incorporated OSCEs into an intensive communication skills workshop designed for nurses and other nonphysician health care providers working with cancer patients and their families. However, their report did not describe the sample or the tools used by SPs to provide feedback to participants. Evaluation by participants and faculty was only briefly described, and it was not clear what types of questions or instruments were used to solicit the feedback. Eid, Petty, Hutchins, and Thompson (2009) created an OSCE to evaluate the communication practices of six hematology–oncology fellows and two experienced advanced practice nurses (APNs). With the use of an average score from three raters on a 21-item skills checklist, learners were evaluated in OSCEs given prior to and then following a brief interactive lecture on the SPIKES protocol for delivering bad news (Baile et al., 2000). The investigators found a statistically significant increase in mean checklist scores following the teaching intervention, indicating improved performance in demonstrating communication skills.

In light of limited literature describing the development and evaluation of palliative care OSCEs for APNs and other learners, the purpose of this project was to develop and pilot a three-station OSCE for multidisciplinary trainees.

Method

Development of the OSCEs

Faculty from the University of Pennsylvania Schools of Nursing and Medicine created three OSCEs to depict clinical situations that are commonly encountered in geriatrics and palliative care: Goals of Care, Breaking Bad News, and Delirium Assessment. They adapted scenarios based on de-identified cases from their own practice or existing scenarios from the Perelman School of Medicine Simulation Center OSCE log book. Individuals portraying the standardized patients participated in a 4-hour training session in which they discussed the goals of the learning experience, practiced role-playing the patients, learned how to complete the interpersonal skills and stations checklists, and provided verbal feedback to learners. Faculty certified the accuracy of SPs’ portrayal and scoring, as well as their effectiveness in delivering verbal feedback on interpersonal skills.

The targeted learners for the OSCE were APN students and medical fellows in geriatrics, oncology, or palliative medicine. Expected learner behaviors and OSCE objectives were developed to reflect the role of the clinician–learner as a palliative care consultant. We chose this specific role because all learners, despite their specialty, were preparing for eventual careers in palliative care nursing or medicine. In this role, they would be most often delivering care as a consultant, who assists with specific aspects of patient and family care (e.g., discussing goals of care, managing symptoms management) rather than overseeing and managing the overall plan of care.

Learner Preparation

Approximately 3 weeks prior to the OSCE session, all learners received an e-mail describing the purpose of the exercise. As a review of the skills they had learned and practiced in earlier coursework, they also received key reading materials about communication and assessment techniques (Baile et al., 2000; Casarett & Quill, 2007; Inouye, 2006).

Because the APN students generally did not have prior exposure to OSCEs, faculty conducted practice role-play sessions using similar case studies to familiarize students with learner expectations and to provide experiences interviewing SPs. One of the authors (D.L.), who is Director of the Perelman School of Medicine Standardized Patient Program, also attended a face-to-face meeting with the nursing students during which OSCE procedures were reviewed and students were able to ask questions about the actual OSCE session.

OSCE Structure

The OSCE session was conducted during a 3-hour period, which began with a brief orientation to the simulation center. Following each scenario, the SPs discussed both constructive comments and points of positive reinforcement from the patient perspective. Using a guided self-reflection model, SPs also asked each learner to share their own impressions of the scenario.

Evaluation and Measurements Tools

Interpersonal Skills Tool. The Interpersonal Skills Tool is a validated instrument used by the Perelman School of Medicine Simulation Center, University of Pennsylvania. Evaluators use a 4-point Likert scale (1 = poor/almost never demonstrates to 4 = very good/almost always demonstrates) for the following categories: eliciting information, listening, giving information, respectfulness, empathy, and professionalism. A seventh item asks evaluators to rate the likelihood that, based on their interaction with the learners, they would refer a family member or a friend for care provided by the learner. The Interpersonal Skills Tool was completed by the learners, the SP, and two faculty raters for each scenario.

Station Checklists. Three Station Checklists reflected palliative care “best practices.” There were six to seven items on each Station Checklist, and each item was evaluated as “yes,” indicating that the learner demonstrated this “best practice,” or “no,” indicating that the learner did not demonstrate this practice. Examples of items from the Goals of Care discussion station included, “asked patient about his understanding of his disease” and “asked patient if he had discussed his preferences for care with family.” The checklist was completed by the learners, the SP, and two faculty raters for each scenario.

OSCE Evaluation Tool. To evaluate the learners’ perceptions of the effectiveness of the OSCE, they completed an 8-item OSCE Evaluation Tool focusing on characteristics such as clarity of expectations and instructions, helpfulness, and realism. Ratings ranged from 1 = strongly disagree to 4 = strongly agree. Two open-ended questions asked learners to state what they liked most about the experience and to give a suggestion for improving the OSCE.

Directly following the OSCE, two of the authors (M.E., A.M.C.) conducted a debriefing session, which included a focus group interview with the entire group of learners to elicit learners’ views on their overall experience, along with their recommendations for improving future OSCEs. The interview was audio-recorded and transcribed verbatim. The transcript was then reviewed by the second author (S.L.) to identify additional themes that would highlight strengths of the program, as well as aspects of the experience that could be improved for future learners.

All testing and evaluation procedures were reviewed and approved by the University of Pennsylvania Institutional Review Board, and participants provided written consent.

Data Analysis

To address how appropriate and useful learners found the OSCE, we calculated the means and medians for each station. We also performed content analysis (Miles & Huberman, 1994) of the transcript of the focus group interview. To examine differences in mean values among SPs, learners, and faculty rating of students’ performance, we fitted linear mixed models of the rating on the group (SP, learner, and faculty) with a random effect for the study subject, and we performed Wald tests to compare the three means. Faculty scores were calculated as the average of the scores of two faculty observers. Post hoc analysis using Tukey’s adjustment for multiple pairwise comparisons was conducted to identify which pairwise differences of means were statistically significant. To evaluate the degree of similarity between the two faculty raters, we calculated intraclass correlations between the two faculty ratings. All quantitative analyses were conducted using PASW Statistics Version 18 (SPSS, 2009) and R version 2.14.0 (R Development Core Team, 2011) software.

Results

Twelve learners participated in the pilot study: seven APN students, three geriatric medicine fellows, and two palliative medicine fellows. The learners’ views about the appropriateness and usefulness of the exercise were assessed using quantitative and qualitative data. Learners rated each scenario as appropriate for their level of training (mean = 3.4 to 3.7, SD = 0.5 to 0.8), SP was convincing (mean = 3.8 to 3.9, SD = 0.3 to 0.6), and feedback from SP was helpful (mean = 3.6 to 3.8, SD = 0.4 to 0.5). Learners reported that the exercise would help them in future interactions with patients (mean = 3.8, SD = 0.4).

The focus group revealed that only one nursing student had prior exposure to OSCEs. The physician-trainees had some prior exposure to OSCEs in other training but none that had focused on palliative care issues. This forum also provided further evidence that the scenarios were relevant and useful to the acquisition of clinical skills. Learners felt that using this method for formative, rather than summative, feedback was beneficial. Participants observed that the exercise provided lessons in important communications skills, especially “leaving space” throughout the conversation, using “empathetic listening,” and combining appropriate verbal and nonverbal communication. One participant reflected, “I was trying my best to communicate nonverbally…but, you know, [the SP] felt maybe that I could also address that a little more in my vocalizations.” Learners appreciated the opportunity to attempt these difficult conversations in a controlled practice setting, and many learners considered feedback from the SP to be the most helpful part of the experience.

Differences among SP, learner, and faculty ratings of students’ performance in interpersonal skills were present for each scenario. Overall means were lower for learners’ self-evaluations than for either faculty or SP scores, whereas faculty scores were slightly higher than SP scores. Differences among SP, learner, and faculty interpersonal skills evaluations for each scenario across all learners (n = 12) are shown in the Table. Differences in scores among the learner, the SP, and faculty were found to be statistically significant in all three scenarios. In addition, post hoc Tukey’s tests yielded significant differences in means between SPs and learners for two scenarios: Goals of Care (p < 0.001) and Breaking Bad News (p = 0.02). Differences between the SP and faculty were not statistically significant.

Differences Among Standardized Patient, Learner, and Faculty Interpersonal Skills Tool Ratings for Each Station Across all Learners (N = 12)

Table: Differences Among Standardized Patient, Learner, and Faculty Interpersonal Skills Tool Ratings for Each Station Across all Learners (N = 12)

In assessing different faculty raters’ evaluation of student performances, intraclass correlations (ICC) between the two faculty members’ summary scores were 0.22 for Goals of Care, 0.46 for Breaking Bad News, and 0.52 for Delirium Assessment, indicating low to moderate agreement between raters.

Discussion

This study evaluated multidisciplinary trainees using a three-station OSCE designed to focus on discussing goals of care, delivering bad news, and assessing delirium. Overall, the OSCEs were effective in evaluating multiple disciplines using the same patient scenarios. However, there are several issues to consider before instituting ongoing implementation of this OSCE.

Our focus was on formative evaluation—that is, assessing learners’ strengths and weaknesses and offering feedback to enhance future performance. This objective was appropriate because we had not yet validated the specific OSCE and its evaluation methods. Moreover, learners were not being evaluated at the completion of their palliative care preparation. Finally, use of the OSCE as a formative evaluative tool allowed several nursing students to gain experience in what was a novel learning strategy for them. Despite these reasons, the high costs necessary to produce and implement the OSCE call into question its sustainability as a formative teaching exercise.

The total cost of the program was $6,800, which covered program development, administration, SP salaries, parking, and refreshments. To justify this expense, it may be necessary to use the OSCE only as a summative tool. It should also be noted that a two-case version of the program was implemented the following year for 29 learners. The cost was approximately the same, showing that the initial outlay is higher than in subsequent implementations of the program. Thus, this OSCE may be sustainable despite high initial costs.

A cheaper option is to examine the effectiveness of conducting role-play exercises, using students or faculty to act as patients (Papadakis, Croughan-Minihane, Fromm, Wilkie, & Ernster, 1997). Although this approach may save money, it must be balanced with the recognition that OSCEs can provide a valid, reliable, authentic, and reproducible experience that may be lacking in peer or faculty role-play exercises (Ebbert & Connors, 2004).

Our findings revealed that the self-evaluation scores for learners was lower than the evaluation of the encounter by SPs or faculty members, which is consistent with findings of a prior study (Lau, Dolovich, & Austin, 2007). This is perhaps due to learners’ anxiety and the high expectations that the learners had set for themselves for this exercise. Another factor that may account for the differences between faculty and learner evaluations was use of the OSCE exercise as a formative, rather than a summative, evaluation tool. Because faculty were not determining overall competence in these skills but instead were focusing on constructive feedback to help learners further develop their skills, they may have been more positive and more lenient than they would have been under more stringent grading conditions. The clinical significance of these differences is unknown; however, novices who evaluate their clinical skills less positively than teachers and patients may be highly motivated to improve their learning and performance, provided they are not unrealistically negative. We also evaluated the degree of agreement between raters, an aspect which is often ignored in the evaluation of role-play and OSCEs (Lane & Rollnick, 2007).

Recent reports have called for increased professional collaboration and education (Institute of Medicine, 2009; Interprofessional Education Collaborative Expert Panel, 2011). Our program ended with a joint debriefing session for participants, which included both nurse and physician learners. However, the ways in which nurses and physicians interact and collaborate around breaking bad news or discussing goals of care with patients and families was not emphasized. This deficit was identified by two nurse learners, who commented that the exercise was not sensitive to the different roles that nurses and physicians have in communicating with seriously ill patients and families. Future SP exercises should embrace the principles of interprofessional education, as well as the unique communication skills needed by nurses (Wittenberg-Lyles, Goldsmith, & Ragan, 2010).

Interrater reliability scores between two faculty raters were low to moderate for our OSCE. Achieving a high level of agreement between raters is possible with OSCEs, as demonstrated by Hall, Marshall, Weaver, Boyle, and Taniguchi (2011), who reported high interrater reliability between two raters evaluating 158 students. Our findings suggest a need for additional faculty training in evaluating the interpersonal scores; Schwartzman Hsu, Law, and Chung (2011) reported significant decreases in the variability of SP and faculty ratings of students following a faculty training program.

Our sample was very small, which was a major limitation to the study. Future projects should include a larger, more diverse sample. In addition, other disciplines and learners could be included in the exercise. For example, in some clinical settings, social workers often lead goals of care discussions with patients and families. Thus, this scenario could be used for social work trainees as well.

Implications for Practice and Future Research

Initial evaluation suggests that a three-station OSCE for APN students and physician fellows is beneficial for formative evaluation in both discipline groups and that it would promote a more effective use of resources. Further revisions of the OSCEs could broaden their applicability to evaluating other professions (i.e., social work) and promote the incorporation of interprofessional educational principles in clinical care, in addition to the ongoing goal of evaluating changes in learners’ skills prior to and following training programs.

References

  • Alexander, S.C., Keitz, S.A., Sloane, R. & Tulsky, J.A. (2006). A controlled trial of a short course to improve residents’ communication with patients at the end of life. Academic Medicine, 81, 1008–1012 doi:10.1097/01.ACM.0000242580.83851.ad [CrossRef] .
  • American Association of Colleges of Nursing. (2010). Adult-gerontology primary care nurse practitioner competencies. Retrieved from http://www.nonpf.com/associations/10789/files/Adult-GeroPCComps2010.pdf
  • Back, A.L., Arnold, R.M., Baile, W.F., Fryer-Edwards, K.A., Alexander, S.C., Barley, G.E. & Tulsky, J.A. (2007). Efficacy of communication skills training for giving bad news and discussing transitions to palliative care. Archives of Internal Medicine, 167, 453–460 doi:10.1001/archinte.167.5.453 [CrossRef] .
  • Baile, W.F., Buckman, R., Lenzi, R., Glober, G., Beale, E.A. & Kudelka, A.P. (2000). SPIKES—A six-step protocol for delivering bad news: Application to the patient with cancer. The Oncologist, 5, 302–311 doi:10.1634/theoncologist.5-4-302 [CrossRef] .
  • Bosse, H.M., Nickel, M., Huwendiek, S., Jünger, J., Schultz, J.H. & Nikendei, C. (2010). Peer role-play and standardised patients in communication training: A comparative study on the student perspective on acceptability, realism, and perceived effect. BMC Medical Education, 10, 27 doi:10.1186/1472-6920-10-27 [CrossRef] .
  • Bosse, H.M., Schultz, J.H., Nickel, M., Lutz, T., Möltner, A., Jünger, J. & Nikendei, C. (2012). The effect of using standardized patients or peer role play on ratings of undergraduate communication training: A randomized controlled trial. Patient Education and Counseling, 87, 300–306 doi:10.1016/j.pec.2011.10.007 [CrossRef] .
  • Casarett, D.J. & Quill, T.E. (2007). “I’m not ready for hospice”: Strategies for timely and effective hospice discussions. Annals of Internal Medicine, 146, 443–449 doi:10.7326/0003-4819-146-6-200703200-00011 [CrossRef] .
  • Chipman, J.G., Beilman, G.J., Schmitz, C.C. & Seatter, S.C. (2007). Development and pilot testing of an OSCE for difficult conversations in surgical intensive care. Journal of Surgical Education, 64, 79–87 doi:10.1016/j.jsurg.2006.11.001 [CrossRef] .
  • Clayton, J.M., Adler, J.L., O’Callaghan, A., Martin, P., Hynson, J., Butow, P.N. & Back, A.L. (2012). Intensive communication skills teaching for specialist training in palliative medicine: Development and evaluation of an experiential workshop. Journal of Palliative Medicine, 15, 585–591 doi:10.1089/jpm.2011.0292 [CrossRef] .
  • Donovan, T., Hutchison, T. & Kelly, A. (2003). Using simulated patients in a multiprofessional communications skills programme: Reflections from the programme facilitators. European Journal of Cancer Care, 12, 123–128 doi:10.1046/j.1365-2354.2003.00394.x [CrossRef] .
  • Ebbert, D.W. & Connors, H. (2004). Standardized patient experiences: Evaluation of clinical performance and nurse practitioner student satisfaction. Nursing Education Perspectives, 25, 12–15.
  • Eid, A., Petty, M., Hutchins, L. & Thompson, R. (2009). “Breaking bad news”: Standardized patient intervention improves communication skills for hematology-oncology fellows and advanced practice nurses. Journal of Cancer Education, 24, 154–159 doi:10.1080/08858190902854848 [CrossRef] .
  • Hall, P., Marshall, D., Weaver, L., Boyle, A. & Taniguchi, A. (2011). A method to enhance student teams in palliative care: Piloting the McMaster-Ottawa Team Observed Structured Clinical Encounter. Journal of Palliative Medicine, 14, 744–750 doi:10.1089/jpm.2010.0295 [CrossRef] .
  • Harden, R.M. & Gleeson, F.A. (1979). Assessment of clinical competence using an objective structured clinical examination (OSCE). Medical Education, 13, 39–54 doi:10.1111/j.1365-2923.1979.tb00918.x [CrossRef] .
  • Inouye, S.K. (2006). Delirium in older persons. New England Journal of Medicine, 354, 1157–1165 doi:10.1056/NEJMra052321 [CrossRef] .
  • Institute of Medicine. (2009). Redesigning continuing education in the health professions. Retrieved from http://www.iom.edu/~/media/Files/Report%20Files/2009/Redesigning-Continuing-Education-in-the-Health-Professions/RedesigningCEreportbrief.pdf
  • Interprofessional Education Collaborative Expert Panel. (2011). Core competencies for interprofessional collaborative practice: Report of an expert panel. Washington, DC: Interprofessional Education Collaborative. Retrieved from http://www.aacn.nche.edu/education-resources/IPECReport.pdf
  • Kruijver, I.P., Kerkstra, A., Kerssens, J.J., HoItkamp, C.C., Bensing, J.M. & van de Wiel, H.B. (2001). Communication between nurses and simulated patients with cancer: Evaluation of a communication training programme. European Journal of Oncology Nursing, 5, 140–153 doi:10.1054/ejon.2001.0139 [CrossRef] .
  • Lane, C. & Rollnick, S. (2007). The use of simulated patients and role-play in communication skills training: A review of the literature to August 2005. Patient Education and Counseling, 67, 13–20 doi:10.1016/j.pec.2007.02.011 [CrossRef] .
  • Lau, E., Dolovich, L. & Austin, Z. (2007). Comparison of self, physician, and simulated patient ratings of pharmacist performance in a family practice simulator. Journal of Interprofessional Care, 21, 129–140 doi:10.1080/13561820601133981 [CrossRef] .
  • Miles, M.B. & Huberman, A.M. (1994). Qualitative data analysis: An expanded sourcebook (2nd ed.). Thousand Oaks, CA: Sage.
  • O’Sullivan, P., Chao, S., Russell, M., Levine, S. & Fabiny, A. (2008). Development and implementation of an objective structured clinical examination to provide formative feedback on communication and interpersonal skills in geriatric training. Journal of the American Geriatrics Society, 56, 1730–1735 doi:10.1111/j.1532-5415.2008.01860.x [CrossRef] .
  • Papadakis, M.A., Croughan-Minihane, M., Fromm, L.J., Wilkie, H.A. & Ernster, V.L. (1997). A comparison of two methods to teach smoking-cessation techniques to medical students. Academic Medicine, 72, 725–727 doi:10.1097/00001888-199708000-00021 [CrossRef] .
  • Paquette, M., Bull, M., Wilson, S. & Dreyfus, L. (2010). A complex elder care simulation using improvisational actors. Nurse Educator, 35, 254–258 doi:10.1097/NNE.0b013e3181f7f197 [CrossRef] .
  • Perelman School of Medicine at the University of Pennsylvania. (2011). What is a standardized patient? Retrieved from http://www.med.upenn.edu/spprogram/what.shtml
  • Perley, M. & Dahlin, C. (Eds). (2007). Core curriculum for the advanced practice hospice and palliative nurse. Dubuque, IA: Kendall Hunt.
  • R Development Core Team. (2011). R: A language and environment for statistical computing [Computer software]. Vienna, Austria: R Foundation for Statistical Computing. Retrieved from http://www.R-project.org/
  • Rushforth, H.E. (2007). Objective structured clinical examination (OSCE): Review of literature and implications for nursing education. Nurse Education Today, 27, 481–490 doi:10.1016/j.nedt.2006.08.009 [CrossRef] .
  • Schwartzman, E., Hsu, D.I., Law, A.V. & Chung, E.P. (2011). Assessment of patient communication skills during OSCE: Examining effectiveness of a training program in minimizing inter-grader variability. Patient Education and Counseling, 83, 472–477 doi:10.1016/j.pec.2011.04.001 [CrossRef] .
  • Shaw, E.A., Marshall, D., Howard, M., Taniguchi, A., Winemaker, S. & Burns, S. (2010). A systematic review of postgraduate palliative care curricula. Journal of Palliative Medicine, 13, 1091–1108 doi:10.1089/jpm.2010.0034 [CrossRef] .
  • SSPS, Inc. (2009). PASW Statistics for Windows, Version 18.0 [Computer software]. Retrieved from http://www.spss.com/
  • Turner, J.L. & Dankoski, M.E. (2008). Objective structured clinical exams: A critical review. Family Medicine, 40, 574–578.
  • von Gunten, C.F., Twaddle, M., Preodor, M., Neely, K.J., Martinez, J. & Lyons, J. (2005). Evidence of improved knowledge and skills after an elective rotation in a hospice and palliative care program for internal medicine residents. The American Journal of Hospice and Palliative Care, 22, 195–203 doi:10.1177/104990910502200309 [CrossRef] .
  • Wittenberg-Lyles, E., Goldsmith, J. & Ragan, S. (2010). The COMFORT Initiative: Palliative nursing and the centrality of communication. Journal of Hospice and Palliative Nursing, 12, 282–292 doi:10.1097/NJH.0b013e3181ebb45e [CrossRef] .

Differences Among Standardized Patient, Learner, and Faculty Interpersonal Skills Tool Ratings for Each Station Across all Learners (N = 12)

StationStandardized Patient MeanLearner MeanFaculty Meanpa
1. Goals of care262125< 0.001
2. Breaking bad news2522250.011
3. Delirium assessment2321250.008

10.3928/01484834-20130328-02

Sign up to receive

Journal E-contents