Journal of Nursing Education

The articles prior to January 2012 are part of the back file collection and are not available with a current paid subscription. To access the article, you may purchase it or purchase the complete back file collection here

Major Article 

Computerized Patient Model and Simulated Clinical Experiences: Evaluation With Baccalaureate Nursing Students

Carol E. Feingold, MS, RN; Margaret Calaluce, BSN, RN; Michael A. Kallen, PhD, MPH

Abstract

This study evaluated student and faculty member perceptions regarding the use of a computerized universal patient simulator (Laerdal SimMan Universal Patient Simulator) in a simulated clinical scenario. Students who used SimMan in clinical simulation activities in this baccalaureate program during two consecutive semesters were surveyed using a 20-item tool scored on a 4-point Likert scale. Four faculty members were surveyed using a 17-item tool with the same response scale.

The majority of the student sample consisted of Anglo women between ages 22 and 36. Survey items related to the reality of the simulations, the pace and flow of the clinical simulation, the ability to transfer skills learned to actual clinical settings, and the value of the simulated clinical experiences. Faculty members were also surveyed related to resource issues.

Descriptive statistics were used to analyze survey responses. The majority of the students and faculty members identified the simulations as realistic and valuable. However, only approximately half of the students agreed that the skills learned in the clinical simulation would transfer to a real clinical setting, compared to 100% of the faculty. Faculty members reported that implementing the simulated clinical scenario required additional time and resources.

Ms. Feingold is Clinical Associate Professor, Ms. Calaluce was formerly Assistant Coordinator, Patient Care Learning Center, College of Nursing, and Dr. Kallen is Assistant Director, Office of Educational Development, University of Arizona, College of Medicine, Tucson, Arizona.

Address correspondence to Carol E. Feingold, MS, RN, Clinical Associate Professor, College of Nursing, University of Arizona, 1305 N. Martin, PO Box 210203, Tucson, AZ 85721-0203; e-mail: Feingold@nursing.arizona.edu.

Received: March 14, 2003
Accepted: September 02, 2003

Abstract

This study evaluated student and faculty member perceptions regarding the use of a computerized universal patient simulator (Laerdal SimMan Universal Patient Simulator) in a simulated clinical scenario. Students who used SimMan in clinical simulation activities in this baccalaureate program during two consecutive semesters were surveyed using a 20-item tool scored on a 4-point Likert scale. Four faculty members were surveyed using a 17-item tool with the same response scale.

The majority of the student sample consisted of Anglo women between ages 22 and 36. Survey items related to the reality of the simulations, the pace and flow of the clinical simulation, the ability to transfer skills learned to actual clinical settings, and the value of the simulated clinical experiences. Faculty members were also surveyed related to resource issues.

Descriptive statistics were used to analyze survey responses. The majority of the students and faculty members identified the simulations as realistic and valuable. However, only approximately half of the students agreed that the skills learned in the clinical simulation would transfer to a real clinical setting, compared to 100% of the faculty. Faculty members reported that implementing the simulated clinical scenario required additional time and resources.

Ms. Feingold is Clinical Associate Professor, Ms. Calaluce was formerly Assistant Coordinator, Patient Care Learning Center, College of Nursing, and Dr. Kallen is Assistant Director, Office of Educational Development, University of Arizona, College of Medicine, Tucson, Arizona.

Address correspondence to Carol E. Feingold, MS, RN, Clinical Associate Professor, College of Nursing, University of Arizona, 1305 N. Martin, PO Box 210203, Tucson, AZ 85721-0203; e-mail: Feingold@nursing.arizona.edu.

Received: March 14, 2003
Accepted: September 02, 2003

Clinical simulation is increasingly being used to teach psychomotor skills and critical thinking to nursing students. Reasons for this trend include cost containment changes in the traditional health care system, which have resulted in fewer learning experiences with less supervision and mentoring; increased patient acuity and technological interventions, requiring better prepared novice learners; shifts in nursing education venues from hospital-based programs to university settings; and the availability of patient simulation technologies with increasing levels of realism. This study evaluated senior undergraduate nursing students’ and faculty members’ responses to the use of a computerized patient model during an interactive clinical simulation.

Literature Review

Simulation in Medical and Nursing Education

The number of whole-body, computerized patient simulators used in medical and nursing education has grown rapidly in recent years (Kapur & Steadman, 1998). This increased use is due, in part, to increased availability of the technology (Mallow & Gilje, 1999) and to student and teacher needs (Marshall et al., 2001). Current literature describes medical education using computerized patient simulators for anesthesia training (Chopra et al., 1994; Gaba et al., 1998), cardiovascular disease simulation (Ewy et al., 1987; Gregoratos & Miller, 1999; Jones, Hunt, Carlson, & Seamon, 1997; Waugh et al., 1995), emergency medicine and trauma (Gilbart, Hutchinson, Cusimano, & Regehr, 2000; Marshall et al., 2001; Small et al., 1999), laparoscopic techniques (Derossis et al., 1998), and critical care (Rogers et al., 2000). Benefits of simulation technology include:

  • Improved surgical techniques.
  • Enhanced assessment and decision-making skills.
  • Retention of knowledge related to procedures (Issenberg et al., 1999).
  • Absence of patient risk.
  • Ability to present a complex problem to all students.
  • Ability to halt actions during the simulation to replay or critique performance (Gaba & DeAnda, 1988).

The benefits of clinical simulation as described in nursing education literature are similar to those found in the medical literature. Simulation in nursing has traditionally been used for psychomotor skills training (Alavi, Loh, & Reilly, 1991; Holloway, 1999; Snyder, Fitzloff, Fiedler, & Lambke, 2000). Knight (1998) reviewed the literature related to acquisition of psychomotor skills in nursing and concluded that repeat practice in a safe environment is essential for learning such skills. Clinical simulation can provide the opportunity for that practice. In addition, the literature increasingly promotes the use of clinical simulations to practice problem solving with faculty support in a safe environment (Aronson, Rosa, Anfinson, & Light, 1997; Cioffi, 2001). Morton (1997) listed the benefits of simulating clinical scenarios in nursing education as:

  • Learning in a risk-free environment.
  • Interactive learning.
  • Repeated practice of skills.
  • Immediate faculty or tutor feedback.

Clinical simulation is consistent with cognitive learning theory because it is interactive, builds on prior knowledge, and relates to real clinical problems (Johnson, Zerwic, & Theis, 1999; Knowles, 1990). Active participation in realistic clinical simulations may promote critical-thinking skills in students (Bechtel, Davidhizar, & Bradshaw, 1999) and increase their level of comfort with technology so the patient, not the technology, becomes the focus of care (Morton, 1996).

Simulation can also be used to demonstrate competence outcomes in nursing programs (Lutrell, Lenburg, Scherubel, Jacob, & Koch 1999). This is similar to the use of the Objective Structured Clinical Examination (OSCE) in medical education, which has been demonstrated to be a reliable tool for evaluating clinical knowledge and skills (Blue et al., 1998; Sloan, Donnelly, Johnson, Schwartz, & Strodel, 1993). Finally, recent literature describes the use of structured clinical simulation to orient new graduates and to maintain clinical competence for practicing nurses (Eaves & Flagg, 2001).

Evaluation of Educational Technology

Baker (1999) suggested that evaluation of educational technology must examine what it is intended to do, under what conditions it works best, and faculty members’ and students’ perceptions of it. The literature indicates clinical simulation is intended to provide opportunities for skill acquisition and decision making in a risk-free environment. Hanson (1993) suggested that transfer of learning is facilitated when clinical simulations are as realistic as possible. In addition, practice should occur in short sessions over a long period of time, and incorporate immediate feedback with correction of errors. Recent data indicate students prefer faculty member presence during skill demonstrations and immediate feedback, rather than videotaped demonstrations with delayed faculty member feedback (Miller, Nichols, & Beeken, 2000). Heinecke, Blasi, Milman, and Washington (1999) also advocate that evaluation of technology requires quantitative and qualitative measures, which examine what students learn, how they learn, and what they think of the learning situation. This article describes student and faculty member responses to the use of Laerdal SimMan Universal Patient Simulator in an interactive clinical simulation.

Purpose

The purpose of evaluation in education is to improve teaching and learning (Knowles, 1990). The intent of this evaluation was to elucidate undergraduate nursing students’ and faculty members’ perceptions about the experience of using the computerized patient model, SimMan, for teaching and assessment during simulated clinical scenarios. As a result of the literature review, we hypothesized that clinical simulation involving assessment, clinical decision making, communication, and psychomotor performance would be an adequate test of students’ clinical competence and also would provide a learning experience with high transferability to “real life.” Key questions included students’ and faculty members’ perceptions of patient and scenario realism, students’ ability to transfer knowledge from the simulated clinical scenarios to real clinical experiences, and the value of the learning experience. Survey data will provide direction for SimMan use and justify the continuing use of this new technology.

Method

Faculty members teaching the Advanced Acute Care of the Adult course designed two standard patient scenarios using SimMan in the critical care area of the Patient Care Learning Center (PCLC). A typical patient scenario involves a 65-year-old woman admitted with chronic obstructive pulmonary disease (COPD) exacerbation and pneumonia. The student enters the test area and receives a short verbal report from the faculty member. The client cardiac monitor indicates atrial fibrillation, with a heart rate of 110, blood pressure 90/40, respiratory rate of 32, temperature of 102°F, SaO2 91%, and CO2 50%. These values are displayed on the computerized patient’s monitor. In addition, breath and heart sounds, blood pressure, respiratory rate, bowel sounds, and peripheral pulses are all evident on assessment of the simulated patient. The PCLC is equipped with a critical care bed, suction, oxygen, a monitor, and various intravenous (IV) regulators and controllers. With these tools, and SimMan’s capabilities, the simulated patient can have continuous hemodynamic monitoring, IV therapy, oxygen, suction, a chest tube, a urinary catheter, a tracheotomy, and a nasogastric tube. SimMan can also vocalize with the help of a remote control and preprogrammed sounds and phrases. The computerized patient allows faculty members to program a trend, so that, during a 10-minute period, SimMan’s condition can deteriorate, with corresponding changes in vital signs and hemodynamic values.

After receiving a verbal report, the student receives a list of current laboratory values and new physician orders for the patient. The student greets SimMan and proceeds with care provision. As SimMan’s condition begins to deteriorate, the student must prioritize problems, take action, and communicate with the patient, the patient’s family, and members of the health care team.

Nursing students in the Advanced Acute Care of the Adult course each had two experiences with the patient simulator during the first semester of their senior year. At the beginning of the semester, students were assessed for skill performance and clinical decision making, with the results of this assessment determining each student’s subsequent clinical placement. At the end of the semester, all students were assessed again via SimMan, using a slightly different patient scenario. Both times students were scored based on a checklist of critical behaviors and indicators of clinical decision making. For both assessments, course faculty members observed the students’ performance in real time and provided immediate feedback. Both of the simulated clinical experiences represented summative assessment of the students. However, the immediate feedback from faculty members also represented formative evaluation. We were not involved in any of the student assessments.

Satisfaction survey items were drafted using a tool described in the literature (Halamek et al., 2000). The survey tool was used at the end of the fall semester 2001 with students who had interacted with SimMan for two standard, simulated clinical assessments (Group 1). Students were asked to respond to 20 items related to the value of the experience, the ability to transfer skills learned in simulation to the real clinical world, the realism of the simulation, and the overall value of the learning experience. A 4-point Likert format was used to obtain extent of student-item agreement, with response choices ranging from 4 = Strongly Agree to 1 = Strongly Disagree. Three survey subscales were created to summarize:

  • Realism, using the four realism-of-the-simulation survey items.
  • Transfer, using the three ability-to-transfer-skills survey items.
  • Value, using the six overall-value-of-the-experience survey items.

The remaining seven items on the scale are the individual items listed in Table 2.

Students’ Means and Standard Deviations for Survey Subscales and Nonsubscale Items and Percentage of Student Agreement with Survey Items (N = 65)

Table 2:

Students’ Means and Standard Deviations for Survey Subscales and Nonsubscale Items and Percentage of Student Agreement with Survey Items (N = 65)

A total of 28 of 50 students responded to the first survey. The survey tool was then used unaltered with a second group of students in spring 2002 (Group 2). Students in Group 2 were enrolled in the same clinical course as students in Group 1 had been and were similarly required to interact with SimMan, using identical patient scenarios to those presented to students in Group 1. At the end of the semester all students in Group 2 were then asked to complete the same satisfaction survey. Responses were received from 37 of a total of 47 students in Group 2.

A 17-item survey, using the same four Likert-format response options, was used to solicit faculty member feedback. The faculty tool also included items about the need for faculty support and training related to use of the new technology. Four faculty members worked with students using SimMan during the two semesters. All of them completed the survey.

Sample

Participants in this study were all baccalaureate nursing students enrolled in the Advanced Acute Care of the Adult course during two consecutive semesters of a single academic year. A total of 50 students were enrolled in the fall semester, and a total of 47 students were enrolled in the spring semester. Twenty-eight (56.0%) of the fall semester students (i.e., Group 1) and 37 (78.7%) of the spring semester students (i.e., Group 2) completed the survey, for a total response rate of 67.0% for both semesters combined.

Chi-square tests for differences between respondents and nonrespondents for the demographic variables of gender [χ2 (1, N = 97) = 2.81, p = .094] and age [χ2 (1, N = 97) = 2.24, p = .135] were not significant. The chi-square test for the demographic variable ethnicity [χ2 (1, N = 97) = 3.88, p = .049] was significant, indicating there was a greater percentage of minority students who did not respond than did respond, compared to Anglo students who did and did not respond. Grade point average (GPA) information for students who did not respond was not available, and therefore, no chi-square test for this demographic variable was performed.

All documents related to this study were submitted to our Institutional Review Board, and approval was received before proceeding with the study. Students were informed that participation was voluntary, there was no risk for either participating or not participating, and responding to the survey conveyed their consent to participate in the study.

The demographics of the two student groups were similar and are shown in Table 1. More than 90% of the students were women. Students’ ages ranged from ⩽ 22 to ⩾ 36, with most students’ ages (44.4%) categorized as 23 to 30. Students’ GPAs ranged from ⩽ 3.0 to ⩾ 3.6, with only 9.5% of students’ GPAs categorized as ⩽ 3.0, and the remainder of students’ GPAs fairly evenly divided between the 3.1-to-3.5 category (46.0%) and the ⩾ 3.6 category (44.4%). A large majority of students identified themselves as Anglo (82.5%), with the remainder self-identifying as either minority (14.3%) or mixed ethnicity (3.2%).

Demographics of Student Groups (N = 97)

Table 1:

Demographics of Student Groups (N = 97)

Chi-square tests for differences between students in Groups 1 and 2 for the demographic variables gender [χ2 (1, N = 63) = 2.58, p = .108], age [χ2 (3, N = 63) = .82, p = .845], GPA [χ2 (2, N = 63) = 3.53, p = .171], and ethnicity [χ2 (2, N = 63) = 1.66, p = .436] were all nonsignificant. Therefore, students in Groups 1 and 2 were combined into a single student group for subsequent analyses. Table 1 also includes the combined student group demographic information.

Faculty participants included three full-time faculty members who taught the Advanced Acute Care of the Adult course and a fourth faculty member who implemented a scenario with some students in the Intermediate Acute Care of the Adult course.

Data Analysis

Survey data were first analyzed to obtain survey subscale and item descriptive statistics, including mean, standard deviation, and frequency and percentage of student responses per response-option category. In addition, two-tailed, independent-groups t tests were performed to determine whether there were statistically significant differences between the level of self-reported GPA (categorized into two groups: 3.1 to 3.5 and ⩾ 3.6) and students’ responses to the three survey subscales (i.e., Realism, Transferability, and Value), as well as to individual survey items not included in these subscales. Finally, analyses of variance (ANOVAs) were performed to determine whether there were statistically significant differences between or within the self-reported age (categorized into three groups: ⩽ 22, 23 to 30, and ⩾ 31), and students’ responses to the three survey subscales, as well as to individual survey items not included in any of the three subscales. An alpha level of .05 was used for all statistical tests.

Results

Means and standard deviations of student responses to each of the three survey subscales and to individual survey items not included in the subscales are presented in Table 2. The Value subscale had the highest level of agreement associated with it (mean = 3.04), and the Transferability subscale had the lowest level of agreement associated with it (mean = 2.52). Survey item #11, “The technical skills taught in this course are valuable,” had the highest level of agreement associated with it (mean = 3.53), while item #15, “My interaction with SimMan improved my clinical competence,” had the lowest level of agreement associated with it (mean = 2.50).

Table 2 also presents findings related to the percentage of student agreement with the survey subscales and items, when responses were dichotomized into Agree (including both Agree and Strongly Agree responses) and Disagree categories (including both Disagree and Strongly Disagree responses). The majority of students (86.1%) found the simulated clinical experience with SimMan to be realistic. Students also agreed that the setting (76.2%) and the pace and flow of the scenarios (73.0%) were like a “real-life” critical care setting, and that SimMan was a realistic patient (64.1%). These three items comprise the Realism subscale.

More than 80% of the students agreed the experience was an adequate test of clinical skills (83.0%) and decision making (87.7%). More than two thirds of the students (69.3%) believed it was a valuable learning experience, and 76.5% believed it enhanced learning. Virtually all of the students (96.9%) thought they received adequate feedback about their performance. Unexpected findings were that less than half of the students believed the simulated clinical experiences increased their confidence (46.9%) or improved their clinical competence (46.9%), and only 54.7% believed the simulated clinical prepared them to function in a real clinical environment. These three items comprise the Transferability subscale.

Table 3 presents findings related to the percentage of faculty agreement with faculty survey items, again, when responses were dichotomized into Agree and Disagree categories. All faculty participants believed the experience prepared students to perform in a real clinical setting. The simulated clinical scenario was judged to be a realistic recreation of an acute care clinical setting by 100% of the faculty members. The majority of faculty members also believed the pace and flow was realistic. The faculty members believed the simulation adequately tested clinical and decision-making skills and reinforced clinical objectives. SimMan was identified by 100% of the faculty members as an effective teaching tool that would prepare students to perform in real clinical settings. Questions related to resources indicated that most of the faculty members believed using the simulated patient required extra preparation time and that faculty support for using the technology was inadequate to the point of less than potential usage.

Percentage of Faculty Member Agreement with Survey Items (N = 4)

Table 3:

Percentage of Faculty Member Agreement with Survey Items (N = 4)

Results of the two-tailed, independent-groups t tests showed no statistically significant differences between level of self-reported GPA and level of student agreement with the Realism, Transferability, or Value subscales. However, results of the t test involving individual survey item #11, “The technical skills taught in this course are valuable” (an item not included in any survey subscale summary), did indicate there were statistically significant differences in level of agreement between students with GPAs between 3.1 and 3.5 (mean = 3.69) and those with GPAs ⩾ 3.6 (mean = 3.43) (Table 4).

t Test for Equality of Means (Independent Variable = Level of Self-Reported Cumulative GPA)

Table 4:

t Test for Equality of Means (Independent Variable = Level of Self-Reported Cumulative GPA)

Results of the ANOVAs also showed no statistically significant differences between or within level of self-reported age and level of student agreement with the Realism, Transferability, or Value subscales. However, in this case, follow-up test results of a statistically significant ANOVA involving individual survey item #17, “The pace of the clinical simulation reflected the flow of an actual clinical setting,” did indicate there were statistically significant differences in level of agreement between students who were ⩽ 22 years old (mean = 3.05) and those 23 to 30 years old (mean = 2.54) (Table 5).

ANOVA Results (Independent Variable = Level of Age Group)

Table 5:

ANOVA Results (Independent Variable = Level of Age Group)

Discussion

Transferability

A continuing theme in nursing and medical literature related to use of simulated clinical scenarios is the question about transferability of competence in a simulated setting to a real clinical setting (Berg et al., 2001). Our definition of clinical competence includes cognitive, affective, and psychomotor skills. Objectively defining competence in each area is challenging, and Berg et al. (2001) indicated that because competence is multifactorial, one simulation may not be able to measure all aspects of competence. Chopra et al. (1994) reflected that it may be impossible to quantitatively answer the question about transferability to real clinical performance because a comparative study would involve too much patient risk. Gaba (1992) stated that because human lives depend on skilled performance by medical and nursing professionals, it is not necessary to have unequivocal proof that simulator-based training improves safety.

This survey demonstrated that nearly half of the students believed working with the simulated patient increased their confidence, clinical competence, or prepared them to perform in real clinical settings. It is interesting that only approximately half of the students believed the learning experiences would transfer to real clinical settings, while 100% of the faculty members believed the learning would transfer. Benner (1984) described novice learners as focusing on individual bits of information and lacking a unified view of the whole. Perhaps this finding demonstrates a difference between novice and expert nurses. In addition, the literature (Johnson et al., 1999; Lutrell et al., 1999) indicates that performance testing with simulated clinical experiences increased student confidence in their ability to make clinical decisions and that simulated experiences provided an opportunity to use critical thinking and reinforce prior learning.

Realism

Halamek et al. (2000) suggested that the ability to simulate a clinical environment requires attention to detail and predicted that the most successful simulations will be those that recreate real-life situations. Our efforts to simulate clinical reality were approved by a majority of the students and faculty members. Most thought SimMan provided a realistic patient simulation. This study demonstrated that younger students believed the “pace and flow” of the simulation were more realistic than older students did. This indicates a need to continue to work on increasing realism in the simulations. Possibly, the younger students have had less life experience and are less aware of the realistic details.

Value

Educational theory informs our efforts to evaluate teaching strategies in nursing. Baccalaureate nursing students are a diverse group in terms of age and experience, and most are categorized as “adult learners,” who take ownership of clinical competence (Peters, 2000). Knowles (1990) identified adult learners as self-directed, motivated by learning needs, and oriented to real-life issues. Adult learners use experience as a resource, have a lifelong orientation to learning, and identify learning needs from the changing professional environment (Knowles, 1990).

As a teaching strategy, simulated clinical experiences are consistent with adult learning theory. Data indicate that active learning increases motivation and interest in learning (Cioffi, 2001). A majority of the students in this study believed the interactive, simulated clinical experiences recreated real-life situations; tested clinical skills and decision making; reinforced clinical objectives; and enhanced learning. Both faculty and students valued the simulated clinical experiences. It is interesting that the students with lower self-reported GPAs believed the technical skills taught in the course were more valuable than those with higher GPAs. Students at the lower end of the grade ranking may value the skill practice more and may need more opportunities to practice their skills than students with higher GPAs. SimMan can provide opportunities for increased practice time for these students.

Clinical simulation has value as a method of teaching psychomotor skills. Ericsson, Krampe, and Tesch-Romer (1993) theorized that expert performance is the end result of prolonged efforts to improve practice through repeat performance. In addition, Knight (1998) concluded that a skill is best learned in a systematic approach that includes repeat practice in a safe environment. She stated it is doubtful that clinical settings in the current health care system can provide all of the practice required by students. SimMan can provide the opportunity for repeated practice in a safe environment.

Resources

Fiscal shortages at this state university resulted in cuts to the intramural funding that helped purchase SimMan. Therefore, other funding sources must be secured. New technology is a resource issue because one must continue to upgrade as technology advances. As Mann (1999) observed, “Technology is a ‘train’ you will either be on it or under it” (p. 7). The majority of faculty members indicated that using SimMan for clinical simulation and assessment required more preparation time than traditional experiences. Preparing a well-designed scenario, writing that to a seamless program for SimMan, and setting up the mock clinical site with attention to realistic detail takes time. It is expected that the support of a full-time nurse who is knowledgeable about SimMan and capable of preparing and setting up scenarios would increase use of SimMan and clinical simulation. Data from this study supported the hiring of a full-time, master’s-prepared RN to support teaching and learning in the PCLC.

Conclusion

This initial evaluation suggests that use of a computerized patient model during simulated clinical scenarios in a baccalaureate nursing program has value for both learners and educators. Students valued the learning experiences, and the majority of them felt the simulation was realistic. Faculty members unanimously believed the experiences with SimMan prepared students to perform in real clinical settings. We must continue to explore the reason many students did not believe the simulated practice would prepare them to perform in real clinical settings. An overriding question is this: are there ways to facilitate transfer of learning from the laboratory to the clinical setting (Bjork, 1997)? Anecdotal data from course faculty indicates that student performance in simulated clinical scenarios is predictive of actual clinical performance.

This needs to be validated. This study is limited by not including comparison of grades on simulated and actual clinical experiences. The original plan for this study included a qualitative interview with a number of students, with analysis of those students’ clinical grades. However, students did not volunteer to participate in that part of the study. Angel, Duffey, and Belyea (2000) suggested that the interaction between the learner and the strategy may be more significant to learning outcomes than the particular strategy used. Interviews could provide the students’ own descriptions of what the interaction with SimMan was like, so we may further clarify the conditions under which this technology works best. In addition, videotaping students as they interact with SimMan may provide further information about the process (Gaba et al., 1998).

Simulation experiences in a laboratory will never replace clinical experiences with real patients, role models, and mentors (Knight, 1998). However, cost containment forces educational innovation to fill the need for more patient care experiences and student supervision and mentoring. Novice student nurses deserve adequate preparation before encountering acutely ill patients and the complex technology involved in the care of those patients. Patient safety demands adequately prepared caregivers. Computerized patient models, such as SimMan, can begin to fill that role. Nurse educators must also consider whether technology can address communication, interpersonal interaction during skill performance, compassionate caring, and nursing understanding (Bjork, 1999; Issenberg, Gordon, Gordon, Safford, & Hart, 2001). These are areas traditionally and ideologically within the domain of nursing.

References

  • Alavi, C., Loh, S. & Reilly, D. (1991). Reality basis for teaching psychomotor skills in a tertiary nursing curriculum. Journal of Advanced Nursing, 16, 957–965. doi:10.1111/j.1365-2648.1991.tb01801.x [CrossRef]
  • Angel, B., Duffey, M. & Belyea, M. (2000). An evidence-based project for evaluating strategies to improve knowledge acquisition and critical-thinking performance in nursing students. Journal of Nursing Education,39, 219–228.
  • Aronson, B., Rosa, J., Anfinson, J. & Light, N. (1997). A simulated clinical problem-solving experience. Nurse Educator, 22 (6), 17–19. doi:10.1097/00006223-199711000-00012 [CrossRef]
  • Baker, E. (1999). Technology: How do we know it works?The Secretary’s Conference on Educational Technology, Whitepapers [Paper 5], 1–5.
  • Bechtel, G., Davidhizar, R. & Bradshaw, M. (1999). Problem-based learning in a competency-based world. Nurse Education Today, 19, 182–187. doi:10.1016/S0260-6917(99)80003-3 [CrossRef]
  • Benner, P. (1984). From novice to expert: Excellence and power in clinical nursing practice. Menlo Park, CA: Addison-Wesley.
  • Berg, D., Raugi, G., Gladstone, H., Berkley, J., Weghorst, S. & Ganter, M. et al. (2001). Virtual reality simulators for dermatologic surgery: Measuring their validity as a teaching tool. Dermatologic Surgery, 27, 370–374.
  • Bjork, I. (1997). Changing conceptions of practical skill and skill acquisition in nursing education. Nursing Inquiry, 4, 184–195. doi:10.1111/j.1440-1800.1997.tb00098.x [CrossRef]
  • Bjork, I. (1999). What constitutes a nursing practical skill?Western Journal of Nursing Research, 21, 51–70. doi:10.1177/01939459922043703 [CrossRef]
  • Blue, A., Stratton, T., Plymale, M., DeGnore, L., Schwartz, R. & Sloan, D. (1998). The effectiveness of the structured clinical instruction module. American Journal of Surgery, 176, 67–70. doi:10.1016/S0002-9610(98)00109-3 [CrossRef]
  • Chopra, V., Gesink, B., DeJong, J., Bovill, J., Spierdijk, J. & Brand, R. (1994). Does training on an anesthesia simulator lead to improvement in performance?British Journal of Anesthesia, 73, 293–297. doi:10.1093/bja/73.3.293 [CrossRef]
  • Cioffi, J. (2001). Clinical simulations: Development and validation. Nurse Education Today, 21, 477–486. doi:10.1054/nedt.2001.0584 [CrossRef]
  • Derossis, A., Fried, G., Abrahanowicz, M., Sigman, H., Barkun, J. & Meakins, J. (1998). Development of a model for training and evaluation of laparoscopic skills. The American Journal of Surgery, 175, 482–487. doi:10.1016/S0002-9610(98)00080-4 [CrossRef]
  • Eaves, R. & Flagg, A. (2001). The U.S. Air Force pilot simulated medical unit: A teaching strategy with multiple applications. Journal of Nursing Education, 40, 110–115.
  • Ericsson, K., Krampe, R. & Tesch-Romer, C. (1993). The role of deliberate practice in the acquisition of expert performance. Psychological Review, 100, 363–406. doi:10.1037/0033-295X.100.3.363 [CrossRef]
  • Ewy, G., Felner, J., Juul, D., Mayer, J., Sajid, A. & Waugh, R. (1987). Test of a cardiology patient simulator with students in fourth-year electives. Journal of Medical Education, 62, 738–743.
  • Gaba, D. (1992). Improving anesthesiologists’ performance by simulating reality. Anesthesiology, 76, 491–494. doi:10.1097/00000542-199204000-00001 [CrossRef]
  • Gaba, D. & DeAnda, A. (1988). A comprehensive anesthesia simulation environment: Recreating the operating room for research and training. Anesthesiology, 76, 387–394. doi:10.1097/00000542-198809000-00017 [CrossRef]
  • Gaba, D., Howard, S., Flanagan, B., Smith, B., Fish, K. & Botney, R. (1998). Assessment of clinical performance during simulated crises using both technical and behavioral ratings. Anesthesiology, 89, 8–18. doi:10.1097/00000542-199807000-00005 [CrossRef]
  • Gilbart, M., Hutchinson, C., Cusimano, M. & Regehr, G. (2000). A computer-based trauma simulator for teaching trauma management skills. The American Journal of Surgery, 179, 223–228. doi:10.1016/S0002-9610(00)00302-0 [CrossRef]
  • Gregoratos, G. & Miller, A. (1999). Cardiology teaching and the changing health care scene. Journal of the American College of Cardiology, 33, 1091–1127.
  • Halamek, L., Kaegi, D., Gaba, D., Sowb, Y., Smith, B. & Smith, B. et al. (2000). Time for a new paradigm in pediatric medical education: Teaching neonatal resuscitation in a simulated delivery room environment. Pediatrics, 106(4), 1–6. doi:10.1542/peds.106.4.e45 [CrossRef]
  • Hanson, G. (1993). Refocusing the skills laboratory. Nurse Educator, 18(2), 10–12. doi:10.1097/00006223-199303000-00008 [CrossRef]
  • Heinecke, W., Blasi, L., Milman, N. & Washington, L. (1999). New directions in the evaluation of the effectiveness of educational technology. The Secretary’s Conference on Educational Technology, Whitepapers [Paper 8], 1–7.
  • Holloway, K. (1999). Developing an evidence base for teaching nursing practice skills in an undergraduate nursing program. Nursing Praxis in New Zealand, 14(1), 22–32.
  • Issenberg, S., Gordon, M., Gordon, S., Safford, R. & Hart, I. (2001). Simulation and new learning technologies. Medical Teacher, 23(1), 16–23. doi:10.1080/01421590020007324 [CrossRef]
  • Issenberg, S., McGaghie, W., Hart, I., Mayer, J., Felner, J. & Petrusa, E. et al. (1999). Simulation technology for health care professional skills training and assessment. Journal of the American Medical Association, 282, 861–866. doi:10.1001/jama.282.9.861 [CrossRef]
  • Johnson, J., Zerwic, J. & Theis, S. (1999). Clinical simulation laboratory. Nurse Educator, 24(5), 37–41. doi:10.1097/00006223-199909000-00016 [CrossRef]
  • Jones, J., Hunt, S., Carlson, S. & Seamon, J. (1997). Assessing bedside cardiologic examination skills using “Harvey,” a cardiology patient simulator. Academic Emergency Medicine, 4, 980–985. doi:10.1111/j.1553-2712.1997.tb03664.x [CrossRef]
  • Kapur, P. & Steadman, R. (1998). Patient simulator competency testing: Ready for takeoff?Anesthesia and Analgesia, 86, 1157–1159.
  • Knight, C. (1998). Evaluating a skills centre: The acquisition of psychomotor skills in nursing-a review of the literature. Nurse Education Today, 18, 441–447. doi:10.1016/S0260-6917(98)80169-X [CrossRef]
  • Knowles, M. (1990). The adult learner: A neglected species (4th ed.). Houston: Gulf.
  • Lutrell, M., Lenburg, C., Scherubel, J., Jacob, S. & Koch, R. (1999). Competency outcomes for learning and performance assessment: Redesigning a BSN curriculum. Nursing and Health Care Perspectives, 20, 134–141.
  • Mallow, G. & Gilje, F. (1999). Technology-based nursing education: Overview and call for further dialogue. Journal of Nursing Education, 38, 248–251.
  • Mann, D. (1999). Documenting the effects of instructional technology: A fly-over of policy questions. The Secretary’s Conference on Educational Technology Whitepapers [Paper 6], 1–8.
  • Marshall, R., Smith, S., Gorman, P., Krummel, T., Haluck, R. & Cooney, R. (2001). Use of a human patient simulator in the development of resident trauma management skills. The Journal of Trauma, 51(1), 17–21. doi:10.1097/00005373-200107000-00003 [CrossRef]
  • Miller, H., Nichols, E. & Beeken, J. (2000). Comparing videotaped and faculty-present return demonstrations of clinical skills. Journal of Nursing Education, 39, 237–239.
  • Morton, P.G. (1996). Creating a laboratory that simulates the critical care environment. Critical Care Nurse, 16(6), 76–81.
  • Morton, P.G. (1997). Using a critical care simulation laboratory to teach students. Critical Care Nurse, 17(6), 66–69.
  • Peters, M. (2000). Does constructivist epistemology have a place in nursing education?Journal of Nursing Education, 39, 166–172.
  • Rogers, P., Jacob, H., Thomas, E., Harwell, M., Willenkin, R. & Pinsky, M. (2000). Medical students can learn the basic application, analytic, evaluative, and psychomotor skills of critical care medicine. Critical Care Medicine, 28, 550–554. doi:10.1097/00003246-200002000-00043 [CrossRef]
  • Sloan, D., Donnelly, M., Johnson, S., Schwartz, R. & Strodel, W. (1993). Use of an objective structured clinical examination (OSCE) to measure improvement in clinical competence during surgical internship. Surgery, 114, 343–351.
  • Small, S., Wuerz, R., Simon, R., Shapiro, N., Conn, A. & Setnik, G. (1999). Demonstration of high-fidelity simulation team training for emergency medicine. Academic Emergency Medicine, 6, 312–323. doi:10.1111/j.1553-2712.1999.tb00395.x [CrossRef]
  • Snyder, M., Fitzloff, B., Fiedler, R. & Lambke, R. (2000). Preparing nursing students for contemporary practice: Restructuring the psychomotor skills laboratory. Journal of Nursing Education, 39, 229–230.
  • Waugh, R., Mayer, J., Ewy, G., Felner, J., Issenberg, B. & Gessner, I. et al. (1995). Multimedia computer-assisted instruction in cardiology. Archives of Internal Medicine, 155, 197–203. doi:10.1001/archinte.1995.00430020089011 [CrossRef]

Demographics of Student Groups (N = 97)

CharacteristicGroup1 n (%)Group 2 n (%)Combined n (%)
Response rate28 (56.0)37 (78.7)65 (67.0)
Female gender26 (93.0)35 (100.0)61 (96.8)
Age
  ⩽ 228 (28.6)12 (34.3)20 (31.7)
  23 to 3013 (46.4)15 (42.9)28 (44.4)
  31 to 354 (14.3)3 (8.6)7 (11.1)
  ⩾ 363 (10.7)5 (14.3)8 (12.7)
Self-reported grade point average
  ⩽ 3.01 (3.6)5 (14.3)6 (9.5)
  3.1 to 3.516 (57.1)13 (37.1)29 (46.0)
  ⩾ 3.611 (39.3)17 (48.6)28 (44.4)
Ethnicity
  Anglo24 (85.7)28 (80.0)52 (82.5)
  Minority4 (14.3)5 (14.3)9 (14.3)
  Mixed0 (0.0)2 (5.7)2 (3.2)

Students’ Means and Standard Deviations for Survey Subscales and Nonsubscale Items and Percentage of Student Agreement with Survey Items (N = 65)

Subscale/ItemMean (SD)% of Student Agreementn
Transferability subscale2.52 (.63)50.865
Realism subscale2.83 (.43)84.665
Value subscale3.04 (.44)92.365
Individual items
  I was prepared for testing with SimMan2.78 (.71)70.865
  Pace and flow reflect real clinical environment2.79 (.65)73.063
  Comfortable room temperature3.00 (.43)90.765
  I needed orientation before testing3.09 (.82)76.965
  Adequate room lighting3.23 (.42)100.065
  Decision making taught is valuable3.46 (.53)100.065
  Skills taught in course are valuable3.53 (.50)100.064

Percentage of Faculty Member Agreement with Survey Items (N = 4)

Subscale/Item% Agreement
Transferability subscale
  Prepared students for real clinical environment100
Realism subscale
  Scenario recreates real-life situations100
  Space resembled a real critical care setting100
  SimMan model provided a realistic patient100
Value subscale
  Scenario tested clinical skills75
  Scenario tested decision making100
  SimMan reinforced course objectives100
  Overall, this is an effective teaching tool100
Individual items
  Faculty support for SimMan is adequate25
  Pace and flow reflected real clinical environment75
  SimMan required extra preparation time75
  I would use it more with more support75
  I was prepared for testing with SimMan100
  I needed an orientation before testing100
  Comfortable room temperature100
  Adequate room lighting100

t Test for Equality of Means (Independent Variable = Level of Self-Reported Cumulative GPA)

Subscale/Item by GPAMean (SD)tdfp
Realism subscale
  3.1 to 3.52.84 (.49).41855.678
  ⩽ 3.62.79 (.41)
Transferability subscale
  3.1 to 3.52.44 (.58).50855.614
  ⩾ 3.62.52 (.71)
Value subscale
  3.1 to 3.53.03 (.43).23555.815
  ⩾ 3.63.01 (.49)
Item 11
  3.1 to 3.53.69 (.47)2.02255.048*
  ⩾ 3.63.43 (.50)

ANOVA Results (Independent Variable = Level of Age Group)

Subscale by Age GroupdfSSFp
Realism subscale
  Between2.27.69.507
  Within6011.69
Transferability subscale
  Between22.323.11.052
  Within6022.32
Value subscale
  Between2.07.18.839
  Within6011.94
Pace/flow
  Between23.434.43.016*
  Within5922.85

10.3928/01484834-20040401-03

Sign up to receive

Journal E-contents