Journal of Nursing Education

Major Article 

Diversity of Nursing Student Views About Simulation Design: A Q-Methodological Study

Jane B. Paige, PhD, RN, CNE, CHSE; Karen H. Morin, PhD, RN, ANEF, FAAN

Abstract

Background:

Education of future nurses benefits from well-designed simulation activities. Skillful teaching with simulation requires educators to be constantly aware of how students experience learning and perceive educators’ actions. Because revision of simulation activities considers feedback elicited from students, it is crucial to understand the perspective from which students base their response.

Method:

In a Q-methodological approach, 45 nursing students rank-ordered 60 opinion statements about simulation design into a distribution grid.

Results:

Factor analysis revealed that nursing students hold five distinct and uniquely personal perspectives—Let Me Show You, Stand By Me, The Agony of Defeat, Let Me Think It Through, and I’m Engaging and So Should You.

Conclusion:

Results suggest that nurse educators need to reaffirm that students clearly understand the purpose of each simulation activity. Nurse educators should incorporate presimulation assignments to optimize learning and help allay anxiety. The five perspectives discovered in this study can serve as a tool to discern individual students’ learning needs. [J Nurs Educ. 2015;54(5):249–260.]

Dr. Paige is Associate Professor, School of Nursing, Milwaukee School of Engineering, and Dr. Morin is Professor Emerita, University of Wisconsin-Milwaukee, Milwaukee, Wisconsin.

This work was supported in part by funding from the Harriet Werley Doctoral Student Research Award from the University of Wisconsin-Milwaukee College of Nursing and Sigma Theta Tau International–Eta Nu Chapter Graduate Student Scholarship Award.

Dr. Morin was not involved in the peer review or decision-making process for this manuscript.

The authors have disclosed no potential conflicts of interest, financial or otherwise.

Address correspondence to Jane B. Paige, PhD, RN, CNE, CHSE, Associate Professor, Milwaukee School of Engineering, School of Nursing, 1025 North Broadway St., Milwaukee, WI 53202; e-mail: paige@msoe.edu.

Received: August 26, 2014
Accepted: January 21, 2015

Abstract

Background:

Education of future nurses benefits from well-designed simulation activities. Skillful teaching with simulation requires educators to be constantly aware of how students experience learning and perceive educators’ actions. Because revision of simulation activities considers feedback elicited from students, it is crucial to understand the perspective from which students base their response.

Method:

In a Q-methodological approach, 45 nursing students rank-ordered 60 opinion statements about simulation design into a distribution grid.

Results:

Factor analysis revealed that nursing students hold five distinct and uniquely personal perspectives—Let Me Show You, Stand By Me, The Agony of Defeat, Let Me Think It Through, and I’m Engaging and So Should You.

Conclusion:

Results suggest that nurse educators need to reaffirm that students clearly understand the purpose of each simulation activity. Nurse educators should incorporate presimulation assignments to optimize learning and help allay anxiety. The five perspectives discovered in this study can serve as a tool to discern individual students’ learning needs. [J Nurs Educ. 2015;54(5):249–260.]

Dr. Paige is Associate Professor, School of Nursing, Milwaukee School of Engineering, and Dr. Morin is Professor Emerita, University of Wisconsin-Milwaukee, Milwaukee, Wisconsin.

This work was supported in part by funding from the Harriet Werley Doctoral Student Research Award from the University of Wisconsin-Milwaukee College of Nursing and Sigma Theta Tau International–Eta Nu Chapter Graduate Student Scholarship Award.

Dr. Morin was not involved in the peer review or decision-making process for this manuscript.

The authors have disclosed no potential conflicts of interest, financial or otherwise.

Address correspondence to Jane B. Paige, PhD, RN, CNE, CHSE, Associate Professor, Milwaukee School of Engineering, School of Nursing, 1025 North Broadway St., Milwaukee, WI 53202; e-mail: paige@msoe.edu.

Received: August 26, 2014
Accepted: January 21, 2015

Benner, Sutphen, Leonard, and Day (2010) reported that nursing education programs are deficient in preparing future nurses. In response to this identified deficiency, calls to transform the education of future nurses (Benner et al., 2010; Institute of Medicine, 2010; McNelis et al., 2014) have intensified the exploration of existing and new pedagogical approaches. The metamorphosis of simulation as an educational strategy is one such approach (Ironside & Jeffries, 2010). Yet, as educational research on simulation proceeds (Cant & Cooper, 2009; Lapkin, Levett-Jones, Bellchambers, & Fernandez, 2010), investigators struggle to keep pace as simulation is integrated into nursing curricula (Schiavenato, 2009). An essential and sometimes absent focus for educational research is discovering how new pedagogies are seen from the students’ viewpoint (Le Couteur & Delfabbro, 2001; Pratt, 1998). Because students frequently score and provide feedback on simulation activities, it is crucial to know from what perspective they base their response. If educators misinterpret what students mean in their commentaries, then subsequent revision of teaching practices may occur based on faulty information. A point of view (perspective) is a complex phenomenon to explore, as it reflects subjective feelings and underlying beliefs (Pratt, 1998). Yet, investigation of the subjectivity inherent in perspectives offers valuable insight into human behavior and ways of thinking (Stephenson, 1953). As a part of a larger study about simulation design (Paige, 2013), this article reports on the research question, “What are nursing students’ perspectives about the design of simulation activities as operationalized by nurse educators?”

Background

Bland, Topping, and Wood (2010) conceptualized simulation as a hypothetical opportunity created to facilitate student engagement and integrate practical and theoretical learning. The expected benefit of simulation is the ability to foster clinical judgment (Ironside & Jeffries, 2010) and develop students’ “sense of salience” (Benner et al., 2010, p. 14) about what is most urgent in each clinical situation. However, as simulation activities occur, it becomes apparent that not all simulations are equally effective nor are the simulation design characteristics of equal importance (Kneebone, 2005; Waxman, 2010). Although some investigators have reported key simulation design features, including repetitive practice, debriefing, range of difficulty level, realism, and student support (Issenberg, McGaghie, Petrusa, Lee Gordon, & Scalese, 2005; Jeffries, 2012), these are broad, conceptually based categories containing terms that lack clear, conceptual definitions (Groom, Henderson, & Sittner, 2014). As educators deliberate on the application of simulation design characteristics, they need to make choices. What is yet unknown is how students view the choices that educators make.

Theoretical Framework

The National League for Nursing–Jeffries Simulation Framework (NLN-JSF) is a comprehensive framework developed to provide theoretical direction as educators plan, conduct, and evaluate simulation activities (Jeffries, 2012). Visually, the NLN-JSF depicts an interaction between three educational components (facilitator [formally named teacher], participant [formally named student], and educational practices) of five simulation design characteristics (objectives, fidelity, problem solving, student support, and debriefing) (Figure 1). Guided by this interaction, a population of opinion statements about how to design and conduct simulation activities was gathered. Using these opinion statements as the unit of analysis, investigators applied a Q-methodological approach to explore student views about simulation design.

Jeffries simulation framework. From Simulation in nursing education: From conceptualization to evaluation (p. 37), by P. Jeffries, 2012, New York, NY: National League for Nursing. Copyright 2012 by the National League for Nursing. Reprinted with permission.

Figure 1.

Jeffries simulation framework. From Simulation in nursing education: From conceptualization to evaluation (p. 37), by P. Jeffries, 2012, New York, NY: National League for Nursing. Copyright 2012 by the National League for Nursing. Reprinted with permission.

Method

Q-Methodology

Q-methodology is a research approach that provides investigators with the ability to investigate and measure subjectivity (Brown, 1980). When Q-methodology is used, the aim is to reveal patterns of thinking, rather than the proportion of people holding a particular perspective. Q-methodology uses a combination of qualitative and quantitative techniques with unique terminology and particular methodological processes (Watts & Stenner, 2012). In brief, Q-studies begin with a comprehensive collection of opinions on a topic of interest (Brown, 1980; Stephenson, 1953). The opinion statements (termed a concourse) become the population of interest. However, the concourse can potentially contain hundreds of statements. Consequently, it is necessary to reduce the concourse to a workable and representative subset (a Q-sample). Typically, 40 to 60 statements sampled from the concourse are sufficient in number to elicit existing points of view (Brown, 1980). Recruiting people for Q-studies is purposeful and based on participants’ potential to hold differing points of view, with relatively small numbers of individuals (typically 40 to 60) needed (Brown, 1980). The recruited participants (a P-set) rank and order the Q-sample statements into a quasinormal distribution grid. This rank-ordering process (Q-sorting) forces participants to judge each statement relative to all other statements and, in so doing, reveal underlying values, beliefs, and ways of thinking. Participants’ individual Q-sorts (each a unique arrangement of Q-sample statements) undergo correlation and factor extraction–rotation procedures. Interpretation of the resulting factors proceeds by converting factor z scores into factor array scores. A factor array displays a reconfigured Q-sort based on the composite and weighted z scores from all participants who define a particular factor (Watts & Stenner, 2012). During factor interpretation, distinguishing statements (i.e., statements placed in statistically different positions, compared with all other factors) and characterizing statements (i.e., statements placed at the polar ends) are given special attention (Watts & Stenner, 2012). Likewise, factor interpretation considers post-sort explanations offered by participants on their reason for placement of statements within the distribution grid. Together, these factor analytic procedures and interpretive approaches determine how individuals group together to reveal differing points of view and patterns of thinking (Brown, 1980).

Concourse and Q-Sample

In the current study, 392 opinion statements about simulation design gathered from 35 nurse educators and the simulation literature base were used to populate the concourse. Guided by the NLN-JSF, a 3×5 factorial design (teacher, student, educational practices × the five simulation design characteristics) provided the structure to construct a 60-statement Q-sample. The steps in Q-sample construction undertaken for the current study are reported elsewhere (Paige & Morin, 2014). Prior to conducting the Q-study, a pilot study tested the Q-sample, Q-sorting process, and recruitment strategies.

Participant Selection (P-Set)

Nursing students’ experience with simulation and the enrollment size of nursing programs are possible demographics that could influence students’ opinions (perspectives) about simulation design. To reduce the risk of missing viewpoints, it was important to recruit students with varying levels of experience with simulation. Students who are first-time participants, as well as students who have participated in multiple simulation experiences, have opinions that need to be heard. Thus, to locate variation in possible points of view, a 3×3 (nine-cell) P-set matrix provided the sampling frame to recruit 45 nursing students (Table 1).

P-Set Matrix for Recruitmenta and Recruitment Resultsb

Table 1:

P-Set Matrix for Recruitment and Recruitment Results

Nursing Student Recruitment

The National Student Nurses Association (NSNA), with more than 60,000 members (NSNA, 2012), provided the vehicle for accessing nursing students. Following institutional review board approval, recruitment memorandums were posted in the NSNA weekly newsletter—one in September 2012 (48 replies of interest) and a second in March 2013 (47 replies of interest). Given the aim to recruit five participants per each of the nine-cell P-set matrix, study packets were mailed to 58 responders, and 32 were returned (55% return rate). Because nursing student respondents were still lacking from P-set matrix cells, a second recruitment strategy accessed students in attendance at the February 2013 Wisconsin Student Nurses Association conference. These two recruitment strategies resulted in a P-set of 45 nursing students. As evident in Table 1, participant recruitment for each of the nine cells ranged from two to six nursing students. Although five participants per each of the nine matrix cells was desired, according to Brown (1980), it was unnecessary to achieve a completely balanced P-set matrix because the matrix provides only a guide to locate diverse views. Demographics descriptors of the P-set are presented in Table 2.

Demographics of Nursing Student P-Set

Table 2:

Demographics of Nursing Student P-Set

Procedure

Nursing students received a $5 gift card as an incentive, a letter stating that completion of the card-sorting activity served as their consent to voluntarily participate in the study, and the following study items: (a) a deck of 60 cards (each card included one opinion statement from the Q-sample and was randomly numbered from 1 to 60), (b) conditions of the instructions for the card sort, (c) a distribution grid (Figure 2), and (d) a tabulation sheet. Students sorted the cards according to the question, “What would you most recommend, or most not recommend, in the design of a simulation activity in nursing education?” Students provided written explanations for cards placed at the −5 and +5 ends of the distribution grid.

Distribution grid of the card sort.

Figure 2.

Distribution grid of the card sort.

Analysis

By-person factor analysis (i.e., factoring individuals by their thinking patterns, rather than by traits, as in conventional factor analysis) involved the sequential application of correlations, factor extraction–rotation, and computation of weighed factor arrays (McKeown & Thomas, 2013). A principal component extraction method with varimax rotation (Watts & Stenner, 2012) located the best factor solution that explained the maximal amount of variance in the correlation matrix, minimized the number of confounding and nonsignificant sorts, and avoided significant interfactor correlations. A 0.01 significance level determined factor loading. A free software program, PQMethod 2.33 (Schmolck, 2012), specifically created for Q-methodology, facilitated the statistical calculations and generation of factor arrays. The qualitative techniques applied a constant comparative process, where the resulting factor arrays were compared for differences and similarities (Brown, 1980). Students’ written explanation for cards placed at the ends of the distribution grid and attention to distinguishing and characterizing statements contributed interpretative value (Brown, 1980). Together, these procedures facilitated a gestalt approach to factor interpretation.

Results

Inspection of results revealed five distinct factors (perspectives) held by nursing students that explained 42% of the study variance. Twenty-seven of the 45 nursing students loaded solely on one of the five factors, 15 students loaded (confounded) on two factors, and three students did not load on any factor (Table A; available in the online version of this article). Nonsignificant interfactor correlations (p > 0.01) indicated that each factor represented a distinct perspective. To avoid obscuring factor clarity, Q-sorts confounded on more than one factor were excluded from computation of the factor array and subsequent factor interpretation (Watts & Stenner, 2012). Factor descriptions follow, ex-emplified with Q-sample statements (card number [numbered 1 to 60], array score [range −5 to +5]) and written explanations (quotes) from students explaining why they placed the cards at −5 and +5. Factor array tables (Tables 37) compare the ranking of statements across factors.

Nursing Student - Factor Loadings

Table A:

Nursing Student - Factor Loadings

Factor Array for the “Let Me Show You” (Factor 1) Perspective

Table 3:

Factor Array for the “Let Me Show You” (Factor 1) Perspective

Factor Array for the “Stand By Me” (Factor 2) Perspective

Table 4:

Factor Array for the “Stand By Me” (Factor 2) Perspective

Factor Array for the “Agony of Defeat” (Factor 3) Perspective

Table 5:

Factor Array for the “Agony of Defeat” (Factor 3) Perspective

Factor Array for the “Let Me Think It Through” (Factor 4) Perspective

Table 6:

Factor Array for the “Let Me Think It Through” (Factor 4) Perspective

Factor Array for the “I’m Engaging and So Should You” (Factor 5) Perspective

Table 7:

Factor Array for the “I’m Engaging and So Should You” (Factor 5) Perspective

Factor 1: The “Let Me Show You” Perspective

Four nursing students loaded solely on factor 1, which explained 11% of the study variance (Table 3). Students holding this perspective want to figure things out on their own (#20, +4), want to receive minimal assistance and cueing (#22, +4), and let the simulation happen as it happens (#57, +3). These students want to talk during the debriefing to figure out what they know (#40, +4). They prefer verbal debriefing, rather than written debriefing (#50, −5), which is most likely related to their comfort level with talking. They are indifferent regarding whether learning objectives are specific (#17, 0) or that cues are scripted and consistent among students (#47, −4). They expect all students to prepare for all simulation roles (#13, +5). They are not interested in playing non-nursing roles (#25, +5). They see no benefit in mixing students across different levels within the program (#54, −5).

Factor 2: The “Stand By Me” Perspective

Eleven nursing students loaded solely on factor 2, which explained 10% of the study variance (Table 4). Students holding this perspective want structure to and guidance in their learning. Students want an orientation and opportunity to practice with the manikins (#23, +4). They desire specific learning objectives (#17, −5) and find it helpful when the learning objectives are verbally reviewed (#16, +3). If they are uncertain what to expect, mistrust may happen. Students recommend that simulations follow theoretical content (#29, +4). They are least interested in nonnursing role-playing (#25, +5; #15, −4), as this “reduces the reality” of the simulation and could “confuse the student” if the role is not well “scripted.” These students clearly prefer interacting with actual patients in the clinical setting, rather than with simulated patients (#56, −4), in part because, as indicated by one student, “two less hours spent in a simulation is cheating the student out of learning time they paid for.” Students appreciate working “together as it calms anxiety” and they are okay with the educator being present in the simulation room (#9, −4) to offer direction on use of equipment and guidance in figuring out the situation (#20, −5; #58, +3). They consider it acceptable to stop a simulation to correct mistakes as they happen (#57, −2). During the debriefing, students count on the educator to ask questions (#6, +5; #40, −2) to help them understand their own thinking process.

Factor 3: The “Agony of Defeat” Perspective

Five nursing students loaded solely on factor 3, which explained 8% of the study variance (Table 5). Students holding this perspective are most concerned about how they feel following the simulation experience, as one student stated: “It is very important that everyone feels like a ‘super’ nurse when they leave.” Students want to leave the simulation feeling good about themselves, as opposed to feeling defeated (#60, +5). In part, a feeling of defeat relates to whether grading of simulations occurs (#30, +5). Instead, students recommend that points be allocated for “showing up prepared and participating” or as “a pass-or-fail” assessment. Compared with other perspectives, students are least likely to value presimulation assignments (#42, −2) or review of learning objectives (#16, −2), perhaps because they can rely on each other to get through the simulation (#10, +4). These students do not recommend singling out weaker students (#8, −5) because “it puts too much pressure on them and could be embarrassing.” It is okay to stop a simulation to offer guidance (#57, −4). Students consider the use of humor to be important (#39, +4) and value the opportunity to role-play nonnursing characters (#25, −4).

Factor 4: The “Let Me Think It Through” Perspective

Three nursing students loaded solely on factor 4, which explained 7% of the study variance (Table 6). Students holding this perspective see greater value from simulation if educators are properly trained in simulation technology (#38, +5; #4, +3) and understand how to use it (#46, +4), “as it doesn’t help us learn when the main piece of equipment (manikin) is broken and no one can fix it.” Students may see a connection between educators’ level of training and teaching expertise with their feelings of defeat (#60, +5) or being singled out if struggling (#31, −5). For example, a preference exists in not being interrupted to provide assistance with equipment (#58, −4) or redirected by cueing (#41, −5) because it throws off one’s train of thought, as indicated by one student: “I don’t like it when my thoughts are stopped, it makes me feel stupid and makes me more nervous.” Students prefer not stopping a simulation (#57, +3) or having others think aloud (#7, −3) because it could interfere with independent thought, as “students need to learn on their own without someone else putting the idea in their head.” Diverging from other perspectives, these students recommend written, in addition to verbal, debriefings (#50, +4) and are less interested in being questioned during debriefing (#6, +1). These students have no qualms about making things up (#33, −2), and pretending (#14, −3) during a simulation is acceptable.

Factor 5: The “I’m Engaging and So Should You” Perspective

Four nursing students loaded solely on factor 5, which explained 6% of the study variance (Table 7). Although all perspectives recommend creating a realistic simulation, students holding this perspective have the strongest feelings about realism. They see reality created in the detail and functioning of the equipment (#35, +5), as well as how seriously educators (#36, +4; #39, −4) and students (#21, +4) take the simulations. Focusing on the lack of realism is unnecessary (#24, −5) and use of the word pretend is not acceptable (#14, +5). Contrary to other student perspectives, it is acceptable to allow grading of simulations (#30, −4; #34, +2) and deliver consequences if students do not take the simulation seriously (#21, +4). Students sharing this perspective recommend viewing video recordings of the simulations (#51, −5) and having presimulation assignments (#42, +3), and they are neutral regarding whether “weaker” students are placed in roles that force them to perform (#8, 0), stating that “weak students need help! Simulation is a wake-up call for them.” Of all perspectives, those sharing this view are least concerned about students feeling defeated following a simulation (#60, −1).

Discussion of Perspectives

Study results revealed that nursing students hold five distinct perspectives about the way nurse educators design simulations. Inspection of the findings indicates that participation in simulation activities evokes different emotional responses. Several possible reasons exist for these findings. Anxiety is a common emotional response, with some of the particular circumstances contributing to anxiety revealed in the perspectives. Students holding the Stand By Me and The Agony of Defeat perspectives indicate that anxiety increases if educators are not able to offer assistance or if they feel singled out as a weaker student. These findings are comparable to other studies. Cordeau (2010) found that perceived anxiety occurs when students do not know what to expect, when they are being videorecorded, and when related to their fear of failure. Kelly, Hager, and Gallagher (2014) found what matters most to students is having academic support during simulation delivery, a practice that varies among educators. Ganley and Linnard-Palmer (2012) and Nielsen and Harder (2013) reported videotaping to be a contributor to student anxiety; however, the five perspectives revealed in the current study indicate that students had no qualms about being videotaped.

A feeling of defeat is an emotional response that exists in the Agony of Defeat perspective. Acknowledging the existence of this perspective is vital, but more important is gaining an understanding of what contributes to this defeated feeling. Written explanations for cards placed at −5 or +5 provide helpful insight into the differing accounts for this feeling. In the Agony of Defeat perspective, students indicate they want to feel good about themselves and feel bad and inadequate if they do not perform up to expectations. Conceivably, this feeling of defeat relates to the visible identification of learning gaps. During simulations, students witness each other’s floundering, as opposed to other learning activities during which another student’s performance is not as obvious. Parker and Myrick (2012) labeled this type of situation as “performing in the fishbowl” (p. 368).

A finding that deserves further investigation is the discovery that students holding the Agony of Defeat perspective are least likely to recommend use of presimulation assignments or review learning objectives. This finding calls into question whether student preparation or lack thereof influences the degree to which students experience a feeling of defeat. Furthermore, students who held the Agony of Defeat perspective, in part, associate their defeated feeling to the grading of simulations. However, it is unclear what defines a grade. Even though the topic of grading simulations is discussed in the literature (Cordeau, 2010; Sportsman, Schumacker, & Hamilton, 2011), it is unclear whether this is in reference to a team or an individual grade, or whether the grade is based on points for performance, for showing up prepared, or for participation. The student perspectives, as revealed in the current study, may reflect this variation in grading practice. Of note is the finding that the I’m Engaging and So Should You perspective considers grading of simulations acceptable and the feeling of defeat takes on little salience for them. Rather, students holding an I’m Engaging and So Should You perspective express frustration with their peers and are more likely to recommend consequences for students who do not take simulation seriously. Considering these findings, it is important to recognize that simulations are designed as a learning activity (formative assessment) or as an evaluation activity (summative or high stakes). It is possible the student perspectives, as revealed in this study, occurred with students thinking of simulation as either a learning activity or an evaluative activity.

Results of this study reveal new findings. The Let Me Think It Through perspective has not been described in the simulation literature. Students who hold this perspective need extra time to work things out in their minds and can get off track if their train of thought is interrupted. It is conceivable that students holding the Let Me Think It Through perspective may have additional difficulty recovering from an interruption in thought. What remains unknown is whether characteristics exist that place students at higher risk for this interruption in thought. Various studies have investigated task interruptions (Altmann, Trafton, & Hambrick, 2014; Brumby, Cox, Back, & Gould, 2013), including the interruptions of nurses as they work in health care environments (Grundgeiger, Sanderson, MacDougall, & Venkatesh, 2010). It may be helpful to explore whether students (and future nurses) have particular tendencies that limit their ability to recover from an interruption in their thought process. Students holding the Let Me Think It Through perspective may benefit from a written debriefing assignment that can provide this opportunity. This was actually recommended (+4) as an option by students holding this perspective. Students holding the Let Me Think It Through perspective may be a subset of students who would benefit from repeating a simulation.

In addition, a finding not found reported in the literature is the variability in how students view stopping a simulation. For example, students holding the Let Me Think It Through perspective consider that stopping a simulation could interfere with their train of thought. On the other hand, students holding The Agony of Defeat and Stand By Me perspectives expect simulations to be stopped if they are doing something wrong. At the same time, students holding the Let Me Show You perspective want the opportunity to figure things out on their own, receive minimal assistance and cueing from educators, and prefer not to stop simulations. Students holding the I’m Engaging and So Should You perspective take offense when other students are unprepared and prefer not to stop a simulation to offer help. The reasons for this diverse preference in whether to stop or not stop a simulation likely relates to each student’s unique needs, such as learning style, different level of academic ability, level of student preparation, and comfort with simulation, to name a few.

Perspectives Within the Context of NLN-JSF

The NLN-JSF (Jeffries, 2012) conceptualizes five simulation design characteristics that nurse educators are to consider when designing simulation activities. Objectives, as one design characteristic, are to be clear, concise, realistic, and correspond to students’ level of knowledge and experience (Jeffries, 2012). However, the degree of specificity for a learning objective remains unknown (Groom et al., 2014). As revealed in the current study, students who hold the Stand by Me, Agony of Defeat, or I’m Engaging and So Should You perspective recommend specifically written objectives, whereas students who hold a Let Me Show You or Let Me Think It Through perspective are indifferent as to whether objectives are specific or general.

Student support, as a design characteristic, occurs when assistance is provided to students but does not interfere with their independent thought (Jeffries, 2012). Allowing time for students to problem solve and make decisions is congruent with the perspectives revealed in the current study. However, in the NLN-JSF, student support connotes an instructional approach initially derived from use of cues (Jeffries, 2012), whereas the perspectives in the current study reveal the importance of an emotional component to support. Findings from the current study suggest that it may be necessary to reexamine student support, not only from an instructional approach but also to include an emotional approach.

Findings revealed that fidelity is an important design characteristic across all perspectives and happens when equipment is functional and educators are proficient in its operation. Therefore, in addition to creating realism, it is equally important that educators know how to maintain it by being properly educated about how to effectively use and troubleshoot the technology.

Problem solving, as a design characteristic, happens when opportunities are designed into a simulation that engage students in tasks that increase knowledge, skills, and challenge beliefs (Jeffries, 2012). Yet, student perspectives in the current study differed on their recommendation for that design characteristic. Some students wanted to problem solve independently, with minimal educator or peer assistance, whereas other students depended on others to help them along in their thinking.

Finally, debriefing, as a design characteristic, occurs when the educator facilitates students’ reexamination of the clinical encounter to foster clinical reasoning and judgment (Jeffries, 2012). This characteristic was important across perspectives, as students wanted educators to help them understand their own ways of thinking. Yet, the level of student participation expected during debriefing varied across perspectives. Conceivably, this is due to the varying level of students’ comfort with their knowledge, as well as the time individual students need to process information.

Implications for Educational Practice

Brookfield (2006) claimed that educators need constant awareness regarding how students experience learning and perceive educators’ actions. However, given that students may not be always honest, upfront, or comfortable expressing their views, uncovering beliefs underpinning student perspectives can be a challenge—hence the value that Q-methodology contributes in revealing the subjectivity inherent in perspectives (Brown, 1980). Based on the perspectives that emerged from this study, it became apparent that students experience simulations in personal and diverse ways. Understanding these perspectives is important, given the trend to incorporate simulation into nursing curricula. Prior to making curricular decisions based on student learning outcomes following simulation activities, confidence in the simulation activity as the educational intervention needs to occur. The five nursing student perspectives, as discovered in this study, offer a student-centered viewpoint on ways to design and conduct simulation activities. The following are new ideas for nurse educators to deliberate as they design, conduct, and use student feedback to evaluate and revise simulations.

First, the diversity in perspectives necessitates the need for educators to understand their particular group of students. Bearing in mind that simulations typically contain a group of students, it is likely that any particular simulation may include students holding one or more of the perspectives discovered in this study. A novel activity to assess the diversity in student views can occur by polling students on how each of the five perspectives matches their way of thinking. The first author (J.B.P.) has undertaken such activities and has found useful information. For example, if it is found that students hold a Let Me Think It Through perspective, then instructional delivery and timing of cues can be tailored to meet the needs of these students, who need more time to process information. If this group of students exists, it may be beneficial to encourage them to meet with educators at a later point following the simulation to reinforce their learning. Of note, when students read the description of the five perspectives, they acquire insight into the thinking of their peers and factors that may influence teamwork dynamics. This insight is useful and can be transferred into future practice, as members of health care teams similarly have different ways of thinking.

Second, it is important that nurse educators reaffirm that students and fellow educators understand the purpose of the simulation activity. If this does not occur, students may see incongruences among educators and, consequently, mistrust is created. In the current study, students used phrases such as “being set up to fail,” “trying to trick me,” and “sink or swim” in their explanations for card placement. These phrases indicate that students may mistrust educators’ intent behind the simulation activity. Even if students review the learning objectives, they also need to be clear about the nature of the simulation—that is, whether it is a formative, summative, or high-stakes evaluation (Sando et al., 2013). In formative assessments, students are still learning the material, and simulations help students to make connections between theory and practice. Mistakes are expected, and students need reassurance that this is okay. In such simulations, investing time up front to discuss how mistakes will be handled and how support will be provided are measures to help allay anxiety. On the other hand, summative or high-stakes evaluations assess whether students meet preestablished criteria. In these types of high-stakes simulations (which may result in student failure), it is conceivable that students feel they are “being set up to fail.” To control for this feeling, it is important that students are clear on the evaluative criteria and that the instruments used to make these determinations are valid and reliable (Robinson & Dearmon, 2013; Sando et al., 2013).

Third, requiring students to complete presimulation assignments that review knowledge and skills for the particular simulation activity can help allay anxiety and promote achievement of learning objectives (Elfrink, Nininger, Rohig, & Lee, 2009; Nielsen & Harder, 2013). Even if students claim that presimulation assignments are extra work, in retrospect, students holding four of the five perspectives in this study found presimulation activities to be beneficial.

Identification of these five perspectives generates further questions to investigate. For example, what impact would assigning students to groups based on similar or divergent perspectives have on learning outcomes or level of anxiety? If students who hold a Let Me Show You perspective were grouped together, would these students be able to reach learning objectives quicker? If students who hold a Let Me Think It Through perspective are grouped together, would these students be able to figure out and deal with the problem if given enough time? When using feedback to evaluate simulations, is it possible to correlate student perspectives about simulation design with students’ rating of simulation activities to offer more meaningful and evaluative data?

Limitations

Several limitations to this study need acknowledgement. First, common in Q-methodology is conducting post-Q-sort, in-person interviews to understand the reasons behind card placement. Because this study recruited students from across the United States, the investigators relied on participants’ written explanations for cards placed at the ends of the grid. Although the written explanations offered by students were generally detailed and informative, in-person interviews may have provided additional information.

A second possible limitation was having students sort opinion statements that were gathered from nurse educators. Typically, in Q-studies, participants completing the sorting process are characteristically similar to those who provide the opinion statements (Brown, 1980; Watts & Stenner, 2012). However, in the current study, it was important to understand students’ perspectives about the actions nurse educators take during simulation design. To control for this limitation, the investigators conducted a pilot study to test the opinion statements (Q-sample) with nursing students.

Furthermore, no male nursing students participated in this study. It is possible that male students hold differing points of view that were missed. Finally, as students participate in simulation activities, attitudes toward simulation may change. Therefore, the current study provides a snapshot in time of the perspectives students hold about simulation design. As such, there is no guarantee that this one Q-study located all existing perspectives (Brown, 1980), yet the five perspectives it did discover are real and do exist.

Conclusion

In the current study, 45 purposely selected nursing students rank-ordered 60 opinion statements theoretically drawn from a concourse of 392 opinions about simulation design gathered from nurse educators. As opposed to surveys that measure opinions against predetermined criteria (Woods, 2011), participants in the current Q-methodological study ranked opinion statements in an interactive process and, in so doing, revealed their personal choice, feelings, and beliefs. It was through this sorting and ranking process that the variability in nursing students’ views about simulation design were revealed. In light of this study’s findings, it is recommended that nurse educators have processes to reaffirm that students understand the purpose of each simulation activity. Educators should incorporate presimulation assignments to optimize students’ learning and to subsequently help allay student anxiety. Educators may find value in employing the five perspectives discovered in this study as a tool to discern students’ particular learning needs. Such knowledge would be beneficial to educators as they strategize means to operationalize the design of simulation activities.

References

  • Altmann, E.M., Trafton, J.G. & Hambrick, D.Z. (2014). Momentary interruptions can derail the train of thought. Journal of Experimental Psychology: General, 143, 215–226. doi:10.1037/a0030986 [CrossRef]
  • Benner, P., Sutphen, M., Leonard, V. & Day, L. (2010). Educating nurses: A call for radical transformation. San Francisco, CA: Jossey-Bass.
  • Bland, A., Topping, A. & Wood, B. (2010). A concept analysis of simulation as a learning strategy in the education of undergraduate nursing students. Nurse Education Today, 31, 664–670. doi:10.1016/j.nedt.2010.10.013 [CrossRef]
  • Brookfield, S. (2006). The skillful teacher: On technique, trust, and responsiveness in the classroom (2nd ed.). San Francisco, CA: Jossey-Bass.
  • Brown, S. (1980). Political subjectivity: Applications of Q methodology in political science. New Haven, CT: Yale University Press.
  • Brumby, D.P., Cox, A.L., Back, J. & Gould, S.J.J. (2013). Recovering from an interruption: Investigating speed-accuracy trade-offs in task resumption behavior. Journal of Experimental Psychology: Applied, 19, 95–107. doi:10.1037/a0032696 [CrossRef]
  • Cant, R. & Cooper, S. (2009). Simulation-based learning in nurse education: Systematic review. Journal of Advanced Nursing, 66, 3–15. doi:10.1111/j.1365-2648.2009.05240.x [CrossRef]
  • Cordeau, M. (2010). The lived experience of clinical simulation of novice nursing students. International Journal for Human Caring, 14(2), 9–15.
  • Elfrink, V.L., Nininger, J., Rohig, L. & Lee, J. (2009). The case for group planning in human patient simulation. Nursing Education Perspectives, 30, 83–86.
  • Ganley, B.J. & Linnard-Palmer, L. (2012). Academic safety during nursing simulation: Perceptions of nursing students and faculty. Clinical Simulation in Nursing, 8(2), e49–e57. doi:10.1016/j.ecns.2010.06.004 [CrossRef]
  • Groom, J.A., Henderson, D. & Sittner, B.J. (2014). NLN/Jeffries Simulation Framework State of the Science project: Simulation design characteristics. Clinical Simulation in Nursing, 10, 377–344. doi:10.1016/j.ecns.2013.02.004 [CrossRef]
  • Grundgeiger, T., Sanderson, P., MacDougall, H.G. & Venkatesh, B. (2010). Interruption management in the intensive care unit: Predicting resumption times and assessing distributed support. Journal of Experimental Psychology: Applied, 16, 317–334. doi:10.1037/a0021912 [CrossRef]
  • Institute of Medicine. (2010). The future of nursing: Leading change, advancing health. Washington, DC: National Academies Press.
  • Ironside, P. & Jeffries, P. (2010). Using multiple-patient simulation experiences to foster clinical judgment. Journal of Nursing Regulation, 1(2), 38–41.
  • Issenberg, S.B., McGaghie, W.C., Petrusa, E.R., Lee Gordon, D. & Scalese, R.J. (2005). Features and uses of high-fidelity medical simulations that lead to effective learning: A BEME systematic review. Medical Teacher, 27(1), 10–28. doi:10.1080/01421590500046924 [CrossRef]
  • Jeffries, P. (Ed.). (2012). Simulation in nursing education: From conceptualization to evaluation (2nd ed.). New York, NY: National League for Nursing.
  • Kelly, M., Hager, P. & Gallagher, R. (2014). What matters most? Students’ rankings of simulation components that contribute to clinical judgment. Journal of Nursing Education, 53, 97–101. doi:10.3928/01484834-20140122-08 [CrossRef]
  • Kneebone, R. (2005). Evaluating clinical simulations for learning procedural skills: A theory-based approach. Academic Medicine, 80, 549–553. doi:10.1097/00001888-200506000-00006 [CrossRef]
  • Lapkin, S., Levett-Jones, T., Bellchambers, H. & Fernandez, R. (2010). Effectiveness of patient simulation manikins in teaching clinical reasoning skills to undergraduate nursing students: A systematic review. Clinical Simulation in Nursing, 6, e207–e222. doi:10.1016/j.ecns.2010.05.005 [CrossRef]
  • Le Couteur, A. & Delfabbro, P. (2001). Repertoires of teaching and learning: A comparison of university teachers and students using Q methodology. Higher Education, 42, 205–235. doi:10.1023/A:1017583516646 [CrossRef]
  • McKeown, B. & Thomas, D. (2013). Q methodology (2nd ed.). Los Angeles, CA: Sage. doi:10.4135/9781483384412 [CrossRef]
  • McNelis, A., Ironside, P., Ebright, P., Dreifuerst, K., Zvonar, S. & Conner, S. (2014). Learning nursing practice: A multisite, multimethod investigation of clinical education. Journal of Nursing Regulation, 4(4), 30–35.
  • Nielsen, B. & Harder, N. (2013). Causes of student anxiety during simulation: What the literature says. Clinical Simulation in Nursing, 9, e507–e512. doi:10.1016/j.ecns.2013.03.003 [CrossRef]
  • National Student Nurses Association. (2012). Home page. Retrieved from http://www.nsna.org/default.aspx
  • Paige, J.B. (2013). Simulation design characteristics: Perspectives held by nurse educators and nursing students (Doctoral dissertation). Available from ProQuest Dissertations and Theses database. (UMI No. 3614774)
  • Paige, J.B. & Morin, K.H. (2014). Q-sample construction: A critical step for a Q-methodological study. Western Journal of Nursing Research. Advance online publication. doi:10.1177/0193945914545177 [CrossRef]
  • Parker, B. & Myrick, F. (2012). The pedagogical ebb and flow of human patient simulation: Empowering through a process of fading support. Journal of Nursing Education, 51, 365–372. doi:10.3928/01484834-20120509-01 [CrossRef]
  • Pratt, D.D. (1998). Five perspectives on teaching in adult and higher education. Malabar, FL: Krieger.
  • Robinson, B.K. & Dearmon, V. (2013). Evidence-based nursing education: Effective use of instructional design and simulated learning environments to enhance knowledge transfer in undergraduate nursing students. Journal of Professional Nursing, 29, 203–209. doi:10.1016/j.profnurs.2012.04.022 [CrossRef]
  • Sando, C.R., Coggins, R.M., Meakim, C., Franklin, A.E., Gloe, D., Boese, T. & Borum, J. (2013). Standards of best practice: Simulation standard VII: Participant assessment and evaluation. Clinical Simulation in Nursing, 9(6, Suppl.), S30–S32. doi:10.1016/j.ecns.2013.04.007 [CrossRef]
  • Schiavenato, M. (2009). Reevaluating simulation in nursing education: Beyond the human patient simulator. Journal of Nursing Education, 48, 388–394. doi:10.3928/01484834-20090615-06 [CrossRef]
  • Schmolck, P. (2012). PQMethod software. Kent, OH: Kent State University. Retrieved from http://schmolck.userweb.mwn.de/qmethod/
  • Sportsman, S., Schumacker, R.E. & Hamilton, P. (2011). Evaluating the impact of scenario-based high-fidelity patient simulation on academic metrics of student success. Nursing Education Perspectives, 32, 259–265. doi:10.5480/1536-5026-32.4.259 [CrossRef]
  • Stephenson, W. (1953). The study of behavior: Q-technique and its methodology. Chicago, IL: University of Chicago Press.
  • Watts, S. & Stenner, P. (2012). Doing Q methodological research: Theory, method and interpretation. London, UK: Sage.
  • Waxman, K.T. (2010). The development of evidence-based clinical simulation scenarios: Guidelines for nurse educators. Journal of Nursing Education, 49, 29–35. doi:10.3928/01484834-20090916-07 [CrossRef]
  • Woods, C.E. (2011). Using Q methodology to explore leadership: The role of the school business manager. International Journal of Leadership in Education: Theory and Practice, 14, 317–335. doi:10.1080/13603124.2010.507877 [CrossRef]

P-Set Matrix for Recruitmenta and Recruitment Resultsb

No. of Simulation Experiences
< 33–5> 5
Program enrollment size
  < 100 students655
  100–250 students664
  > 250 students256

Demographics of Nursing Student P-Set

Demographicn (%)a
Gender
  Female45 (100)
  Male0 (0)
Age (years)
  ⩽ 205 (11)
  21–2515 (33)
  26–3010 (22)
  31–407 (16)
  41–506 (13)
  > 502 (4)
Program type
  Associate’s degree in nursing15 (33)
  Diploma1 (2)
  Baccalaureate degree in nursing29 (64)
Region
  U.S. Northeast3 (7)
  U.S. Midwest23 (51)
  U.S. South6 (13)
  U.S. West13 (29)

Factor Array for the “Let Me Show You” (Factor 1) Perspective

CardStatementFactor Array Scorea
12345
#13Assign student roles randomly at the start of the simulation. This way, students need to be prepared for all roles and not just their assigned role.+5*+2+2+10
#25Do not assign students roles outside their scope of practice, such as doctor or respiratory therapist, as they may not have a clear impression about when or how they are required to act in this role.+5+5−4−10
#40During debriefing, let students do most of the talking on how they came to conclusions. The nurse educator interferes only if conclusions are erroneous.+4*−20+2+1
#20Students should be left to figure out problems on their own during the actual running of the simulation.+4*−5−3−2+1
#22Nurse educators conducting simulations need to control the impulse to prematurely cue or interrupt the student during simulation. This allows students time to think and process information.+4+3+2+1+3
#57Do not stop a simulation for any reason. What happens, happens. It is then discussed in the debriefing.+3−2−4+3+2
#17Design and keep objectives general so that students are not informed of the specific focus of the simulation.0−5−30−4
#47Script and deliver cues in the same way for each simulation, including number of times offered, how, and when.−4−3+1−4−2
#50Use both verbal and written debriefing for simulations, where students need time to consider and think through events such as end-of-life simulations. Comments by students 1 week later are much richer and thoughtful than during the immediate debriefing.−5*−3+1+40
#54Consider mixing students from different levels in the program. This allows senior students to practice delegation and junior students to see how smart they will be/should be closer to graduation.−5*0+2−1+2

Factor Array for the “Stand By Me” (Factor 2) Perspective

CardStatementFactor Array Scorea
12345
#6During debriefing, ask questions that get at why students decided to do what they did. Many times students make decisions based on false assumptions.+3+5+2+1+4
#25Do not assign students roles outside their scope of practice, such as doctor or respiratory therapist, as they may not have a clear impression when or how they are required to act in this role.+5+5−4−10
#29Schedule simulations following theoretical content for students to apply concepts learned in the classroom.+2+4+2+2+2
#23Prior to the first simulation, have students observe a simulation and then allow hands-on orientation with the manikin.+2+40+2−1
#58Freely assist students on how to operate equipment during the simulation so as not to distract from the content of the simulation. For example, if students need help programming the intravenous pump, they should say it out loud, and someone will come out of the control room to help.−1+3*−1−4−3
#16Review simulation objectives verbally with students. This allows time for nurse educators to stress the purpose of the simulation and how meeting these objectives will facilitate learning.+1+3−2−1−1
#40During debriefing, let students do most of the talking on how they came to conclusions. The nurse educator interferes only if conclusions are erroneous.+4−2*0+2+1
#57Do not stop a simulation for any reason. What happens, happens. It is then discussed in the debriefing.+3−2*−4+3+2
#15Assign students to play family role characters. This allows students a better understanding of the experience of family members.0−4*0+3+1
#9Nurse educators should not be present in the room during a simulation, as students tend to rely on the educator to get through the scenario.+3−40−3+1
#56It is acceptable to use 4 hours of simulation time to replace 6 hours of clinical experience.−3−4+1−3−3
#17Design and keep objectives general so that students are not informed of the specific focus of the simulation.0−5*−30−4
#20Students should be left to figure out problems on their own during the actual running of the simulation.+4−5*−3−2+1

Factor Array for the “Agony of Defeat” (Factor 3) Perspective

CardStatementFactor Array Scorea
12345
#30Do not grade simulations. There are too many variables that cannot be controlled to make it fair for all students.+20+5*+2−4
#60Take into consideration that students should not feel defeated when leaving the simulation laboratory.+1+3+5+5−1
#10Run simulations with two to three students to promote the “one whole brain” concept. Between the 3 of them, they should be able to remember enough to get through the simulation.+2+1+4*−1−3
#39Use of humor is important in simulations.−1−2+4*−2−4
#16Review simulation objectives verbally with students. This allows time for nurse educators to stress the purpose of the simulation and how meeting these objectives will facilitate learning.+1+3−2−1−1
#42Give students presimulation assignments to help students be more prepared to take care of the simulated patient.0+2−2*+2+3
#25Do not assign students roles outside their scope of practice, such as doctor or respiratory therapist, as they may not have a clear impression when or how they are required to act in this role.+5+5−4*−10
#57Do not stop a simulation for any reason. What happens, happens. It is then discussed in the debriefing.+3−2−4*+3+2
#8Place “weaker” students in roles that force them to perform. Doing so allows nurse educators to better evaluate these students.−4−2−5−40

Factor Array for the “Let Me Think It Through” (Factor 4) Perspective

CardStatementFactor Array Scorea
12345
#38When running a simulation, use only nurse educators who are very familiar and proficient with operating the simulator and have sufficient content knowledge about the scenario.+1+4−2+5+1
#60Take into consideration that students should not feel defeated when leaving the simulation laboratory.+1+3+5+5−1
#46Nurse educators who use simulation should be master’s prepared, as most clinical instructors are required to be.0+1−4+4*−4
#50Use both verbal and written debriefing for simulations, where students need time to consider and think through events such as end-of-life simulations. Comments by students 1 week later are much richer and thoughtful than during the immediate debrief.−5−3+1+4*0
#4Ideally, three key positions are needed for simulation programs. A subject matter expert (educator with expertise in topic content), an instructional designer (person with expertise in teaching techniques), and an information technology specialist (person with technological expertise).−2−1−3+3*0
#57Do not stop a simulation for any reason. What happens, happens. It is then discussed in the debriefing.+3−2−4+3+2
#6During debriefing, ask questions that get at why students decided to do what they did. Many times students make decisions based on false assumptions.+3+5+2+1+4
#33Prior to a simulation, caution students to not make things up (assessment data/findings) or assume things (i.e., do not need to do something) if they do not have what they are looking for.+10+1−2+1
#7Ask students to “think aloud” during the simulation. This helps other students, who do not deal with the situation as quickly, to hear what other students are thinking.−2+2−1−3−2
#14Do not use the word “pretend.” During prebriefing, tell students whether they are going to carry out an action, then do it (i.e., give medications, wash hands).+20+3−3*+5
#58Freely assist students on how to operate equipment during the simulation so as not to distract from the content of the simulation. For example, if students need help programming the intravenous pump, they should say it out loud and someone will come out of the control room to help.−1+3−1−4−3
#31Use simulation for one-to-one learning and evaluation of students who are struggling or who are possibly unsafe in clinical.+3+1+1−5*+3
#41If students are going to make an error during a simulation, first give them cues to change their minds. But, if they say, “I am good” or “let’s go do this,” let students make the error and help them to discover the error or omission in debriefing.−100−5*−1

Factor Array for the “I’m Engaging and So Should You” (Factor 5) Perspective

CardStatementFactor Array Scorea
12345
#14Do not use the word “pretend.” During prebriefing, tell students if they are going to carry out an action, then do it (i.e., give medications, wash hands).+20+3−3+5*
#35Creating reality is very important and is in the details. That means that manikins need to function properly, audio should be as high quality as possible, body sounds should be as realistic as possible, equipment should be as true to what is used in real practice as possible.+4+4+3+4+5
#21There should be consequences for students if they do not take simulation seriously.−30−2+1+4*
#36Nurse educators need to treat the simulation room and patient like a real person, since students take simulation as seriously as do the educators.+3+2−2−1+4
#42Assign students presimulation assignments to help students be more prepared to take care of the simulated patient.0+2−2+2+3
#34When grading a simulation, record the number of cues given and factor this in when determining student’s grade.−4−4−5−1+2*
#8Place “weaker” students in roles that force them to perform. Doing so allows nurse educators to better evaluate these students.−4−2−5−40*
#60Take into consideration, students should not feel defeated when leaving the simulation laboratory.+1+3+5+5−1*
#39Use of humor is important in simulations.−1−2+4−2−4
#30Do not grade simulations. There are too many variables that cannot be controlled to make it fair for all students.+20+5+2−4*
#24Be “real” about the lack of reality in a simulation. This is appreciated by students and they engage more fully than if this issue is not discussed.−3−10+2−5
#51Videotaping simulation is unnecessary and a waste of time. If debriefing is done immediately after a simulation, students remember perfectly well what they just did. Instead, spend time discussing, asking questions, going over thought processes, and decisions made.−4−3−40−5

Nursing Student - Factor Loadings

Q-Sort No. and Demographic codeaFactor Loadingsb,c
12345
Student 3 Sa35Bd(.71)−.01.18.00.20
Student 7 Sb28A(.64).17.05−.02.12
Student 41 Lc28B(.49).22−.17.25.22
Student 12 Sc35A(.44).10.14.26−.32
Student 35 Lb45A.12(.66).01−.19.05
Student 32 Ma45A−.21(.64).11−.06−.04
Student 2 Sa20B.22(.55).02.32.08
Student 19 Ma28B.14(.54)−.01−.03.10
Student 6 Sb45B−.05(.52).25.29.00
Student 31 Mb23A.17(.52).25.21.17
Student 27 Mc23Bd.00(.51)−.03.25−.07
Student 28 Mc23Bd.11(.46).27.27.18
Student 36 Lb28A.31(.45).31.13.30
Student 21 Mb23B.24(.44).27.02.31
Student 1 Sa35A.25(.40).02.32.05
Student 40 Lc23B.06−.01(.82)−.08−.03
Student 14 Sc28Bd−.02.11(.67).30.25
Student 38 Lb23Bd.22−.03(.55).13−.22
Student 15 Sc38A.15−.06(.47).06.12
Student 30 Sa28B−.07.30(.42).07−.03
Student 45 Lc23B.08.18.11(.70)−.20
Student 24 Mb23Bd.02−.19.10(.59).22
Student 43 Lc45A.06.06.10(.46).16
Student 39 Lb20B.05.00.03.08(.64)
Student 34 La28B.29−.03−.05.17(.61)
Student 10 Sb23Bd.01.16.20.03(.42)
Student 37 Lb28Bd−.07.11.00.04(.42)
Student 5 Sa35A.42.13.33.28.00
Student 9 Sb50B.44−.11−.26−.10.38
Student 11 Sc23B.40.23.35.14.00
Student 13 Sc50B.51−.10.27.35.10
Student 18 Ma20A.58.07.27.34−.07
Student 22 Mb23Bd.47.00.20.36.19
Student 23 Mb23Bd.63.08−.04.37−.05
Student 25 Mb20Bd.40.02.42−.08−.05
Student 26 Mc28A.46.40.11−.11−.20
Student 29 Mc20Bd.47.46.01−.16−.12
Student 33 La45A.41.40.30−.14.14
Student 42 Lc35B.43.32.14.45.12
Student 17 Ma23A.09.46.19.39.07
Student 20 Ma38A.25.42−.04.23.41
Student 8 Sb23B.01.32.36.42.24
Student 16 Ma23D.26.23.32−.10.22
Student 4 Sa45Bd.12.14−.14.32−.02
Student 44 Lc28B.00.25−.03.18−.24

Variance11%10%8%7%6%

10.3928/01484834-20150417-02

Sign up to receive

Journal E-contents