Journal of Nursing Education

The articles prior to January 2012 are part of the back file collection and are not available with a current paid subscription. To access the article, you may purchase it or purchase the complete back file collection here

Major Articles 

Commentary on “Realist Evaluation as a Framework for the Assessment of Teaching About the Improvement of Care”

Shirley M. Moore, PhD, RN, FAAN

Abstract

Dr. Moore is Edward J. and Louise Mellen Professor of Nursing, Frances Payne Bolton School of Nursing, Case Western Reserve University, Cleveland, Ohio.

The author has no financial or proprietary interest in the materials presented herein.

Address correspondence to Shirley M. Moore, PhD, RN, FAAN, Edward J. and Louise Mellen Professor of Nursing, Frances Payne Bolton School of Nursing, Case Western Reserve University, 10900 Euclid Ave., Cleveland, OH 44106; e-mail: smm8@case.edu.

In their article, “Realist Evaluation as a Framework for the Assessment of Teaching About the Improvement of Care,” Ogrinc and Batalden describe the use of a new model of program evaluation: the realist evaluation framework. They apply realist evaluation to measure the effectiveness of a clinical teaching approach for process improvement in resident and medical student education. Realist evaluation is a program evaluation method that emphasizes elucidating the contextual influences and mechanisms by which an education intervention achieves the desired outcomes (Pawson, 2006; Pawson & Tilly, 1997). Such an approach promotes the systematic building of education intervention theories that describe what works, for whom, and in what circumstances. In response to Ogrinc and Baltalden’s illustration of using the realist evaluation framework to assess clinical medical education, a description is given about how realist evaluation can be used to build theory that addresses the complex nursing education interventions needed to facilitate learning about the knowledge, skills, and attitudes associated with quality and safety.

Many nursing education interventions are complex, with several interacting components, as well as some of the following dimensions (Campbell et al., 2007):

It is clear that complex education interventions are needed to address the learning goals of many of the focus areas in quality and safety education. Examples of quality and safety education topics requiring complex interventions are interprofessional communication and collaboration, fair and just culture, root cause analysis, systems thinking, workarounds, health literacy, medication reconciliation, and family inclusion in care planning. Complex education interventions commonly used to teach these quality and safety focus areas are simulation, unfolding cases, interactive enactments, gaming, and interprofessional learning. The multiple interacting components of these interventions, however, have made it challenging to evaluate their effectiveness and build theory about the mechanisms by which the outcomes of these interventions are produced. Realist evaluation provides a promising approach that advances our explanatory quest to determine the “active ingredients” in successful nursing education interventions.

As we design the learning activities and curricula for quality and safety education, we have a unique opportunity to build theory about our education interventions: what works, for whom, and under what conditions. Using a realist evaluation approach, we can test our hypotheses—or sometimes hunches—about how an education intervention produces its outcomes. In other words, what are the particular components (mechanisms) that account for successful outcomes of our interventions? What is the “dose” of that component that is needed to produce the outcomes? What are the contextual factors that affect the implementation and success of the education intervention?

Realist evaluation proposes that context is an important consideration in evaluating programs. Context refers to the features of the conditions in which the interventions are introduced that are relevant to the intervention processes (Pawson, 2006). It is assumed that interventions are effective only under certain circumstances. Taking different contexts into account is not new to nurses and is consistent with the basic nursing view of the importance of environment as an influencing variable on nursing outcomes.

However, we have been less overt in our acknowledgment of the importance of context when evaluating nursing education interventions. Careful consideration of the contexts that influence our quality and safety education interventions will help us to…

Dr. Moore is Edward J. and Louise Mellen Professor of Nursing, Frances Payne Bolton School of Nursing, Case Western Reserve University, Cleveland, Ohio.

The author has no financial or proprietary interest in the materials presented herein.

Address correspondence to Shirley M. Moore, PhD, RN, FAAN, Edward J. and Louise Mellen Professor of Nursing, Frances Payne Bolton School of Nursing, Case Western Reserve University, 10900 Euclid Ave., Cleveland, OH 44106; e-mail: smm8@case.edu.

In their article, “Realist Evaluation as a Framework for the Assessment of Teaching About the Improvement of Care,” Ogrinc and Batalden describe the use of a new model of program evaluation: the realist evaluation framework. They apply realist evaluation to measure the effectiveness of a clinical teaching approach for process improvement in resident and medical student education. Realist evaluation is a program evaluation method that emphasizes elucidating the contextual influences and mechanisms by which an education intervention achieves the desired outcomes (Pawson, 2006; Pawson & Tilly, 1997). Such an approach promotes the systematic building of education intervention theories that describe what works, for whom, and in what circumstances. In response to Ogrinc and Baltalden’s illustration of using the realist evaluation framework to assess clinical medical education, a description is given about how realist evaluation can be used to build theory that addresses the complex nursing education interventions needed to facilitate learning about the knowledge, skills, and attitudes associated with quality and safety.

Many nursing education interventions are complex, with several interacting components, as well as some of the following dimensions (Campbell et al., 2007):

  • Wide range and variability of possible outcomes.
  • Difficulty standardizing the delivery and receipt.
  • Variability in the target population.
  • Sensitivity to features of local context
  • Degree of flexibility or tailoring of the intervention permitted.
  • Long causal chains linking the intervention with its outcome(s).

It is clear that complex education interventions are needed to address the learning goals of many of the focus areas in quality and safety education. Examples of quality and safety education topics requiring complex interventions are interprofessional communication and collaboration, fair and just culture, root cause analysis, systems thinking, workarounds, health literacy, medication reconciliation, and family inclusion in care planning. Complex education interventions commonly used to teach these quality and safety focus areas are simulation, unfolding cases, interactive enactments, gaming, and interprofessional learning. The multiple interacting components of these interventions, however, have made it challenging to evaluate their effectiveness and build theory about the mechanisms by which the outcomes of these interventions are produced. Realist evaluation provides a promising approach that advances our explanatory quest to determine the “active ingredients” in successful nursing education interventions.

As we design the learning activities and curricula for quality and safety education, we have a unique opportunity to build theory about our education interventions: what works, for whom, and under what conditions. Using a realist evaluation approach, we can test our hypotheses—or sometimes hunches—about how an education intervention produces its outcomes. In other words, what are the particular components (mechanisms) that account for successful outcomes of our interventions? What is the “dose” of that component that is needed to produce the outcomes? What are the contextual factors that affect the implementation and success of the education intervention?

Realist evaluation proposes that context is an important consideration in evaluating programs. Context refers to the features of the conditions in which the interventions are introduced that are relevant to the intervention processes (Pawson, 2006). It is assumed that interventions are effective only under certain circumstances. Taking different contexts into account is not new to nurses and is consistent with the basic nursing view of the importance of environment as an influencing variable on nursing outcomes.

However, we have been less overt in our acknowledgment of the importance of context when evaluating nursing education interventions. Careful consideration of the contexts that influence our quality and safety education interventions will help us to address the issues of “for whom” and “in what circumstance” they will work. Examples of nursing education contexts include the background and experience of trainees, whether the learners are of a single discipline or multiple disciplines; whether the delivery is to groups, individuals, face-to-face, or technology-assisted; and whether the setting is didactic or clinical.

Examples of the use of the realist evaluation approach to assess two commonly used nursing quality and safety education interventions, simulation and interdisciplinary courses, follow. These examples illustrate the use of the realist evaluation model of context-mechanism-outcome pattern configuration for evaluation and possible theory building.

Simulation is a good example of a complex education intervention for which we need more evidence about what makes it successful, for what group of learners, and under what conditions. We need to build theory about what “dose” of simulation is needed, how its different subcomponents interact and differ under varying conditions, and the number and difficulty of behaviors required by those delivering or engaging in simulation learning. Using the realist evaluation approach, we would first create working theories about how simulation works. For example, we could hypothesize that simulation provides repetition, problem solving opportunities, peer modeling, and feedback for learners, with the goal of increasing self-confidence, performance ability, and speed in doing a specific task. We would then design teaching strategies (mechanisms) consistent with these hypotheses that we think will provide repetition, problem solving opportunities, peer modeling, and feedback (i.e., debriefing). We would next create a context-mechanism-outcome grid (Table 2 in the article by Ogrinc and Batalden) that individually lists each of these mechanisms and considers the respective mechanism in light of a particular context to produce a specific outcome. Examples of different contexts could be: learner experience with simulation, level of learner experience with the content, and single disciplinary or multidisciplinary. Next, data about the effectiveness of the learning activities under the various conditions is collected, and finally, the theory is further refined and tested.

Another example of an increasingly common education intervention for quality and safety is an interdisciplinary course in teamwork and communication. Evaluating the program effectiveness of interprofessional courses and building theory about the most effective approaches to interprofessional education have been challenging because of the complexity of these interventions. Using a realist evaluation approach, one could hypothesize that interprofessional education produces greater collaboration and better communication among health care professionals of different disciplines because the familiarity that comes of learning together promotes trust, reduces language differences, and promotes building of shared mental models. Thus, one could design multidisciplinary learning activities with components (mechanisms) that promote learning about the cultures of the other health care disciplines, how to tap into and understand the mental models of those disciplines, and how to make one’s own mental models known to others. These different mechanisms can be displayed in the context-mechanism-outcome table previously referred to and the accompanying contexts identified. Contexts for interdisciplinary education could be the number of different disciplines involved, years of experience of the learners in their respective disciplines, experiences of faculty with interdisciplinary teaching, and where in the training program interdisciplinary learning is introduced. This hypothesis of what comprises effective interprofessional education and the mechanisms by which it works could be tested under the varying contexts. This approach to organizing our evaluation of education interventions is systematic and theory building. It goes beyond assessing student outcomes and satisfaction in that it includes a priori consideration of learning about what works, for whom and under what conditions.

As we move toward our goal of implementing quality and safety education across the country, we have an opportunity to build theory in nursing education about what interventions work and why they work. Realist evaluation provides an interesting new approach to education evaluation that may provide more useful information about the effectiveness of complex nursing education interventions than traditional models of education evaluation. Realist evaluation can help nurse educators uncover the “root causes” of the successes in our nursing education interventions.

References

  • Campbell, N.C., Murray, E., Darbyshire, J., Emery, J., Farmer, A. & Griffiths, F. et al. (2007). Designing and evaluating complex interventions to improve health care. BMJ, 334(7591), 455–459. doi:10.1136/bmj.39108.379965.BE [CrossRef]
  • Kirkpatrick, D. (1976). Evaluation of training. In Craig, R. (Ed.) Training and development handbook. New York: McGraw Hill.
  • Pawson, R. (2006). Evidence-based policy: A realist perspective. London: Sage.
  • Pawson, R. & Tilley, N. (1997). Realistic evaluation. London: Sage.
Authors

Dr. Moore is Edward J. and Louise Mellen Professor of Nursing, Frances Payne Bolton School of Nursing, Case Western Reserve University, Cleveland, Ohio.

The author has no financial or proprietary interest in the materials presented herein.

Address correspondence to Shirley M. Moore, PhD, RN, FAAN, Edward J. and Louise Mellen Professor of Nursing, Frances Payne Bolton School of Nursing, Case Western Reserve University, 10900 Euclid Ave., Cleveland, OH 44106; e-mail: .smm8@case.edu

10.3928/01484834-20091113-09

Sign up to receive

Journal E-contents