The Journal of Continuing Education in Nursing

Administrative Angles 

An Update on Kirkpatrick's Model of Evaluation: Part Two

Lynore D. DeSilets, EdD, RN-BC-Retired

Abstract

Since Kirkpatrick first published his evaluation model, many continuing nursing education providers have used it. The model has stood the test of time. In 2016, the model was updated by Kirkpatrick's son and daughter-in-law through the addition of five simple principles that can serve as a blueprint to maximize stakeholders and organizational investments, while validating the value of provider units.

J Contin Educ Nurs. 2018;49(7):292–293.

Abstract

Since Kirkpatrick first published his evaluation model, many continuing nursing education providers have used it. The model has stood the test of time. In 2016, the model was updated by Kirkpatrick's son and daughter-in-law through the addition of five simple principles that can serve as a blueprint to maximize stakeholders and organizational investments, while validating the value of provider units.

J Contin Educ Nurs. 2018;49(7):292–293.

For many years, evaluation was the last step in the instructional design process. In fact, A-D-D-I-E, which most of us have used for planning, begins with analysis and moves on to design–development of the activity, including the use of best available evidence, followed by implementation, and finally evaluation. More contemporary reasoning ensued with the development of a model by Kirkpatrick and Kirkpatrick (2006).

Original Kirkpatrick Model

Kirkpatrick's model provides a useful roadmap that integrates evaluation into the initial planning process. The logic behind this is that when you do not know where you want to end up, it is hard to develop a road map. The model is popular and straightforward. This model includes four levels for evaluation: (a) reaction, (b) learning, (c) behavior, and (d) results (p. 10). Levels of evaluation were built in a stair-step fashion.

Level 1

The lowest level and easiest to accomplish is level 1, reaction. For evaluation at this level, participants provide information, usually at the end of the program. The data equate to customer satisfaction and confirm the quality of the activity and the satisfaction with the instructor. Participants answer questions such as:

  • Was the speaker knowledgeable?
  • Did you find the program held your interest?
  • Were you able to meet the objectives?

Level 2

Level 2, learning, measures the knowledge or skill that has been acquired during the learning activity. Evaluation on this level can be accomplished through the use of a test, an audience response system, a case study, or through an evaluation question such as “What did you learn during the activity?”

Level 3

Data for level 3, behavior, are typically gathered after the activity is over. The information collected here provides helpful information on how or whether participants have been able to use or apply what they have learned in practice. Questions at this level might be “How do you plan to apply what you learned in your work?” or “How has your practice changed since participating in this course?” Answers to the questions at the three levels can be collected in a single, blended survey, either at the completion of the program or sent out after the activity is over. If you are going to measure behavior change, the evaluation needs to be completed weeks to months after the activity is over.

Assessment at Level 4

The highest level of evaluation and the most difficult and time consuming to complete is level 4, results. Data collected here answer the question “What was the impact of the learning on the organization?” Focus can be on cost analysis, financial value, quality, or output. Data can be used to guide executive decision making. The results at this level are not only influenced by the educational experience, but by other intervening variables, such as the physical environment, the culture of the practice site, reminder systems for participants, or being held accountable by managers or supervisors. Although there is no consistent way to evaluate level 4, data can be collected from a variety of sources, including (a) staff turnover, (b) infection rates, (c) length of stay, (d) cost savings, (e) patient indicators, (f) job satisfaction, (g) retention rates, or (h) other quality measures. Arcand (2009) recommended only using this level for high-cost, high-priority, or strategic educational activities.

Updated Model

In 2016, Kirkpatrick's son Jim and daughter-in-law Wendy updated the original, groundbreaking work. It has been published in their book, Kirkpatrick's Four Levels of Training Evaluation. The key foundational principles that have been added are underpinnings that guide the application of the revised model (Kirkpatrick & Kirkpatrick, 2016, p. 33):

  • The end is the beginning.
  • Return on expectations is the ultimate indicator of value.
  • Business partnership is necessary to bring about positive return on expectations.
  • Value must be created before it can be demonstrated.
  • A compelling chain of evidence demonstrates your bottom-line value.

Principle 1

To effectively use the model, desired results serve as the first step in the planning process. Those of us in nursing professional development are familiar with a planning process that begins with identification of a professional practice gap. We are already doing this. Knowing the desired outcome and gap will answer the question “What behavior (level 3) is needed in order to achieve these results?” Subsequently, program developers should decide what attitudes, knowledge, and skills (level 2) are needed in order to realize the desired behaviors so participants learn what they need to know and will also respond positively to the program (level 1).

Principle 2

Return on expectations involves understanding what stakeholders' expectations are. This helps to identify the value of the activity and allows for the attainment of measurable results. Not all nursing continuing education activities involve business partnerships, but when they do, planners need to partner with managers and supervisors to prepare participants for the activity in advance. These stakeholders will also have key roles to play in reinforcing the application of the newly acquired knowledge and skills.

Principle 3

Kirkpatrick and Kirkpatrick (2016) reported that the learning activity in itself will typically result in just 15% of on-the-job application (p. 34). Partnerships with stakeholders, such as managers and supervisors, will be important in preparing participants for the education, as well as in reinforcing the new skills or knowledge. The degree to which these affiliations occur relates directly to the achievement of positive outcomes.

Principle 4

Often the major portion of a planner's efforts and resources are spent on the development and delivery of the learning activity, whereas typically little time is spent on undertakings before and after the training that support behavior change, the results that stakeholders want. In many instances, providers should redefine their roles to focus more on the achievement of behavior change. This may be a challenge for many of us, but it is an important area to consider for future development.

Principle 5

By using the Kirkpatrick model and the foundational principles, a chain of evidence can be created that demonstrates the worth of the learning experience. The bottom line value of the activity, either qualitative or quantitative, can be measured and shared with stakeholders and the organization. This is an important way for educators to demonstrate their value to the organization.

Summary

When developing learning activities, planning is just one part of a three-step process—planning, implementation, and demonstration of value. Using the original Kirkpatrick model planning for evaluation begins with level 4, then moves to level 3, before going to level 2, and then level 1. A buy-in from learners, managers, and supervisors, as well as postactivity reinforcement and accountability, are needed to ensure changes in or maintenance of performance. Applying principles from the updated Kirkpatrick model will enhance positive outcomes not only for planners, but also for participants, organizations, and stakeholders.

A note from the column editor, Dr. Shinners: Dr. DeSilets is a prior Administrative Angles column editor. During her 10 years as an associate editor for this column, she wrote two articles that focused on evaluation (DeSilets, 2009, 2010). According to the Journal's site analytics, those articles have been frequently viewed or cited over the years. This article offers a brief review of Dr. DeSilets's previous work and a look at the revisions that can be used to demonstrate the value of Kirkpatrick's evaluation method in continuing nursing education.

References

  • Arcand, L.L. (2009, October). Basic concepts of program evaluation for staff development and continuing education. Paper presented at Transforming Curricula and Lifelong Learning for Quality and Safety, Mayo Clinic. , Rochester, MN. .
  • DeSilets, L.D. (2009). Connecting the dots of evaluation. The Journal of Continuing Education in Nursing, 40, 532–533. doi:10.3928/00220124-20091119-09 [CrossRef]
  • DeSilets, L.D. (2010). Another look at evaluation models. The Journal of Continuing Education in Nursing, 41, 12–13. doi:10.3928/00220124-20091222-02 [CrossRef]
  • Kirkpatrick, D.L. & Kirkpatrick, J.D. (2006). Evaluating training programs: The four levels (3rd ed.). San Francisco, CA: Berrett-Koehler.
  • Kirkpatrick, J.D. & Kirkpatrick, W.K. (2016). Kirkpatrick's four levels of training evaluation. Alexandria, VA: ATD Press.
Authors

Dr. DeSilets is a previous Associate Editor of this column and former Assistant Dean, Fitzpatrick College of Nursing, Villanova University, Villanova, Pennsylvania.

The author has disclosed no potential conflicts of interest, financial or otherwise.

Author correspondence to Lynore D. DeSilets, EdD, RN-BC-Retired, previous Associate Editor of this column and former Assistant Dean, Fitzpatrick College of Nursing, Villanova University; e-mail: lyn.desilets@villanova.edu.

10.3928/00220124-20180613-02

Sign up to receive

Journal E-contents