Journal of Nursing Education

Major Article 

Optimizing NCLEX-RN Pass Rate Performance Using an Educational Microsystems Improvement Approach

Brant J. Oliver, PhD, MS, MPH; Mimi Pomerleau, DNP, RN; Mertie Potter, DNP, PMHNP-BC; Andrew Phillips, PhD, RN; Susan Carpenter, MSN, RN; Steve Ciesielski, MA; Eleanor Pusey-Reid, DNP, RN; Alexia Marcous, MBA, BSN, RN; Pat Grobecker, DNP, RN

Abstract

Background:

Improvement methods were applied to optimize NCLEX-RN first-attempt pass rates in an Accelerated Bachelor of Science in Nursing (ABSN) program.

Method:

An improvement team was formed comprising nursing faculty, a student, faculty leading a course in the ABSN program involved in preparing students for NCLEX-RN, and a faculty expert in improvement science. Two Plan-Do-Study-Act (PDSA) cycles aimed at increasing practice and mastery were conducted. Effects were assessed using inferential statistics and statistical process control analyses.

Results:

One hundred eighty-four ABSN students participated over two semesters. Average practice questions per student per semester increased from 1,000 questions at baseline to 2,130 questions (p < .01) postintervention, average practice examinations from 2.9 to 3.2 (p < .05), and average practice test mastery from 6.8 to 7.2 on an 8-point scale (p < .05). First-attempt NCLEX-RN pass rates increased from 76.7% to 86.2% (p < .05).

Conclusion:

ABSN NCLEX-RN performance improved subsequent to this microsystem-based improvement effort. [J Nurs Educ. 2018;57(5):265–274.]

Abstract

Background:

Improvement methods were applied to optimize NCLEX-RN first-attempt pass rates in an Accelerated Bachelor of Science in Nursing (ABSN) program.

Method:

An improvement team was formed comprising nursing faculty, a student, faculty leading a course in the ABSN program involved in preparing students for NCLEX-RN, and a faculty expert in improvement science. Two Plan-Do-Study-Act (PDSA) cycles aimed at increasing practice and mastery were conducted. Effects were assessed using inferential statistics and statistical process control analyses.

Results:

One hundred eighty-four ABSN students participated over two semesters. Average practice questions per student per semester increased from 1,000 questions at baseline to 2,130 questions (p < .01) postintervention, average practice examinations from 2.9 to 3.2 (p < .05), and average practice test mastery from 6.8 to 7.2 on an 8-point scale (p < .05). First-attempt NCLEX-RN pass rates increased from 76.7% to 86.2% (p < .05).

Conclusion:

ABSN NCLEX-RN performance improved subsequent to this microsystem-based improvement effort. [J Nurs Educ. 2018;57(5):265–274.]

Optimizing new graduate professional licensure examination performance can affect many aspects of quality related to nursing education. From the health care system perspective, it can contribute to the timely supply of qualified nurses to the health care system, which in turn can influence access to care. From an educational institution (school of nursing) perspective, first-time NCLEX performance is a standardized measure representing the quality of instruction and the curriculum. Finally, it is important to nursing students who expect to pass the NCLEX-RN and enter into practice directly after completing their program of study. The following describes the work of a quality improvement action committee (QIAC) to optimize NCLEX-RN pass rates in an Accelerated Bachelor of Science in Nursing (ABSN) program using formal improvement methods similar to those employed in modern health care quality improvement.

Description of the Problem

The context is a nonprofit interprofessional health professions education institution that is part of a large academic health care system and offers undergraduate, graduate, and doctoral degree programs in five health professions and includes a school of nursing (SON). The SON offers generalist (Bachelor of Science in Nursing [BSN]/RN), graduate (Master of Science [MS]/nurse practitioner [NP]), and doctoral (DNP) level programs. The ABSN program is a rigorous 14-month program for students with prior bachelor or higher degrees in other fields that culminates in the BSN degree and eligibility to take the NCLEX-RN.

The ABSN program experienced a significant reduction in first time NCLEX-RN pass rates in 2013. This followed a period of considerable growth in the number of faculty members and student enrollment and the introduction of new online and hybrid teaching methods. The reduction in first-time pass rates also coincided with changes to the format and structure of the NCLEX-RN test plan, which adversely affected pass rates in many schools of nursing, decreasing from 90% (National Council of State Boards of Nursing [NCSBN], 2012) in 2012 to 77% in 2013 (NCSBN, 2013). Recognizing this problem and the need for a rapid and effective response, the dean called for ABSN program faculty to focus on improvement in this area.

The QIAC identified factors which were known to contribute to higher first time NCLEX-RN pass rates including:

Educational strategies can improve mastery of higher cognitive level questions on the NCLEX-RN (Lavin & Rosario-Sim, 2013), including the use of remediation strategies (Brodersen & Mills, 2014), and “exam blueprinting” which links examination questions to course objectives and the NCLEX-RN test plan (McDonald, 2013). Penprase and Harris (2013) found that standardized testing given in the last semester of a nursing program was correlated with increased NCLEX-RN performance. Finally, Serembus (2016) suggested that a temporal 3-year trend exists in which first-time pass rates oscillate following substantive changes in NCLEX-RN format or administration.

Planning

A formal improvement committee was established by the dean comprising faculty and a student representative—the QIAC. The Dean made the issue and the QIAC highly visible, placed a high priority on the work, and supported the application of a formal improvement approach to address it. For example, QIAC faculty presentations including interim data and discussions (with the dean actively participating) were included as a priority item at monthly faculty meetings throughout the duration of the work, and the dean also met regularly and frequently with the QIAC team leader for strategic planning and guidance.

Educational Microsystems

An educational microsystem (EM) perspective was adopted by the QIAC to understand the problem and improve performance. The EM model, introduced elsewhere (Oliver et al., 2018), derives heavily from health care clinical microsystems theory and assumes that health professions education is similarly composed of processes and systems that can be continuously improved by the people in them. An EM is defined as a combination of a defined group of people who work together in a specified educational setting or capacity on a regular basis to provide educational services to customers (who can be recognized as members of a discrete subpopulation of students, faculty, or administrators). Quinn (1992) asserted that smallest replicable units (SRUs) are a critical focus for quality generation in industry, and Nelson et al. (2002) adapted SRUs from industry for health care application to create the clinical microsystems theory. Following on this and adapting for educational contexts, rigorous sustained efforts to optimize EM performance arguably can be expected to generate better educational service delivery quality and outcomes.

Interventions adapting health care improvement approaches have been reported previously in nursing education. Theis and Ayers (2007) utilized the clinical microsystems framework to conceptualize and evaluate quality in a community health nursing practicum. Serembus (2016) applied basic improvement principles derived from Deming and the IHI Model for Improvement, such as the Plan-Do-Study-Act (PDSA) approach, at the whole system level to drive large scale interventions. Our approach builds upon these prior works by incorporating a more formalized improvement approach including an EM context assessment, linked analytic outcomes measurement, use of formalized health care improvement methods, and reporting of results using the SQUIRE 2.0 guidelines for improvement scholarship.

Because the SON was seeking rapid intervention to address the problem, the QIAC sought to identify immediately actionable drivers of NCLEX-RN performance. The committee synthesized our understanding of available knowledge using a fishbone cause-and-effect diagram (Figure 1). Actionable factors appropriate for short-term intervention that were in alignment with the available knowledge synthesis included examination preparation in the final semester of nursing school using standardized approaches (Penprase & Harris, 2013) and developing targeted remediation and test-taking skill-building approaches (Brodersen & Mills, 2014). Other factors were deemed amenable to larger (and slower) scale program level changes, such as those conducted by Serembus (2016), including program level implementation of examination blueprinting, revising admissions policies, and designing new student support and remediation approaches.

High-level Ishikawa (cause-and-effect, or fishbone) diagram of factors contributing to NCLEX-RN performance. Dashed circles denote foci of improvement interventions selected by the quality improvement action committee.

Figure 1.

High-level Ishikawa (cause-and-effect, or fishbone) diagram of factors contributing to NCLEX-RN performance. Dashed circles denote foci of improvement interventions selected by the quality improvement action committee.

Of the two short-term options available, a focus on examination preparation had the highest feasibility for immediate action in our context. Course coordinators for a final semester course in the ABSN program were highly motivated and available to collaborate with the QIAC, and the course already had incorporated some NCLEX-RN test preparation aspects. By this process, the EM was defined as this final-semester nursing course.

Assessment

Following the general tenants of the Batalden and Davidoff (2007) formula for improvement (2007), the QIAC improvement team synthesized the available generalizable knowledge about improving NCLEX-RN test performance (discussed previously) with a context assessment of the EM (the immersion course). We adapted the “5P” assessment process commonly utilized in the clinical microsystems model (Dartmouth Institute Microsystem Academy, n.d.) to assess the context. The 5Ps include: (a) the purpose of the EM; (b) the people served by the EM (e.g., students); (c) the professionals engaged in the work of the EM (e.g., faculty and staff); (d) the processes of the EM (e.g., educational services); and (e) the patterns of performance of the EM (e.g., utilization of standardized examination practice).

The purpose of the EM (the immersion course) is to provide the culminating clinical and educational experience of the nursing program and to prepare students for entry to practice, including preparation for licensure examination (NCLEX-RN). The people engaged in the EM were ABSN students from two cohorts (spring and summer 2015) with approximately 101 and 93 students, respectively. The professionals engaged in the EM included two course faculty, which coordinated both course cohorts (spring and summer, 2015). The processes involved in the EM related to NCLEX-RN preparation included using an online adaptive learning resource (Lippincott PassPoint©) which gave students adaptive practice quizzes and examinations in conjunction with access to evidence-based content to reinforce material missed or not understood. The course coordinators monitored progress and the results of students' practice quizzes and examinations, then followed up with e-mails and meetings to encourage and remind students to stay focused on completing the targeted number of weekly assigned questions. Key patterns in the EM were related to expectations about the number of practice quiz questions and completion of three practice examinations. Each student was encouraged to complete three practice examinations: preterm, midterm, and final, which had 100, 265, and 265 questions, respectively. Because these were adaptive questions, complexity and mastery level of performance increased, allowing for more challenging content for students. Completion of practice questions (other than taking practice examinations) was strongly recommended but not specifically stipulated as a requirement in the course syllabus.

Improvement Approach

The QIAC utilized a hybrid improvement model modified for application in an educational services delivery setting. The improvement team was predominantly comprised novice improvers who were highly motivated to address this problem and engage in experiential applied learning about improvement. Our improvement model included a sequential process beginning with assessment and progressing to the development of aims, change ideas (interventions), and measures. Elements derived from the Dartmouth Microsystem Improvement Curriculum (n.d.), or DMIC (Dartmouth Institute Microsystem Academy, n.d.), which uses an adapted LEAN Define-Measure-Assess-Improve-Control, or DMAIC model (Aveta Business Institute, n.d.), comprised this first portion of the approach. Improvement cycles for the intervention phase were derived from the IHI Model for Improvement (IHI, n.d.), with a particular emphasis on the PDSA method.

Global and Specific Aims

The global aim of this work was to increase first attempt NCLEX pass rates for ABSN students from current level of 77% to greater than 85%, by focusing on examination preparation using standardized mechanisms during the last semester of nursing school. We sought to achieve this performance level within 12 months following the intervention period.

Three specific aims were formulated which derived from the global aim (Table 1):

  • Specific Aim #1: Increase the percentage of ABSN students completing standardized practice examinations to from preintervention baseline of 85% to greater than 95%.
  • Specific Aim #2: Increase the average total number of standardized practice questions completed per student per semester from preintervention baseline of 1,000 to greater than 2,000.
  • Specific Aim #3: Significantly increase the average practice examination proficiency level from preintervention baseline level 6.0 to greater than 7.0 (high proficiency).

Aims and Measures

Table 1:

Aims and Measures

Method

Interventions

The QIAC developed change ideas (interventions) that would align with improvement aims and fit with minimal disruption into the usual and customary work of the EM. These ideas ranged from optional practice opportunities offered external to the course, the same option offered within the course, recommending increased utilization of external test preparation resources, and requiring an increased level of practice as a minimum expectation of the course. The team used the context assessment and applied LEAN principles to guide selection of the initial improvement approach, prioritizing the most readily actionable approach for rapid improvement work in the short term. Given that the EM already contained an embedded standardized examination preparation mechanism that was available to students (PassPoint), the QIAC decided that the most readily actionable interventions should be aimed at increasing practice frequency and consistency, as well as practice test mastery using the existing system. It was assumed that this increase would correlate with increased NCLEX pass rates, as Penprase and Harris (2013) suggested. A minimum number of practice questions required of each student in the course was explicitly designated, and the expectation for consistent practice was incorporated into the course syllabus.

Measures

Operationally defined measures were developed to assess NCLEX-RN outcomes and EM system process performance. Measures were aligned with global and specific aims. A modified clinical value compass approach (Nelson et al., 2002; Nelson, Splaine, Batalden, & Plume, 1998) was utilized to conceptualize our approach. Nelson's model is commonly used in health care improvement and identifies four global quality measurement domains: clinical outcomes, functional outcomes, patient experience, and cost and utilization. Adapted for a focus on EMs and for specific application to NCLEX-RN performance, our EM value compass domains were defined as: NCLEX-RN first-time pass rate (outcomes), EM process performance (practice test utilization and mastery), and student experience. An analysis of cost was not included. The specific aims addressed the first two domains (outcomes and process performance). The third domain (student experience) was assessed postintervention using an anonymous survey developed by student representative on the QIAC. Because this survey is not yet validated, it will be discussed as an exploratory measure.

Analysis

Descriptive analyses were conducted for each process measure for each cohort (spring and summer 2015, respectively). Student t tests (for continuous data) and tests of proportions (for proportions data) were used to assess for significant differences in summary data with significance defined at p < .5. Shewhart statistical process control (SPC) analyses (Benneyan, 1998; Cheung, Jung, Sohn, & Ogrinc, 2012) for proportions data (p charts), commonly used in modern health care improvement measurement, were utilized to assess longitudinal NCLEX-RN performance.

Ethical Considerations

This work qualifies as a quality improvement project and is therefore exempt from institutional review board approval.

Results

Study of the Intervention

The first PDSA cycle (Table 2) was conducted in the spring semester of 2015 and tested a bundle of two change ideas:

  • The course expectations in the EM were modified to include the completion of least 1,500 practice questions (a stipulated graded course participation requirement explicitly stated in the syllabus reinforced by the course instructors).
  • Introduction of a simple process performance dashboard (Figure 2) to improve transparency and provide real-time feedback on performance to the QIAC.

Plan-Do-Study-Act (PDSA) Cycles

Table 2:

Plan-Do-Study-Act (PDSA) Cycles

Process performance results for Plan-Do-Study-Act (PDSA) cycles 1 and 2. Image on the left refers to average practice questions; center image refers to average mastery; image on the right refers to average practice examinations.

Figure 2:

Process performance results for Plan-Do-Study-Act (PDSA) cycles 1 and 2. Image on the left refers to average practice questions; center image refers to average mastery; image on the right refers to average practice examinations.

Use of the data dashboard helped the QIAC to assess PDSA cycle 1 performance and adjust for the second cycle. The QIAC discovered that the syllabus changes appeared to be feasible to implement and were more effective than anticipated. For example, the minimum practice target had been set too low at 1,500 questions (students completed 2,330 on average; Figure 2). The data dashboard also assessed that global mastery in practice examinations was still below target at 6.8 (goal is > 7.0).

The second PDSA cycle was conducted in summer 2015 and incorporated a bundle of change ideas planned based on PDSA cycle 1 results: the minimum number of required practice questions was increased from 1,500 to 2,000; the number of required practice examinations was increased from 2 to 3; and reporting of mastery scores on practice examinations back to students in real time was introduced. We observed that practice question and practice examination process goals met or exceeded goals and that mastery also increased above 7.0 (Figure 2). Explicit accountability expectations and transparency in performance monitoring, two elements known to influence improvement, appeared to contribute to progress in this cycle.

Process Performance

Summary statistics for both PDSA cycles (spring and summer 2015) are given in Figure 2. Practice examination performance was 92% (n = 101) for spring 2015 and 91% (n = 93) for summer 2015, approaching the goal of 95% (Specific Aim #1). The summer cohort completed a significantly higher average number of examinations than the spring cohort (3.2 versus 2.9, p < .05), and both achieved approximately 3 examinations per semester. In addition, both cohorts completed more than 2,000 questions per semester (Specific Aim #2). Average mastery using the 8-point performance scale from the standardized practice examination system (PassPoint) was 6.8 for the spring cohort and 7.2 for the summer cohort (8-point maximum). Scores above 7 on this scale are considered to indicate high proficiency. This was significantly improved in summer 2015 versus spring 2015 (p < .05) and demonstrated overall achievement of a high mastery level for both cohorts (Specific Aim #3).

Outcomes Performance

The global aim of this work was to increase first-pass NCLEX-RN pass rates for ABSN students. The historic pass rate from 2009 to 2014 at our SON was approximately 84%. However, as has been noted by others (Sereumbus, 2016), changes in NCLEX-RN test format are often associated with decreased pass rates. Using SPC analyses, a cluster of decreased performance was observed following test format changes in 2010 that almost reached statistical significance, and another following changes in 2013 that did reach significance (special cause variation). A new preintervention performance level was calculated using the time period following the special cause variation observed in 2013 up until the start of the improvement intervention in 2015 (i.e., 2013–2014)—this was approximately 77%. This mean was held constant to provide a comparison to intervention period performance in 2015–2016 (Figure 3). Following the two PDSA cycles (intervention) conducted in 2015, the NCLEX-RN pass rate at the close of the 2015–2016 academic year increased to 86% (p < .01)—a change of almost 10%, which exceeded our target of 85% (Global Aim, Table 1).

First attempt pass rate by quarter for the intervention period of 2012 to 2016. The center line denotes overall mean proportion passing the NCLEX-RN (2013–2015 baseline) prior to the intervention and is held constant during the intervention period to detect change from baseline postintervention. The squares and lines connecting them denote mean NCLEX-RN pass rate at each individual administration time point (each quarter). The uppermost and lowermost solid lines denote three sigma deviation upper and lower control limits, respectively (approximately 3 standard deviations above and below the mean). The point circled denotes a special cause signal (point above the upper control limit) indicating a significant deviation from the overall mean (p < .01). Arrows (L to R): The leftmost arrow indicates changes in NCLEX-RN format and administration; the second arrow denotes the beginning of quality improvement action committee work, and the final two arrows denote Plan-Do-Study-Act (PDSA) cycles 1 and 2, respectively.

Figure 3.

First attempt pass rate by quarter for the intervention period of 2012 to 2016. The center line denotes overall mean proportion passing the NCLEX-RN (2013–2015 baseline) prior to the intervention and is held constant during the intervention period to detect change from baseline postintervention. The squares and lines connecting them denote mean NCLEX-RN pass rate at each individual administration time point (each quarter). The uppermost and lowermost solid lines denote three sigma deviation upper and lower control limits, respectively (approximately 3 standard deviations above and below the mean). The point circled denotes a special cause signal (point above the upper control limit) indicating a significant deviation from the overall mean (p < .01). Arrows (L to R): The leftmost arrow indicates changes in NCLEX-RN format and administration; the second arrow denotes the beginning of quality improvement action committee work, and the final two arrows denote Plan-Do-Study-Act (PDSA) cycles 1 and 2, respectively.

Using SPC analyses, a special cause variation signal was observed at this time, demonstrating significantly improved performance in summer 2016 (Figure 3). Special cause signals are often assignable to internal or external effects influencing system performance which are nonrandom—for example, PDSA improvement interventions, change in NCLEX-RN test format, and so on. Another SPC analysis of annual NCLEX pass rates from 2009 to 2016, we observed a return to prior baseline following a special cause variation observed from 2013 through early 2015 (decreased performance following change in NCLEX test format in 2013). A special cause variation signal represents a deviation from the measure of central tendency (which in this case is the mean NCLEX-RN pass rate) that is not due to chance (random) variation. This special cause signal resolved by 2016 following the intervention period (Figure 4).

Annual NCLEX-RN performance. Statistical process control proportions chart for first-attempt pass rate by year of test administration. The center line denotes overall mean proportion passing the NCLEX-RN based on all values from 2009 to 2016. Diamonds and the lines connecting them denote mean NCLEX-RN pass rate at each individual NCLEX administration time point (each quarter). The uppermost and lowermost solid lines denote three sigma deviation upper and lower control limits, respectively (approximately 3 standard deviations above and below the mean). The points circled denote special cause signals, indicating significant deviations from the overall mean (p < .01). Arrows (L to R): The first two arrows denote changes in NCLEX-RN format or administration. The final arrow denotes Plan-Do-Study-Act (PDSA) cycles 1 and 2 (both conducted in 2015). The text boxes at the top of the figure describe the overall time line coinciding with the observed data trends.

Figure 4.

Annual NCLEX-RN performance. Statistical process control proportions chart for first-attempt pass rate by year of test administration. The center line denotes overall mean proportion passing the NCLEX-RN based on all values from 2009 to 2016. Diamonds and the lines connecting them denote mean NCLEX-RN pass rate at each individual NCLEX administration time point (each quarter). The uppermost and lowermost solid lines denote three sigma deviation upper and lower control limits, respectively (approximately 3 standard deviations above and below the mean). The points circled denote special cause signals, indicating significant deviations from the overall mean (p < .01). Arrows (L to R): The first two arrows denote changes in NCLEX-RN format or administration. The final arrow denotes Plan-Do-Study-Act (PDSA) cycles 1 and 2 (both conducted in 2015). The text boxes at the top of the figure describe the overall time line coinciding with the observed data trends.

Discussion

The improvement intervention (increased NCLEX-RN practice and mastery) significantly improved process and outcomes performance and motivated greater appreciation by and engagement of ABSN nursing faculty and students in improvement activities focused on educational performance.

Nature of Association Between Intervention and Outcomes

The effect of the QIAC work on process performance appears to be attributable to the intervention evidenced by a temporal relationship between intervention and outcomes. There was a strong partnership between the QIAC and the EM faculty and the improvement intervention was the only substantive, formalized, and systematic modification made to the EM for NCLEX-RN preparation during the intervention period. However, it is not possible to make a definitive causal association between the intervention and NCLEX-RN pass rate outcomes due to the potential presence of unmeasured confounding factors and lack of a randomized design or use of a control group. One of these factors was increased student awareness of the drop in NCLEX-RN pass rate in late 2013, which may have increased motivation for additional student preparation outside of the EM. Finally, the NCLEX-RN performance improvements noted pre- and postintervention also may be a product, in part or sum, of the temporal variations that have been previously observed by Serembus (2016).

Comparison With Other Studies

Cox-Davenport and Phelan (2015) studied a small sample of 69 undergraduate nursing students and found that adaptive quiz questions were helpful in remediating students who had not performed well on predictor examinations. They changed the focus of their program from a more punitive approach with predictor examinations tied to program completion to assisting students in areas identified as weaknesses with remediation quizzes. Serembus (2016) reported on a large-scale continuous improvement plan for a BSN program in which the objective was to raise first-time pass rates. Their student population was composed of undergraduate nursing students in a traditional 4-year nursing program. Serembus (2016) demonstrated an increase in first-attempt NCLEX pass rate from 73% to 96% after 1 year of implementation and this was sustained above 90% for 3 years. Serembus (2016) implemented large-scope, total program changes, including course mapping, increasing admissions requirements to include a science GPA of 2.75 and an overall GPA of 3.0, objective testing with NCLEX-type questions included in 90% of the course evaluation methods, and progression policy changes. The QIAC work realized a similar effect following a shorter duration 1-year intervention focusing on a single EM.

Student Experience

The student representative on the QIAC led the design of an anonymous self-report survey to assess the experience of the intervention from the student perspective. During development, survey questions were shared with class representatives for their feedback. The survey was sent to all students in the summer of 2015 cohort, for a total sample of 91 students. Thirty-seven responses were collected anonymously (40.65% response rate). Of these, 91.9% reported that PassPoint was helpful in preparing for the NCLEX-RN, 75.7% reported that using PassPoint increased their confidence going into the NCLEX-RN “somewhat” or “very much,” and 70.2% of respondents endorsed that having weekly minimum requirements for a certain number of test questions to answer helped with their NCLEX-RN preparation. All respondents reported using another source for test preparation in addition to PassPoint, with 62.1% of respondents indicating that PassPoint questions were easier than questions from other sources. An analysis of free-text responses revealed that 40% of respondents indicated that other sources were more helpful than PassPoint in preparing for the NCLEX-RN. In light of these findings, improvement in the NCLEX-RN pass rate from the intervention could be attributed to imposing a more disciplined study approach, rather than a benefit from the particular software used in the intervention. In addition, it suggested options for future improvement, for example exploration of use of a new software platform. Finally, and perhaps most importantly, the overall changes made in the EM during the intervention period were perceived to be valuable by students.

Impact on Culture, Systems, and Sustainability

Deming's theory of profound knowledge (Deming Institute, n.d.) suggests that “everybody wins” via engagement in improvement by involving the entire culture in the work. This effort represents the first systematized application of formal improvement methods, including linked improvement measures focused on a specifically identified EM in a school of nursing. In this way, the QIAC work, including its publication using the SQUIRE guidelines, represents an additional developmental step taken by the faculty culture toward becoming more engaged and capable in basic improvement science methods and scholarship. This is also a fundamental aspect of microsystems-oriented improvement work, which aims to build improvement capability in improvement teams as they are engaged in the work.

This work continued a forward momentum influencing culture change in the SON. This impetus involved increasing skills and application of improvement science in education. Many of the potential confounding factors mentioned previously include separate efforts that were initiated by the faculty after, or simultaneously with, the initiation of the QIAC work. Although these efforts were not formally organized or evaluated like our QIAC work, they exemplified a shift in culture from reliance on traditional educational mechanisms of change toward a more engaged, empowered, rapid, and activated improvement orientation.

Finally, this work appears to be sustained and propagating. The EM-based improvement work here led to direct sustained changes in the EM. NCLEX-RN practice question and practice exam requirements remain as stipulated expectations in the course syllabus at the new levels achieved after PDSA cycle, and the course directors plan to continue to track and continuously improve performance on this over time. In turn, the success and visibility of this effort in a single EM has created interest in expansion of this approach to other educational need areas, including replication by other faculty in other courses and incorporation into the activities of established formal academic committees. The speed and low resource requirement of the EM improvement approach have made it attractive and feasible.

There is also evidence of future growth and sustainability at higher system levels. Faculty participation on the QIAC has been recognized as service performance for faculty during annual performance evaluations. The QIAC itself, which was a chartered dean's committee, later was incorporated into activities of two standing committees at the school. Finally, the greatest evidence of sustainability may be the members of the QIAC itself, which as a result of participation in this work now represents the most highly prepared improvers in the faculty. These faculty now have improved capacity to teach quality and safety principles and incorporate QSEN competencies in quality and safety into the nursing curriculum. Many have participated in concerted innovation and curriculum integration efforts throughout the SON, as described elsewhere (Oliver et al., 2017).

Limitations

The outcomes observed here cannot be assumed as solely attributable to the improvement intervention due to the potential presence of unmeasured confounders. As formal improvement work was underway on this initiative, additional efforts were started by other groups in the SON to improve performance. These included slower mid-range and long-range interventions, including use of academic support services by faculty for students at risk, curriculum mapping in some course to the NCLEX-RN test plan to identify areas lacking, and encouraging ABSN faculty to increase the amount of higher cognitive level questioning on examinations. Although many of these interventions started long after the initiation of our QIAC work and did not utilize formal improvement methods or assessment, their potential effects on outcomes cannot be ignored. Because these interventions were not quantitatively evaluated, we cannot empirically assess for their contribution or potential confounding effects.

The temporal trend effect suggested by Serembus (2016) could be contributing to outcomes findings. However, our analysis suggests against this. We did not observe the pattern of 3-year temporal trends that Serembus described (e.g., over a 7-year period, we observed only one special cause variation lasting 2 years; Figure 4), and our SPC analyses also suggest against the presence of a temporal trend based on special cause signal findings in 2016 (Figure 3). In addition, we did not observe similar performance trends in our non-ABSN generalist nursing cohort either pre- or postintervention. First-attempt NCLEX-RN pass rates for this cohort did not drop as our ABSN cohort rates did after the NCLEX-RN format changes in 2013 nor did they improve as ABSN NCLEX-RN first-attempt pass rates did in 2015–2016 during and postintervention. Rather, they remained stable and above 80% throughout. One would expect these dissimilar observations between cohorts in a situation in which a temporal trend affecting the entire student population is absent.

Finally, we recognized the risk of single criterion measurement limitations—focusing solely on the NCLEX-RN outcomes as a sole measure of performance. The QIAC attempted to mitigate this by using a multidimensional measurement approach, which included process and outcome measures and also a preliminary assessment of student experience.

Conclusion

This effort builds on prior similar work focused on improving NCLEX-RN performance in undergraduate nursing populations and makes several new contributions. First, our work focuses on an accelerated program (the ABSN), which differs in many respects from traditional undergraduate nursing programs. Second, unlike larger program-level efforts, our work focused on optimizing the performance of a single EM—a more rapid and much less resource-intensive approach. We assumed that optimizing microsystem performance in an educational context would have a similar effect on overall system performance as has been observed in industry and health care, and our results support this. Finally, faculty and students engaged in high-level experiential learning, applied formal improvement techniques, measured performance, and reported results using the latest guidelines for health care improvement scholarship (SQUIRE 2.0). This will generate initial publications for some of the faculty and for the student representative engaged in this work. Taken in sum, this work achieved more than measurable improvements in NCLEX-RN related performance—it actively developed future faculty and student improvement leaders and scholars at the SON and reinforced its growing improvement culture.

REFERENCES

  • Abbott, A.A., Schwartz, M.M., Hercinger, M., Miller, C.L. & Foyt, M.E. (2008). Predictors of success on National Council Licensure Examination for Registered Nurses for accelerated baccalaureate nursing graduates, Nurse Educator, 33, 5–6. doi:10.1097/01.NNE.0000299489.07872.b0 [CrossRef]
  • Aveta Business Institute. (n.d.). The Define-Measure-Assess-Improve-Control (DMAIC) method. Retrieved from http://www.sixsigmaonline.org/six-sigma-training-certification-information/the-dmaic-model-in-six-sigma/
  • Batalden, P.B. & Davidoff, F. (2007). What is “quality improvement” and how can it transform healthcare?BMJ Quality & Safety, 16, 2–3. doi:10.1136/qshc.2006.022046 [CrossRef]
  • Benneyan, J.C. (1998). Statistical quality control methods in infection control and hospital epidemiology, part I: Introduction and basic theory. Infection Control and Hospital Epidemiology, 19, 194–214. doi:10.2307/30143442 [CrossRef]
  • Brodersen, L.D. & Mills, A.C. (2014). A comparison of two nursing program exit exams that predict first-time NCLEX-RN outcome. Computer, Informatics, Nursing: CIN, 32, 404–412. doi:10.1097/CIN.0000000000000081 [CrossRef]
  • Cheung, Y.Y., Jung, B., Sohn, J.H. & Ogrinc, G. (2012). Quality initiatives: Statistical control charts: Simplifying the analysis of data for quality improvement. Radiographics, 32, 2113–2126. doi:10.1148/rg.327125713 [CrossRef]
  • Cox-Davenport, R.A. & Phelan, J.C. (2015). Laying the groundwork for NCLEX success: An exploration of adaptive quizzing as an examination preparation method. Computers, Informatics, Nursing, 33, 208–215. doi:10.1097/CIN.0000000000000140 [CrossRef]
  • Dartmouth Institute Microsystem Academy. (n.d.). Curriculum. Retrieved from http://clinicalmicrosystem.org/knowledge-center/curriculum/
  • Deming Institute. (n.d.). The Deming System of Profound Knowledge (SoPK). Retrieved from https://deming.org/explore/so-pk
  • Institute for Healthcare Improvement. (n.d.). How to improve. Retrieved from http://www.ihi.org/resources/Pages/HowtoImprove/default.aspx
  • Lavin, J. & Rosario-Sim, M.G. (2013). Understanding the NCLEX: How to increase success on the revised 2013 examination. Nursing Education Perspectives, 34, 196–198.
  • McDonald, M.E. (2014). The nurse educator's guide to assessing learning outcomes (3rd ed.). Burlington, MA: Jones & Bartlett Learning.
  • National Council of State Boards of Nursing. (2012). 2012 NCLEX pass rates. Retrieved from https://www.ncsbn.org/Table_of_Pass_Rates_2012.pdf
  • National Council of State Boards of Nursing. (2013). 2013 NCLEX pass rates. Retrieved from https://www.ncsbn.org/Table_of_Pass_Rates_2013.pdf
  • Nelson, E.C., Batalden, P.B., Huber, T.P., Mohr, J.J., Godfrey, M.M., Headrick, L.A. & Wasson, J.H. (2002). Microsystems in health care: Part 1: Learning from high performing front line clinical units. The Joint Commission Journal on Quality Improvement, 28, 472–493. doi:10.1016/S1070-3241(02)28051-7 [CrossRef]
  • Nelson, E.C., Splaine, M.S., Batalden, P.B. & Plume, S.K. (1998). Building measurement and data collection into medical practice. Annals of Internal Medicine, 128, 460–466. doi:10.7326/0003-4819-128-6-199803150-00007 [CrossRef]
  • Oliver, B.J., Phillips, A., Potter, M., Pomerleau, M., Carpenter, S., Ciesielski, S. & Pusey-Reid, E. (2018). Building faculty capacity for healthcare quality and safety improvement using an educational microsystems approach: A pilot initiative. Manuscript submitted for publication.
  • Oliver, B.J., Potter, M., Pomerleau, M., Phillips, A., O'Donnell, M., Cowley, C. & Sipe, M. (2017). Rapid health care improvement science curriculum integration across programs in a school of nursing [Supplemental material]. Nurse Educator, 42, 38–43. doi:10.1097/NNE.0000000000000428 [CrossRef]
  • Penprase, B.B. & Harris, M.A. (2013). Accelerated second-degree nursing students: Predictors of graduation and NCLEX-RN first-time pass rates. Nurse Educator, 38, 26–29. doi:10.1097/NNE.0b013e318276df16 [CrossRef]
  • Quinn, J.B. (1992). Intelligent enterprise: A knowledge and service based paradigm for industry. New York, NY: The Free Press.
  • Serembus, J.F. (2016). Improving NCLEX first-time pass rates: A comprehensive program approach. Journal of Nursing Regulation, 6(4), 38–44. doi:10.1016/S2155-8256(16)31002-X [CrossRef]
  • Thies, K.M. & Ayers, L. (2007). Academic microsystems: Adapting clinical microsystems as an evaluation framework for community-based nursing education. Journal of Nursing Education, 46, 325–329.
  • Woo, A., Wendt, A. & Liu, W. (2009). NCLEX pass rates: An investigation into the effect of lag time and retake attempts. Journal of Nursing Administration's Healthcare Law, Ethics and Regulation, 11, 23–26. doi:10.1097/NHL.0b013e31819a78ce [CrossRef]

Aims and Measures

Aim Measure
Global Aim: Improve first pass NCLEX-RN pass rate for Bachelor of Science in Nursing (BSN) graduates from the Accelerated Bachelor of Science in Nursing (ABSN) program from current rate of 77% to greater than 85% within 1 year. Outcome: Proportion of students taking the NCLEX-RN who pass the examination on first attempt.
Specific Aim #1: Increase the percentage of ABSN students completing practice examinations to greater than 95% by the close of course offered by the educational microsystem (EM). Process: Proportion of students in the course completing each of the full-length practice examinations.
Specific Aim #2: Increase the total number of practice questions completed per student to greater than 2,000 by the close of the course offered by the EM. Process: Mean number of practice questions answered by students in the course.
Specific Aim #3: Increase the average practice examination proficiency of students participating in the course offered by the EM from the current level of 6.0 to greater than 7.0 (on 8-point maximum mastery scale). Process: Mean global mastery level of students completing full-length practice examinations.

Plan-Do-Study-Act (PDSA) Cycles

Step PDSA Cycle 1: Spring 2015 PDSA Cycle 2: Summer 2015
Plan Syllabus changes requiring practice questions. Syllabus changes increasing practice question and practice examination requirements.
Process measures dashboard. Adjustment to practice platform to enable and automatically report mastery on practice exams to students after each examination.
Establishment of roles for execution of plan and time line. Student quality improvement action committee (QIAC) member advisement on feasibility of plans.
Student QIAC member advisement on feasibility of initial plan. Student suggests qualitative feedback survey, group agrees. Student leads development with faculty assist.
Check for alignment with charter and specific aims. Establishment of roles for execution of plan and time line.
Pilot test of data collection plan.
Do Syllabus changes implemented beginning of semester. Syllabus changes implemented.
Data dashboard completed by midsemester. Mastery reporting enabled after practice examinations and emphasized by faculty.
Data collection complete and data available by close of semester. Student feedback survey administered.
Data collection complete and data available by close of semester.
Study Practice performance exceeded expectations. Practice examinations above new goal.
Practice examination performance at expectations. Practice questions above new goal.
Mastery below expectations. Mastery increased above goal of 7.0.
Data dashboard helpful. Student feedback survey indicates appreciation of improvement efforts and increased overall confidence, and suggests that better software platform needed for practice.
Some students had trouble accessing the online practice questions and examinations. Data dashboard gaining increased perceived acceptance in the QIAC.
NCLEX-RN quarterly data show that pass rates are increasing.
Act Increase practice, examinations, and mastery expectations for next cycle. Work to sustain and standardize gains; transition plan from QIAC project to ongoing work lead by the course faculty and student representative.
Expand data transparency to include students; real-time examination performance feedback on mastery. Continued monitoring including longitudinal outcomes (NCLEX-RN pass rate) for 2016.
Assistance to students having trouble accessing the practice platform. Further develop and validate student experience survey as a standard performance measure and dashboard component.
Explore options for new online practice platform based on student feedback.
Present final findings to the faculty, consider replication in other settings or for other problems.
Disseminate via conference presentations and an academic journal publication.
Authors

Dr. Oliver is Assistant Professor, The Dartmouth Institute for Health Policy & Clinical Practice and the Geisel School of Medicine at Dartmouth, Hanover, New Hampshire; Dr. Pomerlau is Nurse Manager, Department of Obstetrics and Gynecology, Tufts Medical Center, Boston, and Dr. Potter is Clinical Professor, Dr. Phillips is Assistant Professor, Ms. Carpenter is Instructor, Mr. Ciesielski is Instructional Support Counselor, Dr. Pusey-Reid is Assistant Professor, Ms. Marcous is Graduate, ABSN Program, and Dr. Grobecker is Assistant Professor, School of Nursing, MGH Institute of Health Professions, Charlestown, Massachusetts.

The authors have disclosed no potential conflicts of interest, financial or otherwise.

Address correspondence to Brant J. Oliver, PhD, MS, MPH, Assistant Professor, Institute for Health Policy & Clinical Practice and the Geisel School of Medicine at Dartmouth, Hanover, NH 03766; e-mail: brant.j.oliver@dartmouth.edu.

Received: February 28, 2017
Accepted: November 29, 2017

10.3928/01484834-20180420-03

Sign up to receive

Journal E-contents