A peer-review process has been the gold standard for evaluating scientific merit conference proposals for many years, but inconsistent evaluation methods have been challenging. Poor or low interrater reliability between peer reviewers has weakened the peer-review process (Deveugele & Silverman, 2017). Concerns have been expressed regarding reviewing, the scoring process, and the peer-review method used for evaluation of scientific merit.
Although peer review has been around for more than 300 years, with professional organizations using peer reviewers since the 18th century, many organizations have yet to employ structured education and training for their peer reviewers (Pierson, 2016). This article discusses how one Sigma Theta Tau International Honor Society of Nursing (Sigma) translated educational rubrics into an instrument for peer reviewers. Professional development educators could create a similar process to ensure that abstract reviews are consistent and streamlined.
Rubrics, or a set of assessment criteria that articulate the quality of specific measures (Tenam-Zemach, 2015), have been used as a guide to establish and communicate standards to students in the academic setting for years. Standards set by rubrics ensure that educational objectives and nursing standards of practice are met. Rubrics in the clinical practice setting evaluate clinical judgment and advance clinical practice (Bergum et al., 2017).
Feedback from conference abstract reviewers was a driving force in this initiative. The analysis identified a need for increased consistency between abstract reviewers. A small group of reviewers, Sigma staff, and experts were assembled to evaluate the peer-review process. After reviewing the literature, the group developed a conceptual rubric to increase peer reviewer consistency, decrease subjective judgment between peer reviewers, and provide a framework to presenters to improve the quality of abstract submissions.
The newly developed abstract rubric was presented to abstract reviewers for a comment period of 6 weeks. Additional alterations were made to the abstract rubric based on peer reviewer feedback and then reevaluated. Based on the reevaluation comments, a final abstract rubric was developed. Upon final approval by Sigma's Conference Planning Task Force members, an asynchronous online educational activity was designed to clarify the abstract rubric's role and purpose, to decrease ambiguity in scoring and terminology, and to establish objectivity for and improve transparency for the abstract reviewers.
Task Force members used previously reviewed abstracts to highlight subjective differences between reviewers in the educational activity. The peer-review educational activity was disseminated to abstract reviewers through Sigma's online learning management system. Posteducational activity feedback noted high peer reviewer satisfaction with the new abstract rubric. Abstract submitters also noted increased confidence in submitting abstracts to Sigma conferences after rubric implementation.
All peer reviewers were provided access to the peer-review educational activity. Peer reviewers who successfully passed the training were used in Sigma's next organizational event review cycle. The abstract rubric was also made available to abstract authors for provider-directed, provider-paced educational activities. As noted in Figure 1, comprehensive peer reviewer scores increased compared with previous review cycles. Author adherence to abstract regulations also increased. Quantitative and qualitative data identified increased usability with the new abstract rubric.
Abstract review scores pre- and postintervention. Note. x-axis = range of scores between 2.5 (poor) and 5 (excellent); y-axis = percentage of a particular score.
Implications for Nursing Professional Development Practice
The use of an abstract rubric provides consistency in judging scientific abstracts and elevates the quality of abstracts submitted (Tenam-Zemach, 2015). Clinical nurses on quality, research, and evidence-based practice shared governance councils and those reviewing abstracts for health care research symposiums could benefit from a similar structure. With the rubric's introduction, the rigor of submitted abstracts for educational activities has continued to increase. Setting clear expectations and providing education for both authors and reviewers may further the science of nursing. Table 1 describes the steps used to implement the development of the new rubric. Nursing professional development specialists can use these steps as a model to create meaningful and precise change within their organization.
Rubric Development Steps
- Bergum, S. K., Canaan, T., Delemos, C., Gall, E. F., McCracken, B., Rowen, D., Salvemini, S. & Wiens, K. (2017). Implementation and evaluation of a peer review process for advanced practice nurses in a university hospital setting. Journal of the American Association of Nurse Practitioners, 29(7), 369–374 doi:10.1002/2327-6924.12471 [CrossRef] PMID:28560763
- Deveugele, M. & Silverman, J. (2017). Peer-review for selection of oral presentations for conferences: Are we reliable?Patient Education and Counseling, 100(11), 2147–2150 doi:10.1016/j.pec.2017.06.007 [CrossRef] PMID:28641993
- Pierson, C. A. (2016). Recognizing peer reviewers and why that matters. Journal of the American Association of Nurse Practitioners, 28(1), 5 doi:10.1002/2327-6924.12441 [CrossRef] PMID:26751726
- Tenam-Zemach, M. (2015). Rubric nation: Critical inquiries on the impact of rubrics in education. Information Age Publishing.
Rubric Development Steps
|1. Review the current process.|
| Examine current processes. Be open to a critical review of current methods.|
| Are they providing the desired outcome? Are they working efficiently? Is there variability in the process?|
| Determine what is already known (e.g., scoping review, literature review).|
|2. Identify qualified stakeholders to review the current process.|
| Ensure stakeholders are involved with the decision-making process.|
| Develop relationships with stakeholders through frequent and regular communication.|
|3. Develop a clear vision of the change, development, and implementation process.|
| Establish communication and time line expectations.|
| Establish a consensus on the deliverables and what success means.|
|4. Develop a strategy to disseminate the new/revised process to those involved.|
| Formulate a communication strategy for reviewers.|
| Allow time to receive feedback and make adjustments as needed.|
|5. Provide practical training.|
| Develop reviewer training content.|
| Construct a platform and strategy to disseminate the training.|
|6. Evaluate the training.|
| Request continuous feedback from participants and stakeholders.|
| Act on the feedback received if feasible or needed.|
|7. Make modifications as needed.|
| Be open to making adjustments based on the feedback received.|
| Communicate the changes that are made.|
|8. Conduct a final review with stakeholders.|
| Gather evidence gleaned from training.|
| Share the modified rubric and process with stakeholders and establish consensus.|