Athletic Training and Sports Health Care

Professional Practice Supplemental Data

Evaluation of Athletic Trainer Performance in the Secondary School Setting: The Secondary School Athletic Trainer Evaluation Tool (SSATET)

Eric Nussbaum, MEd, LAT, ATC; Joanne Ploch, MS, LAT, ATC; Casey Christy, MA, LAT, ATC, CSCS; Shannon Tomasula, MS, LAT, ATC; Mark W. Bramble, MS, LAT, ATC; William J. von Leer, MS, LAT, ATC; Paula Sammarone Turocy, EdD, LAT, ATC

Abstract

A new evaluation tool was constructed to evaluate secondary school athletic trainers (SSATs) and to educate administrators on the performance expectations for SSATs. Content validation was provided using the Board of Certification Practice Analysis, 7th Edition. Performance criteria and standards were developed for each domain. Six Secondary School Athletic Trainer Evaluation Tool (SSATET) AT Performance Standards for Proficient Practice were evaluated for content and criterion-related validity. Validation was completed using face validity assessment by 15 experienced educational administrators. Based on feedback from this evaluation, the tool was revised to its final format and submitted for approval to the Executive Council of the Athletic Trainers' Society of New Jersey and the New Jersey Department of Education for review and approval. The SSATET represents a novel, validated approach toward performance evaluations of the SSAT. Although designed to meet the demands of a state mandate, it could be adapted and used for ATs in other settings. [Athletic Training & Sports Health Care. 2019;11(4):161–165.]

Abstract

A new evaluation tool was constructed to evaluate secondary school athletic trainers (SSATs) and to educate administrators on the performance expectations for SSATs. Content validation was provided using the Board of Certification Practice Analysis, 7th Edition. Performance criteria and standards were developed for each domain. Six Secondary School Athletic Trainer Evaluation Tool (SSATET) AT Performance Standards for Proficient Practice were evaluated for content and criterion-related validity. Validation was completed using face validity assessment by 15 experienced educational administrators. Based on feedback from this evaluation, the tool was revised to its final format and submitted for approval to the Executive Council of the Athletic Trainers' Society of New Jersey and the New Jersey Department of Education for review and approval. The SSATET represents a novel, validated approach toward performance evaluations of the SSAT. Although designed to meet the demands of a state mandate, it could be adapted and used for ATs in other settings. [Athletic Training & Sports Health Care. 2019;11(4):161–165.]

According to the National Athletic Trainers' Association (NATA), 18% (> 8,000) of its members currently work in the secondary school setting.1 School districts often require school administrators to conduct regular performance evaluations of school personnel to assess and improve performance (formative evaluations), as well as to determine continued employment and raises (summative evaluations). Unfortunately, contrary to the medical model employed in other athletic training practice settings where the administrator has athletic training or other medical training, secondary school administrators rarely have any medical background. The purpose of this project was to develop an evaluation tool that could be used by school administrators with no medical training to consistently and fairly evaluate secondary school athletic trainers (SSATs) based on athletic training professional practice expectations that would be apparent to evaluators with no medical training.

In the state of New Jersey and in many other states, SSATs employed by school districts must meet similar requirements, including performance evaluations, as do the teachers and other credentialed professionals.2 Prior to the development of the Secondary School Athletic Trainer Evaluation Tool (SSATET) in the state of New Jersey, clearly defined SSAT performance objectives were absent; therefore, there were no consistent or objective ways for administrators who had no medical training to objectively evaluate the performance of SSATs. Further complicating the evaluation process is the fact that many school administrators have little to no understanding of the knowledge, skills, and performance expectations of SSATs.

The key to a successful evaluation of the work and performance of a SSAT lies in the structure, function, and administration of an evaluation tool.3 That evaluation tool should be designed to target SSATs specifically and to objectively assess SSAT performance based on defined roles and responsibilities.4–6 SSATs who were members of the Athletic Trainers' Society of New Jersey (ATSNJ) came together to address these concerns and develop an object evaluation tool that would be acceptable to the state Department of Education and the administrators within the different school districts in the state. Although this project was undertaken to specifically address a need in the state of New Jersey, the evaluation tool and school administrator training module were developed in such a manner that other SSATs may be able to use it in their school districts where school administrators are required to evaluate SSATs.

Methods

A task force of six experienced SSATs (each with 18 to 32 years of experience in the secondary school setting), with 140 years of combined experience, was formed and charged with gathering information needed to develop an evaluation tool to assess performance of SSATs. Information about the typical processes and procedures of administering performance evaluations of secondary school employees, in general, was gathered from school administrators who were involved in the evaluative process within their districts by the task force. Information on performance evaluations also was gathered through Ovid, CINAHL, PubMed, and ERIC database searches, including the search terms “performance + appraisal,” “performance + review,” “education + performance,” and “athletic training + performance.” Further information was solicited from leaders from the NATA and the Board of Certification, Inc. (BOC) to help identify existing tools that could be adapted. Finally, existing educational evaluation models, including those used for secondary school teachers,7–9 also were reviewed for their designs and processes in hopes of gaining insight into the evaluation methods used to assess the performance of other professionals who were typically evaluated in the secondary school setting by administrators who had different backgrounds and preparation from those being evaluated. The existing educational tools7–9 were evaluated by four separate criteria: (1) purpose of the performance evaluation, (2) core elements of the performance evaluation, (3) tools for evaluating performance, and (4) barriers to implementation of the performance evaluation. The information collected helped to provide a framework for the structure and process that would be used in the development of the actual content of the SSATET.

The next step in the process was the determination of the content (criteria for evaluation) that would be included in the SSATET. Initially, the task force used the six domains of practice identified in the 5th BOC Role Delineation Study (BOCRDS). These domains of practice were later (post-hoc development) cross-referenced with the current BOC Practice Analysis, 7th Edition4 (BOCPA) to ensure consistency in content. The BOCPA uses five domains of practice compared to six in the BOCRDS, with Administrative and Personal/Professional domains being combined in the BOCPA. The tasks associated with performance of those domains would serve as the constructs for the SSATET. The domains include (1) Injury and Illness Prevention and Wellness Promotion, (2) Examination, Assessment, and Diagnosis; (3) Immediate and Emergency Care; (4) Therapeutic Intervention; and (5) Healthcare Administration and Professional Responsibility. The content (criteria for evaluation) delineated under each construct of the SSATET included the tasks and the subsequent skills that would be used to define and refine the content of the questions on the evaluation tool. For ease of administration, the content defined by the fifth domain of athletic training practice was split into two separate evaluation criteria for use in the SSATET–Administration Standard and Professional Responsibility Standard.

The task force then developed specific objective evaluation criteria, termed AT Performance Standards for Proficient Practice, under each of the evaluation criteria, with objective, measurable examples that would be apparent to secondary school administrators and other non-medical evaluators, regardless of their understanding of the scope of practice of an AT. When complete, the six SSATET AT Performance Standards for Proficient Practice were evaluated for content and criterion-related validity10 with the BOCPA. To ensure that the SSATET could be used consistently to objectively evaluate performance of SSATs, the task force used the recommendations laid out in New Jersey state law (TEACHNJ),2 which stipulated that prior to the completion of any evaluation, all non-tenured secondary school employees should be observed on at least three or more occasions to allow for a more comprehensive assessment. The task force felt that this was an appropriate recommendation for SSATs to access their varied roles and functions, including those performed during practice, competition, and daily activities (eg, conducting rehabilitation, interacting with parents, and patient education sessions).

The results of those observations would be recorded as formative assessments on an Observation Evaluation Form (OEF) (Figure A, available in the online version of this article) that contains the same constructs for evaluation as the SSATET and generally prompts the evaluator to consistently and objectively look for observable demonstrations of desired behavior (specific professional practice skills and performance) that could contribute to the final evaluation using the SSATET. The OEF also allows an evaluator to record objective evidence of performance for each evaluation criterion (individual job-related task). The SSATET Summative Performance Report (Figure B, available in the online version of this article) then is used to summarize the findings from the three OEFs, rating the SSAT as highly effective, effective, partially effective, or ineffective for each specific evaluation criterion and requires documentation of specific descriptions of behavior that support the rating.

Formal Observation – Licensed Athletic Trainer Formal Observation – Licensed Athletic Trainer Formal Observation – Licensed Athletic Trainer

Figure A:

Formal Observation – Licensed Athletic Trainer

Summative Performance Report Summative Performance Report Summative Performance Report Summative Performance Report

Figure B:

Summative Performance Report

It was recommended by the task force that if three or more “partially effective” ratings are earned consistently on one evaluation criterion, the resulting SSATET summative evaluation rating should be either “partially effective” or “ineffective.” Similarly, one “ineffective” rating on a specific performance criterion should result in an “ineffective” rating on the SSATET. An AT who receives a “partially effective” and/or “ineffective” rating on the SSATET should be given a Corrective Action Plan with specific recommendations and improvement tasks identified. The OEFs and/or SSATET could be used for subsequent remediation evaluations that should continue until either the Corrective Action Plan requirements are met or the employer determines another course of action.

The OEF, SSATET, proposed evaluation timeline, and associated processes were critiqued by the Task Force members. The face validity10 of the evaluation tools and associated processes and timeline also were evaluated by 15 school administrators with no medical background who were highly experienced in the teacher evaluation process in New Jersey. Based on their feedback and recommendations, the initial tools and processes were revised several times until the final evaluation tools were formalized. The input from the school administrators also contributed to the development of the online instructional presentation that was compiled to ensure that all secondary school administrators who were responsible for the evaluation of SSATs understood not only the scope of practice of SSATs, but also how to best use the SSATET and its associated supportive tools to conduct objective and consistent evaluations of SSATs.

Results

We searched for existing secondary school AT evaluation tools using Ovid, CINAHL, PubMed, and ERIC databases; however, although there were many matches with our keyword searches, no existing evaluation tools were discovered (Table 1). Therefore, we decided to develop and validate a new instrument to assess the performance of the SSAT.

Results of Literature Reviewa

Table 1:

Results of Literature Review

After significant review and validation, the SSATET was developed and submitted to the Executive Council of the ATSNJ for a final validity check and endorsement, which it ultimately received. Additional approval was received from the New Jersey Department of Education, which approved the SSATET for implementation as a statewide tool to appropriately evaluate the performance of the SSAT. Additionally, an online educational course was created for administrators to guide them on how to properly use the SSATET.

Discussion

The BOCRDS that was used as the framework for construction of this tool is repeated every 5 years by the BOC. Since the creation of the SSATET, the name of the document has subsequently changed to the Board of Certification Practice Analysis (BOCPA) to reflect more current language. The number of domains decreased from six to five as the administrative and professional domains were combined and other domains were renamed. The task force and administrators who critiqued the tool felt that maintaining specific evaluation of professional and administrative duties was significant for the SSAT and should therefore be considered as separate domains within the evaluation.

Although performance evaluations are common in the business world, it was evident from our search that no other athletic training performance evaluations have been formally validated and published. This evaluation tool offers a novel approach to evaluating the SSAT and is the first evaluative tool specific to athletic training to be validated and published.

From a practitioner perspective, performance evaluation can be an effective tool in providing objective feedback to validate skills and practice, facilitate corrective action if poor skills are demonstrated, or as a medium to correct or reward performance11–14 It can also assist in identifying professional development needs and, in the case of New Jersey, fulfilling professional regulatory body obligations. ATs should be educated on goals, criteria, summative report, and appeal process to be properly prepared for this performance evaluation.

Objectives, procedures, materials (eg, training materials and interpretation guides), and premises for performance evaluation should be clearly identified and documented.9 Because the impetus for the construction of this evaluation tool was a state mandate for all secondary school certified employees, we thought that it was important to try to maintain some consistency in the evaluative process. Existing teacher evaluation tools7–9 used a performance evaluation based on an “effective” to “ineffective” rating scale; to maintain consistency, that same scale was used in the SSATET.

Evaluator training is also a key factor to conducting effective performance evaluation.7,13 Training has been reported to improve consistency and develop confidence with the use of an evaluation instrument.6 Evaluators or anyone completing the measurement must be instructed about the performance measurement process.10 This was considered a vitally important component of the SSATET development. Ideally, the person performing the evaluation should be a health care provider and have some clinical understanding about what certified ATs do. There was concern among the athletic training community that ATs would potentially be evaluated by someone who did not fully comprehend the totality of an AT's job responsibilities. For this reason, we specifically identified key standards of practice, observable tasks, and documents for evaluators to look for when completing an AT observation. We also included an online educational course for evaluators (available on the ATSNJ website: www.atsnj.org) for those who will be using the SSATET.

Personality conflict between managers/supervisors and individual practitioners was also identified as a major impediment for performance evaluation.11,15 There may also be resistance from practitioners who are skeptical about the validity and usefulness of performance evaluation data. It was the intention of the task force to make the tool well-defined, based on highly respected guidelines, and easy to understand and interpret. In the event an AT feels that he or she was inappropriately evaluated, the SSATET recommends that an appeal process be in place that allows for a review of the evaluation process by an unbiased third party.

The analysis of data and reporting of results should lead to the recognition of good performance, improvement of poor performance, and modification of the performance evaluation system, if required. A clear plan of action should be agreed on by relevant stakeholders for performance evaluation to be meaningful and worthwhile.

Implications for Clinical Practice

The SSATET represents a novel, validated approach toward performance evaluations of the SSAT. Although designed to meet the demands of a state mandate (specifically for SSATs), because this tool was designed using the BOCPA, it could be adapted and used for ATs in other settings. A process of three formal observations provides objective data for the completion of an annual summative review of the job performance of the AT. Supportive education for the evaluator and AT regarding purpose, expectations, criteria, and appeal are necessary for a functional review program.

References

  1. National Athletic Trainers' Association web site. https://www.nata.org/about/athletic-training/job-setting
  2. Teacher Effectiveness and Accountability for the Children of New Jersey (TEACH-NJ) Act, S. 2925, New Jersey Legislature, 214th sess. (2011).
  3. Grimmer-Somers K, Milanese S, Kumar S. Measuring the quality of allied health services in Australia: is it a case of the ‘more we learn, the less we know?’J Healthc Leadersh. 2012;4:71–81. doi:10.2147/JHL.S33163 [CrossRef]
  4. Henderson J. The 2015 Athletic Trainer Practice Analysis Study. Omaha, NE: Board of Certification; 2015.
  5. Purbey S, Kampan M, Chandan B. Performance measurement system for health-care processes. Int J Prod Perform Manag. 2007;56:241–251. doi:10.1108/17410400710731446 [CrossRef]
  6. Lizarondo L, Grimmer K, Kumar S. Assisting allied health in performance evaluation: a systematic review. BMC Health Serv Res. 2014;14:572. doi:10.1186/s12913-014-0572-7 [CrossRef]
  7. Danielson C, Evaluations that help teachers learn. The Effective 248. Educational Leadership. 2010;68:35–29.
  8. Marzano RJ. The Art and Science of Teaching: A Comprehensive Framework for Effective Instruction, 1st ed. Alexandria, VA: Association for Supervisor and Curriculum Development; 2007.
  9. Strong JH. Handbook on Teacher Evaluation: Assessing and Improving Performance. Florence, KY: Taylor & Francis; 2003.
  10. Turocy PS. Survey research in athletic training: the scientific method of development and implementation. J Athl Train. 2002;37(4 Suppl):S174–S179.
  11. Chandra A, Frank CD. Utilization of performance appraisal systems in health care organizations and improvement strategies for supervisors. Health Care Manag. 2004;23:25–30.
  12. Geraedts M, Selbmann H-K, Ollenschlaeger G. Critical appraisal of clinical performance measures in Germany. Int J Qual Health Care. 2003;15:79–85. doi:10.1093/intqhc/15.1.79 [CrossRef]
  13. Geddes L, Gill C. Annual performance appraisal: one organization's process and retrospective analysis of outcomes. Healthc Q. 2012;15:59–63. doi:10.12927/hcq.2012.22764 [CrossRef]
  14. Mant J. Process versus outcome indicators in the assessment of quality of health care. Int J Qual Health Care. 2001;13:475–480. doi:10.1093/intqhc/13.6.475 [CrossRef]
  15. Arnold E, Pulich M. Personality conflicts and objectivity in appraising performance. Health Care Manag. 2003;22:227–232.

Results of Literature Reviewa

Search Keywords OVID CINAHL PubMed ERIC
Performance + Appraisal 21,871 (0) 6,201 (0) 6,393 (0) 881 (0)
Performance + Review 395,512 (0) 1,751 (0) 74,867 (0) 1,286 (0)
Education + Performance 209,700 (0) 1,268 (0) 85,006 (0) 5,296 (0)
Athletic Training + Performance 1,171 (0) 124 (0) 20,345 (0) 9 (0)
Athletic Trainer + Job Performance 5 (0) 0 20 (0) 0
Athletic Trainer Performance + Appraisal 0 0 1 (0) 0
Athletic Trainer Performance 712 (0) 14 (0) 1 (0) 2 (0)
Education + Performance + Athletic Training 419 (0) 1 (0) 10,010 (0) 1 (0)
Athletic Trainer + Performance Evaluation 4 (0) 1 (0) 28 (0) 0
Athletic Trainer + Work Assessment 0 0 0 0
Athletic Trainer + Job Evaluation 0 0 0 0
Authors

From Colts Neck High School, Colts Neck, New Jersey (EN); Vernon Township High School, Vernon, New Jersey (JP); Eastern High School, Voorhees, New Jersey (CC); Manalapan High School, Manalapan, New Jersey (ST); Marlboro High School, Marlboro, New Jersey (MWB); Lenape High School, Medford, New Jersey (WJV); and Duquesne University, Pittsburgh, Pennsylvania (PST).

The authors have no financial or proprietary interest in the materials presented herein.

Correspondence: Eric Nussbaum, MEd, LAT, ATC, Colts Neck High School, 59 Five Points Road, Colts Neck, NJ 07722. E-mail: nussatcjb@aol.com

Received: September 29, 2018
Accepted: April 03, 2019

10.3928/19425864-20190410-01

Sign up to receive

Journal E-contents