Athletic Training and Sports Health Care

Guest Editorial Free

EBP: IMHO (Evidence-Based Practice: In My Humble Opinion)

Richard Demont, PhD, CAT(C), ATC

From the Department of Exercise Science, Concordia University, Montreal, Quebec, Canada.

The author has no financial or proprietary interest in the materials presented herein.

Correspondence: Richard DeMont, PhD, CAT(C), ATC, Department of Exercise Science, Concordia University, SP165-25, 7141 Sherbrooke St. W, Montreal, Quebec H4B 1R6, Canada. E-mail: Richard. DeMont@Concordia.ca

The topic of evidence-based practice (EBP) is being discussed in various venues. Many sessions were offered at this year’s Athletic Trainer’s Educator Conference, the Board of Certification, Inc. is providing reminders to make sure we have met our requirement, and I have seen textbooks hot off the presses detailing the subject.

But what is evidence? The “Dictionary” app (Apple Co. v2.2.1) on my laptop indicates: “(noun) the available body of facts or information indicating whether a belief or proposition is true or valid”; and evidence-based as: “(medical adjective) denoting disciplines of health care that proceed empirically with regard to the patient and reject more traditional protocols.”

Indicating that a proposition, or in our case a test or treatment, is valid seems like a pretty good idea. I’m not sure the goal should be to “reject traditional protocols,” but improvement of them as a result of ongoing reassessment of data and outcomes seems like a pretty good idea, too.

How do we get there? Typically, we might have a two-pronged approach of curious participants: researchers and clinicians. Often (and I have certainly experienced this) there is disconnect between the two. The clinicians are faced with pressing needs from their clients to provide up to date knowledge of latest techniques. Certainly in the athletic training profession, speed seems to be of the essence. The researcher is also interested in promoting up to date information, but the procedure of analysis is usually more concept-driven based on a foundation of stepwise actions to ultimately create data to support a position; or simply put: developing knowledge. Speed can often lose out to hurdles such as resource limitations, data collection concerns, or the peer-review process.

Sometimes there is a lack of respect or understanding between the two entities. Stereotypically, the clinician does not have an understanding of what the researcher does, and the complications faced. The researcher in turn sees the clinician as searching for ideas for assessment or treatment as based in reason, but finds the methodology for testing these ideas falling short (or, unfortunately, nonexistent). Either of these groups could get an idea and even results to the abstract stage but, although increasingly peer reviewed, would still lack suitable rigor to defend the result as “evidence based.”

Despite this under-appreciation of each other, we all learn about evidence during our professional education programs, or should. For years we have had research methods courses in our programs. Some programs are at the Master’s level, and certainly their institutions would make “evidence” a requirement. Incorporation of research into all of our courses (the latest information specific to what we are teaching—in the classroom and the clinic) should be a high priority.

At the Athletic Trainer’s Educator Conference, presenters discussed synonyms to evidence and methods to follow a system. Mentorship and transition to practice were keys, but EBP was ever present and described as best practices, evidence-based education, critical appraisal, and critical thinking.

Included with our NATA membership over the past few years have been 10 “CEU bucks” to use toward maintenance of our Board of Certification credential. Accused of falling short last year, the NATA provided three courses related to EBP for those who registered for the conference, and I assume will have increased EBP programming to choose from at this year’s Clinical Symposia.

So there isn’t a lack of opportunity to obtain the standard set by the Board of Certification. The phrase increasingly heard—PICO(T) = Population, Intervention, Comparison, Outcome(s), and Time (not always included)—can be employed by clinicians, researchers, and students. To realize the goal of increasing EBP information development, researchers/teachers should strive to give the best information to students, generate new information, and ultimately test the practices through randomized controlled trials. Students hopefully learn the system and gain an appreciation as they transition to practice. Clinicians in turn need to employ techniques to avoid the fads and fashion and determine the information they are obtaining is scientific. Some clinicians insisting on their traditional protocols should find evidence to support their interventions.

The question that rises for me in all this is shouldn’t we have been doing this all along? Review of programing for conferences and courses should have only been approved if it showed, to some level, to have sound theory, which was tested and reviewed by independent and knowledgeable people. Perhaps this is more relevant to the “weekend course.” The systems were in place. I’m not sure the new CEU system is needed, but perhaps the previous system needed more credibility of oversight.

10.3928/19425864-20150422-01

Sign up to receive

Journal E-contents