Athletic Training and Sports Health Care

Original Research 

Test–Retest Reliability of the BrainFx 360® Performance Assessment

Chelsea Searles, MS, LAT, ATC; James L. Farnsworth II, PhD, ATC; Colby Jubenville, PhD; Minsoo Kang, PhD, FACSM; Brian Ragan, PhD, ATC†

Abstract

Purpose:

To examine the reliability of the BrainFx 360 (Brain-Fx, Toronto, Ontario, Canada) digital assessment tool for detection of mild-to-moderate symptoms following concussion.

Methods:

Fifteen healthy adults were administered the Brain-Fx 360 at two time points. Reliability was assessed using twoway random effects intraclass correlation coefficients (ICCs) for average measures.

Results:

Reliability was high for the overall performance score (ICC = .85). Reliability of individual tests ranged from low to high within each domain of cognitive function.

Conclusions:

The BrainFx 360 is a promising new instrument. Reliability of the scores for overall performance and performance categories were relatively high; however, reliability was much lower for the individual cognitive outcomes associated with each task. Application of advanced measurement models may be useful in identifying specific tasks with poor measurement properties.

[Athletic Training & Sports Health Care. 201X;X(X):XX–XX.]

Abstract

Purpose:

To examine the reliability of the BrainFx 360 (Brain-Fx, Toronto, Ontario, Canada) digital assessment tool for detection of mild-to-moderate symptoms following concussion.

Methods:

Fifteen healthy adults were administered the Brain-Fx 360 at two time points. Reliability was assessed using twoway random effects intraclass correlation coefficients (ICCs) for average measures.

Results:

Reliability was high for the overall performance score (ICC = .85). Reliability of individual tests ranged from low to high within each domain of cognitive function.

Conclusions:

The BrainFx 360 is a promising new instrument. Reliability of the scores for overall performance and performance categories were relatively high; however, reliability was much lower for the individual cognitive outcomes associated with each task. Application of advanced measurement models may be useful in identifying specific tasks with poor measurement properties.

[Athletic Training & Sports Health Care. 201X;X(X):XX–XX.]

Despite an abundance of research conducted over the past few decades, the amorphous nature of concussions makes management of concussive injuries a challenging task for athletic trainers.1 Experts recommend that athletic trainers employ a multidimensional assessment protocol that includes evaluation of clinical symptoms, physical signs, cognitive impairments, sleep disturbances, and neurobehavioral changes (eg, irritability).1,2

Clinical symptoms, physical signs, sleep disturbances, and neurobehavioral changes can be identified with relative ease by trained health care professionals such as athletic trainers through observation of injured athletes in combination with self-reported symptom checklists. However, cognitive impairments are often subtle and difficult to detect without the assistance of cognitive testing. A variety of cognitive tests, such as the Immediate Post-Concussion Assessment and Cognitive Testing (ImPACT)3 and the Sport Concussion Assessment Tool 5 (SCAT5),4 have been developed to assist athletic trainers with evaluating cognitive impairments following concussions. Although the value of these assessments has been heavily debated in the literature,5,6 there is strong empirical evidence in support of cognitive testing to assist athletic trainers with concussion-related clinical decisions.7–9 More specifically, evidence has suggested that many of the instruments currently in use by athletic trainers and other health care professionals suffer from poor reliability, which could lead to potential misdiagnoses.9

The BrainFx 360 performance assessment (BrainFx, Toronto, Ontario, Canada) was designed in 2013 as a “comprehensive assessment that measures the cognitive, physical, and psychosocial areas of neurofunction through engaging and interactive activities delivered via tablet by a certified BrainFx administrator (CBA) that can be influenced by brain disorders” (CEO of BrainFx, personal communication, 2015). Examples of relevant brain disorders include: concussions, strokes, mental illnesses, neurodegenerative diseases (eg, dementia), or childhood diagnoses such as learning disabilities and attention-deficit/hyperactivity disorder. Although there are many different digital cognitive assessment tools already on the market, the BrainFx 360 is truly unique because it uses “real-world” situations and tasks to assess cognitive performance.

The CBA certification requires the individual administering the examination to be a health care professional in good standing with his or her regulatory or certifying body and the completion of eight online training modules that describe the assessment procedures, how to use the tablet, how to administer the assessment, and the reasons and purposes for the assessment. Each training module includes a quiz to assess comprehension with a final examination once all modules have been completed. The entire training process takes approximately 4 to 6 hours to complete.10 Occasional mandatory software updates are provided by BrainFx, about which the CBA is notified ahead of time to minimize potential conflicts with injury assessment and use in the clinical setting.

The BrainFx 360 includes a total of 49 tasks that provide an overall performance score for each patient. In addition, scores are provided for each of five performance categories. These performance categories include: (1) sensory and physical skill performance, (2) social and behavioral skill performance, (3) foundational cognitive skill performance, (4) intermediate cognitive skill performance, and (5) complex cognitive skill performance. Each performance category consists of multiple cognitive outcomes (ie, visual skills, executive functioning and combined skills, working memory, and problem solving). A list of the cognitive outcomes (n = 26) associated with each performance category is provided in Table 1.

BrainFx Performance Categories and Cognitive Outcomes

Table 1:

BrainFx Performance Categories and Cognitive Outcomes

The specific nature of these tasks makes them more relevant for patients and clinicians as they attempt to assess real life function for their patients. Some examples of tasks on the BrainFx 360 include: counting money, setting up and managing a daily schedule, listening to a lecture and recalling information, and dual-attention activities such as matching fruits/vegetables while ensuring a pot of water does not boil over. Some of these tasks are directly related to the types of activities that athletes would be expected to perform in daily life while recovering from a concussion. Thus, this instrument may be valuable for athletic trainers and other health care professionals to evaluate recovery patterns before engaging in a standardized return-to-play activity progression.

There is little information available to date regarding the measurement properties of the BrainFx 360. A recent pilot study involving four patients diagnosed as having schizophrenia used the BrainFx 360 to evaluate changes in cognitive function following cognitive adaptation training.12 The results of the study were inconclusive. Unpublished data from York University indicated that the BrainFx 360 was able to identify concussions in 95% of cases.13 However, there have been no published studies investigating the reliability of the BrainFx 360. Understanding the reliability of this instrument is critical for determining its value in clinical practice. Therefore, the purpose of this study was to evaluate the reliability of the BrainFx 360.

Methods

Study Design

This study used a single cohort repeated-measures research design to evaluate the test–retest reliability of the BrainFx 360. The protocol for this study was approved by the university's institutional review board.

Participants

Fifteen healthy adults (male = 9, female = 6; mean age = 22.9 ± 2.4 years) volunteered to participate in this study. Healthy was defined as an individual who had not sustained a concussion or experienced residual concussion symptoms in the past 6 months, and reported no other health disorders, history of learning disability, seizure disorder, attention deficit disorder, or other mental/physical disability that could possibly affect his or her motor and/or cognitive performance.

Instrument

The BrainFx 360 is intended for use with children and adults aged 10 to 90 years with a suspected brain disorder. The test includes a detailed client report, a companion report, and the performance assessment. The BrainFx application kit includes a 10-inch Android tablet (software pre-loaded), stand, stylus, wireless keyboard, and noise-cancelling headphones. The BrainFx uses a monthly subscription-based pricing model (12-month commitment) with multiple options available depending on the number of clinicians and the anticipated number of test uses. The most expensive option ($125/month) includes the application kit, training, and unlimited use of both the BrainFx 360 and the BrainFx SCREEN (a brief 10-minute screening questionnaire). The BrainFx 360 assessment requires Internet access only at the beginning and end of the test and it takes approximately 1 hour to administer all 49 required tasks. A demonstration version of the software can be accessed on the company's web site, which allows individuals to experience a few of the tasks that will be completed by patients. The assessment report provided at the conclusion of the test is approximately 40 pages in length, which provides detailed information about overall scores and patient responses to each task. The assessment report includes scores for the overall performance, each performance category, and detailed information for each cognitive outcome. A sample image for one of the cognitive outcomes (ie, temporal awareness) has been provided in Figure 1. For each outcome, the scores are compared with normative values and color coded to indicate values that are within (yellow) and outside (green) of 1 standard deviation of the population averages. A brief description of all tasks is provided in Table 2.

BrainFx 360 (BrainFx, Toronto, Ontario, Canada) cognitive outcome score example.

Figure 1.

BrainFx 360 (BrainFx, Toronto, Ontario, Canada) cognitive outcome score example.

Tasks Included Within the BrainFx 360 Performance Assessment

Table 2:

Tasks Included Within the BrainFx 360 Performance Assessment

Procedures

Participants were recruited for this study through personal e-mail communications. The e-mail briefly outlined the details for the study. Participants who agreed to complete the study met with the CBA in a quiet, locked office, where they read and signed an informed consent agreement. Participants were notified that they would be contacted for a second follow-up test 7 to 14 days following the initial testing. Patient demographic information and injury history were extracted from the BrainFx 360 client report. Prior to beginning the assessment, the CBA explained the details of the test to the participant. In addition, the CBA was available throughout the assessment to answer any questions about a particular task or to provide additional cues when necessary.

The retest of the BrainFx 360 was administered 7 to 14 days after the initial assessment. Both the initial and follow-up assessment were conducted by the same CBA. This time period was selected to reflect the typical testing cycle that is often used in concussion management protocols.1,2 In addition, 7 to 14 days is a sufficient length of time to decrease the effects of learning and familiarization of tasks and testing procedures.14,15 The procedures used during the retest period of data collection were the same as those used during the initial testing period.

Data Analysis

To evaluate the reliability of the BrainFx 360, twoway random effects intraclass correlation coefficients for average measures (ICC2,k) were calculated using SPSS software (version 21; SPSS, Inc., Chicago, IL). Separate ICCs were calculated for overall reliability and each outcome measure. Many different standards have been reported for acceptable levels of reliability, but a common standard suggests that coefficients between 0.70 and 0.79 are below average acceptable, 0.80 and 0.89 are average acceptable, and 0.90 to 1.0 are above average acceptable.14

Results

Descriptive statistics for scores on the BrainFx 360 have been provided in Table 3. Possible scores range from 0 to 100 for each outcome, with higher values indicating higher levels of cognitive function. Test–retest reliability for the BrainFx 360 overall performance score was relatively high (ICC2,k = .85). Reliability for all performance categories and respective cognitive outcomes for the BrainFx 360 is provided in Table 4. Reliability of scores for performance categories of the BrainFx 360 were at least average acceptable for all but one performance category (complex cognitive skill performance), which had a reliability coefficient of 0. As expected, reliability coefficients of individual outcomes were lower, ranging from 0 to 0.97 with only 10 of 26 outcomes demonstrating at least below average acceptable reliability. Post-hoc power analysis revealed that the achieved power of this study was .85, indicating the sample size was sufficient for this design.

Descriptive Statistics

Table 3:

Descriptive Statistics

ICCs for Study Populationa

Table 4:

ICCs for Study Population

Discussion

BrainFx recognized the need for an assessment tool that was both comprehensive and could be applied to real life experiences. The BrainFx 360 performance assessment emphasizes evaluation of intermediate to complex cognitive skills using real life activities, which integrates the patient's medical history, symptoms, functional participation, and quality of life into a single report (CEO of BrainFx, personal communication, 2015).

One of the biggest concerns with cognitive testing is the inconsistent reliability estimates reported across studies.9 Reliability is important for cognitive testing because of its relationship to reliable change indices. When reliability for a test is low, a larger change in scores is necessary to determine meaningful differences in scores. Cognitive changes following a concussion can be minimal, providing support for the use of computerized cognitive testing, but when reliability is low the rates of false-positive and false-negative diagnoses will increase. In a prospective head-to-head study of three computerized neurocognitive assessment tools (Immediate Post-Concussion Assessment and Cognitive Testing [ImPACT], Axon, and Automated Neuropsychological Assessment Metrics [ANAM]), false-positive rates ranged from 24.4% to 37.2% in recently asymptomatic athletes.16 In addition, 8 days following injury positive predictive values were relatively low (< 50%) for both the Axon and ANAM computerized concussion tests.16

Although the consequences of returning an athlete to play too soon, before a concussion has properly healed, are not well understood, it is generally accepted that impairments in mental status and physical reflexes, which are expected following a concussion, place the athlete at risk for subsequent injuries. Therefore, it is important that athletes have sufficiently recovered from a concussion before being allowed to return to play. Commonly used cognitive tests are hindered by their low reliability.9

The results of this study suggest that the BrainFx 360 may have superior reliability when compared to commonly used computerized test batteries such as the ImPACT, ANAM, and Axon/CogSport tests.9 With the exception of complex cognitive skill performance, all performance categories for the BrainFx 360 demonstrated acceptable reliability (ICC ≥ .79). However, direct comparisons of cognitive outcomes between test batteries is challenging because of the nature of assessment. Outcomes with similar names may not be assessing the same cognitive ability. For example, the Axon/ CogSport test battery assesses an individual's processing speed and decision-making ability through the Detection Test. The ImPACT test battery measures the composite score processing speed through scores on the X's and O's and the Three Letters tasks. Although both of these instruments are reportedly assessing the same cognitive domain (ie, processing speed), the manner in which they are assessed is different. Because the ImPACT's tasks for measuring processing speed are more complex than those for the Axon/CogSport tasks, multiple cognitive domains are likely incorporated into each assessment. For these reasons, although different tests may use the same terminology the cognitive abilities being assessed may not be comparable.

BrainFx 360 Test Limitations and Concerns

One key difference between the BrainFx 360 and alternative testing options is the number of tasks that are included within each test battery. The BrainFx 360 includes 49 unique tasks, whereas the ImPACT and Axon/CogSport assessments each contain 6 and 4 tasks, respectively. It is generally acknowledged that reliability increases as the length of a test increases,17 which may partially explain why the reliability coefficients for scores on the BrainFx 360 are generally higher than those found on other instruments. The BrainFx 360 takes approximately 1 hour to administer, requiring a significant investment of time by both the patient and the clinician. In the clinical setting, most athletic trainers may not able to dedicate a full hour to the testing of a single athlete. This issue is further complicated when considering many concussion management protocols recommend routine evaluation to track changes in cognitive function. Although the scores for performance categories on the BrainFx 360 were relatively reliable, the administrative burden may be too high for this instrument to be clinically useful. Furthermore, the cognitive burden associated with such a long test may result in some athletes not maintaining sufficient focus throughout the test, leading to inaccurate results. A modified version of this instrument with a shorter test duration may be an attractive option to athletic trainers.

Although most of the scores for the total performance categories for the BrainFx 360 had acceptable reliability, the majority of the individual cognitive outcomes did not. The BrainFx 360 provided outcome scores for 26 cognitive outcomes determined from participants' responses to the 49 tasks. More than half of the cognitive outcomes (n = 16) yielded reliability coefficients of 0.70 or less. Removal, or modification of, the tasks associated with these less reliable outcomes may help to significantly improve the overall quality of the test and reduce the test administration time, making the instrument more patient- and clinician-friendly.

For example, Category from Words is a task associated with both the Attention–Selective to Audio Distraction and the Abstract Reasoning cognitive outcomes. The reliability coefficients for Attention–Selective to Audio Distraction and Abstract Reasoning were 0 and .31, respectively. For this task, participants are asked to review a list of words and identify the commonality among the words. This task is later repeated with an audio distraction. The difficulty level of the words and commonalities was generally not consistent for this task. In many cases, participants would get an easier set of words during the first task (without distraction) and a more difficult list of words with the distraction present or vice versa. Modification of the algorithms used to select the words to make them more consistent could help to improve reliability of this section. Alternatively, the tasks could be removed to shorten the test administration time.

The test–retest interval in this study was selected to match the typical symptom resolution period for most athletes (eg, approximately 7 days).18 However, this testing window may not be appropriate in all cases. Future studies should examine alternative test–retest intervals that include both shorter and longer durations.

Technical limitations with the BrainFx 360 hardware also led to some issues and complications. Specifically, with the Rebuilding Pictures task, participants were required to use their finger to reorganize a puzzle that had been separated into nine pieces. Sometimes the final piece of the puzzle would not slide into place, instead returning to the outside of the puzzle. This delayed the participants' completion time and in some cases completely prevented them from finishing the puzzle.

An interesting observation that was noticed during testing was that, in some instances, participants in the study would skip an entire task on the initial assessment yet complete the task during the retest assessment. One explanation for this may be that participants did not fully understand how to complete the task due to the complexity or difficulty of the task. This issue was apparent with both the Recall Sound and the Watched Pot tasks. For the Recall Sound tasks, participants were asked to double click on a specific icon on-screen and type a sentence. However, most of the participants completely missed the recall sound. Others would state the sentence out loud that they were supposed to remember and type. In the Watched Pot task, participants were instructed to multi-task by matching items while watching a pot of water on the stove. Anytime the pot boiled, the participant was supposed to slide the pot from the stove to the sink and dump out the water. On the initial test, some participants would ignore the matching task and only slide the pot to the sink when it boiled, whereas others would only match the items while ignoring the other task. On the retest, extra cues were needed to complete the task successfully or the participant would remember to read the directions more carefully before beginning the task. Many of the tasks developed for the BrainFx 360 are unique to this assessment. Providing participants an option to practice the assessment could help to eliminate this issue, but due to the already lengthy administration time, this may not be a practical solution for most individuals.

As a potential solution for this problem, the BrainFx manufacturer currently provides a shorter assessment program called the BrainFx SCREEN. The SCREEN takes approximately 10 to 15 minutes to complete and may be a more suitable option for athletic trainers where time is limited. The company indicates that this instrument may be used to establish baseline information for the use of monitoring changes in functional status or as a screening tool to determine whether a more thorough assessment is required. Similar to the BrainFx 360, however, there are currently no published measurement data available for the BrainFx SCREEN. When used as part of a comprehensive assessment program, the SCREEN may serve as an alternative for clinicians who do not have the time available to dedicate to the full BrainFx 360 assessment.

Implications for Clinical Practice

The BrainFx 360 is a promising new computerized concussion test battery that has the potential to improve concussion evaluation and management. Overall performance scores and all but one performance category demonstrated acceptable reliability; however, many of the individual cognitive outcomes yielded poor reliability coefficients. In addition, there are some technical issues and administrative concerns that need to be addressed to improve the clinical utility of the instrument. BrainFx 360 developers should consider revising or removing the tasks associated with poor cognitive outcomes to improve the overall quality of the test. Application of advanced measurement models such as Item Response Theory or Generalizability Theory may be useful in identifying specific tasks with poor measurement properties or identifying additional sources of error. Furthermore, additional studies are needed to examine the sensitivity and specificity of this instrument for diagnostic purposes before a determination of clinical value can be determined.

References

  1. McCrory P, Meeuwisse W, Dvorák J, et al. Consensus statement on concussion in sport—the 5th international conference on concussion in sport held in Berlin, October 2016. Br J Sports Med. 2017;51:838–847.
  2. Broglio SP, Cantu RC, Gioia GA, et al. National Athletic Trainers' Association position statement: management of sport concussion. J Athl Train. 2014;49:245–265. doi:10.4085/1062-6050-49.1.07 [CrossRef]
  3. Iverson GL, Lovell MR, Collins MW. Interpreting change on Im-PACT following sport concussion. The Clinical Neuropsychologist. 2003;17:460–467. doi:10.1076/clin.17.4.460.27934 [CrossRef]
  4. Echemendia RJ, Meeuwisse W, McCrory P, et al. The Sport Concussion Assessment Tool 5th Edition (SCAT5): background and rationale. Br J Sports Med. 2017;51:848–850.
  5. Randolph C, McCrea M, Barr WB, Macciocchi SN. Is neuropsychological testing useful in the management of sport-related concussion?J Athl Train. 2005;40:139–152.
  6. McCrory P, Makdissi M, Davis G, Collie A. Value of neuropsychological testing after head injuries in football. Br J Sports Med. 2005;39(suppl 1):i58–i63. doi:10.1136/bjsm.2005.020776 [CrossRef]
  7. Kontos AP, Braithwaite R, Dakan S, Elbin R. Computerized neurocognitive testing within 1 week of sport-related concussion: meta-analytic review and analysis of moderating factors. J Int Neuropsychol Soc. 2014;20:324–332. doi:10.1017/S1355617713001471 [CrossRef]
  8. Iverson GL, Schatz P. Advanced topics in neuropsychological assessment following sport-related concussion. Brain Inj. 2015;29:263–275. doi:10.3109/02699052.2014.965214 [CrossRef]
  9. Farnsworth JL, Dargo L, Ragan BG, Kang M. Reliability of computerized neurocognitive tests for concussion assessment: a meta-analysis. J Athl Train. 2017;52:826–833. doi:10.4085/1062-6050-52.6.03 [CrossRef]
  10. BrainFx. The BrainFx Assessment Platform. 2017. http://www.brainfx.com/home-2/brainfxassessments-2/. November 19, 2017.
  11. BrainFx. BrainFx Administrator Certification Course. 2017. https://brainfx.litmos.com/.
  12. McWhinnie J, Kwan G, Hirano G. Measuring Change in Neurofunction with BrainFx 360 after Cognitive Adaptation Training in Schizophrenia. Evidence Based Practice Symposium Occupational Therapy Class of 2017; 2017; Hamilton, Ontario, Canada.
  13. Sergio L. BrainFx 360 Assessment: Clinical Validation. Toronto, Canada: York University; 2014.
  14. Baumgartner TA, Jackson AS, Mahar MT, Rowe DA. Reliability and Objectivity: Measurement For Evaluation in Kinesiology, 9th ed. Burlington, MA: Jones & Bartlett Learning; 2016:91–113.
  15. Ohl AM, Crook E, MacSaveny D, McLaughlin A. Test-Retest Reliability of the Child Occupational Self-Assessment (COSA). Am J Occup Ther. 2015;69:6902350010p6902350011–6902350014. doi:10.5014/ajot.2015.014290 [CrossRef]
  16. Nelson LD, LaRoche AA, Pfaller AY, et al. Prospective, head-to-head study of three computerized neurocognitive assessment tools (CNTs): reliability and validity for the assessment of sport-related concussion. J Int Neuropsychol Soc. 2016;22:24–37. doi:10.1017/S1355617715001101 [CrossRef]
  17. Morrow JR Jr, Mood D, Disch J, Kang M. Measurement and Evaluation in Human Performance, 5th ed. Champaign, IL: Human Kinetics; 2015.
  18. McCrea M, Guskiewicz KM, Marshall SW, et al. Acute effects and recovery time following concussion in collegiate football players: The NCAA Concussion Study. JAMA. 2003;290:2556–2563. doi:10.1001/jama.290.19.2556 [CrossRef]

BrainFx Performance Categories and Cognitive Outcomes

Performance CategoryCognitive OutcomeTasks Associated With Each Outcome
Sensory and Physical SkillVisual Skills, Fine Motor CoordinationNeglect, Visual Acuity, Color Blindness, Visual Perception Skills, Fine Motor Coordination
Social and Behavioral SkillImpulsivity, Emotion RecognitionModified Stroop Test, Emotion Recognition
Foundational Cognitive SkillMemory–Immediate for Auditory, Memory–Immediate for Visual, Memory–Immediate for Complex, Visual, Novel, Temporal AwarenessRecall Items, Players on the Field, Listening to a Lecture, Time Elapsed, Managing Time, Time Awareness
Intermediate Cognitive SkillAttention–Selective to Visual Distraction, Attention–Selective to Auditory and Written, Memory–Delay Auditory and Written, Memory–Delay Written and cued, Working Memory, Problem Solving, Constructive Ability, Route Finding, SequencingMatching, Math Questions with Audio Distraction, Items in a Category with Distraction, Commonality with Audio Distraction, Recall with a Delay, Recall Appointment, People on the Train, Answering Match Questions, Making Change, Puzzle, Route Finding, Sequencing
Complex Cognitive SkillAttention Divided, Memory–Delay for Face and Names, Memory–prospective Auditory Two Steps, Mental Flexibility, Abstract Reasoning, Judgment for Safety, Foresight for Safety, Comprehension and Humor Inferences with Distraction, Executive Functioning and Combined SkillsA Watched Pot, Recalling Faces, Recall a Task for Later, Thinking of Items in a Category, Thinking of Categories from a List of Items, Categories, Picture Interpretation, Recognizing Safety Hazards, The Conversation, Prioritizing, Online Banking, Fundraising Activity

Tasks Included Within the BrainFx 360 Performance Assessment

Task NameDescription
How Was I Feeling?Patient is asked to respond to questions related to fatigue, sleep, mood, and pain.
NeglectPatient is able to touch all objects presented on the tablet screen.
Visual AcuityPatient is asked to read sentences in a variety of font sizes.
Color BlindnessPatient is able to identify the number in the color palette presented.
Visual Perception SkillsPatient is provided a series of tasks that include the following: [1] Scanning–a two digit number is highlighted to match on screen with same numbers in a different order; [2] Visual closure–a dotted drawing is shown and needs to choose which of four solid drawings match it; [3] Spatial–a shape is cut out of a solid rectangle and needs to choose with of the four shapes was the one cut out; [4] A word is presented where the patient needs to select from four options that the word fits inside; [5] Rotation–a drawing is presented and it needs to be drawn to a specified rotation; [6] Mirror Image–a drawing is presented that needs to be completed via a mirror image; [7] A drawing is presented that needs to be copied; [8] Draw a Clock–instruction is provided to draw the face of a clock with it set to a specific time.
Fine Motor CoordinationPatient is asked to slide 5 items with the left hand, slide 5 items with the right hand only, slide 10 items while alternating between left and right hands, and pinch and slide 10 items while alternating between left and right hands.
Modified Stroop TestPatient is asked to ignore what the word reads and instead touch the color in which the word is written as quickly as possible. If instructions are understood, a low score may demonstrate impulsive responses.
Emotion RecognitionPatient is asked to identify and recognize different emotions in pictures of people.
Recall ItemsPatient is asked to learn four items in context immediately and told to remember for later (through free recall).
Players on the FieldPatient is asked to learn and recall the locations of specific players on a playing field.
Listening to a LecturePatient is asked to recall details from a 1-minute educational lecture.
Time ElapsedPatient is asked to estimate the amount of time that has passed since starting the assessment.
Managing TimePatient is asked to manage time to meet 20 people in 2 minutes.
Time AwarenessPatient is able to determine correctly what part of the day it was at the time of assessment.
MatchingPatient is asked to review and identify the differences between two pictures.
Math Questions With Audio DistractionPatient is asked to complete a series of math questions while listening to a crowd cheering or background noise.
Items in a Category With Audio DistractionPatient is asked to think of items in a category while listening to a crowd cheering or background noise.
Commonality with Audio DistractionPatient is asked to think of a category from a list of items while listening to a crowd cheering or background noise.
Recall with a DelayPatient is asked to recall the four items that were previously heard and written.
Recall AppointmentPatient is asked to list appointment details from a written note learned earlier in the assessment.
People on a TrainPatient is asked to remember how many people remain on the train, as people walk on and off.
Answering Math QuestionPatient is asked to complete a series of math questions.
Making ChangePatient is asked to make change for various purchases at a cash register.
PuzzlePatient is asked to view and reconstruct images after they are broken into pieces.
Route FindingPatient is provided a street map and asked to trace the most direct route between several locations.
SequencingPatient is asked to put pictures or written steps into proper order.
A Watched PotPatient is asked to match objects in a kitchen scene while watching to move the pot to the sink before it boils over.
Recalling FacesPatient is asked to recall faces and names learned previously in the assessment.
Recall a Task for LaterPatient remembers to complete two actions when cued by an auditory reminder.
Thinking of Items in a CategoryPatient is asked to list items within the specified category (eg, name items commonly found in a refrigerator or name types of balls that you would be surprised to see a golfer tee off with).
Thinking of a Category From a List of ItemsPatient is asked to identify commonalities among a list of items by providing a category (eg, things you wash or things that fly).
CategoriesPatient is asked to sort items into household categories.
Picture InterpretationPatient is provided a picture and asked to interpret the picture out loud.
Recognizing Safety HazardsPatient is asked to identify safety hazards in a collection of pictures, understand what to do in response to a hazard that is presented, and understand the consequences of a hazard that is presented.
The ConversationPatient listens to 2 minutes of conversation at a loud party and then answers multiple-choice questions about details or inferences from the conversation.
PrioritizingTasks are presented in differing levels of priority and need to be placed into order of priority.
Online BankingPatient is asked to pay bills and transfer money according to certain rules.
Fundraising ActivityPatient is asked to schedule tasks according to certain rules.

Descriptive Statistics

BrainFx OutcomesTrial 1Trial 2


M (SD)MinMaxM (SD)MinMax
Overall Performance Score69.1 (5.7)547772.1 (6.8)5480
Sensory and Physical Skill Performance46.0 (3.7)405247.5 (3.4)4052
  Visual Skills73.5 (9.4)315876.3 (8.2)6289
  Fine Motor Coordination78.3 (9.0)619280.7 (8.6)6194
Social and Behavioral Skill Performance81.5 (18.4)369881.3 (18.0)3699
  Impulsivity72.9 (25.5)309772.5 (22.7)3499
  Emotion Recognition90.3 (25.7)2710090.3 (25.7)27100
Foundational Cognitive Skill Performance76.1 (8.3)628780.3 (9.7)6291
  Memory–Immediate for Auditory95.9 (11.1)6310096.7 (8.8)75100
  Memory–Immediate for Visual83.4 (8.7)669484.3 (9.5)66100
  Memory–Immediate for Complex, Visual, Novel60.3 (19.5)289663.3 (22.8)2896
  Temporal Awareness64.4 (18.6)309677.0 (17.7)3997
Intermediate Cognitive Skill Performance72.5 (8.0)578575.6 (8.9)5790
  Attention–Selective to Visual Distraction66.1 (15.7)359170.0 (13.7)4989
  Attention–Selective to Audio Distraction50.9 (8.2)366951.0 (7.7)3361
  Memory–Delay Auditory and Written77.7 (25.0)2410083.6 (17.7)47100
  Memory–Delay Written and Cued79.2 (22.1)2510085.3 (19.4)49100
  Working Memory73.9 (28.0)3510076.0 (26.0)35100
  Problem Solving56.5 (16.1)257660.3 (23.1)596
  Constructive Ability88.3 (21.6)3010083.9 (21.6)36100
  Route Finding97.4 (5.4)8710096.7 (7.1)77100
  Sequencing63.1 (13.6)428771.5 (15.8)4295
Complex Cognitive Skill Performance69.0 (5.9)618076.6 (5.6)6884
  Attention Divided68.7 (22.5)339887.6 (16.0)4899
  Memory–Delayed for Face and Names67.3 (12.2)478576.6 (11.1)5592
  Memory–Prospective Auditory 2 Steps73.9 (7.2)7210086.3 (13.9)72100
  Mental Flexibility63.3 (25.1)169874.4 (20.4)2596
  Abstract Reasoning64.2 (9.3)477867.1 (8.3)5584
  Judgement for Safety71.5 (11.9)529078.7 (12.7)4194
  Foresight for Safety68.1 (22.8)209975.3 (23.5)20100
  Comprehension and Humor Inferences With Distraction71.1 (12.9)458971.1 (15.4)4398
  Executive Functioning and Combined Skills72.3 (5.0)658073.1 (5.6)6380

ICCs for Study Populationa

BrainFx OutcomesICC95% CI

LowerUpper
Overall Performance Score.85.56.95
Sensory and Physical Skill Performance.84.53.95
  Visual Skills.84.54.95
  Fine Motor Coordination.81.44.94
Social and Behavioral Skill Performance.82.48.94
  Impulsivity.97.90.99
  Emotion Recognition.62.00b.88
Foundational Cognitive Skill Performance.79.40.93
  Memory–Immediate for Auditory.69.12.90
  Memory–Immediate for Visual.88.66.96
  Memory–Immediate for Complex, Visual, Novel.84.54.95
  Temporal Awareness.63.00b.88
Intermediate Skill Performance.86.58.95
  Attention–Selective to Visual Distraction.84.54.95
  Attention–Selective to Audio Distraction.00b.00b.64
  Memory–Delay Auditory and Written.50.00b.83
  Memory–Delay Written and Cued.65.00b.88
  Working Memory.86.59.95
  Problem Solving.78.67.93
  Constructive Ability.47.00b.82
  Route Finding.19.00b.73
  Sequencing.75.26.91
Complex Cognitive Skill Performance.00b.00b.48
  Attention Divided.03.00b.67
  Memory–Delayed for Face and Names.37.00b.79
  Memory–Prospective Auditory 2 Steps.00b.00b.63
  Mental Flexibility.42.00b.80
  Abstract Reasoning.31.00b.76
  Judgment for Safety.00b.00b.35
  Foresight for Safety.90.72.97
  Comprehension and Humor Inferences With Distraction.10.00*.70
  Executive Functioning and Combined Skills.54.00*.84
Authors

From Oxford High School and PT for Life, a PTSMC affiliate, Southbury, Connecticut (CS); the Department of Health and Human Performance, Texas State University, San Marcos, Texas (JLF); the Department of Health & Human Performance, Middle Tennessee State University, Murfreesboro, Tennessee (CJ, BR); and Health, Exercise Science, and Recreation Management, The University of Mississippi, University, Mississippi (MK).

Deceased.

The authors have no financial or proprietary interest in the materials presented herein.

Correspondence: James L. Farnsworth, II, PhD, ATC, 601 University Drive, Department of Health and Human Performance, Texas State University, San Marcos, TX 78666-4616. E-mail: farnsworth@txstate.edu

Received: January 31, 2018
Accepted: August 08, 2018
Posted Online: November 27, 2018

10.3928/01913913-20181005-01

Sign up to receive

Journal E-contents