Annals of International Occupational Therapy

Original Research Supplemental Data

Survey-Based Investigation of Practice Implications for Vision Assessment: An Initial Look at Eye-Hand Coordination Testing

Kimberly Hreha, EdD, OTR/L; John Ross Rizzo, MD, MSCI; Andrew Abdou, MD; Lyna Truong, BA; Jeffrey Wertheimer, PhD; Min Jeong Park Graf, MD; Jennifer Kaldenberg, DrPH, MSA, OTR/L, SCLV, FAOTA; Imelda Llanos, MS, OTR/L; Pamela Roberts, PhD, OTR/L, SCFES, FAOTA, CPHQ, FNAP

Abstract

Background:

Eye-hand coordination (EHC) is defined as visually guided movements of the eye and hand toward shared spatial targets. After an injury, it is necessary to assess for impairments in vision, including EHC. Occupational therapists (OTs) should routinely use standardized assessments that have good psychometric properties. This article reports findings from a survey of knowledge of EHC assessments.

Methods:

This study used an online survey on 19 EHC assessments. The survey was conducted with a multiple-choice format.

Results:

Of the 54 respondents, 98% were women and 100% were OTs. The respondents reported treating visual impairment 50% of the time. Differences for frequency of use, reliability, and cost and burden were found between respondents who had practiced for less than 6 years and those who had practiced for 6 years or longer (p = .020; p = .029; p = .041, respectively). Differences in ratings were also found between respondents who had some form of post-degree training and those who had no post-degree training (p = .041).

Conclusion:

This study indicates the need for additional foundational work to improve knowledge among OTs of EHC assessments and the psychometric properties of these assessment tools. Limitations include the use of survey-based methods, which may limit the generalizability of the findings. In addition, participants were limited to a convenience sample of U.S. OTs. Recommendations include further research to test the psychometric properties of these assessments and to determine the clinical applicability of the findings. [Annals of International Occupational Therapy. 2020; 3(1):29–37.]

Abstract

Background:

Eye-hand coordination (EHC) is defined as visually guided movements of the eye and hand toward shared spatial targets. After an injury, it is necessary to assess for impairments in vision, including EHC. Occupational therapists (OTs) should routinely use standardized assessments that have good psychometric properties. This article reports findings from a survey of knowledge of EHC assessments.

Methods:

This study used an online survey on 19 EHC assessments. The survey was conducted with a multiple-choice format.

Results:

Of the 54 respondents, 98% were women and 100% were OTs. The respondents reported treating visual impairment 50% of the time. Differences for frequency of use, reliability, and cost and burden were found between respondents who had practiced for less than 6 years and those who had practiced for 6 years or longer (p = .020; p = .029; p = .041, respectively). Differences in ratings were also found between respondents who had some form of post-degree training and those who had no post-degree training (p = .041).

Conclusion:

This study indicates the need for additional foundational work to improve knowledge among OTs of EHC assessments and the psychometric properties of these assessment tools. Limitations include the use of survey-based methods, which may limit the generalizability of the findings. In addition, participants were limited to a convenience sample of U.S. OTs. Recommendations include further research to test the psychometric properties of these assessments and to determine the clinical applicability of the findings. [Annals of International Occupational Therapy. 2020; 3(1):29–37.]

The fundamentals of eye-hand coordination (EHC) are based on temporally and spatially synchronized movements of the eye and hand toward shared spatial targets (Rizzo et al., 2017). Activities that require EHC include reaching, grasping, and object manipulation (e.g., pouring milk into a bowl of cereal). The complex relationship between the visual system and the manual motor system rests on the ability to visually decipher environmental details and integrate motor responses of the eye and hand to produce controlled, rapid, and accurate movements (Gao, Ng, Kwok, Chow, & Tsang, 2010; Rizzo et al., 2017). Encoding these visual details and directing goal-oriented hand movements is central to EHC and quintessential to functional independence (Bard, Fleury, & Hay, 1990; Ma-Wyatt & Renninger, 2011; Smeets, Hayhoe, & Ballard, 1996). Although ocular motor programming controls gaze, the foveated target and subsequent vision provide the primary sensory input, supporting the refinement of hand movement. Post-foveation fixations target key spatial positions and are contingent on the functional requirements of the task at hand, such as placement of the index finger for the dexterous manipulation of objects (Johansson, Westling, Bäckström, & Flanagan, 2001).

Deficits in EHC may occur after acquired brain injuries (Gao et al., 2010), including traumatic events (Brown, Dalecki, Hughes, Macpherson, & Sergio, 2015), or as a result of developmental disability (Caeyenberghs et al., 2009). Reports of EHC deficits found significant correlations with sensorimotor impairment (Gao et al., 2010), a lack of isolated movement (Zackowski, Dromerick, Sahrmann, Thach, & Bastian, 2004), muscle weakness (McCrea, Eng, & Hodgson, 2005), and abnormal synergistic arm patterns (Beer, Dewald, & Rymer, 2000). Regardless of the pathologic condition and related associations, EHC deficits can impair performance and profoundly affect activities of daily living (Gao et al., 2010; Land, Mennie, & Rusted, 1999).

Current EHC assessments have not been performed with strict psychometric testing, and the testing that has been performed has been limited and often isolated to solitary tests with unidimensional constructs (i.e., test batteries or multidimensional testing algorithms have not been developed) (Table A, available in the online version of the article). For these reasons, clinical EHC assessments are often used incorrectly (Morley, 2014). Assessment of vision, including EHC deficits, is critical, particularly with comprehensive standardized measures at longitudinal time points along the recovery path. Such assessment would allow quantitative measurement of the prevalence of deficits and provide accurate documentation of client status and the effectiveness of treatment.

Eye hand coordination tests psychometric properties and corresponding references

Table A.

Eye hand coordination tests psychometric properties and corresponding references

The recent development of a conceptual model for vision rehabilitation, which provides a process to systematically integrate assessment and collaboration between vision specialists and non-vision specialists, established the necessary framework to integrate assessment of both visual function and functional vision (Roberts et al., 2016). Objective testing to quantify these deficits is largely under-represented in the clinical health literature, specifically in occupational therapy, a discipline known to be instrumental in assessing and understanding EHC in patient care (Scheiman, 2011). Even if EHC testing is completed, often it is not routine, standardized, or comprehensive. For example, a study concluded that fewer than 50% of survey respondents believed that standardized measures were used routinely in the clinic (Cole, Finch, Gowland, & Mayo, 1994). Further, occupational therapists (OTs) often do not favor standardized assessment (even if it is functional) because of varying knowledge, limited skills in assessment, and concern that adopting “standard” practices will preclude a holistic approach (Jaeger Pedersen & Kaae Kristensen, 2016). In addition, OTs have reported that the use of assessments is too time-consuming (Piernik-Yoder & Beck, 2012). This research is not specific to the United States or to OTs. Multiple countries and specialty disciplines have shown similar trends for the effective use of clinical information, including assessments, and documentation is inconsistent (Cole et al., 1994; Jaeger Pedersen & Kaae Kristensen, 2016; Morley, 2014; van Polanen & Davare, 2015).

Thus, a better understanding of current practice for vision rehabilitation is needed, specifically, knowledge of EHC assessments, the related psychometric properties, and the frequency of assessment. This article reports the findings of a recent survey of the content validity of EHC assessments that were deemed to have initial face validity, according to a consensus from a vision task force at a national meeting. Additional objectives included elucidating approaches commonly used by OTs to evaluate EHC and identifying common barriers to assessment. This report describes the perspectives of OTs, specifically those who reported that they were familiar with vision assessments and used them in practice. Given current evidence, we hypothesized that there would be a lack of standardized and routine use of vision assessment and a lack of understanding of EHC assessments because OTs were unfamiliar with the available tests (education), programmatic barriers were in place (lack of departmental support to mandate assessments), and there were few consistent systematic reviews and meta-analyses comparatively analyzing the respective tests (translating common data elements).

Methods

Procedures and Participants

The institutional review board at Cedars-Sinai approved this work. This study was not regulated because it only used de-identified information; thus, consent forms were not required. An online platform was used to develop the survey, and the survey was available to respondents for 14 days. Participants were recruited through a published link on the American Congress for Rehabilitation Medicine's website, and the survey was disseminated through a national e-mail distribution for members of the American Congress for Rehabilitation Medicine and other lists, including multiple state discipline-specific member listservs (e.g., occupational therapy). Additionally, respondents were instructed that they could forward the e-mail invitation to anyone they thought would be appropriate to participate. The body of the e-mail explained that the goal of the survey was to better understand “the practice and professional reasoning for using specific vision assessments regarding EHC assessment.” Respondents were informed that this study was voluntary, and compensation was not provided. The survey could be completed via a computer, tablet, or smartphone (Figure A, available in the online version of the article).

The ACRM vision taskforce would like to thank you for completing this survey. It will take approximately 5 minutes to complete.The ACRM vision taskforce would like to thank you for completing this survey. It will take approximately 5 minutes to complete.The ACRM vision taskforce would like to thank you for completing this survey. It will take approximately 5 minutes to complete.The ACRM vision taskforce would like to thank you for completing this survey. It will take approximately 5 minutes to complete.

Figure A:

The ACRM vision taskforce would like to thank you for completing this survey. It will take approximately 5 minutes to complete.

The survey was estimated to take no longer than 5 minutes to complete. The survey contained demographic questions in a multiple-choice format. Questions included gender, professional degree, specialization, current practice setting, years practicing since graduation, post-degree training, state of practice (geographic), and information on formal vision education. An “other” option was provided to permit additional options for clarity. After the demographic questions, the survey asked respondents: “Do you use structured vision assessments to assess EHC impairment”? If the participant answered “no,” the survey was terminated. If the participant answered “yes,” the survey asked questions about vision assessments. As the survey progressed, participants were given the option to skip questions on assessments that were not familiar and to move on to the next assessment.

The survey included questions on 19 assessments. The selections were made by consensus, based on what the vision task force members deemed to adequately assess “eye” and “hand” function or “coordination.” In other words, the survey assessments were chosen based on face validity, which is a subjective form of validation that is defined as the degree to which the assessment appears to be effective in measuring what it is intended to measure (Mosier, 1947). Even though face validity has limitations, it is still used to inform clinical practice (Royal, 2016). Thus, survey assessments were conferred by the topical knowledge of the multidisciplinary teams (experts in the field of rehabilitation therapy, newer members of the task force, and new professionals). This technique allowed the survey to assess content validity, or the degree to which the items, taken as a whole, captured the entire construct being measured (Polit & Beck, 2006).

For each of the 19 assessments, questions were designed to probe the effectiveness of the tool's psychometric properties (reliability, validity), usability, cost and burden, and use as a repeated measure over time. The questions used a 5-point Likert scale. For example, the usability question asked: Are you currently using the Rapid Alternating Movements of the Hands? (1 = never, 2 = very rarely, 3 = rarely, 4 = occasionally, 5 = frequently). Respondents who used this test were asked to continue to answer all questions about the test, based on empirical knowledge and best judgment. Finally, the survey allowed participants to comment on the psychometric properties of each tool, and this section was free text.

Data Analysis

Data used for this analysis included information obtained from respondents who had a degree in occupational therapy and the validity of the six EHC assessments (Rapid Alternating Movements of the Hands, Box-and-Block Test, Finger-to-Nose Test, Finger-Tapping Test, 9-Hole Peg Test, and Test of Visual-Motor Skills) that had the highest response rates, which we interpreted to mean familiarity with the assessments. Descriptive statistics summarized the demographic features (gender, specialization, percentage of time spent treating patients with visual impairment, post-degree training, formal vision education, years of practice, geographic location, and setting) and the use of specific EHC assessments. The Mann-Whitney U test was used to determine statistical differences between demographic data and Likert scale responses on attitudes of OTs on the frequency of use, reliability, usability, and cost/burden of the six EHC assessments. Kruskal-Wallis one-way analysis of variance was used to compare practice setting and Likert scale score responses. Statistical significance was set at a two-sided p < .05, and the Bonferroni correction was used where relevant to avoid overreporting significance when multiple tests were conducted. Statistical analysis was performed with SPSS, version 24.

Results

Demographic Features

A total of 54 surveys were collected. Demographic data of the respondents are shown in Table 1. Fewer than half of the respondents were in practice for less than 6 years after graduation (42.6%), 98.1% were women, and 51.9% were currently working on the U.S. East Coast. More than half of the respondents reported that they treat people with visual impairments more than 50% of the time, and 64.8% of therapists reported having some form of post-degree training in vision. In addition, 64.8% of the OTs specialized in neurological rehabilitation. Half of the survey respondents practiced in community settings, 27.8% practiced in an institutional setting, and 22.2% practiced in both institutional and community settings.

Demographic Features of Occupational Therapy Survey Respondents

Table 1:

Demographic Features of Occupational Therapy Survey Respondents

Eye-Hand Coordination Assessments

Frequencies for current usage of the 19 EHC assessments are shown in Table 2. The Finger-to-Nose Test was the most commonly used EHC assessment (75.9%), followed by the 9-Hole Peg Test (72.2%). The Purdue Pegboard Test, Motor Accuracy Test, Minnesota Handwriting Assessment, neurobehavioral evaluation, and Frostig Figure Ground assessment had the lowest frequencies of usage, with only 0% to 1.9% of respondents reporting current use.

Use of Each Assessment

Table 2:

Use of Each Assessment

Comparing Findings of Eye-Hand Coordination Assessments With Demographic Features

Comparison analyses are shown in Tables BF (available in the online version of the article).

Comparison of frequency of use of eye/hand coordination impairment assessments for occupational therapist respondents

Table B.

Comparison of frequency of use of eye/hand coordination impairment assessments for occupational therapist respondents

Comparison of reliability ratings of eye/hand coordination impairment assessments for occupational therapist respondents

Table C.

Comparison of reliability ratings of eye/hand coordination impairment assessments for occupational therapist respondents

Comparison of validity ratings of eye/hand coordination impairment assessments for occupational therapist respondents

Table D.

Comparison of validity ratings of eye/hand coordination impairment assessments for occupational therapist respondents

Comparison of user friendliness ratings of eye/hand coordination impairment assessments for occupational therapist respondents

Table E.

Comparison of user friendliness ratings of eye/hand coordination impairment assessments for occupational therapist respondents

Comparison of low cost and burden ratings of eye/hand coordination impairment assessments for occupational therapist respondents

Table F.

Comparison of low cost and burden ratings of eye/hand coordination impairment assessments for occupational therapist respondents

Respondents who had a specialization in neurological rehabilitation reported more frequent use of the 9-Hole Peg Test compared with respondents who had no specializations and those who had specialization in other areas (p < .01). Those who had a specialization in neurological rehabilitation also rated the 9-Hole Peg Test as more reliable (p = .039) and more user friendly (p = .039) compared with those who had other specializations.

Ratings for frequency of use, reliability, and cost and burden showed statistically significant differences between respondents who had fewer than 6 years of practice and respondents who had 6 years or more of practice. Further, OTs who had practiced for fewer than 6 years reported using the Finger-to-Nose Test more frequently than those who had 6 years or more of practice (p = .020). Those who had fewer than 6 years of practice also rated the Finger-to-Nose Test as more reliable compared with the more experienced OTs (p = .029). Respondents who had 6 years or more of practice were significantly more likely to rate the Rapid Alternating Movements of the Hands as low in cost and burden compared with those who had fewer than 6 years of practice (p = .041).

Statistically significant differences were also found between respondents who had some form of post-degree training and those with no post-degree training. Respondents with no post-degree training rated the Box-and-Block Test as having higher validity compared with those who had any post-degree training (p = .041). Respondents who had no post-degree training were significantly more likely to rate the 9-Hole Peg Test as low in cost and burden compared with those who had any type of post-degree training (p = .036).

Compared with respondents who practiced on the East Coast, those from other states reported using the Boxand-Block Test significantly more frequently (p < .01). Respondents from the East Coast rated the 9-Hole Peg-Test as having significantly higher validity compared with those from other states (p = .039). No differences in ratings of effectiveness were noted between groups who treated patients with visual impairment more than 50% of the time versus those who treated patients with visual impairment less than 50% of the time or between OTs who worked in different practice settings.

Discussion

To our knowledge, this study represents a novel area of research for vision rehabilitation. It is a preliminary attempt to understand OT behavior and attitudes toward frequency of use, reliability, validity, user friendliness, and cost and burden of EHC assessments. However, inferential conclusions could be determined for only 6 of the 19 assessments because some respondents were not sufficiently familiar with the remaining assessments. In other words, only 40% of the assessments were familiar enough to the participants to allow them to complete the questions. These assessments were: (a) Rapid Alternating Movements of the Hands, (b) the Box-and-Block Test, (c) the Finger-to-Nose Test, (d) the Finger-Tapping Test, (e) the 9-Hole Peg Test, and (f) the Test of Visual-Motor Skills. No respondents provided insights about assessments that they personally used but were not included in the survey. Unfortunately, the hypotheses of this study were accepted and confirmed.

Education

More than half of survey respondents claimed to have a specialization in neurological rehabilitation and reported treating people with visual impairments for half of their clinical work hours. Many (64.8%) reported having post-degree training in vision. However, the limited number of survey responses and the run-of-the-mill results of this survey raise questions about the participants' comfort level with targeted assessments for vision deficits, as related to the intersection between vision or eye control and hand control or coordination, despite post-degree training.

Many survey respondents had fewer than 6 years of experience, and this group reported using the Finger-to-Nose Test to assess EHC impairment. Clinically, this assessment is used to test EHC. However, current research distinguishes the Finger-to-Nose Test from EHC because it is used to assess ataxic dysmetria (Rodrigues, Slimovitch, Chilingaryan, & Levin, 2017). Moreover, although the Finger-to-Nose Test is mediated by visual processing (i.e., seeing the target finger in front of the patient), this test was designed to assess voluntary motor functioning, particularly movement trajectory (disturbances in rate, range, or force of movement) and its correlation to cerebellar dysfunction. This concept also applies to the Rapid Alternating Movements of the Hands. Respondents with fewer than 6 years of experience also gave this test a higher rating because they considered it low in cost and burden. In addition, respondents with 6 or more years of experience reported that they used this test, which is concerning to the authors because this test is more often used to assess motor function as related to temporal rhythmicity versus EHC. We expected to see these tests receive low scores for assessing EHC, but we found the opposite. Additionally, of the six tests, there were three tests (Rapid Alternating Movements of the Hands, Finger-to-Nose Test, and Finger-Tapping Test) that the authors considered assessments of motor coordination, as per the cerebellum. This finding suggests lack of experience or lack of knowledge of essential aspects of vision.

Programmatic Barriers

The use of mandated vision assessments may be affected by lack of departmental support as well as programmatic barriers. We found geographic differences for assessment use, suggesting regional, location-based trends among therapists. Respondents on the East Coast reported using the Box-and-Block Test more frequently than those in other states. The East Coast has many large cities with sizable hospital systems. Potentially, departmental culture, including leadership, could influence which assessments are available. For example, we would argue that the Box-and-Block Test can be thought of as an assessment of manual dexterity as opposed to EHC. However, when asked about its relevance to EHC, survey respondents with no post-degree training rated the validity of this test as strong. It is unclear how this information was determined or why this assessment is being used frequently, but we speculate that the assessment may be available in therapy gyms and used for visual assessment when it is described in the clinical literature as a dexterity tool. These findings suggest the need for more uniformity in best practice approaches to clinical care assessment and in-service education that is concordant with these approaches. As increased attention is paid to heterogeneity in routine evaluation and interinstitutional disparities in best practice assessments, task forces should dedicate efforts and resources to meta-analyses and systematic reviews that yield objective results, translating findings into consensus statements and optimized clinical guidelines.

Translating Assessments Into Common Data Elements

Some assessments, such as the Peabody Developmental Motor Scales, directly assess EHC, which is a subskill area. Additionally, literature supports its psychometric properties. For example, because strong interrater agreement has been found (intraclass correlation coefficient = 0.97) for the EHC items of the Peabody Developmental Motor Scales, it has been concluded that the assessment should be recommended for OTs who treat children with disabilities (Gebhard, Ottenbacher, & Lane, 1994). However, in our study, only one respondent commented on the Peabody Developmental Motor Scales, which suggests a possible lack of understanding of the research on psychometric properties and how they relate to practice. It is also possible that respondents who were not practicing in a pediatric setting were unaware of this assessment.

Other assessments included in this study (e.g., the Frostig Figure Ground test) integrate EHC skills; these tests include additional or related assessment subcategories or subtypes, such as low vision (Quillman, Mehr, & Goodrich, 1981). Respondents did not mention this information, again potentially because of a lack of understanding of the validity of the tests in these clinical contexts. More simply, there may be an educational gap in test deployment. The broader concern is that an inability to translate these assessments into common data elements in either partial or complete administration (i.e., administering a tailored combination of tests or subtests for a given need or a particular study objective) may limit the benefit of potential clinical research.

Limitations

First, the study used a convenience sample of U.S. OTs and many respondents were from the East Coast and clustered from the same hospitals. Also, the study included more women than men. Another limitation was that the survey design did not adequately match the analytic plan. This limitation was identified after the survey occurred. For example, permitting multiple answers to questions led to challenges in data interpretation, and we had to dichotomize variables instead. A small sample size produced a limited number of responses that were eligible for analysis. Thus, the survey-based method and the response pattern may limit generalizability. However, despite these limitations, the results are consistent with previous research about the difficulty of interpreting psychometric properties and the use of specific assessments (Coster, 2008).

Conclusion

The study results strongly suggest the need for foundational work to improve OT knowledge of EHC instruments and their related psychometric properties. Further action is needed on specific education on vision rehabilitation, particularly in the neurorehabilitation setting. This training should include background information on the assessments, including where to locate relevant literature, how to employ the assessments, and when to administer them appropriately. Also, clinicians need to know how to interpret findings from empirically based EHC assessments to make meaningful contributions toward each client's care plan. Infrastructural changes and systematic alterations to clinical documentation (medical records) may be required to distill the evidence on a common data element level and to ensure uniformity and consistency of care.

This study suggests questions about the current EHC practice landscape in the United States among treating OTs. We conclude that OTs, even those who have post-degree training in vision and who currently work in neurological rehabilitation settings, do not have in-depth knowledge of EHC assessments. Given the knowledge state assessed, most clinicians use a few assessments, as verified by the findings among more than 70% of the survey respondents. Additional research is needed to determine whether standardized education on measurement tools for vision rehabilitation will advance and ensure best practices. These recommendations should not be specific to the United States.

The key finding of this study was that OTs require further training to adequately assess EHC. There is limited knowledge about what assessments to use and when and how to interpret psychometric data as well as programmatic and facility barriers that may affect which assessments are used. Further research is needed to develop gold standard visual and visuomotor assessments for neurological populations, especially for those with EHC impairment. Recommendations are needed for the use of assessments and how measurement standards can be meaningfully and consistently integrated into practice to improve care.

References

  • Alt Murphy, M., Resteghini, C., Feys, P. & Lamers, I. (2015). An overview of systematic reviews on upper extremity outcome measures after stroke. BMC Neurology, 15, 29. doi:10.1186/s12883-015-0292-6 [CrossRef]25880033
  • Baker, K., Cano, S. J. & Playford, E. D. (2011). Outcome measurement in stroke: A scale selection strategy. Stroke, 42(6), 1787–1794. doi:10.1161/STROKEAHA.110.608505 [CrossRef]21566236
  • Bard, C., Fleury, M. & Hay, L. (Eds.). (1990). Development of eye-hand coordination across the life-span. Columbia, SC: University of South Carolina Press.
  • Beer, R. F., Dewald, J. P. & Rymer, W. Z. (2000). Deficits in the coordination of multijoint arm movements in patients with hemiparesis: Evidence for disturbed control of limb dynamics. Experimental Brain Research, 131(3), 305–319. doi:10.1007/s002219900275 [CrossRef]10789946
  • Boissy, P., Bourbonnais, D., Carlotti, M. M., Gravel, D. & Arsenault, B. A. (1999). Maximal grip force in chronic stroke subjects and its relationship to global upper extremity function. Clinical Rehabilitation, 13(4), 354–362. doi:10.1191/026921599676433080 [CrossRef]10460123
  • Brown, J. A., Dalecki, M., Hughes, C., Macpherson, A. K. & Sergio, L. E. (2015). Cognitive-motor integration deficits in young adult athletes following concussion. BMC Sports Science, Medicine and Rehabilitation, 7, 25. doi:10.1186/s13102-015-0019-4 [CrossRef]26491541
  • Brown, T. & Unsworth, C. (2009). Evaluating construct validity of the Slosson visual-motor performance test using the Rasch measurement model. Perceptual and Motor Skills, 108(2), 367–382. doi:10.2466/pms.108.2.367-382 [CrossRef]19552005
  • Buddenberg, L. A. & Davis, C. (2000). Test-retest reliability of the Purdue pegboard test. American Journal of Occupational Therapy, 54(5), 555–558. doi:10.5014/ajot.54.5.555 [CrossRef]11006818
  • Caeyenberghs, K., van Roon, D., van Aken, K., De Cock, P., Linden, C. V., Swinnen, S. P. & Smits-Engelsman, B. C. (2009). Static and dynamic visuomotor task performance in children with acquired brain injury: Predictive control deficits under increased temporal pressure. Journal of Head Trauma Rehabilitation, 24(5), 363–373. doi:10.1097/HTR.0b013e3181af0810 [CrossRef]19858970
  • Chen, H. M., Chen, C. C., Hsueh, I. P., Huang, S. L. & Hsieh, C. L. (2009). Test-retest reproducibility and smallest real difference of 5 hand function tests in patients with stroke. Neurorehabilitation and Neural Repair, 23(5), 435–440. doi:10.1177/1545968308331146 [CrossRef]19261767
  • Cole, B., Finch, E., Gowland, C. & Mayo, N. (1994). Physical rehabilitation outcome measures. Toronto, Canada: Canadian Physiotherapy Association.
  • Connell, L. A. & Tyson, S. F. (2012). Clinical reality of measuring upper-limb ability in neurologic conditions: A systematic review. Archives of Physical Medicine and Rehabilitation, 93(2), 221–228. doi:10.1016/j.apmr.2011.09.015 [CrossRef]22289230
  • Coster, W. J. (2008). Embracing ambiguity: Facing the challenge of measurement. American Journal of Occupational Therapy, 62(6), 743–752. doi:10.5014/ajot.62.6.743 [CrossRef]19024752
  • Croarkin, E., Danoff, J. & Barnes, C. (2004). Evidence-based rating of upper extremity motor function tests used for people following a stroke. Physical Therapy, 84(1), 62–74.14992677
  • Desrosiers, J., Bravo, G., Hébert, R., Dutil, E. & Mercier, L. (1994). Validation of the box and block test as a measure of dexterity of elderly people: Reliability, validity, and norms studies. Archives of Physical Medicine and Rehabilitation, 75(7), 751–755.8024419
  • Desrosiers, J., Hébert, R., Bravo, G. & Dutil, E. (1995). The Purdue pegboard test: Normative data for people aged 60 and over. Disability and Rehabilitation, 17(5), 217–224. doi:10.3109/09638289509166638 [CrossRef]7626768
  • Desrosiers, J., Rochette, A., Hébert, R. & Bravo, G. (1997). The Minnesota manual dexterity test: Reliability, validity and reference values studies with healthy elderly people. Canadian Journal of Occupational Therapy, 64(5), 270–276. doi:10.1177/000841749706400504 [CrossRef]
  • Folio, M. R. & Fewell, R. R. (2000). PDMS-2 Peabody developmental motor scales (2nd ed.). Austin, TX: Pro-Ed.
  • Gao, K. L., Ng, S. S. M., Kwok, J. W., Chow, R. T. & Tsang, W. W. (2010). Eye-hand coordination and its relationship with sensori-motor impairments in stroke survivors. Journal of Rehabilitation Medicine, 42(4), 368–373. doi:10.2340/16501977-0520 [CrossRef]20461340
  • Gebhard, A. R., Ottenbacher, K. J. & Lane, S. J. (1994). Interrater reliability of the Peabody developmental motor scales: Fine motor scale. American Journal of Occupational Therapy, 48(11), 976–981. doi:10.5014/ajot.48.11.976 [CrossRef]7530905
  • Graf, M. & Hinton, R. N. (1997). Correlations for the developmental visual-motor integration test and the Wechsler intelligence scale for children-III. Perceptual and Motor Skills, 84(2), 699–702. doi:10.2466/pms.1997.84.2.699 [CrossRef]9106866
  • Hines, M. & O'Connor, J. (1926). A measure of finger dexterity. Personnel Journal, 4, 379–382.
  • Jaeger Pedersen, T. & Kaae Kristensen, H. (2016). A critical discourse analysis of the attitudes of occupational therapists and physiotherapists towards the systematic use of standardised outcome measurement. Disability and Rehabilitation, 38(16), 1592–1602. doi:10.3109/09638288.2015.1107630 [CrossRef]
  • Johansson, R. S., Westling, G., Bäckström, A. & Flanagan, J. R. (2001). Eye-hand coordination in object manipulation. Journal of Neuroscience, 21(17), 6917–6932. doi:10.1523/JNEUROSCI.21-17-06917.2001 [CrossRef]11517279
  • Land, M., Mennie, N. & Rusted, J. (1999). The roles of vision and eye movements in the control of activities of daily living. Perception, 28(11), 1311–1328. doi:10.1068/p2935 [CrossRef]
  • Lehman-Bear, J. & Abreu, B. C. (1989). Evaluating the hand: issues in reliability and validity. Physical Therapy, 69(12):1025–1033. doi:10.1093/ptj/69.12.1025 [CrossRef]
  • Levine, A. J., Miller, E. N., Becker, J. T., Selnes, O. A. & Cohen, B. A. (2004). Normative data for determining significance of test-retest differences on eight common neuropsychological instruments. Clinical Neuropsychology, 18(3), 373–384. doi:10.1080/1385404049052420 [CrossRef]
  • Lezak, M. D., Howieson, D. B. & Loring, D. W. (2004). Neuropsychological assessment (4th ed.). New York, NY: Oxford University Press.
  • Mathiowetz, V., Volland, G., Kashman, N. & Weber, K. (1985). Adult norms for the box and block test of manual dexterity. American Journal of Occupational Therapy, 39(6), 386–391. doi:10.5014/ajot.39.6.386 [CrossRef]3160243
  • Ma-Wyatt, A. & Renninger, L. (2011). Eye-hand coordination in rapid, goal directed movements. Journal of Vision, 11, 946. doi:10.1167/11.11.946 [CrossRef]
  • McCrea, P. H., Eng, J. J. & Hodgson, A. J. (2005). Saturated muscle activation contributes to compensatory reaching strategies after stroke. Journal of Neurophysiology, 94(5), 2999–3008. doi:10.1152/jn.00732.2004 [CrossRef]16014786
  • Morley, M. (2014). Evidencing what works: Are occupational therapists using clinical information effectively?British Journal of Occupational Therapy, 77(12), 601–604. doi:10.4276/030802214X14176260335228 [CrossRef]
  • Mosier, C. I. (1947). A critical examination of the concepts of face validity. Educational and Psychological Measurement, 7(2), 191–205. doi:10.1177/001316444700700201 [CrossRef]20256558
  • Piernik-Yoder, B. & Beck, A. (2012). The use of standardized assessments in occupational therapy in the United States. Occupational Therapy in Health Care, 26(2–3), 97–108. doi:10.3109/07380577.2012.695103 [CrossRef]23899135
  • Platz, T., Pinkowski, C., van Wijck, F., Kim, I. H., di Bella, P. & Johnson, G. (2005). Reliability and validity of arm function assessment with standardized guidelines for the Fugl-Meyer test, action research arm test and box and block test: A multicentre study. Clinical Rehabilitation, 19(4), 404–411. doi:10.1191/0269215505cr832oa [CrossRef]15929509
  • Polit, D. F. & Beck, C. T. (2006). The content validity index: Are you sure you know what's being reported? Critique and recommendations. Research in Nursing & Health, 29(5), 489–497. doi:10.1002/nur.20147 [CrossRef]
  • Preda, C. (1997). Test of visual-motor integration: Construct validity in a comparison with the Berry-Buktenica development test of visual-motor integration. Perceptual and Motor Skills, 84(3), 1439–1443. doi:10.2466/pms.1997.84.3c.1439 [CrossRef]
  • Quillman, R. D., Mehr, E. B. & Goodrich, G. L. (1981). Use of the Frostig figure ground in evaluation of adults with low vision. American Journal of Optometry and Physiological Optics, 58(11), 910–918. doi:10.1097/00006324-198111000-00002 [CrossRef]7315942
  • Reddon, J. R., Gill, D. M., Gauk, S. E. & Maerz, M. D. (1988). Purdue pegboard: Test retest estimates. Perceptual and Motor Skills, 66(2), 503–506. doi:10.2466/pms.1988.66.2.503 [CrossRef]3399326
  • Rizzo, J. R., Hosseini, M., Wong, E. A., Mackey, W. E., Fung, J. K., Ahdoot, E. & Hudson, T. E. (2017). The intersection between ocular and manual motor control: Eye-hand coordination in acquired brain injury. Frontiers in Neurology, 8, 227. doi:10.3389/fneur.2017.00227 [CrossRef]28620341
  • Roberts, P. S., Rizzo, J. R., Hreha, K., Wertheimer, J., Kaldenberg, J., Hironaka, D. & Colenbrander, A. (2016). A conceptual model for vision rehabilitation. Journal of Rehabilitation Research and Development, 53(6), 693–704. doi:10.1682/JRRD.2015.06.0113 [CrossRef]27997671
  • Rodrigues, M. R., Slimovitch, M., Chilingaryan, G. & Levin, M. F. (2017). Does the finger-to-nose test measure upper limb coordination in chronic stroke?Journal of Neuroengineering and Rehabilitation, 14(1), 6. doi:10.1186/s12984-016-0213-y [CrossRef]28114996
  • Royal, K. (2016). “Face validity” is not a legitimate type of validity evidence!American Journal of Surgery, 212(5), 1026–1027. doi:10.1016/j.amjsurg.2016.02.018 [CrossRef]27255779
  • Scheiman, M. (2011). Understanding and managing vision deficits: A guide for occupational therapists (3rd ed.). Thorofare, NJ: Slack.
  • Smeets, J. B., Hayhoe, M. M. & Ballard, D. H. (1996). Goal-directed arm movements change eye-head coordination. Experimental Brain Research, 109(3), 434–440. doi:10.1007/BF00229627 [CrossRef]8817273
  • Sunderland, A., Tinson, D., Bradley, L. & Hewer, R. L. (1989). Arm function after stroke: An evaluation of grip strength as a measure of recovery and prognostic indicator. Journal of Neurology, Neurosurgery, and Psychiatry, 52(11), 1267–1272. doi:10.1136/jnnp.52.11.1267 [CrossRef]2592969
  • Tavasoli, A., Azimi, P. & Montazari, A. (2014). Reliability and validity of the Peabody developmental motor scales-second edition for assessing motor development of low birth weight preterm infants. Pediatric Neurology, 51(4), 522–526. doi:10.1016/j.pediatrneurol.2014.06.010 [CrossRef]25266615
  • van Polanen, V. & Davare, M. (2015). Interactions between dorsal and ventral streams for controlling skilled grasp. Neuropsychologia, 79(Pt B), 186–191. doi:10.1016/j.neuropsychologia.2015.07.010 [CrossRef]26169317
  • Wang, H. H., Lio, H. F. & Hsieh, C. L. (2006). Reliability, sensitivity to change, and responsiveness of the Peabody developmental motor scale-second edition for children with cerebral palsy. Physical Therapy, 86(10), 1351–1359. doi:10.2522/ptj.20050259 [CrossRef]17012639
  • Wang, Y. C., Magasi, S., Bohannon, R. W., Reuben, D. B., McCreath, H. E., Bubela, D. J. & Rymer, W. Z. (2011). Assessing dexterity function: A comparison of two alternatives for the NIH toolbox. Journal of Hand Therapy, 24(4), 313–321. doi:10.1016/j.jht.2011.05.001 [CrossRef]21798715
  • Wuang, Y. P., Su, C. Y. & Huang, M. H. (2012). Psychometric comparisons of three measures for assessing motor functions in preschoolers with intellectual disabilities. Journal of Intellectual Disability Research, 56(6), 567–578. doi:10.1111/j.1365-2788.2011.01491.x [CrossRef]
  • Zackowski, K. M., Dromerick, A. W., Sahrmann, S. A., Thach, W. T. & Bastian, A. J. (2004). How do strength, sensation, spasticity and joint individuation relate to the reaching deficits of people with chronic hemiparesis?Brain, 127(5), 1035–1046. doi:10.1093/brain/awh116 [CrossRef]14976070

Demographic Features of Occupational Therapy Survey Respondents

Characteristicn (%)
Gender
  Female53 (98.1)
  Male1 (1.9)
Specialization
  Neurorehabilitation35 (64.8)
  All others19 (35.2)
Percentage of time treating visual impairments
  < 50%24 (44.4)
  ⩾ 50%30 (55.6)
Post-degree training
  Any training35 (64.8)
  No training19 (35.2)
Years of practice
  < 323 (42.6)
  ⩾ 331 (57.4)
Region
  East Coast28 (51.9)
  All others26 (48.1)
Current practice setting
  Institution15 (27.8)
  Community27 (50)
  Both12 (22.2)

Use of Each Assessment

Assessment no.AssessmentScore for use, n (%)
1Finger-to-Nose Test41 (75.9)
29-Hole Peg Test39 (72.2)
3Rapid Alternating Movements of the Hands27 (50)
4Box-and-Block Test21 (38.9)
5Finger-Tapping Test9 (16.7)
6Test of Visual-Motor Skills17 (31.5)
7Minnesota Manual Dexterity Test16 (29.6)
8Grooved Pegboard Test13 (24.1)
9Beery-Buktenica Test of Visual Motor Integration8 (14.8)
10Overhead beanbag throw5 (9.3)
11Stick catching test5 (9.3)
12Peabody Developmental Motor Scales4 (7.4)
13O'Connor Dexterity Battery3 (5.6)
14Movement Assessment Battery for Children1 (1.9)
15Motor Accuracy Test1 (1.9)
16Frostig Figure Ground1 (1.9)
17Neurobehavioral evaluation0 (0)
18Purdue Pegboard Test0 (0)
19Minnesota Handwriting Assessment0 (0)

Eye hand coordination tests psychometric properties and corresponding references

Peabody Developmental Motor ScaleBox and Blocks testTest of Visual Motor SkillsMinnesota Manual Dexterity TestPurdue peg board9 Hole PegO'Connor Dexterity BatteryGrooved Peg Board
Psychometric Properties
ValidityTavasoli et al., 2014; Wuang, et al., 2012Boissy et al., 1999; Desrosiers et al., 1994; Platz et al., 2005Brown & Unsworth, 2009; Graf & Hinton, 1997; Preda, 1997Desrosiers et al., 1997Wang et al., 2011Chen, et al., 2009; Sunderland et al., 1989; Croarkin et al., 2004;Wang et al., 2011
Intra-rater/test-retest reliabilityTavasoli et al., 2014; Wuang, et al., 2012Mathiowetz et al., 1985Desrosiers et al., 1997Reddon et al., 1988; Bear-Lehman & Abreu, 1989; Desrosiers et al., 1995; Buddenberg & Davis, 2000; Lezak et al., 2004; Wang et al., 2011Croarkin et al., 2004Hines & O'Connor, 1926Levine et al., 2004; Wang et al., 2011
Inter-rater reliabilityMathiowetz et al., 1985Croarkin et al., 2004
ResponsivenessWang at al., 2006; Wuang et al., 2012
MCIDCroarkin et al., 2004
MDC/SDD/SRDWang at al., 2006; Wuang et al., 2012Mathiowetz et al., 1985; Chen et al., 2009Mathiowetz et al., 1985; Croarkin et al., 2004; Chen et al., 2009
Clinical UtilityFolio et al., 2000Mathiowetz et al., 1985; Murphy et al., 2015; Baker et al., 2011; Connell et al., 2012Brown & Unsworth, 2009Mathiowetz et al., 1985; Croarkin et al., 2004

Comparison of frequency of use of eye/hand coordination impairment assessments for occupational therapist respondents

Rapid Alternating Hand MovementsFinger to NoseBox and BlocksFinger Tapping9 Hole peg boardTest of Visual Motor Skills
nMean RatingpnMean RatingpnMean RatingpnMean RatingpnMean RatingpnMean Ratingp
Specialization
  Neuro163.190.433263.880.764164.190.78743.250.418274.480.001*103.100.304
  All Others113.73153.7354.2053.00123.2572.57
Percentage of Treating Visual Impairment Patients
  <50%143.430.784203.850.78364.330.61073.141.000124.000.46772.570.304
  >=50%133.38213.81154.1323.00274.15103.10
Post Degree Training
  Any Training173.410.857253.760.621164.190.78782.880.088274.070.946153.070.041**
  No Training103.40163.9454.2015.00124.1721.50
Years Practiced
  <6 Years133.690.332194.260.020*84.500.52833.670.256184.110.77852.600.657
  ≥6 Years143.14223.45134.0062.83214.10123.00
State
  East Coast173.590.520273.960.23453.200.008*53.000.418183.780.07972.710.681
  All Others103.10143.57164.5043.25214.38103.00
Setting
  Institution52.800.44993.330.236103.800.20333.000.985114.730.09053.200.421
  Community153.40224.0064.5023.50183.8342.25
  Both73.86103.9054.6043.00103.9083.00

Comparison of reliability ratings of eye/hand coordination impairment assessments for occupational therapist respondents

Rapid Alternating Hand MovementsFinger to NoseBox and BlocksFinger Tapping9 Hole peg boardTest of Visual Motor Skills
nMean RatingpnMean RatingpnMean RatingpnMean RatingpnMean RatingpnMean Ratingp
Specialization
  Neuro143.500.320263.190.206163.750.96543.500.076273.670.039*103.700.956
  All Others113.09152.8053.8051.80123.0873.71
Percentage of Treating Visual Impairment Patients
  <50%133.310.907203.000.81463.830.70872.430.544123.580.89373.710.913
  >=50%123.33213.10153.7323.00273.44103.70
Post Degree Training
  Any Training163.311.000253.000.777163.560.05782.750.229273.370.227153.800.314
  No Training93.33163.1354.4011.00123.7523.00
Years Practiced
  <6 Years133.540.233193.370.029*84.000.39432.330.789183.670.35953.600.859
  ≥6 Years123.08222.77133.6262.67213.33123.75
State
  East Coast173.470.237273.000.62054.000.53652.000.205183.390.73273.860.442
  All Others83.00143.14163.6943.25213.57103.60
Setting
  Institution43.500.12493.000.497103.600.032*33.000.189113.910.036*54.000.062
  Community143.57223.1864.5021.00183.5042.75
  Both72.71102.8053.2043.00103.0084.00

Comparison of validity ratings of eye/hand coordination impairment assessments for occupational therapist respondents

Rapid Alternating Hand MovementsFinger to NoseBox and BlocksFinger Tapping9 Hole peg boardTest of Visual Motor Skills
nMean RatingpnMean RatingpnMean RatingpnMean RatingpnMean RatingpnMean Ratingp
Specialization
  Neuro143.070.908262.770.573163.000.47843.250.544273.300.450103.500.253
  All Others112.91152.9353.4042.75123.0874.00
Percentage of Treating Visual Impairment Patients
  <50%132.920.752202.850.74963.500.21063.170.483123.500.21074.000.194
  >=50%123.08212.81152.9322.50273.11103.50
Post Degree Training
  Any Training163.060.811252.760.385162.880.041*83.00--273.070.092153.600.153
  No Training92.89162.9453.800--123.5824.50
Years Practiced
  <6 Years133.000.977193.000.21583.000.75622.500.483183.440.15853.800.823
  ≥6 Years123.00222.68133.1563.17213.05123.67
State
  East Coast172.940.645272.810.86053.600.16942.750.544183.560.039*74.140.069
  All Others83.13142.86162.9443.25212.95103.40
Setting
  Institution43.000.86092.780.913102.900.44333.670.377113.180.30053.800.912
  Community143.07222.8663.5012.00183.4443.50
  Both72.86102.8053.0042.75102.9083.75

Comparison of user friendliness ratings of eye/hand coordination impairment assessments for occupational therapist respondents

Rapid Alternating Hand MovementsFinger to NoseBox and BlocksFinger Tapping9 Hole peg boardTest of Visual Motor Skills
nMean RatingpnMean RatingpnMean RatingpnMean RatingpnMean RatingpnMean Ratingp
Specialization
  Neuro144.640.845264.420.640164.690.60144.250.786274.670.039*103.800.748
  All Others114.55154.5354.6054.40124.2573.57
Percentage of Treating Visual Impairment Patients
  <50%134.460.244204.300.30064.670.84474.290.871124.500.57173.710.668
  >=50%124.75214.62154.6724.50274.56103.70
Post Degree Training
  Any Training164.500.266254.320.135164.560.11784.500.134274.440.351153.800.325
  No Training94.78164.6955.0013.00124.7523.00
Years Practiced
  <6 Years134.690.497194.530.42584.750.31334.000.391184.560.59953.200.105
  ≥6 Years124.50224.41134.6264.50214.52123.92
State
  East Coast174.631.000274.290.13754.560.11754.000.223184.330.11173.700.789
  All Others84.59144.56165.0044.60214.78103.71
Setting
  Institution45.000.17094.440.661104.700.80235.000.227114.550.57554.000.017**
  Community144.43224.3664.6724.00184.5042.75
  Both74.71104.7054.6044.00104.6084.00

Comparison of low cost and burden ratings of eye/hand coordination impairment assessments for occupational therapist respondents

Rapid Alternating Hand MovementsFinger to NoseBox and BlocksFinger Tapping9 Hole peg boardTest of Visual Motor Skills
nMean RatingpnMean RatingpnMean RatingpnMean RatingpnMean RatingpnMean Ratingp
Specialization
  Neuro144.790.492264.690.158164.500.14444.751.000274.520.944103.500.290
  All Others114.64154.8054.0054.20124.5073.00
Percentage of Treating Visual Impairment Patients
  <50%134.621.000204.650.40564.500.54774.430.423124.580.57673.140.560
  >=50%124.83214.81154.3324.50274.48103.40
Post Degree Training
  Any Training164.810.722254.760.969164.250.14484.880.034**274.370.036*153.330.686
  No Training94.56164.6954.8011.00124.8323.00
Years Practiced
  <6 Years134.460.041*194.580.26384.380.93633.330.034**184.440.60553.000.253
  ≥6 Years125.00224.86134.3865.00214.57123.42
State
  East Coast174.590.144274.740.81154.800.14454.201.000184.670.24473.140.560
  All Others85.00144.71164.2544.75214.38103.40
Setting
  Institution45.000.16894.780.441104.400.87535.000.392114.360.73153.600.059
  Community144.50224.6464.3323.00184.5642.50
  Both75.00104.9054.4044.75104.6083.50
Authors

Dr. Hreha is Research Assistant Professor, Division of Rehabilitation Sciences, School of Health Professions, University of Texas Medical Branch, Galveston, Texas. Dr. Rizzo is Director of Innovation & Technology, Director of Research (Interim), Director, Visuomotor Integration Laboratory (VMIL), Director, Rehabilitation Engineering Alliance and Center Transforming Low Vision (REACTIV Lab), and Assistant Professor, Departments of Rehabilitation Medicine, Neurology, and Biomedical Engineering and Tandon School of Engineering, NYU School of Medicine and Tandon School of Engineering, New York, New York. Dr. Abdou is Neurorehabilitation Post-doctoral Fellow, Center for Stroke Rehabilitation Research, Kessler Foundation, West Orange, New Jersey. Ms. Truong is Applications Specialist, Enterprise Information Services, Cedars-Sinai, Los Angeles, California. Dr. Wertheimer is Clinical Neuropsychologist, Department of Physical Medicine & Rehabilitation, Cedars-Sinai, Los Angeles, California. Dr. Graf is Assistant Professor, Department of Rehabilitation Medicine, University of Minnesota Medical School, and Medical Director, Traumatic Brain Injury Outpatient Program, Hennepin Healthcare, Minneapolis, Minnesota. Dr. Kaldenberg is Clinical Assistant Professor, Department of Occupational Therapy, Boston University, College of Health and Rehabilitation Sciences, Boston, Massachusetts. Ms. Llanos is Coordinator, Polytrauma Research Executive Committee, and Program Manager, Polytrauma/TBI Vision Rehabilitation Clinic, James A. Haley Veterans Administration Medical Center, Tampa, Florida. Dr. Roberts is Executive Director, Academic and Physician Informatics, Professor and Executive Director, Physical Medicine and Rehabilitation, and Co-Director, Division of Informatics, Department of Biomedical Sciences, Cedars-Sinai, Los Angeles, California.

The authors have no relevant financial relationships to disclose.

Address correspondence to John Ross Rizzo, MD, MSCI, Assistant Professor, NYU School of Medicine, 240 E. 38th St., Office 1776, New York, NY 10016; e-mail: johnross.rizzo@nyulangone.org.

Received: April 08, 2019
Accepted: July 22, 2019
Posted Online: September 23, 2019

10.3928/24761222-20190910-03

Sign up to receive

Journal E-contents