Orthopedics

Feature Article 

Accuracy and Reliability of Visual Inspection and Smartphone Applications for Measuring Finger Range of Motion

Hannah H. Lee, MD, PhD; Kwesi St. Louis, MD; John R. Fowler, MD

Abstract

Measurement of finger range of motion is critical in clinical settings, especially for outcome analysis, clinical decision making, and rehabilitation/disability assessment. Although goniometer measurement is clinically considered the gold standard, its accuracy compared with the true radiographic measurements of the joint angles remains questionable. The authors compared 3 smartphone applications and visual inspection measurements of the finger joints with the radiographic measurements and determined interrater reliability for these measurement tools. A finger was held in place using an aluminum-alloy splint, and a fluoroscopic image was acquired by a mini C-arm. An independent observer measured each joint flexion angle of the fluoroscopic image using a universal handheld goniometer, and this was used as the reference. Finger joint flexion angles were then independently measured by 3 observers using 3 different smartphone applications. In addition, visual inspection was used to estimate the flexion angles of finger joints. The results of this study suggest that all 3 smartphone measurement tools, as well as visual inspection, agree and correlate well with the reference fluoroscopic image measurement. Average differences between the fluoroscopic image measurements with the measured angles using the tools studied ranged from 9.4° to 12.2°. The mean correlation coefficients for each smartphone application exceeded 0.7. Overall interrater reliabilities were similar, with the interclass correlation coefficient being greater than 0.9 for all of the measurement tools. These data suggest that new smartphone applications hold promise for providing accurate and reliable measures of range of motion. [Orthopedics. 201x; xx(x):xx–xx.]

Abstract

Measurement of finger range of motion is critical in clinical settings, especially for outcome analysis, clinical decision making, and rehabilitation/disability assessment. Although goniometer measurement is clinically considered the gold standard, its accuracy compared with the true radiographic measurements of the joint angles remains questionable. The authors compared 3 smartphone applications and visual inspection measurements of the finger joints with the radiographic measurements and determined interrater reliability for these measurement tools. A finger was held in place using an aluminum-alloy splint, and a fluoroscopic image was acquired by a mini C-arm. An independent observer measured each joint flexion angle of the fluoroscopic image using a universal handheld goniometer, and this was used as the reference. Finger joint flexion angles were then independently measured by 3 observers using 3 different smartphone applications. In addition, visual inspection was used to estimate the flexion angles of finger joints. The results of this study suggest that all 3 smartphone measurement tools, as well as visual inspection, agree and correlate well with the reference fluoroscopic image measurement. Average differences between the fluoroscopic image measurements with the measured angles using the tools studied ranged from 9.4° to 12.2°. The mean correlation coefficients for each smartphone application exceeded 0.7. Overall interrater reliabilities were similar, with the interclass correlation coefficient being greater than 0.9 for all of the measurement tools. These data suggest that new smartphone applications hold promise for providing accurate and reliable measures of range of motion. [Orthopedics. 201x; xx(x):xx–xx.]

Reliable and accurate measurement of range of motion (ROM) of a joint is an integral part of the physical examination in hand surgery. Handheld goniometer measurements are often considered the gold standard.1–4 Goniometers are the most widely used, often studied, economical, and portable devices for measuring joint ROM.5 However, neither goniometer nor visual inspections has shown high levels of accuracy compared with radiographic measurements in finger and wrist joints.2

Recently, several new smartphone-based applications (apps) allowing measurement of ROM have been introduced. With the widespread use of smartphones, these apps may offer new means of providing accurate and reliable measures of ROM, especially in clinical situations where the standard goniometer and/or radiographs may be unavailable. If these apps were proven to be reliable and accurate, apps could be designed to automate the recording of these measurements and possibly even place them into the electronic medical record, improving efficiency for both therapists and clinicians.

The purposes of this study were to compare different smartphone apps and visual inspection measurements of the finger joints with the goniometer measurements of radiographs and to determine interrater reliability for these measurement tools.

Materials and Methods

Finger flexion at the metacarpophalangeal joint (MCPJ), proximal interphalangeal joint (PIPJ), and distal interphalangeal joint (DIPJ) were independently measured by a hand fellowship–trained surgeon (J.R.F.) and 2 residents (H.H.L., K.S.L.). The index finger of a healthy subject with no known pathology was held in place using an aluminum-alloy splint (Figure 1A). A fluoroscopic image was acquired by a mini C-arm (Figure 1B). An independent observer measured each joint flexion angle on the fluoroscopic image using a universal handheld goniometer. This measurement was used as the reference standard.

Finger flexion angle reference measurements. The index finger of a healthy subject with no known pathology was held in place by an aluminum-alloy splint (A). A fluoroscopic image was acquired by a mini C-arm. An independent observer measured each joint flexion angle using a universal handheld goniometer (B).

Figure 1:

Finger flexion angle reference measurements. The index finger of a healthy subject with no known pathology was held in place by an aluminum-alloy splint (A). A fluoroscopic image was acquired by a mini C-arm. An independent observer measured each joint flexion angle using a universal handheld goniometer (B).

Three different iPhone 6 (Apple, Cupertino, California) apps were used to measure the flexion angles of the finger according to manufacturer instructions: Goniometer (June Infrastructure Pvt Ltd, Viman Nagar, Pune, India), iPhone Compass (Apple), and PT-Tools Suite (by David Raney). Each observer measured the MCPJ, PIPJ, and DIPJ of the finger held in the aluminum-alloy splint once. In addition, visual inspection was used to estimate the flexion angles of the MCPJ, PIPJ, and DIPJ. A total of 9 trials were performed, each with a varied finger flexion configuration selected by the independent observer. Ease of use was rated by each observer.

Statistical analyses were performed using SPSS software (IBM, Armonk, New York). The Bland–Altman method was used to evaluate the agreement of each measurement tool with the standard measurement.6 Comparisons of each of the measurement tools against the standard measurement were performed via 2-tailed Pearson correlation, as well as one-way analysis of variance and Kruskal–Wallis test for parametric and nonparametric data, respectively. Interrater reliability was quantified using interclass correlation coefficient.

Results

The “user-friendliness” of each measurement tool is presented in Table 1. All 3 observers found visual inspection to be the easiest.

Ease of Use of Measurement Methods

Table 1:

Ease of Use of Measurement Methods

Figure 2 documents all measured readings of finger joints for all 3 observers. Each data point represents an observer's measured angle of the MCPJ, PIPJ, or DIPJ using visual inspection, Goniometer, iPhone Compass, or PT-Tools Suite apps vs the measurement of fluoroscopic images of each joint.

Measured angles vs the fluoroscopic image (XR) reference standard. Visual inspection (A), Goniometer (June Infrastructure Pvt Ltd, Viman Nagar, Pune, India) (B), iPhone Compass (Apple, Cupertino, California) (C), and PT-Tools Suite (by David Raney) (D).

Figure 2:

Measured angles vs the fluoroscopic image (XR) reference standard. Visual inspection (A), Goniometer (June Infrastructure Pvt Ltd, Viman Nagar, Pune, India) (B), iPhone Compass (Apple, Cupertino, California) (C), and PT-Tools Suite (by David Raney) (D).

Overall agreement of the smartphone apps as well as visual inspection measurements of the finger joints with the fluoroscopic image measurement, as assessed by the Bland–Altman method, is illustrated in Figure 3 and Table 2. Visual inspection had 3 cases (3.7%), iPhone Compass had 4 cases (4.9%), and Goniometer and PT-Tools Suite had 5 cases (6.2%) outside of the 95% agreement range. Goniometer and iPhone Compass had the mean difference between the measurement and fluoroscopic image standard close to zero, and Goniometer had the smallest standard deviation of differences. When the subanalysis was performed for each joint, the MCPJ had the greatest agreement, with Goniometer and PT-Tools Suite having no cases outside of the agreement range (data not shown).

Bland–Altman plot for agreement with fluoroscopic image (XR) reference standard. Visual inspection (A), Goniometer (June Infrastructure Pvt Ltd, Viman Nagar, Pune, India) (B), iPhone Compass (Apple, Cupertino, California) (C), and PT-Tools Suite (by David Raney) (D).

Figure 3:

Bland–Altman plot for agreement with fluoroscopic image (XR) reference standard. Visual inspection (A), Goniometer (June Infrastructure Pvt Ltd, Viman Nagar, Pune, India) (B), iPhone Compass (Apple, Cupertino, California) (C), and PT-Tools Suite (by David Raney) (D).

Bland–Altman's Limits of Agreement

Table 2:

Bland–Altman's Limits of Agreement

Average differences between the fluoroscopic image measurements with the measured angles using the tools studied were 10.8° for visual inspection, 9.4° for Goniometer, 10.4° for iPhone Compass, and 12.2° for PT-Tools Suite. When the authors dropped the allowed difference between the fluoroscopic image measurements to the measured angles to ±10°, visual inspection had 65.4%, Goniometer had 59.3%, iPhone Compass had 49.4%, and PT-Tools Suite had 51.9% of cases within the range. No statistical differences were observed among the 4 measurement tools for either average differences (P=.214) or percentage of measures that are within 10% from the fluoroscopic image standard (P=.159).

The Pearson r2 coefficients were 0.8149 for visual inspection (range, 0.7435–0.8660), 0.7061 for Goniometer (range, 0.6463–0.7388), 0.7201 for iPhone Compass (range, 0.5875–0.8283), and 0.7118 for PT-Tools Suite (range, 0.6481–0.7503) (Table 3). The data were subdivided into each joint using each of the measurement tools. The PIPJ was found to have the strongest correlation when compared with the other joints, except when measured with the iPhone Compass app.

Pearson r2 Coefficients for Measurement Methods

Table 3:

Pearson r2 Coefficients for Measurement Methods

Overall interrater reliability was excellent—the interclass correlation coefficient was 0.950 (range, 0.931–0.964). Interrater reliability was further analyzed for each measurement tool as well as each joint (Table 4). Visual inspection and the Goniometer app had the highest correlation coefficients.

Interrater Reliability of Measurement Methods

Table 4:

Interrater Reliability of Measurement Methods

Discussion

The purposes of this study were to (1) compare different smartphone apps as well as visual inspection measurements of the finger joints with goniometer measurements of the fluoroscopic image and (2) determine interrater reliability for these measurement tools. The results of this study suggest that all 3 smartphone measurement tools, as well as visual inspection, agree and correlate well with the reference standard fluoroscopic image measurement. Average differences between the fluoroscopic image measurements and the tools used in this study ranged from 9.4° to 12.2°, with 49.4% to 65.4% of measurements within ±10° of references. The mean correlation coefficients for each iPhone app exceeded 0.7, signifying strong correlation.7 The strongest correlations were between visual inspection and the fluoroscopic image reference standard. However, this was not the case for the Bland–Altman agreement. Overall interrater reliabilities were similar, with the interclass correlation coefficient being greater than 0.9 for all of the measurement tools. Visual inspection and the Goniometer app had the highest correlation coefficients.

Although there was a strong correlation between visual inspection and the fluoroscopic image reference standard, agreement and accuracy between the 2 were not as strong. Rose et al8 estimated the angles of the MCPJ, PIPJ, and DIPJ of a resin cast of an adult hand and reported 25% inaccuracy using visual estimate. However, visual estimates of the surgeons were better than those of the physiotherapists. In addition, both hand surgery experience and stated interest in hand surgery correlated with the accuracy, which may be due to improvement with time and practice. Nonetheless, Rose et al8 concluded that visual assessment alone is imprecise and recommended that goniometers be used in evaluating finger angles.

McVeigh et al2 found that neither visual inspection nor handheld goniometer measurements were consistently accurate when compared with radiographic measurement. This is similar to findings in the current study when the authors dropped the allowed difference range between the measured angles to the fluoroscopic image reference to ±10°. In addition, the authors found no difference in accuracy of visual inspection between hand surgeons and therapists. The accuracies of the wrist and MCPJ angles were similar between the visual estimate and Goniometer measurement, but the PIPJ angle was more accurate with the Goniometer measurement. The authors found a high degree of interobserver variability in measurements. The overall strong interrater reliability and significant correlation between the visual inspection and fluoroscopic image reference standard may be explained by all observers being orthopedic surgeons and one having hand fellowship training.

Among the finger joints, the MCPJ showed the strongest agreement, whereas the PIPJ showed the strongest correlation. Ellis et al3 measured the finger joint angles using a goniometer and wire tracings. They showed that the MCPJ had the greatest repeatability when compared with the more distal joints, with the DIPJ having the greatest variability. They postulated that this was due to the shorter lever arms to align the measuring tools, a less definable joint line, and lack of a distal bony landmark for alignment. A study by Macionis9 confirmed this finding.

This study had several limitations. First, only 3 observers were included in the study. Other studies have included a greater number of examiners, and this could have affected the interrater reliability and the strength of correlation. In addition, the subject had no known finger pathology, as this was a feasibility study regarding whether smartphone apps can be used as additional tools to measure ROM of healthy finger joints. How pain and pathology may affect these measurements in the setting of pathology is unknown. Furthermore, only flexion/extension angles of the finger joints were measured, without fully accounting for the overall hand's place in space. The wrist position inclination and supination/pronation that could have affected the smartphone app measurements were not controlled. Finally, the reference standard was the actual bony joint flexion angles as measured by radiographs, whereas the app measurements as well as visual inspection were applied over the soft tissue. This is an inherent shortcoming of clinical measure of ROM, as frequent radiographic measurements can be cost prohibitive.

Conclusion

Measurement of finger ROM is critical in clinical settings, especially for outcome analysis, clinical decision making, and rehabilitation/disability assessment. Although goniometer measurement is clinically considered the gold standard, its accuracy compared with the true radiographic measurements of the joint angles remains questionable.2 With measurement of fluoroscopic image joint angles as the reference standard, the current data suggest that new smartphone apps hold promise for providing measures of ROM as accurate and reliable as those of the standard goniometer and visual inspection.

References

  1. Armstrong AD, MacDermid JC, Chinchalkar S, Stevens RS, King GJ. Reliability of range-of-motion measurement in the elbow and forearm. J Shoulder Elbow Surg. 1998; 7(6):573–580. doi:10.1016/S1058-2746(98)90003-9 [CrossRef]
  2. McVeigh KH, Murray PM, Heckman MG, Rawal B, Peterson JJ. Accuracy and validity of goniometer and visual assessments of angular joint positions of the hand and wrist. J Hand Surg Am. 2016; 41(4):e21–e35. doi:10.1016/j.jhsa.2015.12.014 [CrossRef]
  3. Ellis B, Bruton A, Goddard JR. Joint angle measurement: a comparative study of the reliability of goniometry and wire tracing for the hand. Clin Rehabil. 1997; 11(4):314–320. doi:10.1177/026921559701100408 [CrossRef]
  4. Meislin MA, Wagner ER, Shin AY. A comparison of elbow range of motion measurements: smartphone-based digital photography versus goniometric measurements. J Hand Surg Am. 2016; 41(4):510–515. doi:10.1016/j.jhsa.2016.01.006 [CrossRef]
  5. Lea RD, Gerhardt JJ. Range-of-motion measurements. J Bone Joint Surg Am. 1995; 77(5):784–798. doi:10.2106/00004623-199505000-00017 [CrossRef]
  6. Bland JM, Altman DG. Measuring agreement in method comparison studies. Stat Methods Med Res. 1999; 8(2):135–160. doi:10.1177/096228029900800204 [CrossRef]
  7. Landis JR, Koch GG. The measurement of observer agreement for categorical data. Biometrics. 1977; 33(1):159–174. doi:10.2307/2529310 [CrossRef]
  8. Rose V, Nduka CC, Pereira JA, Pickford MA, Belcher HJ. Visual estimation of finger angles: do we need goniometers?J Hand Surg Br. 2002; 27(4):382–384. doi:10.1054/jhsb.2002.0782 [CrossRef]
  9. Macionis V. Reliability of the standard goniometry and diagrammatic recording of finger joint angles: a comparative study with healthy subjects and non-professional raters. BMC Musculoskelet Disord. 2013; 14:17. doi:10.1186/1471-2474-14-17 [CrossRef]

Ease of Use of Measurement Methods

Measurement MethodRatinga

Observer 1Observer 2Observer 3
Visual inspection444
Goniometerb323
iPhone Compassc212
PT-Tools Suited131

Bland–Altman's Limits of Agreement

Measurement MethodMean DifferenceStandard Deviation of Difference95% Limits of Agreement

MinimumMaximum
Visual inspection−7.77812.45516.634−32.189
Goniometera0.49411.99424.002−23.014
iPhone Compassb0.76513.20026.637−25.106
PT-Tools Suitec−7.37013.45018.992−33.733

Pearson r2 Coefficients for Measurement Methods

Measurement MethodObserver 1Observer 2Observer 3Average
Visual inspection0.86600.83520.74350.8149
  MCPJ0.88000.82000.73000.8100
  PIPJ0.87000.89000.77000.8433
  DIPJ0.86600.86140.79200.8398
Goniometera0.64630.73880.73310.7061
  MCPJ0.57000.53000.53000.5433
  PIPJ0.76000.87000.94000.8567
  DIPJ0.82440.96770.87730.8898
iPhone Compassb0.82830.74460.58750.7201
  MCPJ0.88000.75000.67000.7667
  PIPJ0.77000.87000.56000.7333
  DIPJ0.88450.84160.68560.8039
PT-Tools Suitec0.73700.75030.64810.7118
  MCPJ0.62000.64000.48000.5800
  PIPJ0.89000.95000.87000.9033
  DIPJ0.79690.78260.47400.6845

Interrater Reliability of Measurement Methods

Measurement MethodAverageRange
Overall0.9500.931–0.964
  Visual inspection0.9660.935–0.983
    MCPJ0.9750.921–0.994
    PIPJ0.9350.798–0.984
    DIPJ0.9780.931–0.995
  Goniometera0.9670.938–0.984
    MCPJ0.9580.870–0.990
    PIPJ0.9790.934–0.995
    DIPJ0.9620.883–0.991
  iPhone Compassb0.9130.834–0.957
    MCPJ0.9150.733–0.979
    PIPJ0.9050.702–0.977
    DIPJ0.9250.764–0.981
  PT-Tools Suitec0.9330.872–0.967
    MCPJ0.8930.666–0.974
    PIPJ0.9480.838–0.987
    DIPJ0.9400.813–0.985
Authors

The authors are from the Department of Orthopedic Surgery, University of Pittsburgh Medical Center, Pittsburgh, Pennsylvania.

The authors have no relevant financial relationships to disclose.

Correspondence should be addressed to: John R. Fowler, MD, Department of Orthopedic Surgery, University of Pittsburgh Medical Center, 3471 Fifth Ave, Ste 911 Kaufmann Bldg, Pittsburgh, PA 15213 ( fowlerjr@upmc.edu).

Received: July 05, 2017
Accepted: November 30, 2017
Posted Online: January 08, 2018

10.3928/01477447-20180103-02

Advertisement

Sign up to receive

Journal E-contents
Advertisement