Orthopedics

Feature Article 

Compromise of Radiology Studies From Nonstandardized Viewing Platforms

Paul M. Lichstein, MD, MS; Scott C. Wilson, MD; William G. Ward Sr, MD

Abstract

There is no standardization of proprietary radiology viewing software platform functions allowing recorded digital radiographic imaging studies on compact discs (CDs) to be viewed in a standardized manner at subsequent institutions. Primary concerns include the following: (1) a large number of image viewing software platforms with a wide variety of features making familiarity with use difficult, (2) an inordinate amount of time required to load imaging data, (3) imaging data may not upload or be viewed with the care center's picture archiving and communication system, (4) navigation through imaging studies is inconsistent and tedious, and (5) image viewing requires additional software downloads. Additionally, images generated from “outside CDs” are frequently of low quality and resolution, eliminating the ability to render a reliable diagnosis. The authors sought to determine the frequency and extent of these functional problems by analyzing a sample of 50 consecutive radiology CDs containing imaging studies referred to a university orthopedic oncology practice. Eighteen different viewing software platforms were encountered. Only 24 (48%) of the CDs met all optimal system criteria. Mean time required to load the studies was 3.4 seconds using the picture archiving and communication system and 37.9 seconds using the proprietary viewing software (P<.001). Fifteen (30%) of the CDs did not upload to the institution's picture archiving and communication system, and 18 (36%) required additional downloads and/or license agreements. Four CDs did not contain Digital Imaging and Communications in Medicine images. Physicians using radiology studies on CDs encounter numerous difficulties in evaluating patients' imaging data because of the plethora of viewing software platforms. These difficulties add time and cost and compromise patient care. [Orthopedics. 2018; 41(1):e136–e141.]

Abstract

There is no standardization of proprietary radiology viewing software platform functions allowing recorded digital radiographic imaging studies on compact discs (CDs) to be viewed in a standardized manner at subsequent institutions. Primary concerns include the following: (1) a large number of image viewing software platforms with a wide variety of features making familiarity with use difficult, (2) an inordinate amount of time required to load imaging data, (3) imaging data may not upload or be viewed with the care center's picture archiving and communication system, (4) navigation through imaging studies is inconsistent and tedious, and (5) image viewing requires additional software downloads. Additionally, images generated from “outside CDs” are frequently of low quality and resolution, eliminating the ability to render a reliable diagnosis. The authors sought to determine the frequency and extent of these functional problems by analyzing a sample of 50 consecutive radiology CDs containing imaging studies referred to a university orthopedic oncology practice. Eighteen different viewing software platforms were encountered. Only 24 (48%) of the CDs met all optimal system criteria. Mean time required to load the studies was 3.4 seconds using the picture archiving and communication system and 37.9 seconds using the proprietary viewing software (P<.001). Fifteen (30%) of the CDs did not upload to the institution's picture archiving and communication system, and 18 (36%) required additional downloads and/or license agreements. Four CDs did not contain Digital Imaging and Communications in Medicine images. Physicians using radiology studies on CDs encounter numerous difficulties in evaluating patients' imaging data because of the plethora of viewing software platforms. These difficulties add time and cost and compromise patient care. [Orthopedics. 2018; 41(1):e136–e141.]

The ongoing success of the picture archiving and communication system (PACS) according to Digital Imaging and Communications in Medicine (DICOM) standards is reliant on the ability of the imaging software to generate a functional image with diagnostic capabilities for the clinician and radiologist. However, the imaging examinations of patients referred from outside institutions are often not electronically available and are transferred via compact disc (CD). Imaging studies may then be evaluated with the vendor viewing platform contained on the CD or, if compatible, uploaded to an institutional enterprise viewing system. However, there is no standardization of the vendor viewing platforms employed by referral institutions that allows reliable and consistent review at other care centers. Ideally, these platforms would access and display all images from the original examination with a level of quality sufficient to make accurate diagnoses and to plan appropriate therapeutic interventions. This is often not the case and may lead to duplication of radiographic evaluation, repeat exposure to ionizing radiation, delayed treatment, inaccuracy in diagnosis, and increased overall cost of administering patient care.1–5

There are a plethora of proprietary vendor viewing platforms using a wide variety of technologies that differ greatly in quality, reliability, functionality, intuitiveness, and appearance (Figure). Practitioners at tertiary care and referral centers are faced with imaging examinations presented on varied platforms that often require training and experience for proficient use. There are 5 major areas of concern: (1) a large number of image viewing software platforms with a wide variety of features making familiarity with use difficult, (2) an inordinate amount of time required to load imaging data, (3) imaging data may not upload or be viewed with the care center's PACS, (4) navigation through imaging studies is inconsistent and tedious, and (5) image viewing requires additional software downloads. Additionally, images generated from “outside CDs” are frequently of low quality and resolution, eliminating the ability to render a reliable diagnosis. The practical result of these problems is suboptimal radiographic information transfer to the clinician that compromises patient care. To the authors' knowledge, no previous studies have assessed and compared the functionality and intuitive nature of these various platforms. Therefore, to determine the frequency and extent of functionality problems in the transfer of radiographic information to tertiary care institutions, the authors analyzed a series of 50 sequential CDs containing radiographic examinations from outside institutions that accompanied patients referred to an orthopedic oncology clinic.

Examples of the variety of software platforms and presentation formats encountered.

Figure:

Examples of the variety of software platforms and presentation formats encountered.

Materials and Methods

Following institutional review board approval, a systematic evaluation was performed of 50 individual CDs containing digital radiographic imaging studies of patients referred to the orthopedic oncology clinic of the senior author (W.G.W.). The studies were a chronologically sequential series of 50 CDs derived from the CD files of the orthopedic oncology clinic. Before evaluation of the digital imaging data on each CD, study personnel recorded patient demographics, including age, sex, type of study (plain radiograph, computed tomography scan, magnetic resonance image, bone scan, and so on), date of study, and anatomic site under review.

The 50 radiographic CDs selected for this study were prospectively gathered from a sample of consecutive patients who brought radiographic studies on CDs from outside institutions. All patients had been referred to the senior author's orthopedic oncology clinic. Physically damaged CDs, as evidenced by obvious physical damage or by lack of the ability of any testing computer to recognize the CD, were excluded. The authors excluded CDs that caused error messages, indicating possible defects, on testing computers and those that lacked any detectable data written to the provided CD. A sample was defined as 1 CD containing digital imaging data for 1 patient referred to the orthopedic oncology clinic. Anatomic sites, imaging modalities, software platforms, and primary diagnoses are provided in Tables 14.

Number of Imaging Studies According to Anatomic Site

Table 1:

Number of Imaging Studies According to Anatomic Site

Distribution of Imaging Modalities

Table 2:

Distribution of Imaging Modalities

Imaging Software Platforms Encountered

Table 3:

Imaging Software Platforms Encountered

Primary Diagnoses

Table 4:

Primary Diagnoses

Digital imaging data were evaluated using a single independent password-protected computer in the orthopedic oncology clinic representing computer configurations encountered in clinical practices. The oncology clinic's designated standard radiology imaging viewing computer (Optiplex 745 [Dell, Round Rock, Texas] with the following specifications: Core 2 central processing unit [Intel, Santa Clara, California] @ 2.13 GHz, 1.00 GB memory, and network speed of 100 MB/sec) served as the accepted standard for clinician evaluation of the imaging data. Each CD was evaluated using 5 criteria: (1) presence of study images that could be uploaded and viewed with the clinic's PACS (IMPAX CS5000; AGFA, Mortsel, Belgium), which was already loaded on the radiology viewing computer described above and contains programming to view outside CD-based studies; (2) time required to load a functional image with each CD using the clinic's PACS software and time required using the proprietary software; (3) navigation and image advancement requirements within each series using the proprietary software contained on each CD (with the accepted optimal standard defined as the ability to “scroll” through each series using a mouse wheel or mouse movement); (4) management of customized window settings and their consistent application throughout each series using proprietary software platforms contained on each CD; and (5) any additional image viewing requirements, such as license agreements, disclaimers, or adjunct software permissions or downloads, of the proprietary software employed for data viewing. Radiology technicians performed further evaluation of each CD to determine the presence of high-fidelity DICOM images vs only compressed image data.

All studies on the CD were evaluated. In the case of multiple studies of the same format (bone scan, magnetic resonance image, and so on), only the most recent study in each format was evaluated. Only magnetic resonance images and computed tomography scans were used for evaluating window-setting applications within series and image advancement within series.

Following evaluation of each study contained on an individual CD, the CD was ejected and all data were deleted before inserting another CD. Each CD was evaluated by the first author (P.M.L.), and all CDs that displayed compromised features (poor/missing scroll function, failure of window settings to apply to all images in a series) were verified by the second (S.C.W.) and senior (W.G.W.) authors.

Results were examined using raw descriptive statistical analysis. Time recorded for “time to load” using the clinic's PACS and each proprietary system was compared with normality and Wilcoxon signed rank test.

Results

Problems were common; 18 different viewing platforms were encountered (Table 3). Of the 50 CDs, 15 (30%) contained digital images that could not be uploaded or viewed with the clinic's PACS. Mean time required to load studies was 3.4 seconds using the PACS and 37.9 seconds using the proprietary viewing software (z test statistic based on positive ranks=6.126, P<.001). Excluding 6 CDs that contained only plain radiographs or simple bone scans (ie, no series to test for generalized window-setting application or image advancement functionality), 8 (18%) of the remaining 44 CDs contained proprietary image viewing software platforms that did not allow efficient image advancement and 5 (11%) of 44 contained proprietary image viewing software platforms that failed to apply user-customized window settings throughout the entire series. Three (7%) of the 44 CDs were capable of applying user-customized window settings throughout the series only after the viewer expended additional time to access the nonintuitive function. Eighteen (36%) of the 50 CDs were encoded with proprietary image viewing software platforms that required additional downloads and/or license agreements. Further analysis by radiology technicians found that 4 (8%) CDs contained only compressed image data without accompanying high-fidelity DICOM image data. Only 24 (48%) CDs contained images and software platforms that met all optimal system criteria (Table 5).

CD Performance

Table 5:

CD Performance

Discussion

The recent evolution from conventional hard-copy imaging studies viewed with a lighted viewbox to digital images rendered according to PACS and DICOM standards has offered substantial benefits to physicians and patients, perhaps most evident on the front lines of the emergency department and specialist clinics. The CD remains the most prevalent modality for transferring medical imaging data, with hard copy and network transfers being significantly less prevalent.6 Previous investigations have highlighted the increased efficiency, productivity, and financial savings, improved physician communication, and overall improved patient care afforded by incorporating PACS software into clinical practice and effectively realizing its capabilities.7–10 Images are now stored and archived via electronic databases, thus minimizing the loss of hardcopy data, and digitally archived imaging study “soft copies” are accessible for evaluation by multiple users simultaneously and around the clock. Additionally, digital image transfer via web-based and CD-based software platforms has greatly decreased the costs inherent in generating image copies and transferring such studies between and among care centers, individual physicians, and patients.11 Advances in and availability of digital imaging technology have spurred a multitude of vendor-developed proprietary software platforms employed for data encoding, retrieval, display, and navigation of imaging examinations. As a consequence, available systems are diverse in their operation, constantly evolving, and nonstandardized in their display and navigation functions. Problems arise because of a lack of standards incorporated into methods used to accomplish these tasks. Most investigations into the effect of such outside CDs have focused on emergency department transfers and trauma referrals and have not attempted to evaluate vendor software characteristics. Therefore, the purpose of this study was to critically evaluate a series of image viewing platforms contained on CDs to determine the frequency and extent of functionality problems in the transfer of radiographic information to tertiary care institutions.

A typical scenario at the authors' clinic would involve a referred patient arriving accompanied by a CD containing recent pertinent imaging studies. While the patient is escorted into the examination room, the physician inserts the CD into a relatively specialized computer and awaits display of the digital images. Frequently, data are not accessible by the clinic's PACS, no images are generated, and repeat imaging may be necessary. Sodickson et al1 retrospectively reviewed 1487 consecutive emergency department patients and found that CD import was unsuccessful in 22%, whereas successful importation was associated with a 17% reduction in mean rates of all subsequent diagnostic imaging and a 16% reduction in subsequent computed tomography use. Similarly, Flanagan et al12 observed a 17% repeat imaging rate even when using a combination of Internet and CDs to transfer images between care facilities.

Occasionally, it is possible to access a root folder displaying all files contained on the CD. These files are frequently insufficiently labeled for clear identification, may contain file extensions that are not accessible by enterprise computers, and may not be effectively uploaded. In 2006, the German Radiology Association found that as many as 70% of CDs used for the transfer of imaging studies contain discrepancies between data structure and content.13 Additionally, although the necessary imaging studies may have been conducted, and the raw data exist written to a CD, access and use are key factors. Lu et al2 revealed that 52% of patients accompanied by outside images underwent repeat imaging if those images, although available and potentially navigated via a proprietary viewing platform, were not successfully imported.

At this juncture, the clinician may choose to access the study data using the proprietary software platform accompanying the CD. However, the sheer variety of software platforms ensures that no individual clinician is facile with the use of all available platforms. Clinicians will often encounter a platform with which they have little to no experience; a clinician facile with 18 different viewing platforms would be exceptional. Additionally, CDs may not contain high-fidelity images and software platforms may lack capacity to display DICOM images, causing clinicians to instead rely on compressed data, for which there is a high degree of variability and no clear standardization.14

These imaging challenges lead to inequities in time consumption and profound inefficiencies in patient care. Duplication of imaging services and exposure to additional radiation and ionizing contrast often occur, resulting in frustration, dissatisfaction, and ultimately treatment delays and diagnostic errors. Sung et al5 reviewed 425 computed tomography scans accompanying patients transferred from the emergency department. They observed that 35% of repeat examinations were performed because of inadequate imaging, CD operation failure, or inability to upload images to the hospital's PACS. The ability to access and effectively use outside imaging data presents the opportunity to greatly reduce the need for repeat imaging, especially in patient populations exquisitely sensitive to ionizing radiation. Tepper et al15 reviewed exposure records for 75 pediatric trauma patients and found that patients transferred to tertiary care centers endured significantly higher radiation exposure compared with patients evaluated directly.

The current results indicated a slightly greater than 10-fold increase in time required (37.9 vs 3.4 seconds) to generate a viewable image when the clinician is forced to use the proprietary viewing platform rather than the institution's PACS optimized for viewing outside CDs. Clinically, the additional time expended is much greater in those cases wherein the clinician must resort to the proprietary platform following failed attempts to use the vendor platform (Table 5). Compact discs that are capable of being viewed and navigated within a PACS offer superior time-saving advantages, functionality standards, and diagnostic capability.

This study had some limitations. Not all CDs contained studies of the same modality. Some studies consisted of only a single modality, such as plain radiograph, that was unable to be analyzed in terms of all potential software features (ie, window-setting applications throughout series, scroll function, and so on). Additionally, variations in clinician familiarity with different platforms could have influenced the evaluations of software functions. However, such variability in experience does not excuse the unacceptable variability in viewing platform functionality.

Conclusion

The authors believe that steps must be taken to provide an acceptable set of universal standards for radiographic image transfer and presentation on CDs. Despite advancements in the creation of regional PACS networks among care facilities and direct links between community hospitals and referral centers, the CD remains the most frequently used transfer vehicle. The vendor viewing platforms contained within CDs require some uniform access and presentation features. Ideally, transferred images will upload and be viewable on all host platforms with a standardized base level of functionality and intuitivism required. A base level of quality images at the DICOM level adequate for diagnostic and treatment purposes should accompany each patient transfer. Although all features of individual software platforms need not conform, an acceptable minimum level of functions is suggested. The authors recommend establishment of standards whereby (1) all CDs or other transfer vehicles include the true noncompressed DICOM images; (2) these true fidelity images easily download and render images capable of review for primary diagnosis on basic computer viewing platform systems; and (3) imaging data are presented in a format whereby the essential software functions of image advancement, window-setting applications, and additional software downloads or requirements are consistent. Adoption of these basic minimum standards will allow each software manufacturer to continue to develop signature features yet retain a baseline level of function with which clinicians may become skilled. Undoubtedly, such action will greatly improve physician efficiency, decrease costs, decrease duplication of services, increase satisfaction among patients and physicians, and improve the quality of care.

References

  1. Sodickson A, Opraseuth J, Ledbetter S. Outside imaging in emergency department transfer patients: CD import reduces rates of subsequent imaging utilization. Radiology. 2011; 260(2):408–413. doi:10.1148/radiol.11101956 [CrossRef]
  2. Lu MT, Tellis WM, Fidelman N, Qayyum A, Avrin DE. Reducing the rate of repeat imaging: import of outside images to PACS. AJR Am J Roentgenol. 2012; 198(3):628–634. doi:10.2214/AJR.11.6890 [CrossRef]
  3. Chwals WJ, Robinson AV, Sivit CJ, Alaedeen D, Fitzenrider E, Cizmar L. Computed tomography before transfer to a level I pediatric trauma center risks duplication with associated increased radiation exposure. J Pediatr Surg. 2008; 43(12):2268–2272. doi:10.1016/j.jpedsurg.2008.08.061 [CrossRef]
  4. Haley T, Ghaemmaghami V, Loftus T, Gerkin RD, Sterrett R, Ferrara JJ. Trauma: the impact of repeat imaging. Am J Surg. 2009; 198(6):858–862. doi:10.1016/j.amjsurg.2009.05.030 [CrossRef]
  5. Sung JC, Sodickson A, Ledbetter S. Outside CT imaging among emergency department transfer patients. J Am Coll Radiol. 2009; 6(9):626–632. doi:10.1016/j.jacr.2009.04.010 [CrossRef]
  6. Robinson JD, McNeeley MF. Transfer patient imaging: a survey of members of the American Society of Emergency Radiology. Emerg Radiol. 2012; 19(5):447–454. doi:10.1007/s10140-012-1047-y [CrossRef]
  7. Passadore DJ, Isoardi RA, Ariza PP, Padín C. Use of a low-cost, PC-based image review workstation at a radiology department. J Digit Imaging. 2001; 14(2)(suppl 1):222–223. doi:10.1007/BF03190346 [CrossRef]
  8. Arenson RL, Andriole KP, Avrin DE, Gould RG. Computers in imaging and health care: now and in the future. J Digit Imaging. 2000; 13(4):145–156. doi:10.1007/BF03168389 [CrossRef]
  9. Mansoori B, Erhard KK, Sunshine JL. Picture Archiving and Communication System (PACS) implementation, integration & benefits in an integrated health system. Acad Radiol. 2012; 19(2):229–235. doi:10.1016/j.acra.2011.11.009 [CrossRef]
  10. Nitrosi A, Borasi G, Nicoli F, et al. A filmless radiology department in a full digital regional hospital: quantitative evaluation of the increased quality and efficiency. J Digit Imaging. 2007; 20(2):140–148. doi:10.1007/s10278-007-9006-y [CrossRef]
  11. Mehta A, Dreyer K, Thrall J. Enhancing availability of the electronic image record for patients and caregivers during follow-up care. J Digit Imaging. 1999; 12(2)(suppl 1):78–80. doi:10.1007/BF03168762 [CrossRef]
  12. Flanagan PT, Relyea-Chew A, Gross JA, Gunn ML. Using the Internet for image transfer in a regional trauma network: effect on CT repeat rate, cost, and radiation exposure. J Am Coll Radiol. 2012; 9(9):648–656. doi:10.1016/j.jacr.2012.04.014 [CrossRef]
  13. Mildenberger P, Kotter E, Riesmeier J, et al. The DICOM-CD-Project of the German Radiology Association: an overview of the content and results of a pilot study in 2006 [in German]. Rofo. 2007; 179(7):676–682. doi:10.1055/s-2007-963122 [CrossRef]
  14. Koff DA, Shulman H. An overview of digital compression of medical images: can we use lossy image compression in radiology?Can Assoc Radiol J. 2006; 57(4):211–217.
  15. Tepper B, Brice JH, Hobgood CD. Evaluation of radiation exposure to pediatric trauma patients. J Emerg Med. 2013; 44(3):646–652. doi:10.1016/j.jemermed.2012.09.035 [CrossRef]

Number of Imaging Studies According to Anatomic Site

Anatomic SiteNo. of Imaging Studies
Femur10
Knee7
Humerus6
Foot6
Tibia5
Upper extremity4
Hip3
Lower extremity2
Ankle2
Fibula2
Pelvis2
Sacrum1

Distribution of Imaging Modalities

Imaging ModalityNo.
Magnetic resonance image29
Plain radiograph28
Computed tomography scan15
Bone scan7

Imaging Software Platforms Encountered

Software ProgramNo.
eFilm Litea11
eFilm Lite 2.1a9
Horizon MIb5
View Capsule V3.4d.002.421c4
IDX Imagecastd3
NovaPACSe3
Syngof2
Image Viewerg2
Viztekh2
Centricity DICOM 2.1d1
DR Media Ambassadori1
eFilm Lite 2.0a1
FilmX DICOM Liteboxj1
Centricity DICOM 3.0d1
iSite CD Direct 3.5.56k1
Neurostar Litel1
Image Viewer R 10.2k1
SIENET Skyf1

Primary Diagnoses

Primary DiagnosisNo.
Enchondroma8
Cyst7
Ganglion5
Avascular necrosis3
Nonossifying fibroma3
Gout2
Heterotopic ossification2
Osteochondroma2
Osteoid osteoma2
Osteosarcoma2
Melorheostosis2
Adipose tissue1
Adipose tissue irritation1
Aneurysmal bone cyst1
Chondromalacia1
Eosinophilic granuloma1
Leiomyosarcoma1
Lipoma1
Metastatic breast carcinoma1
Osteoarthritis1
Osteophyte1
Pleomorphic sarcoma1
Rotator cuff tendinopathy1

CD Performance

Problem EncounteredNo. of CDs
Navigation issues (n=44)8 (18%)
Window issues (n=44)5 (11%)
Additional downloads (n=50)18 (36%)
Compressed images (n=50)4 (8%)
Optimal (n=50)24 (48%)
Authors

The authors are from the Department of Orthopaedic Surgery (PML), Brigham and Women's Hospital–Harvard Medical School, Boston, Massachusetts; and the Department of Orthopaedic Surgery (SCW), Wake Forest University Health Sciences, and Novant Health (WGW), Forsyth Medical Center, Winston Salem, North Carolina.

The authors have no relevant financial relationships to disclose.

Correspondence should be addressed to: Paul M. Lichstein, MD, MS, Department of Orthopaedic Surgery, Brigham and Women's Hospital–Harvard Medical School, 75 Francis St, Boston, MA 02115 ( paulmlichstein@gmail.com).

Received: August 18, 2017
Accepted: November 13, 2017
Posted Online: December 19, 2017

10.3928/01477447-20171213-03

Sign up to receive

Journal E-contents