Higher adenoma detection rates in screening colonoscopy were found to be associated with as high as 50% to 60% lower lifetime risk for colorectal cancer incidence and death without incurring higher overall costs, according to new research data.
“The adenoma detection rate or ADR is a recommended colonoscopy quality indicator,” Reinier G. S. Meester, MSc, from Erasmus MC University Medical Center in the Netherlands, told Healio Gastroenterology. “However, it is still unclear how ADR relates to the long-term benefits and cost of colonoscopy-based colorectal cancer screening programs.”
Reinier G. S. Meester
Aiming to estimate the lifetime benefits, complications and costs of an initial colonoscopy screening program at different levels of adenoma detection, Meester and colleagues performed microsimulation modeling using data from Kaiser Permanente Northern California health care system on variations in ADR and cancer risk. The data included 57,588 patients examined by 136 physicians from 1998 through 2010.
The researchers used modeling to compare no screening and screening initiation with colonoscopy at ages 50, 60 and 70 years based on quintiles of physician ADRs:
- 15.32% (range, 7.35%-19.05%) for quintile one;
- 21.27% (range, 19.06%-23.85%) for quintile two;
- 25.61% (range, 23.86%-to 28.4%) for quintile three;
- 30.89% (range, 28.41%-33.5%) for quintile four; and
- 38.66% (range, 33.51%-52.51%) for quintile five.
Outcomes included colorectal cancer incidence and death, years of life lost, number of colonoscopies, complications and screening and treatment costs, all of which were discounted to 2010 at a fixed annual rate of 3% and reported with uncertainty ranges.
Lifetime risk for colorectal cancer incidence was 34.2 per 1,000 (95% CI, 25.9-43.6) and risk for mortality was 13.4 per 1,000 (95% CI, 10-17.6) in unscreened patients. Simulated lifetime incidence for screened patients was inversely related to ADRs, ranging from 26.6 (95% CI, 20-34.3) for quintile one to 12.5 (95% CI, 9.3-16.5) for quintile five. The trend for mortality was similar, ranging from 5.7 (95% CI, 4.2-7.7) for quintile one to 2.3 (95% CI, 1.7-3.1) for quintile five. Simulated lifetime incidence and mortality were on average 11.4% (95% CI, 10.3%-11.9%) lower and 12.8% (95% CI, 11.1%-13.7%) lower, respectively, for every five percentage point increase of ADRs compared with quintile one.
Of the 2,777 (95% CI, 2,626-2,943) colonoscopies in quintile one, there were 6 (95% CI, 4-8.5) complications compared with 8.9 (95% CI, 6.1-12) complications among the 3,376 (95% CI, 3,081-3,681) colonoscopies in quintile five. The simulated risk for complications was on average 9.8% (95% CI, 7.5%-13.2%) higher for every five percentage point increase of ADRs.
Estimated net screening costs were $2.1 million (95% CI, $1.8-$2.4 million) in quintile one compared with $1.8 million (95% CI, $1.3-$2.3 million) in quintile five due to averted cancer treatment costs. Estimated net screening costs were on average 3.2% (95% CI, 0.8%-6.4%) for every five percentage point increase in ADRs.
“Our results confirm the belief that ADRs are likely an important colonoscopy quality measure,” Meester said. “Our results further suggest that efforts to improve the detection and removal of pre-cancerous polyps will likely support current quality improvement efforts, help patients, and be cost-effective. A goal now is to evaluate methods for increasing detection and to see if these improve patient outcomes.” – by Adam Leitenberger
Disclosure: The researchers report no relevant financial disclosures.