Higher rates of adenoma detection may cut the lifetime risk for colorectal cancer incidence and death in half, according to a new study.
In a microsimulation model of a US population cohort designed to estimate the lifetime benefits, complications, and costs of an initial colonoscopy screening program at different levels of adenoma detection, the lifetime risk for colorectal cancer incidence and risk for mortality were, respectively, 34.2 per 1000 (95% confidence interval [CI], 25.9 - 43.6) and 13.4 per 1000 (95% CI, 10.0 - 17.6) among unscreened patients and, on average, 19.1 per 1000 (95% CI, 14.3 - 24.8) and 3.8 per 1000 (95% CI, 2.7 - 5.2) among screened patients.
"The modeled risks were inversely related to the level of adenoma detection," Reinier G. S. Meester, MSc, from Erasmus MC University Medical Center, Rotterdam, the Netherlands, and colleagues report in an article published in the June 16 issue of JAMA. Specifically, the simulated lifetime risk of colorectal cancer and death per 1000 among patients of physicians whose screening colonoscopies were in the lowest (first) adenoma detection rate (ADR) quintile was 26.6 (95% CI, 20.0 - 34.3) and 5.7 (95% CI, 4.2 - 7.7), respectively, compared with 12.5 (95% CI, 9.3 - 16.5) and 2.3 (95% CI, 1.7 - 3.1) among patients of physicians in the highest (fifth) quintile.
"The simulated lifetime risk of colorectal cancer death was on average 12.8% lower (95% CI, 11.1%-13.7%) for every 5 percentage-point increase in physician ADRs," the authors write.
The microsimulation model used data from a community-based healthcare system on ADR variation and cancer risk among 57,588 patients examined by 136 physicians from 1998 through 2010. Excluding patients with insufficient follow-up data, the analysis included 179,682 person-years of follow-up, and the interval colorectal cancer incidence per 100,000 person-years ranged from 66.6 (95% CI, 43.2 - 97.0) in quintile 1 to 39.0 (95% CI, 22.7 - 62.4) in quintile 4 and 49.7 (95% CI, 27.8 - 81.9) in quintile 5, according to the authors.
The model's total estimated number of colonoscopies per 1000 patients increased an average of 4.6% (95% CI, 3.6 - 5.7) for every 5-point increase in ADR, and the simulated risk for colonoscopy complications was an average of 9.8% (95% CI, 7.5% - 13.2%) higher for every 5-percentage-point increase in ADR, they note.
Screening costs were inversely proportional to ADRs, the authors report, noting that estimated net screening costs were on average 3.2% (95% CI, 0.8% - 6.4% million) lower for every 5-percentage-point increase in ADRs. "For higher ADR quintiles, estimated colonoscopy costs were higher, but estimated treatment costs were lower," they write.
The analyses were stable across multiple sensitivity analyses.
"Our results suggest that higher adenoma detection rates may be associated with up to 50% to 60% lower lifetime colorectal cancer incidence and mortality without higher net screening costs despite a higher number of colonoscopies and polypectomy-associated complications," the authors write.
"Higher ADRs were associated in the model with up to 34.4 additional life-years saved per 1000 patients, which represents about 10 years per prevented cancer death, 2 weeks per average patient, and one-third of the maximum potential mortality benefit derived from screening," the authors explain.
The findings are consistent with those of prior studies that have shown an inverse relationship between ADR level and colorectal cancer risk.
"Future research is needed to assess why adenoma detection rates vary and whether increasing adenoma detection would be associated with improved patient outcomes," the authors conclude.
The study was supported by grants from the National Cancer Institute, National Institutes of Health. The authors have disclosed no relevant financial relationships.
JAMA. 2015;313:2349-2358. Full text
Δεν υπάρχουν σχόλια:
Δημοσίευση σχολίου