Feature

‘Reproducibility crisis’ in radiation biology research could negatively impact clinical care advances

The scientific method is the cornerstone of all scientific advancement.

By following clearly outlined methods used during an investigator’s research, other researchers can reproduce the results and, hopefully, establish a baseline from which scientific investigation can move forward.

Many radiation biology studies, however, lack adequate reporting of irradiation methodology, which makes replication difficult, according to results of a large-scale systematic review published in International Journal of Radiation Oncology Biology Physics, also known as the Red Journal.

Emily Draeger, PhD, of the division of translational radiation sciences in the department of radiation oncology at University of Maryland School of Medicine, and colleagues evaluated 1,758 peer-reviewed studies in preclinical radiobiology research between 1997 and 2017, and they found wide variation in the reporting of experimental methods.

For example, 13.8% of the studies, published among 469 journals, did not clearly specify the radiation source. When broken down into reported physics and dosimetric experimental details, 92.5% reported the energy source, 81.4% indicated the energy used, and 64.8% provided the manufacturer and model of the equipment used.

Study authors reported absolute dosimetric details even less frequently — only 1.2% detailed the protocol for machine calibration and 15.9% listed the equipment used to measure the absorbed dose of radiation.

Draeger and colleagues found that the type of radiation and the absorbed dose were “nearly universally reported” but that “very rarely are details concerning the irradiation geometry or irradiator calibration reported.”

Moreover, the analysis showed higher-impact journals with a greater number of citations had fewer experimental details in the reports they published.

The negative effect this inconsistent reporting could have on translating basic science into clinical applications is clear, according to Draeger and colleagues.

“This endemic failure to report basic experimental details ... can have farther-reaching implications,” they wrote. “Because plausible but misreported quantities are undetectable, reporting of experimental details is no guarantee of their accuracy, and only a comprehensive description of the irradiation protocol can ultimately be used to evaluate and reproduce it.”

The ‘reproducibility crisis’

Yannick Poirier, PhD
Yannick Poirier

According to an earlier analysis by Steinberg and colleagues published in the Red Journal, most research funding for radiation oncology is dedicated to radiation biology research, Yannick Poirier, PhD, assistant professor in the department of radiation oncology at University of Maryland School of Medicine and a co-author of the Draeger study, told HemOnc Today.

“Despite these several hundred million research dollars spent over the last couple of decades, as clinical radiation physicists or clinicians, we see relatively few of these studies being translated into daily patient care,” he said. “It has always been suspected that a critical aspect of this lack of reproducibility was that irradiation protocols were incompletely described in radiation biology research.”

Poirier said previous studies on the topic were limited by the number of publications or time frame studied. His group sought to revisit the topic in a broad and systematic fashion to determine the extent of underreporting of irradiation protocols.

“A concerningly large portion of radiation biology publications don't report the most basic experimental details of the irradiation protocol necessary to understand and, more importantly, to replicate the study,” Poirier said.

“It is very difficult to replicate preclinical research from one laboratory to the next; and even more so from a preclinical setting to a human clinical trial,” he added. “The time, effort and resources expended on such irreproducible studies are a tremendous lost opportunity to improving cancer care for patients.”

The lack of experimental details appeared especially evident in what Poirier called “generalist” publications compared with “specialist” journals. The worst offenders were journals with high impact factors, which are more widely cited.

“We did not expect more highly cited publications in high impact factor journals to be less well-described,” Poirier said. “After all, publications such as Nature or Science take the reproducibility crisis very seriously.”

Brian Marples, PhD
Brian Marples

However, to call this a “crisis is perhaps a too provocative and emotional description,” Brian Marples, PhD, professor and director of radiobiology in the department of radiation oncology at University of Miami Miller School of Medicine, wrote in an editorial that accompanies the study by Draeger and colleagues. Nevertheless, Marples said that the report does highlight the need for reporting improvements.

For instance, the study extends previous research and “highlights that the reporting of irradiation conditions and parameters in radiation studies has declined and should return to prior standards,” Marples wrote. This, he added, could help overcome the “crisis” of disparities among published data.

However, Marples warned that solving issues with irradiation conditions will not automatically lead to greater reproduction of preclinical research.

“The issue of reporting irradiation methods is one possible reason, but it is not the only reason for any lack of reproducibility in preclinical research,” Marples told HemOnc Today. “Factors such as the heterogeneity of tumor models and experimental endpoints, among others, also have an impact.”

‘A bedrock principle

According to an opinion article by Hans E. Plesser, PhD, head of the data science section at Norwegian University of Life Sciences, reproducibility means being able to obtain consistent results using the same methods and analysis. Conversely, replicability means getting consistent results across studies that are aimed at answering the same scientific question, but each of which has followed its own methodology in the pursuit of answering this question.

Being able to reproduce results has been cited as a hallmark of good science, and achieving replication first requires reproducibility.

Todd Pawlicki, PhD
Todd Pawlicki

“The main takeaway [of the analysis by Draeger and colleagues] is that there is room for improvement in documenting methods of irradiation in radiation biology studies,” Todd Pawlicki, PhD, FASTRO, professor and vice chair for medical physics in the department of radiation medicine and applied sciences at University of California, San Diego, told HemOnc Today.

Pawlicki also is a member of the board of directors of American Society for Radiation Oncology, of which the Red Journal is the official publication. He recommended the inclusion of expert radiation biologists and physicists in all experimental studies that involve ionizing radiation to ensure adequate documentation of experimental methods.

He noted the close relationship in science between theory and experimentation, with theories validated by experience in the laboratory and experimental results leading to more theories.

“A bedrock principle of experimental science is reproducibility of results,” Pawlicki told HemOnc Today. “If experimental results are not able to be readily reproduced, then the results should be carefully examined. It doesn’t immediately mean that the results are wrong, just that more scrutiny is required to validate them.”

A new focus on quality

John H. Suh

Poor documentation of experimental methods has impeded progress in the field of radiation oncology and raised questions about the validity of studies, according to John H. Suh, MD, FASTRO, FACR, chairman of the department of radiation oncology and associate director of the Gamma Knife Center in the Brain Tumor and Neuro-Oncology Center at Cleveland Clinic, and a HemOnc Today Editorial Board Member.

“Given the importance of in vitro and in vivo studies to design and execute clinical studies, any difficulties with validating published radiobiology studies will squander valuable resources and increase the risk for negative studies,” Suh told HemOnc Today.

Suh said the lack of detail and consistency in reporting basic experimental methods surprised him “given the expectation that this would be standardized.” He added that factors significant to physicists — including irradiation geometry or irradiator calibration — were not usually reported.

“Given the importance of preclinical research to develop clinical applications, the inability to reproduce and replicate published preclinical radiation biology studies is concerning,” Suh said. “The design, documentation and validation of future radiation biology studies should be done with radiation biologists and radiation physicists working together.”

Suh then highlighted the efforts of the American Association of Physicists in Medicine (AAPM) Task Group 319, which is considering how to standardize future research in the field of preclinical radiation biology. The task force was mentioned in the original study by Draeger and colleagues and highlighted by both Pawlicki and Poirier.

“We hope they will underline the importance of better physics reporting of irradiation protocols to funding agencies that make this preclinical research possible,” Poirier said. “We hope this will spur funding agencies and journals to verify and demand a greater level of detail in the description of the irradiation protocol in a radiation biology study.”

Poirier added that the task group will soon produce guidelines for accurate dosimetry in radiation biology studies and that his group’s study should “serve as a snapshot of the current state of the field” to aid in the development of those guidelines.

“Once completed, this report should help standardize reporting of methods for radiation biology studies,” Pawlicki said.

Overall, the concerns of Draeger and colleagues are valid, and better dose reporting is warranted, Marples said.

“We also need to focus on multidisciplinary solutions to reproducibility issues, however, and continue recent progress such as AAPM Task Group No. 319’s work on dosimetry guidelines for radiobiology experiments,” he said.

Suh said the process to rectify the “reproducibility problem” and advance more research toward clinical applications must begin with the same approach toward quality that occurs in clinical practice.

“In clinical radiation oncology practice, the continuous focus on quality, safety and standardization has had a positive impact on clinical outcomes and expectations,” Suh told HemOnc Today. “A similar focus on radiation quality assurance in preclinical radiobiology studies should rectify some of the concerns.”

Poirier called upon clinicians and researchers to come together to advance the science of radiation oncology for those for whom it matters most — the patients.

“A long time ago, the distinction between radiation physics and radiation biology was much more blurred than it is today,” he said. “As the fields solidified and specialized, physicists have been increasingly drawn to the clinic, and biologists to basic and preclinical science; consequently, they interact much more seldom than they used to. If we want to ensure that studies are properly documented, we must find ways to break down silos and increase cooperation between these two disciplines.” – by Drew Amorosi

References:

Draeger E, et al. Int J Radiation Oncol Biol Phys. 2019;doi:10.1016/j.ijrobp.2019.06.2545.

Marples B. Int J Radiation Oncol Biol Phys. 2019;doi: 10.1016/j.ijrobp.2019.10.025.

Plesser HE. Front Neuroinform. 2018;doi:10.3389/fninf.2017.00076.

Steinberg M, et al. Int J Radiation Oncol Biol Phys. 2013:doi:10.1016/j.ijrobp.2013.01.030.

For more information:

Brian Marples, PhD, can be reached at University of Miami, Department of Radiation Oncology, 1475 NW 12th Ave., Miami, FL 33136; email: brian.marples@med.miami.edu.

Todd Pawlicki, PhD, FASTRO, can be reached at Department of Radiation Medicine and Applied Sciences, Moores Cancer Center, 3855 Health Sciences Drive, #0843, La Jolla, CA 92093-0843; email: tpawlicki@ucsd.edu.

Yannick Poirier, PhD, can be reached at University of Maryland School of Medicine, 22 South Greene St., GGJ12A, Baltimore, MD 21201; email: yannick.poirier@umm.edu.

John H. Suh, MD, FASTRO, FACR, can be reached at Cleveland Clinic, 9500 Euclid Ave., CA-50, Cleveland, OH 44195; email: suhj@ccf.org.

Disclosures: Marples is an ASTRO Board member and vice chair of its Science Council; he reports research funding from NIH/NCI and Live Like Bella Award (from the state of Florida). Suh reports a consultant role with AbbVie. Pawlicki and Poirier report no relevant financial disclosures.

The scientific method is the cornerstone of all scientific advancement.

By following clearly outlined methods used during an investigator’s research, other researchers can reproduce the results and, hopefully, establish a baseline from which scientific investigation can move forward.

Many radiation biology studies, however, lack adequate reporting of irradiation methodology, which makes replication difficult, according to results of a large-scale systematic review published in International Journal of Radiation Oncology Biology Physics, also known as the Red Journal.

Emily Draeger, PhD, of the division of translational radiation sciences in the department of radiation oncology at University of Maryland School of Medicine, and colleagues evaluated 1,758 peer-reviewed studies in preclinical radiobiology research between 1997 and 2017, and they found wide variation in the reporting of experimental methods.

For example, 13.8% of the studies, published among 469 journals, did not clearly specify the radiation source. When broken down into reported physics and dosimetric experimental details, 92.5% reported the energy source, 81.4% indicated the energy used, and 64.8% provided the manufacturer and model of the equipment used.

Study authors reported absolute dosimetric details even less frequently — only 1.2% detailed the protocol for machine calibration and 15.9% listed the equipment used to measure the absorbed dose of radiation.

Draeger and colleagues found that the type of radiation and the absorbed dose were “nearly universally reported” but that “very rarely are details concerning the irradiation geometry or irradiator calibration reported.”

Moreover, the analysis showed higher-impact journals with a greater number of citations had fewer experimental details in the reports they published.

The negative effect this inconsistent reporting could have on translating basic science into clinical applications is clear, according to Draeger and colleagues.

“This endemic failure to report basic experimental details ... can have farther-reaching implications,” they wrote. “Because plausible but misreported quantities are undetectable, reporting of experimental details is no guarantee of their accuracy, and only a comprehensive description of the irradiation protocol can ultimately be used to evaluate and reproduce it.”

The ‘reproducibility crisis’

Yannick Poirier, PhD
Yannick Poirier

According to an earlier analysis by Steinberg and colleagues published in the Red Journal, most research funding for radiation oncology is dedicated to radiation biology research, Yannick Poirier, PhD, assistant professor in the department of radiation oncology at University of Maryland School of Medicine and a co-author of the Draeger study, told HemOnc Today.

“Despite these several hundred million research dollars spent over the last couple of decades, as clinical radiation physicists or clinicians, we see relatively few of these studies being translated into daily patient care,” he said. “It has always been suspected that a critical aspect of this lack of reproducibility was that irradiation protocols were incompletely described in radiation biology research.”

PAGE BREAK

Poirier said previous studies on the topic were limited by the number of publications or time frame studied. His group sought to revisit the topic in a broad and systematic fashion to determine the extent of underreporting of irradiation protocols.

“A concerningly large portion of radiation biology publications don't report the most basic experimental details of the irradiation protocol necessary to understand and, more importantly, to replicate the study,” Poirier said.

“It is very difficult to replicate preclinical research from one laboratory to the next; and even more so from a preclinical setting to a human clinical trial,” he added. “The time, effort and resources expended on such irreproducible studies are a tremendous lost opportunity to improving cancer care for patients.”

The lack of experimental details appeared especially evident in what Poirier called “generalist” publications compared with “specialist” journals. The worst offenders were journals with high impact factors, which are more widely cited.

“We did not expect more highly cited publications in high impact factor journals to be less well-described,” Poirier said. “After all, publications such as Nature or Science take the reproducibility crisis very seriously.”

Brian Marples, PhD
Brian Marples

However, to call this a “crisis is perhaps a too provocative and emotional description,” Brian Marples, PhD, professor and director of radiobiology in the department of radiation oncology at University of Miami Miller School of Medicine, wrote in an editorial that accompanies the study by Draeger and colleagues. Nevertheless, Marples said that the report does highlight the need for reporting improvements.

For instance, the study extends previous research and “highlights that the reporting of irradiation conditions and parameters in radiation studies has declined and should return to prior standards,” Marples wrote. This, he added, could help overcome the “crisis” of disparities among published data.

However, Marples warned that solving issues with irradiation conditions will not automatically lead to greater reproduction of preclinical research.

“The issue of reporting irradiation methods is one possible reason, but it is not the only reason for any lack of reproducibility in preclinical research,” Marples told HemOnc Today. “Factors such as the heterogeneity of tumor models and experimental endpoints, among others, also have an impact.”

‘A bedrock principle

According to an opinion article by Hans E. Plesser, PhD, head of the data science section at Norwegian University of Life Sciences, reproducibility means being able to obtain consistent results using the same methods and analysis. Conversely, replicability means getting consistent results across studies that are aimed at answering the same scientific question, but each of which has followed its own methodology in the pursuit of answering this question.

PAGE BREAK

Being able to reproduce results has been cited as a hallmark of good science, and achieving replication first requires reproducibility.

Todd Pawlicki, PhD
Todd Pawlicki

“The main takeaway [of the analysis by Draeger and colleagues] is that there is room for improvement in documenting methods of irradiation in radiation biology studies,” Todd Pawlicki, PhD, FASTRO, professor and vice chair for medical physics in the department of radiation medicine and applied sciences at University of California, San Diego, told HemOnc Today.

Pawlicki also is a member of the board of directors of American Society for Radiation Oncology, of which the Red Journal is the official publication. He recommended the inclusion of expert radiation biologists and physicists in all experimental studies that involve ionizing radiation to ensure adequate documentation of experimental methods.

He noted the close relationship in science between theory and experimentation, with theories validated by experience in the laboratory and experimental results leading to more theories.

“A bedrock principle of experimental science is reproducibility of results,” Pawlicki told HemOnc Today. “If experimental results are not able to be readily reproduced, then the results should be carefully examined. It doesn’t immediately mean that the results are wrong, just that more scrutiny is required to validate them.”

A new focus on quality

John H. Suh

Poor documentation of experimental methods has impeded progress in the field of radiation oncology and raised questions about the validity of studies, according to John H. Suh, MD, FASTRO, FACR, chairman of the department of radiation oncology and associate director of the Gamma Knife Center in the Brain Tumor and Neuro-Oncology Center at Cleveland Clinic, and a HemOnc Today Editorial Board Member.

“Given the importance of in vitro and in vivo studies to design and execute clinical studies, any difficulties with validating published radiobiology studies will squander valuable resources and increase the risk for negative studies,” Suh told HemOnc Today.

Suh said the lack of detail and consistency in reporting basic experimental methods surprised him “given the expectation that this would be standardized.” He added that factors significant to physicists — including irradiation geometry or irradiator calibration — were not usually reported.

“Given the importance of preclinical research to develop clinical applications, the inability to reproduce and replicate published preclinical radiation biology studies is concerning,” Suh said. “The design, documentation and validation of future radiation biology studies should be done with radiation biologists and radiation physicists working together.”

PAGE BREAK

Suh then highlighted the efforts of the American Association of Physicists in Medicine (AAPM) Task Group 319, which is considering how to standardize future research in the field of preclinical radiation biology. The task force was mentioned in the original study by Draeger and colleagues and highlighted by both Pawlicki and Poirier.

“We hope they will underline the importance of better physics reporting of irradiation protocols to funding agencies that make this preclinical research possible,” Poirier said. “We hope this will spur funding agencies and journals to verify and demand a greater level of detail in the description of the irradiation protocol in a radiation biology study.”

Poirier added that the task group will soon produce guidelines for accurate dosimetry in radiation biology studies and that his group’s study should “serve as a snapshot of the current state of the field” to aid in the development of those guidelines.

“Once completed, this report should help standardize reporting of methods for radiation biology studies,” Pawlicki said.

Overall, the concerns of Draeger and colleagues are valid, and better dose reporting is warranted, Marples said.

“We also need to focus on multidisciplinary solutions to reproducibility issues, however, and continue recent progress such as AAPM Task Group No. 319’s work on dosimetry guidelines for radiobiology experiments,” he said.

Suh said the process to rectify the “reproducibility problem” and advance more research toward clinical applications must begin with the same approach toward quality that occurs in clinical practice.

“In clinical radiation oncology practice, the continuous focus on quality, safety and standardization has had a positive impact on clinical outcomes and expectations,” Suh told HemOnc Today. “A similar focus on radiation quality assurance in preclinical radiobiology studies should rectify some of the concerns.”

Poirier called upon clinicians and researchers to come together to advance the science of radiation oncology for those for whom it matters most — the patients.

“A long time ago, the distinction between radiation physics and radiation biology was much more blurred than it is today,” he said. “As the fields solidified and specialized, physicists have been increasingly drawn to the clinic, and biologists to basic and preclinical science; consequently, they interact much more seldom than they used to. If we want to ensure that studies are properly documented, we must find ways to break down silos and increase cooperation between these two disciplines.” – by Drew Amorosi

References:

Draeger E, et al. Int J Radiation Oncol Biol Phys. 2019;doi:10.1016/j.ijrobp.2019.06.2545.

Marples B. Int J Radiation Oncol Biol Phys. 2019;doi: 10.1016/j.ijrobp.2019.10.025.

Plesser HE. Front Neuroinform. 2018;doi:10.3389/fninf.2017.00076.

Steinberg M, et al. Int J Radiation Oncol Biol Phys. 2013:doi:10.1016/j.ijrobp.2013.01.030.

For more information:

Brian Marples, PhD, can be reached at University of Miami, Department of Radiation Oncology, 1475 NW 12th Ave., Miami, FL 33136; email: brian.marples@med.miami.edu.

Todd Pawlicki, PhD, FASTRO, can be reached at Department of Radiation Medicine and Applied Sciences, Moores Cancer Center, 3855 Health Sciences Drive, #0843, La Jolla, CA 92093-0843; email: tpawlicki@ucsd.edu.

Yannick Poirier, PhD, can be reached at University of Maryland School of Medicine, 22 South Greene St., GGJ12A, Baltimore, MD 21201; email: yannick.poirier@umm.edu.

John H. Suh, MD, FASTRO, FACR, can be reached at Cleveland Clinic, 9500 Euclid Ave., CA-50, Cleveland, OH 44195; email: suhj@ccf.org.

Disclosures: Marples is an ASTRO Board member and vice chair of its Science Council; he reports research funding from NIH/NCI and Live Like Bella Award (from the state of Florida). Suh reports a consultant role with AbbVie. Pawlicki and Poirier report no relevant financial disclosures.