A minimally invasive procedure to determine radiation toxicity in therapy patients or victims of nuclear accidents.



The Need

The lack of a reliable radiation biodosimeter has been a major barrier to medical decision-making regarding the triage and treatment of persons who might be at risk for developing acute radiation syndromes (ARS) following nuclear events (e.g. reactor accidents, “dirty bomb” assaults, etc.). ARS is an acute illness caused by exposure to high-dose, penetrating radiation to the entire body. Radiation exposure affects the hematopoietic, gastrointestinal, and central nervous systems; the different syndromes clinically manifest depending on exposure level. Rapid diagnosis is critical for emergency management, as demonstrated by the dramatic increase in survival rates if countermeasures and mitigators are administered within the first two days.

.

Currently, three criteria are used to determine radiation exposure levels; clinical observation, lymphocyte depletion kinetics, and the dicentric chromosome assay. However, patient-to-patient variability greatly decreases the accuracy of clinical diagnosis. Lymphocyte depletion kinetics requires three days of analysis for a crude dosimetry reading and 11 days for an accurate reading, and the dicentric chromosome assay is highly technical and labor intensive. By the time the level of radiation dosing/injury can be identified, significant and irreversible damage has already occurred in exposed individuals. New methods for early quantitation of absorbed radiation dose and prediction or detection of injury need to be developed.

The Technology

Reserchers at the Ohio State University, led by Dr. Naduparambil Jacob, have identified multiple biomarkers for use in radiation biodosimetry. This minimally invasive technique is effective in detecting early acute radiation syndromes (ARS) in either the hematological or gastrointestinal systems. These biomarkers are responsive 24-48 hours after exposure and are stable for several days unlike other current radiation detection tests. They are effective for triaging those with minimal exposure (the “worried well”) from those with substantial but treatable exposure. The latter group can be triaged with a resolution of 0.5 units (Grey or Gy) of exposure, allowing for optimal choice of treatment regime. This early, accurate detection of dose and severity will allow timely administration of countermeasures to mitigate acute toxicities and reduce late effects. The multi-panel markers provide the scientific basis for developing devices deployed in the field for rapid triage in radiological events. This procedure can also be used in the field of clinical radiation oncology as a means of evaluating toxicities in patients undergoing radiation therapy. A significant application here is in the diagnosis of radiation-induced lung injury, a major issue for cancer patients (e.g. breast, lung, lymphoma) receiving lung-directed radiation therapy. Clinical manifestation of radiation injury is delayed, appearing 1-6 months following treatment (pneumonitis) and 6-24 months following treatment (lung fibrosis). At these time points, it is often too late for therapy to preserve lung function. The OSU biomarker panels can detect such lung injury within two weeks of exposure, allowing for effective, early treatment long before clinical disease expression. This technology has already been established in pre-clinical models with plans to optimize the methods in clinical trials in the near future.

Commercial Applications

  • Triage and treatment selection for individuals exposed to a nuclear event
  • Early evaluation of radiation toxicity allowing effective treatment measures in patients undergoing radiation therapy

Benefits/ Advantages

  • High sensitivity
  • Rapid responses
  • Minimally invasive procedure – only requires blood draw from patient
  • Proof-of-concept established in pre-clinical models
  • Allows repeated assessment



Interested in this Technology?

Submit your interest below