In times of war, radiological/nuclear emergency scenarios have become a reemphasized threat. However, there are challenges in transferring whole-blood samples to laboratories for specialized diagnostics using RNA. This project aims to miniaturize the process of unwieldy conventional RNA extraction with its stationed technical equipment using a microfluidic-based slide (MBS) for point-of-care diagnostics. The MBS is thought to be a preliminary step toward the development of a so-called lab-on-a-chip microfluidic device. A MBS would enable early and fast field care combined with gene expression (GE) analysis for the prediction of hematologic acute radiation syndrome (HARS) severity or identification of RNA microbes. Whole blood samples from ten healthy donors were irradiated with 0, 0.5 and 4 Gy, simulating different ARS severity degrees. RNA quality and quantity of a preliminary MBS was compared with a conventional column-based (CB) RNA extraction method. GE of four HARS severity-predicting radiation-induced genes (FDXR, DDB2, POU2AF1 and WNT3) was examined employing qRT-PCR. Compared to the CB method, twice as much total RNA from whole blood could be extracted using the MBS (6.6 ± 3.2 µg vs. 12.0 ± 5.8 µg) in half of the extraction time, and all MBS RNA extracts appeared DNA-free in contrast to the CB method (30% were contaminated with DNA). Using MBS, RNA quality [RNA integrity number equivalent (RINe)] values decreased about threefold (3.3 ± 0.8 vs. 9.0 ± 0.4), indicating severe RNA degradation, while expected high-quality RINe ≥ 8 were found using column-based method. However, normalized cycle threshold (Ct) values, as well as radiation-induced GE fold-changes appeared comparable for all genes utilizing both methods, indicating that no RNA degradation took place. In summary, the preliminary MBS showed promising features such as: 1. halving the RNA extraction time without the burden of heavy technical equipment (e.g., a centrifuge); 2. absence of DNA contamination in contrast to CB RNA extraction; 3. reduction in blood required, because of twice the biological output of RNA; and 4. equal GE performance compared to CB, thus, increasing its appeal for later semi-automatic parallel field applications.
After nuclear scenarios, combined injuries of acute radiation syndrome (ARS) with, e.g., abdominal trauma, will occur and may require contrast-enhanced computed tomography (CT) scans for diagnostic purposes. Here, we investigated the effect of iodinated contrast agents on radiation-induced gene expression (GE) changes used for biodosimetry (AEN, BAX, CDKN1A, EDA2R, APOBEC3H) and for hematologic ARS severity prediction (FDXR, DDB2, WNT3, POU2AF1), and on the induction of double-strand breaks (DSBs) used for biodosimetry. Whole blood samples from 10 healthy donors (5 males, 5 females, mean age: 28 ± 2 years) were irradiated with X rays (0, 1 and 4 Gy) with and without the addition of iodinated contrast agent (0.016 ml contrast agent/ml blood) to the blood prior to the exposure. The amount of contrast agent was set to be equivalent to the blood concentration of an average patient (80 kg) during a contrast-enhanced CT scan. After irradiation, blood samples were incubated at 37°C for 20 min (DSB) and 8 h (GE, DSB). GE was measured employing quantitative real-time polymerase chain reaction. DSB foci were revealed by γH2AX + 53BP1 immunostaining and quantified automatically in >927 cells/sample. Radiation-induced differential gene expression (DGE) and DSB foci were calculated using the respective unexposed sample without supplementation of contrast agent as the reference. Neither the GE nor the number of DSB foci was significantly (P = 0.07-0.94) altered by the contrast agent application. However, for some GE and DSB comparisons with/without contrast agent, there were weakly significant differences (P = 0.03-0.04) without an inherent logic and thus are likely due to inter-individual variation. In nuclear events, the diagnostics of combined injuries can require the use of an iodinated contrast agent, which, according to our results, does not alter or influence radiation-induced GE changes and the quantity of DSB foci. Therefore, the gene expression and γH2AX focus assay can still be applied for biodosimetry and/or hematologic ARS severity prediction in such scenarios.
Despite the large variety of high-voltage semiconductor components for medium and high voltage switching and pulse-forming applications as well as for high-power high-frequency generation, the use of vacuum electron tubes still prevails to a considerable degree. Due to the common design incorporating a high energy electron beam which finally is dumped into an anode or a resonator cavity, these tubes are also considered as sources of X rays produced as bremsstrahlung and characteristic radiation, which are referred to as parasitic X rays. Here three types of vacuum-electron tubes, diode, tetrode, and thyratron, with glass housings are investigated. They are predominantly operated in the high voltage range below 30 kV and are not subject to licensing laws. The measurements of the dose rate and X-ray-spectra were performed in the laboratory without complex electrical circuitry usually used in making practical measurements for occupational radiation protection. For the diode tube, where a parasitic X-ray emission is observed only in the reverse operation as a blocking diode, a broad distribution of dose rates of electrically equivalent specimens was observed. This is attributed to field emission from the electrodes. For the tetrode and the thyratron tubes, field emission from the electrodes is identified as the dominant mechanism for the generation of parasitic X rays. Thus, technical radiation protection must focus on shielding of the glass tube rather than optimization of the electrical circuitry.
With the current volatile geopolitical climate, the threat of nuclear assault is high. Exposure to ionizing radiation from either nuclear incidents or radiological accidents often lead to major harmful consequences to human health. Depending on the absorbed dose, the symptoms of the acute radiation syndrome and delayed effects of acute radiation exposure (DEARE) can appear within hours, weeks to months. The lung is a relatively radiosensitive organ with manifestation of radiation pneumonitis as an acute effect, followed by apparent fibrosis in weeks or even months. A recently developed, first-of-its-kind murine model for partial-body irradiation (PBI) injury, which can be used to test potential countermeasures against multi-organ damage such as gastrointestinal (GI) tract and lungs was used for irradiation, with 2.5% bone marrow spared (BM2.5-PBI) from radiation exposure. Long-term damage to lungs from radiation was evaluated using µ-CT scans, pulmonary function testing, histopathological parameters and molecular biomarkers. Pulmonary fibrosis was detected by ground glass opacity observed in µ-CT scans of male and female C57BL/6J mice 6-7 months after BM2.5-PBI. Lung mechanics assessments pertaining to peripheral airways suggested fibrotic lungs with stiffer parenchymal lung tissue and reduced inspiratory capacity in irradiated animals 6-7 months after BM2.5-PBI. Histopathological evaluation of the irradiated lungs revealed presence of focal and diffuse pleural, and parenchymal inflammatory and fibrotic lesions. Fibrosis was confirmed by elevated levels of collagen when compared to lungs of age-matched naïve mice. These findings were validated by findings of elevated levels of pro-fibrotic biomarkers and reduction in anti-inflammatory proteins. In conclusion, a long-term model for radiation-induced pulmonary fibrosis was established, and countermeasures could be screened in this model for survival and protection/mitigation or recovery from radiation-induced pulmonary damage.
The increased risk of acute large-scale radiological exposure for the world's population underlines the need for optimal radiation biomarkers. Ionizing radiation triggers a complex response by the genome, proteome, and metabolome, all of which have been reported as suitable indicators of radiation-induced damage in vivo. This study analyzed peripheral blood samples from total-body irradiation (TBI) leukemia patients through mass spectrometry (MS) to identify and quantify differentially regulated proteins in plasma before and after irradiation. In brief, samples were taken from 16 leukemic patients prior to and 24 h after TBI (2 × 2.0 Gy), processed with Tandem Mass Tag isobaric labelling kit (TMTpro-16-plex), and analyzed by MS. In parallel, label-free relative quantification was performed with a RP-nanoLC-ESI-MS/MS system in a Q-Exactive mass spectrometer. Protein identification was done in Proteome Discoverer v.2.2 platform (Thermo). Data is available via ProteomeXchange with identifier PXD043516. Using two different methods, we acquired two datasets of up-regulated (ratio ≥ 1.2) or down-regulated (ratio ≤ 0.83) plasmatic proteins 24 h after irradiation, identifying 356 and 346 proteins in the TMT-16plex and 285 and 308 label-free analyses, respectively (P ≤ 0.05). Combining the two datasets yielded 15 candidates with significant relation to gamma-radiation exposure. The majority of these proteins were associated with the inflammatory response and lipid metabolism. Subsequently, from these, five proteins showed the strongest potential as radiation biomarkers in humans (C-reactive protein, Alpha amylase 1A, Mannose-binding protein C, Phospholipid transfer protein, and Complement C5). These candidate biomarkers might have implications for practical biological dosimetry.
The purpose of this investigation was to characterize the natural history of a murine total-abdominal-irradiation exposure model to measure gastrointestinal acute radiation injury. Male CD2F1 mice at 12 to 15 weeks old received total-abdominal irradiation using 4-MV linear accelerator X-rays doses of 0, 11, 13.5, 15, 15.75 and 16.5 Gy (2.75 Gy/min). Daily cage-side (i.e., in the animal housing room) observations of clinical signs and symptoms including body weights on all animals were measured up to 10 days after exposure. Jejunum tissues from cohorts of mice were collected at 1, 3, 7 and 10 days after exposure and radiation injury was assessed by histopathological analyses. Results showed time- and dose-dependent loss of body weight [for example at 7 days: 0.66 (±0.80) % loss for 0 Gy, 6.40 (±0.76) % loss at 11 Gy, 9.43 (±2.06) % loss at 13.5 Gy, 23.53 (± 1.91) % loss at 15 Gy, 29.97 (±1.16) % loss at 15.75 Gy, and 31.79 (±0.76) % loss at 16.5 Gy]. Negligible clinical signs and symptoms, except body weight changes, of radiation injury were observed up to 10 days after irradiation with doses of 11 to 15 Gy. Progressive increases in the severity of clinical signs and symptoms were found after irradiation with doses >15 Gy. Jejunum histology showed a progressive dose-dependent increase in injury. For example, at 7 days postirradiation, the percent of crypts, compared to controls, decreased to 82.3 (±9.5), 69.2 (±12.3), 45.4 (±11.9), 18.0 (±3.4), and 11.5 (± 1.8) with increases in doses from 11 to 16.5 Gy. A mucosal injury scoring system was used that mainly focused on changes in villus morphology damage (i.e., subepithelial spaces near the tips of the villi with capillary congestion, significant epithelial lifting along the length of the villi with a few denuded villus tips). Peak levels of total-abdominal irradiation induced effects on the mucosal injury score were seen 7 days after irradiation for doses ≥15 Gy, with a trend to show a decline after 7 days. A murine multiple-parameter gastrointestinal acute-radiation syndrome severity-scoring system was established based on clinical signs and symptoms that included measures of appearance (i.e., hunched and/or fluffed fur), respiratory rate, general (i.e., decreased mobility) and provoked behavior (i.e., subdued response to stimulation), weight loss, and feces/diarrhea score combined with jejunum mucosal-injury grade score. In summary, the natural-history radio-response for murine partial-body irradiation exposures is important for establishing a well-characterized radiation model system; here we established a multiple-parameter gastrointestinal acute-radiation syndrome severity-scoring system that provides a radiation injury gastrointestinal tissue-based assessment utility.
High-LET-type cell survival curves have been observed in cells that were allowed to incorporate 125I-UdR into their DNA. Incorporation of tritiated thymidine into the DNA of cells has also been shown to result in an increase in relative biological effectiveness in cell survival experiments, but the increase is smaller than observed after incorporation of 125I-UdR. These findings are explained in the literature by the overall complexity of the induced DNA damage resulting from energies of the ejected electron(s) during the decay of 3H and 125I. Chromosomal aberrations (CA) are defined as morphological or structural changes of one or more chromosomes, and can be induced by ionizing radiation. Whether the number of CA is associated with the linear energy transfer (LET) of the radiation and/or the actual complexity of the induced DNA double-strand breaks (DSB) remains elusive. In this study, we investigated whether DNA lesions induced at different cell cycle stages and by different radiation types [Auger-electrons (125I), β- particles (3H), or γ radiation (137Cs)] have an impact on the number of CA induced after induction of the same number of DSB as determined by the γ-H2AX foci assay. Cells were synchronized and pulse-labeled in S phase with low activities of 125I-UdR or tritiated thymidine. For decay accumulation, cells were cryopreserved either after pulse-labeling in S phase or after progression to G2/M or G1 phase. Experiments with γ irradiation (137Cs) were performed with synchronized and cryopreserved cells in S, G2/M or G1 phase. After thawing, a CA assay was performed. All experiments were performed after a similar number of DSB were induced. CA induction after 125I-UdR was incorporated was 2.9-fold and 1.7-fold greater compared to exposure to γ radiation and radiation from incorporated tritiated thymidine, respectively, when measured in G2/M cells. In addition, measurement of CA in G2/M cells after incorporation of 125I-UdR was 2.5-fold greater when compared to cells in G1 phase. In contrast, no differences were observed between the three radiation qualities with respect to exposure after cryopreservation in S or G1 phase. The data indicate that the 3D organization of replicated DNA in G2/M cells seems to be more sensitive to induction of more complex DNA lesions compared to the DNA architecture in S or G1 cells. Whether this is due to the DNA organization itself or differences in DNA repair capability remains unclear.
The Radiation and Nuclear Countermeasures Program (RNCP) at the National Institute of Allergy and Infectious Diseases (NIAID), National Institutes of Health (NIH) was established to facilitate the development of medical countermeasures (MCMs) and diagnostic approaches for use in a radiation public health emergency. Approvals for MCMs can be very challenging but are made possible under the United States Food and Drug Administration (FDA) Animal Rule, which is designed to enable licensure of drugs or biologics when clinical efficacy studies are unethical or unfeasible. The NIAID portfolio includes grants, contracts, and inter-agency agreements designed to span all aspects of drug development and encompasses basic research through FDA approval. In addition, NIAID manages an active portfolio of biodosimetry approaches to assess injuries and absorbed radiation levels to guide triage and treatment decisions. NIAID, together with grantees, contractors, and other stakeholders with promising products, works to advance candidate MCMs and biodosimetry tools through an established product development pipeline. In addition to managing grants and contracts, NIAID tests promising candidates in our established preclinical animal models, and the NIAID Program Officers work closely with sponsors as product managers to guide them through the process. In addition, a valuable benefit for stakeholders is working with the NIAID Office of Regulatory Affairs, where NIAID coordinates with the FDA to facilitate interactions between sponsors and the agency. Activities funded by NIAID include basic research (e.g., library screens to discover new products, determine early efficacy, and delineate mechanism of action) and the development of small and large animal models of radiation-induced hematopoietic, gastrointestinal, lung, kidney, and skin injury, radiation combined injury, and radionuclide decorporation. NIAID also sponsors Good Laboratory Practice product safety, pharmacokinetic, pharmacodynamic, and toxicology studies, as well as efficacy and dose-ranging studies to optimize product regimens. For later-stage candidates, NIAID funds large-scale manufacturing and formulation development of products. The program also supports Phase 1 human clinical studies to ensure human safety and to bridge pharmacokinetic, pharmacodynamic, and efficacy data from animals to humans. To date, NIAID has supported >900 animal studies and one clinical study, evaluating >500 new/repurposed radiation MCMs and biodosimetric approaches. NIAID sponsorship led to the approval of three of the six drugs for acute radiation syndrome under the FDA Animal Rule, five Investigational New Drug applications, and 18 additional submissions for Investigational Device Exemptions, while advancing 38 projects to the Biomedical Advanced Research and Development Authority for follow-on research and development.
In the current geopolitical climate there is an unmet need to identify and develop prophylactic radiation countermeasures, particularly to ensure the well-being of warfighters and first responders that may be required to perform on radiation-contaminated fields for operational or rescue missions. Currently, no countermeasures have been approved by the U.S. FDA for prophylactic administration. Here we report on the efficacious nature of FSL-1 (toll-like receptor 2/6 agonist) and the protection from acute radiation syndrome (ARS) in a murine total-body irradiation (TBI) model. A single dose of FSL-1 was administered subcutaneously in mice. The safety of the compound was assessed in non-irradiated animals, the efficacy of the compound was assessed in animals exposed to TBI in the AFRRI Co-60 facility, the dose of FSL-1 was optimized, and common hematological parameters [complete blood cell (CBC), cytokines, and bone marrow progenitor cells] were assessed. Animals were monitored up to 60 days after exposure and radiation-induced damage was evaluated. FSL-1 was shown to be non-toxic when administered to non-irradiated mice at doses up to 3 mg/kg. The window of efficacy was determined to be 24 h prior to 24 h after TBI. FSL-1 administration resulted in significantly increased survival when administered either 24 h prior to or 24 h after exposure to supralethal doses of TBI. The optimal dose of FSL-1 administration was determined to be 1.5 mg/kg when administered prior to irradiation. Finally, FSL-1 protected the hematopoietic system (recovery of CBC and bone marrow CFU). Taken together, the effects of increased survival and accelerated recovery of hematological parameters suggests that FSL-1 should be developed as a novel radiation countermeasure for soldiers and civilians, which can be used either before or after irradiation in the aftermath of a radiological or nuclear event.