Optimum filter selection for Dual Energy X-ray Applications through Analytical Modeling

2015 
In this simulation study, an analytical model was used in order to determine the optimal acquisition parameters for a dual energy breast imaging system. The modeled detector system, consisted of a 33.91mg/cm2 Gd2O2S:Tb scintillator screen, placed in direct contact with a high resolution CMOS sensor. Tungsten anode X-ray spectra, filtered with various filter materials and filter thicknesses were examined for both the low- and high-energy beams, resulting in 3375 combinations. The selection of these filters was based on their K absorption edge (K-edge filtering). The calcification signal-to-noise ratio (SNRtc) and the mean glandular dose (MGD) were calculated. The total mean glandular dose was constrained to be within acceptable levels. Optimization was based on the maximization of the SNRtc/MGD ratio. The results showed that the optimum spectral combination was 40kVp with added beam filtration of 100 μm Ag and 70kVp Cu filtered spectrum of 1000 μm for the low- and high-energy, respectively. The minimum detectable calcification size was 150 μm. Simulations demonstrate that this dual energy X-ray technique could enhance breast calcification detection.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    9
    References
    0
    Citations
    NaN
    KQI
    []