Deeply buried underground nuclear explosions used in the recovery of minerals and natural gas can have a positive impact on the environment. To put this on a quantitative base, one can compute emissions from a hypothetical 1000-MW(e) plant using coal, oil, or nuclearly stimulated gas and examine the relative effects downwind from the plant. The tradeoffs between SO2 emissions from coal and oil, and tritium and krypton from the nuclearly stimulated gas then can be evaluated under identical conditions. Using natural gas, the plant energy requirement of 90 billion ft3/year can be met by a field development consisting of 15 wells the first year and decreasing to 2 wells per year after five years. Four 100-kt explosives are assumed needed to stimulate each well. Tritium and 85Kr concentrations are computed to decrease from first-year values of 10 pCi/cm3 and 52 pCi/cm3, respectively, to 1.4 and 7.5 pCi/cm3 after five years, as new formation gas replaces the original chimney gas and the number of new wells decreases. For the reasonable meteorological conditions assumed to remain constant, the maximum effluent concentration occurs 4.3 km from the plant where the ground-level values of SO2 for coal, oil, and natural gas use are 0.18, 0.004, and 0.00002 ppm, respectively. Converting the radionuclide concentration at the same location to dose shows that whole body tritium doses decrease from 0.14 mrem/year for the first year to 0.018 mrem/year after six years, and that the whole body 85Kr dose decreases from 0.009 to 0.001 over the same time span. These doses can be compared with those from natural and manmade radioactive sources. The maximum annual dose from a power plant using nuclearly stimulated natural gas is comparable to that from TV sets and luminous dial watches.