Natural gas pipelines form a vital part of the energy infrastructure of the United States. In order to overcome head losses in moving the natural gas from one area of the country to another, large compressors are needed to pressurize the gas. For decades, the most efficient and cost-effective method of compressing the gas has been through the use of integral compressor engines. Pipeline companies have great financial incentive to continue using these engines, but increasingly stringent emissions regulations threaten their continued operation.
In this study, the above problem was addressed by developing a zero-dimensional thermodynamic cycle simulation to predict NOx emissions for a large bore, single cylinder, naturally aspirated, 2-stroke, natural gas engine. Excellent agreement was obtained between experimental measurements and simulated predictions of the average exhaust NOx concentration.
Once the simulation was validated by experimental data, a sensitivity analysis was conducted to determine the response of NOx emissions to changes in three factors: trapped equivalence ratio (TER), burned gas fraction (xb), and stuffing box temperature (SBT). This study sought to identify the fundamental thermodynamic reasons that NOx varied with each factor, and to quantify their respective effects.
It was found that changes in each factor effected linear changes in the combustion temperatures, which effected linear changes in the rate constant of the first reaction in the extended Zeldovich mechanism, which effected exponential changes in the NOx emissions. TER and SBT were shown to be directly related to NOx, while xb was shown to be inversely related to NOx.