The theme day on air quality modeling (organized by FIMEA and IMT Lille Douai) on June 8 provided an opportunity for researchers in this field to exchange on existing methods. Modeling makes it possible to identify the link between pollution sources and receptors. These models help provide an understanding of atmospheric processes and air pollution prevention.
What will the pollution be like tomorrow? Only one tool can provide answer: modeling. But what is modeling? It all depends the area of expertise. In the field of air quality, this method involves creating computer simulations to represent different scenarios. For example, it enables pollutant emissions to be simulated before building a new highway. Just as meteorological models predict rain, an air quality model predicts pollutant concentrations. Modeling also provides a better understanding of the physical and chemical reactions that take place in the atmosphere. “There are models that cover smaller and larger areas, which make it possible to study the air quality for a continent, region, or even for one street,” explains Stéphane Sauvage, a researcher with the Atmospheric Sciences and Environmental Engineering Department (SAGE) at IMT Lille-Douai. How are these models developed?
Models, going back to the source
The first approach involves identifying the sources that emit the pollutants via field observations, an area of expertise at IMT Lille-Douai. Sensors located near the receptors (individuals, ecosystems) measure the compounds in the form of gas or particles (aerosols). The researchers refer to certain types that are detected as tracers, because they are representative of a known source of emissions. “Several VOC (Volatile Organic Compounds) are emitted by plants, whereas other kinds are typical of road traffic. We can also identify an aerosol’s origin (natural, wood combustion…) by analyzing its chemical composition,” Stéphane Sauvage explains.
The researchers study the hourly, daily, and seasonal variability of the tracers through statistical analysis. These variations are combined with models that trace the path air masses followed before reaching the observation site. “Through this temporal and spatial approach, we can succeed in reproducing the potential areas of origin. We observe ‘primary’ pollutants, which are directly emitted by the sources, and are measured by the receptors. But secondary pollutants also exist; the result of chemical reactions that take place in the atmosphere,” the researcher adds. To identify the sources of this second category of pollutants, researchers identify the reactions that could possibly take place between chemical components. This is a complex process, since the atmosphere is truly a reactor, within which different species are constantly being transformed. Therefore, the researchers come up with hypotheses to enable them to find the sources. Once these models are functional, they are used as decision-making tools.
Models focused on receptors
A second approach, referred to as the “deterministic” modeling, is focused on the receptors. Based on what they know about the sources (concentrations of industrial waste and of road traffic…), the researchers use air mass diffusion and movement models to visualize the impact these emissions have on the receptor. To accomplish this, the models integrate meteorological data (wind, temperature, pressure…) and the equations of the chemical reactions taking place in the atmosphere. These complex tools require a comprehensive knowledge of atmospheric processes and high levels of computing power.
These models are used for forecasting purposes. “air pollution control agencies use them to inform the public of the levels of pollutants in a given area. If necessary, the prefecture can impose driving restrictions based on the forecasts these models provide,” explains Stéphane Sauvage. This modeling approach also makes it possible to simulate environmental impact assessments for industrial sites.
Complementary methods
Both methods have their have limits and involve uncertainties. The models based on observations are not comprehensive. “We do not know how to observe all the species. In addition, this statistical approach requires a large amount of observations to be made before a reliable and robust model can be developed. The hypotheses used in this approach are simplistic compared to the receptor-focused models,” Stéphane Sauvage adds. The other type of model also relies on estimations. It uses data that can be uncertain, such as the estimation of the sources’ emissions and the weather forecasts.
“We can combine these two methods to obtain tools that are more effective. The observation-based approaches make it possible to assess information about the sources, which is useful for the deterministic models. The deterministic models are validated by comparing the predictions with the observations. But we can also integrate the observed data into the models to correct them,” the researcher adds. This combination limits the uncertainties involved and supports the identification of links between the sources and receptors. The long-term objective is to propose decision-making tools for policies aimed at effectively reducing pollutants.