COST ES1006 Background and Justification Document

Size: px
Start display at page:

Download "COST ES1006 Background and Justification Document"

Transcription

1 COST ES1006 Background and Justification Document COST Action ES1006 Evaluation, improvement and guidance for the use of local-scale emergency prediction and response tools for airborne hazards in built environments May 31 st, 2012

2 Legal notice by the COST Office Neither the COST Office nor any person acting on its behalf is responsible for the use which might be made on the information contained in the present publication. The COST Office is not responsible for the external web sites referred to in the present publication. Contact: Dr. Basak Kisakurek Science Officer Cost Office Avenue Louise Brussels Belgium Tel: Fax: Basak.Kisakurek@cost.eu Edited by the Editorial Board Members of COST Action ES1006: Spyros Andronopoulos, Patrick Armand, Kathrin Baumann-Stanzer, Steven Herring, Bernd Leitl, Tamir Reisin, Silvia Trini Castelli Contributing authors: Marton Balczo, Silvana Di Sabatino, Jörg Franke, Marko Grebec, Ari Karpinnen, Ernst Meijer, Jacques Moussafir, Bjorn Petterson Reif, Gianni Tinarelli, Inge Trijssenaar-Buhre COST Office, 2012 No permission to reproduce or utilize the contents of this book by any means is necessary, other than in the case of images, diagrams or other material from other copyright holders. In such cases permission of the copyright holders is required. This book may be cited as: COST ES Background and Justification Document, COST Action ES1006, May 2012 ISBN: X Distributed by University of Hamburg Meteorological Institute Bundesstraße 55 D Hamburg, Germany

3 Contents 1. Introduction: the concern What are the challenges? What actions are required? What is meant by local-scale dispersion? Structure Current or future threats and dispersion modelling challenges Overview The prevailing threats Terms to define a hazardous reality Origins of hazardous releases The notion of CBRN agents Two categories of hazardous events Releases of gases or particles Passive and non-passive releases Threat summary Modelling and operational challenges Mapping the challenges The source and source term A complex chain of physical processes The meteorological data The physics of aerosols The challenge of the computation time The challenge to build an operational computational tool Approaches and tools Concepts of tools and models Short-range and local scale modelling Flow models and dispersion models Proposed classifications for flow models... 32

4 Classification of flow models by dimension Classification of flow models by the type of equations resolved Proposed classification for dispersion models Concepts of use: When to use what Preparedness phase Response phase Recovery and analysis phase Input and output requirements Examples of existing tools and models for CBRN releases Dispersion modelling for emergency planning and response Modelling challenges Overview of the involved physical processes Input to dispersion flow modelling Dispersion modelling Output of dispersion modelling Impact assessment Modelling need in case of a CBRN release The principle of down-scaling Meso-scale modelling Local scale modelling Bridging the gap Impact assessment Future needs for model development Quality assurance of local-scale hazmat dispersion models Specific requirements Data sets Evaluation methodologies Summary... 69

5 1976, Seveso, North- Italy, 12:37 on 10 July. A failure occurs in the control system in the ICMESA chemical plant that produces trichlorophenol, a key constituent in weed killers. The failure allows the temperature of the production process to exceed the safe limits, leading to chemical reactions that produce large amounts of the highly toxic dioxin 2,3,7,8-tetrachlorodibenzodioxin, TCDD. The explosion of the reaction chamber is avoided by opening relief valves, but a toxic cloud is emitted, it rises to a height of 50 m and spreads south, heavily contaminating an area up to 10 km downwind. In the days that follow the release, dermal lesions affect people exposed to the cloud and a large number of courtyard animals and birds die in the affected area. Ten days after the accident ICMESA inform the health authorities that different type of samples taken around the plant have revealed the presence of TCDD. The population are finally alerted to the severity of the accident and on the 26 th of July the evacuation procedures begin. The area affected by TCDD contamination is then divided into the three zones, A, B and R, in decreasing order of TCDD soil concentrations. In zone A the buildings are demolished, a layer of soil is removed, the population is evacuated and all access prevented. In zones B and R monitoring procedures are established and access is controlled. Farming and stockbreeding are forbidden in all zones. In 1977 a programme of monitoring the presence of TCDD in atmospheric dust throughout the affected area is started and remains in place until 1979 (Di Domenico et al., 1980a,b,c,d,e,f). 2 TCDD [ g/m ] January 1977 Lentate sul Seveso ICMESA plant Barlassina Seveso Meda Zone A Seregno ICMESA plant Cesano Maderno Zone R Zone B Bovisio Desio motorway Varedo Zone A N Nova Milanese Muggio N km m Map indicating the area affected by the release of TCDD at Seveso in Italy (left). Local-scale Plot of TCCD deposition determined from soil samples in Zone A near the source (right, elaborated from di Domenico et al., 1980b)

6 2005, Sarpsborg, Norway, May. The most severe outbreak of legionellosis ever recorded in Norway occurs in the Sarpsborg/Fredrikstad area. More than 100 people living within in a 12 km radius of a biological treatment plant are infected, and 10 subsequently die The authorities conclude that the outbreak has resulted from the aerial dispersion of Legionella containing aerosol from the biological treatment plant. This conclusion is supported by results obtained from a simple Gaussian transport and dispersion model (Nygård et al 2008). This modelling approach is chosen because of its short run time. Later in A more comprehensive study is initiated with the aim of increasing understanding of the physical and biological processes that led up to the legionellosis outbreak (Blatny et al., 2008, 2011). This study is based on detailed Computational Fluid Dynamics - CFD - simulations that are validated against wind tunnel experiments. The study reveals that the Gaussian model had significantly over predicted legionella concentrations downwind of the plant. This is because the Gaussian model had not represented the effects of large building structures in the immediate vicinity of the source(s). The results from the detailed, but time-consuming, CFD simulations not only increase understanding of the processes involved, but lead to the conclusion that there must have been more than one source in the Sarpsborg/Fredrikstad area that was responsible for the outbreak of legionellosis. Numerical simulation of aerosol dispersion from a biological treatment plant (Fossum et.al., 2012)

7 A weekday morning in the central business district of a major European city. A loud explosion is heard. The fire brigade, civil defence and police observe the blast damage and examine the scene of the explosion with the aid of instrumented remotely controlled vehicles. These detect radiation and it is concluded that a Dirty Bomb has exploded, and the local authorities alert radiological response teams. The radiological response teams model the release using generic source term information and the available meteorological data. Within thirty minutes of the release occurring, dispersion plots are produced from which the best approach to the explosion scene can be determined, using generic source terms and available meteorological data. One hour after the explosion. Ordnance experts assess the amount and type of explosives used in the attack. This enables new dispersion predictions to be produced, based on better source term information and local meteorological data. Four hours after the explosion. The radiological response teams begin surveying the area around the site of the explosion. They recommend initial protective actions for the population, including evacuation and sheltering where appropriate. The dispersion modelling and monitoring data is used to define the dose rates and stay times required by first responders. Measurements reported by the radiological response teams are fed into further model simulations which enable the possible locations of hot, warm and cold radioactivity spots to be identified, and the recommended protective action strategy to be refined and implemented. In the two days following the explosion. Risk assessment reports are issued periodically, based on dispersion model results and information gathered by the radiological response teams.

8

9 1. Introduction: the concern Incidents similar to those outlined above, are provoking growing concerns in western societies with regard to the effects that even small accidental or deliberate releases of hazardous materials could have in populated areas. This is due to the potentially large scale of casualties and/or damage to ecosystems and infrastructures that releases from chemical plants, industrial sites, nuclear power stations, transportation accidents involving chemicals or nuclear materials and Chemical, Biological, Radiological or Nuclear (CBRN) terrorist attacks could have. The ability of the emergency services and authorities to deal with hazardous material releases relies upon access to a fast and accurate emergency response tool. While advances in computational and information technology have enabled sophisticated emergency response management systems to be produced, accurate prediction of dispersion on a local scale within complex environments remains a serious scientific challenge. This challenge must be addressed, as if the dispersion of material cannot be predicted with any certainty, then the threat remains unknown, and all subsequent emergency response steps and management actions quickly become questionable, and possibly put the lives of first responders at risk What are the challenges? The identification of actions to take to mitigate the impact of hazardous material releases is innately challenging due to the complex stochastic nature of the atmospheric dispersion process. This means that first responders must deal with a difficult situation that unfolds over an uncertain timeframe. The situation is typically complicated by the following factors: The duration of the release is often very short (minutes only at most); The emission characteristics of the source (amount and type of material released, for example) are only partially known (if at all); The response time in which to mitigate the effects of a release is short (typically less than an hour); The local meteorological conditions driving the spread of the contaminant are not readily available at the desired level of accuracy, and are subject to constant change; The release occurs in a complex industrial or urban environment, where the release of even small amounts of hazardous material can instantly pose a severe threat to the surrounding population. 1

10 Although the releases in the scenarios depicted above are quite different, they present common challenges to first responders. The most important of these is that the development of a reliable strategy and actions to protect the population relies upon an accurate prediction of how the airborne contaminants will be transported and diluted. Source Term Meteorology ADM Toxicology Risk Assessment Display and Command System Figure 1: The principal components of a hazardous release emergency management tool. An emergency management tool that could assist in the response to releases of hazardous materials might be expected to incorporate the functional modules shown in Figure 1. A variety of emergency response tools either already exist, or are being developed in different European countries. These tools take the form of fully integrated management systems, or modular concepts that have interfaces between the individual components, but they all provide the means to: 2

11 Characterise potential hazards; Manage the logistical aspects of emergency incident response; Account for different types of release; Document the decisions and actions taken during an incident to facilitate comprehensive post-incident analysis. Dispersion models, combined with sensors that detect and measure hazardous material concentrations are the backbone for any comprehensive emergency management system. The quality of these components dictates the range and likely efficacy of the protective actions that can be recommended. For example, to describe a threat in terms of commonly accepted exposure limit values, the spread of the contaminant must be known. Depending upon the particular exposure limit used, this spread may be described either in terms of a probability of occurrence, or as a timeseries at a particular location. Ultimately, it is only through knowing where and when a given exposure level will be exceeded that first responders and higher level decision makers can plan their actions; without this knowledge, even the most comprehensive emergency management tools are of little value. However, providing reliable dispersion predictions at the very local scale, where the related risks and threats are extremely high, is extremely challenging. A number of different approaches are available for predicting atmospheric dispersion of material. These range from simple parametric methods to advanced dispersion models and sophisticated methods based on fast data access to detailed pre-computed solutions. The development of sophisticated numerical models that can provide detailed predictions of how hazardous materials disperse at local and neighbourhood scales has been facilitated by the continuous increases in computing power that have taken place over the past two decades. Nevertheless, at present the various methodologies have specific advantages and disadvantages regarding the efficiency, quality and reliability with which they may provide results in any given situation. The variety of dispersion prediction tools used by the different organizations involved in emergency response management suggests that a variety of different answers can be given to emergency response personnel, even in a well-defined release scenario. To provide them with confidence in the tools they are using, and the limitations of the outputs, it is evident that any airborne hazard dispersion model should be comprehensively validated to identify its strengths and weaknesses when applied in any given scenario. To achieve this, improve the local-scale dispersion modelling techniques and emergency response procedures, it is evident that there is a need to share expertise and integrate development efforts where possible within and beyond Europe. 3

12 1.2. What actions are required? It is generally agreed that the quality and reliability of the results obtained from the tools currently used for simulating local-scale airborne hazards are overestimated. It is also generally agreed that users at the operational level are unaware of how the tools available to them could be improved by incorporating recent advances in modelling. A survey of European local-scale dispersion modelling research activities indicates that these are generally undertaken as part of national research programs. Consequently, considerable potential exists for improving local-scale modelling capabilities and providing more practical guidance by coordinating and consolidating research activities at a European level. COST Action ES1006 aims to address this need by facilitating the integration of national research activities that support the emergency services in responding to short duration, local-scale releases of hazardous materials. The Action aims to achieve this by establishing a dedicated interdisciplinary and inter-institutional forum for scientific information exchange, consensus building and model improvement. The non-competitive nature of the Action provides an environment in which: 1. The limitations and uncertainties associated with current approaches to modelling for local-scale emergency response will be articulated; 2. The most effective directions for future developments will be identified; 3. A common strategy for improving the performance of modelling tools will be developed; 4. High quality test data obtained in national research projects will be made available for validating and quantifying the uncertainties in models; 5. The benefits from the research expertise available will be maximised and recommendations made from a broader scientific basis than national efforts alone could provide. A particular goal of the Action is to bring together scientists with emergency response specialists in evaluating and further developing the local-scale hazard prediction tools available for use in emergency response systems. This goal is reflected in the organisation of the Work Groups and by restricting its scope of interest to the local scale. In the field of air pollution modelling, model inter-comparisons have been made over a range of different scales. These comparisons have involved rigorous statistical analyses of the model outputs to evaluate the sensitivity of different modelling approaches to: 4

13 The quality of the input data; The definition of boundary conditions; The details of the meteorological input; The level of physics implemented in the model (including the parameterisations selected, the choice of equations and numerical solution method used). In comparison, relatively little inter-comparison work has been undertaken on models for predicting the local-scale effects of hazardous material releases. This is partly because conducting such an exercise is non-trivial due to the need to reconcile different levels of complexity, such as relating relatively simple outputs to different modelling techniques that aim to capture very complex processes. A comprehensive study made in Austria clearly demonstrated that the selection of model input data and choice of model, as well as the model performance, were crucial in being able to react to local-scale, short-term threats. It is also anticipated that further information in this area will be gained in due course from the following studies. A number of working groups have been established in France to investigate the significant discrepancies found between model results from established emergency response tools, and those from more sophisticated tools when applied to similar problems. A pilot study has recently begun in Germany to scientifically and practically evaluate a new approach to modelling local-scale airborne hazards reliably (Schatzmann et al., 2011). Model evaluation is complicated by the fact that model performance may be expressed in terms of a range of fundamentally different parameters. These range from a simple yes/no criteria relating to the prediction of threat zones, to probabilistic evaluations of the hazard exceeding health risk criteria based on dosage, or the accuracy of the outputs for use in source reconstruction analyses. A promising route for future developments is through combining the results from pre-computed high-resolution dispersion simulations, with application-oriented postprocessing. This potentially enables the results of complex computationally intensive simulations to be made available to assist in local-scale emergency response and disaster management with minimal time lag. The implications of such approaches for model validation and assessment are that there is a need to distinguish between the reliability of the pre-computed flow and dispersion modelling, and the accuracy of the model prediction constructed in the software tool used by first responders. In addition to model evaluation, improvement and development, an important goal of the Action is to provide clear guidance to practitioners, stakeholders and endusers, on how to use existing tools to achieve the best results in local-scale 5

14 emergency response modelling. This task will be addressed by identifying the limitations of current models by means of examples, and by defining a quality standard through developing a consensus of opinion regarding local-scale emergency response dispersion modelling. Data of known quality, and in a standardised form, needs to be available for evaluating model performance. This needs to include well-known test cases and also unknown test cases so that 'blind benchmarking' of models and tools to be conducted under realistic conditions. This suggests that a standard quality assurance procedure should be defined for local-scale airborne hazard modelling, such as that which exists for example in the field of air quality modelling (Britter, Schatzmann, 2007). Access to such quality assured data will provide confidence in the subsequent model evaluations, and is an important part of the quality assurance required to justify the use local-scale dispersion models in emergency response management systems What is meant by local-scale dispersion? Existing atmospheric dispersion models provide a cost-effective way to analyse air pollution impact over extended domains, and are used in air quality management systems. Given this, similar models are then regularly used as the core component of local-scale emergency response tools. Atmospheric dispersion models for emergency response are generally designed to describe the rate of spread of hazardous substances, and to provide predictions of ground level concentration or dosage and deposition with time. The models apply various mathematical and numerical techniques to simulate the main physical and chemical processes driving transport, dispersion, chemical transformation and decay of contaminants in the atmosphere. Sophisticated dispersion models not only take account of meteorological conditions, but also topography and obstacle geometry as well as release information such as the type of the emission source (point, linear, areal, instantaneous or continuous), emission rates and different types of contaminants. Chemistry models may be added that enable the concentrations of secondary contaminants formed as a result of complex chemical reactions within the atmosphere to also be estimated. At very short ranges and particularly in the context of emergency response needs, the performance of a given dispersion model depends on: The uncertainty in the release and material parameters, including the chemical/radiological composition, physical properties (e.g. substance physical state, temperature, pressure, environmental conditions, etc.), emission rate quantification, the type of release (buoyant/non-buoyant; toxic/non-toxic; flashing/non-flashing, exploding/non-exploding, reacting/non-reacting) as well as the space-time scales of possible chemical transformations and reactions; 6

15 The degree of uncertainty in the meteorological inputs driving the transport and dispersion predictions (noting that neither local forecast or observation data are generally immediately available, and even if they are they may be of limited accuracy and representativeness); The complexity of the topography and geometry of the local environment, and the availability of this information for use in modelling systems; The compromises made between providing a physical modelling approach that is sufficiently sophisticated to provide reliable results and providing a fast-response tool ; The ease with which the tool can be handled and applied by non-scientists, providing outputs and information that can be easily interpreted without error. Based on the above factors, different dispersion models have a role to play in handling different types of incident and in different stages of an incident. A fitness for purpose based selection is required that takes account of: The quality and quantity of output information required; The reliability and robustness of the model predictions; The computational demands; The rapidity with which the results are produced; The input data dependencies. In addition, the ease with which different dispersion models may be integrated into emergency response management systems is also an important consideration. The main focus of COST Action ES1006 is to improve the quality and robustness of local-scale predictions of airborne hazard dispersion from accidental or deliberate releases in complex urban and industrial environments. The Action aims to establish a scientific and methodological reference for local-scale airborne hazard modelling through: 1. Improving the scientific basis behind local-scale dispersion modelling; 2. Developing an inventory of models and modelling systems; 3. Developing comprehensive practical guidance for using models to track and predict the dispersion of airborne hazards. In addition, the Action will enable more accurate representations of source terms in complex heterogeneously structured environments to be developed. This will lead to an improvement in the quality of dispersion modelling at regional and larger scales by providing more reliable initial dispersion conditions. The major tasks of the Action are to: Review the current tools and models used in characterising hazard dispersion and examine how these are applied operationally in emergency response efforts; 7

16 Identify the deficiencies in tools and models that limit their effectiveness and operational use in emergency situations; Identify the critical input data that must be available to use the tools and models effectively; Identify ways to improve the accuracy of tools and models. To measure the quality of model results and identify ways to improve them, a task-specific validation procedure will be adopted. This will be based on a structured set of local threat scenarios that have been defined by their requiring models to have certain capabilities. The final evaluation of the health and environmental effects defines one of the bounds to the Action. Publicly available exposure indices, such as those defined by the European Centre for Ecotoxicology and Toxicology of Chemicals 1 will be considered and applied in the Action within the limits of the information provided. However, it is beyond the scope of the Action to consider the detailed health and environmental impacts of materials. This field of research requires specialist input from epidemiologists, toxicologists, physicians and biologists and requires a much wider interdisciplinary context than the existing Action can provide Structure COST Action ES1006 aims to address issues related to the applicability and improvement of emergency response tools. Particular effort will be devoted to addressing the limitations in terms of timely response and reliability of current approaches to local-scale dispersion modelling. This document is the first scientific deliverable of the Action, and presents a state-of-the-art review of the local-scale threat scenario modelling capability available to first responders and higher level decision makers. Chapter 2 identifies and discusses the present and potential future threats, and the challenges they raise in terms of emergency response. After defining the basic threats, detailed descriptions are given of the threat scenarios and particular source terms defined by the different agencies involved in local-scale emergency response, including: civil protection, homeland security and industrial safety. Critical and challenging situations that might arise are described. In Chapter 3 the range of modelling approaches and tools currently in use, or under development is reviewed. The limitations of both simple and advanced models and their consequent applicability to different scenarios, is addressed. A preliminary analysis of the known discrepancies is offered and well-known limitations and deficiencies of current emergency response systems are discussed. 1 ECETOC, 8

17 In Chapter 4 the general analysis presented in previous 2 chapters is related to the specific problems of performing dispersion modelling to support emergency planning and response. The particular challenges associated with local-scale dispersion modelling are presented and discussed. The needs for future model development are addressed, and the issue of uncertainties and their treatment and interpretation raised. Chapter 5 presents an analysis of the present status of the evaluation process for the local-scale dispersion models. It then goes on to discuss how the quality assurance needs for local-scale dispersion models should be addressed; assessing the specific requirements, datasets available and evaluation methodologies. Finally, Chapter 6 reviews the key tasks tackled, presents the first conclusions and the plan for subsequent phases of the Action. Consideration is given to the practical constraints and potential legal issues associated with the use of emergency response tools. The need for interaction between scientists and model developers with endusers and decision makers is also discussed. References Blatny, J. M., Pettersson Reif, B. A., Skogan, G., Andreassen, Ø., Høiby, E. A., Ask, E., Waagen, V., Aanonsen, D., Aaberge, I. S. & Caugant, D. A. (2008): Tracking Airborne Legionella and Legionella pneumophila at a Biological Treatment Plant. Environmental Science and Technology, 42, Blatny, J. M., Tutkun, M. Fossum, H, Ho, J, Skogan, G, Fykse, E. M., Andreassen, Ø, Waagen, V. & Pettersson Reif, B. A. (2011): Assessment of the dispersion of Legionella-containing aerosols from a biological treatment plant. In print: Frontiers in Bioscience (Elite Ed). Jun 1; 3: Croddy E., Perez-Armendariz, Hart J. (2002): Chemical and Biological Warfare A Comprehensive Survey for the Concerned Citizen. Copernicus Books, New York, ISBN Di Domenico A; Silano V; Viviano G; Zapponi G. (1980a): Accidental Release of 2,3,7,8- Tetrachlorodibenzo-Para-Dioxin (TCDD) at Seveso, Italy. 1. Sensitivity and Specificity of Analytical Procedures Adopted for TCDD Assay. Ecotoxicology And Environmental Safety Vol. 4, Issue 3, Di Domenico A; Silano V; Viviano G; Zapponi G. (1980b): Accidental Release of 2,3,7,8- Tetrachlorodibenzo-Para-Dioxin (TCDD) at Seveso, Italy. 2. TCDD DISTRIBUTION IN THE SOIL SURFACE-LAYER. Ecotoxicology And Environmental Safety, Vol. 4, Issue 3, Di Domenico A; Silano V; Viviano G; Zapponi G. (1980c): Accidental Release of 2,3,7,8- Tetrachlorodibenzo-Para-Dioxin (TCDD) at Seveso, Italy. 3. Monitoring of Residual TCDD Levels in Reclaimed Buildings. Ecotoxicology And Environmental Safety Vol. 4, Issue 3,

18 Di Domenico A; Silano V; Viviano G; Zapponi G. (1980d): Accidental Release of 2,3,7,8- Tetrachlorodibenzo-Para-Dioxin (TCDD) at Seveso, Italy. 4. Vertical-Distribution of TCDD in soil. Ecotoxicology And Environmental Safety, Vol. 4, Issue 3, Di Domenico A; Silano V; Viviano G; Zapponi G. (1980e): Accidental Release of 2,3,7,8- Tetrachlorodibenzo-Para-Dioxin (TCDD) at Seveso, Italy. 5. Environmental Persistence of TCDD in Soil. Ecotoxicology And Environmental Safety, Vol. 4, Issue 3, Di Domenico A; Silano V; Viviano G; Zapponi G. (1980f): Accidental Release of 2,3,7,8- Tetrachlorodibenzo-Para-Dioxin (TCDD) at Seveso, Italy. 6. TCDD Levels In Atmospheric Particles. Ecotoxicology And Environmental Safety Vol. 4, Issue 3, Fossum, H.E., Reif, B. A. Pettersson, Tutkun, M. and Gjesdal, T. (2012): On the Use of Computational Fluid Dynamics to Investigate Aerosol Dispersion in an Industrial Environment: A Case Study. Boundary-Layer Meteorology 2012, DOI: /s z Online First. Nygård, K.; Werner-Johansen, Ø.; Rønsen, S.; Caugant, D. A.; Simonsen, Ø.; Kanestrøm, A.; Ask, E.; Ringstad, J.; Ødegård, R.; Jensen, T (2008): An outbreak of Legionnaires disease caused by longdistance spread from an air scrubber in Sarpsborg, Norway. Clin. Infect. Dis. 2008, 46,

19 2. Current or future threats and dispersion modelling challenges 2.1. Overview The accident and terrorist attack scenarios described at the beginning of Chapter 1 represent severe events that society cannot ignore, as they constitute latent or hypothetical threats that could occur at any time. These tragic events, and a far greater range of possible events, are characterised by the release of hazardous material (gaseous or particulate) into the atmosphere. Such accidental or malevolent releases lead to the dispersion and deposition of material that is potentially harmful to human health and the environment. It can be said that scenarios such as those presented in the introduction are extremely rare. And it is true that they are beyond the experience of most people. But this is not the case for the personnel involved in rescue teams and with local, regional or national public authorities, some of who have to make decisions when confronted with these types of emergency situations on a regular basis. Indeed, everyone is aware that industrial accidents and terrorist attacks present actual and recurring, threats. Situations in which hazardous materials are released to disperse through the atmosphere have to be dealt with as they threaten the human population, cause a severe disruption to society, and endanger flora and fauna. While there are many important requirements for an emergency response tool to assist in handling hazardous atmospheric releases, dispersion modelling has a two major roles to play: 1. To bring reliable help to the rescue teams and decision makers. 2. To be available to provide quick and precise answers throughout the crisis. Section 2.2. describes the present and potential future threats that could result in the liberation of hazardous gaseous or particulate species into the atmosphere. Section 2.3. presents the scientific challenges involved in producing dispersion models to respond to these threats. 11

20 2.2 The prevailing threats Terms to define a hazardous reality One of the most commonly shared feelings among the general population and the ruling classes in the West is that we live in a dangerous world. As a result, the adjectives hazardous or harmful are extensively and recurrently used in this chapter. The appellation hazardous material, or hazmat, may refer to materials that are flammable or explosive (in a concentration domain depending on the material); or create anoxia and / or are characterised by their toxicity for the environment and / or human beings. As explained in Section , this toxic or noxious effect can vary greatly, as it may result from the exposure to radioactivity, chemicals or pathogenic bio-agents, with consequences ranging from mild transient effects to death. The word threat is often used when discussing modelling and simulation of the atmospheric dispersion of hazardous materials. The Cambridge Dictionary Online (Cambridge, 2012), defines a threat as: a possible danger constituting a common menace for a whole part of the society. It is also interesting to note the different meaning of the following terms: A danger represents the exposure or vulnerability to harm or risk. It can also be a source or an instance of risk or peril. A risk designates the possibility of suffering harm or loss. It is a factor, thing, element, or course involving uncertain danger or hazard. This means that a risk consists in the combination of a danger and an associated probability. A threat is an expression of an intention to inflict pain, injury, evil, or punishment. It can also be an indication of impending danger or harm. Or, it identifies something that is regarded as a possible danger or menace. In other words, a threat may be defined as an established risk that can endanger the lives of citizens Origins of hazardous releases Within this report, the focus is on threats that result in the release of hazardous materials into the atmosphere. The releases may have far as well as near-field consequences, although the latter is the primary focus here. The origin of the releases can be natural, anthropogenic, or a mix of both: Natural emissions are exemplified by volcanic eruptions, forest fires of various extents, etc. 12

21 Anthropogenic emissions may be classified in 2 sub-categories as they are either planned (occurring in normal operations) or unplanned (due to accidents or to malevolent or terrorist actions). By mixed emission, we allude e.g. to the natural re-suspension of radionuclides (especially 137Cs) deposited by the anthropogenic Chernobyl nuclear accident in April 1986, produced by the forest fires in Russia in August It is worth noting the duality of accidents and terrorist attacks which although distinctly different in initiation may have the same result, in terms of potentially harmful atmospheric releases. This report is essentially devoted to threatening anthropogenic releases of hazardous material. Figure 2 illustrates some potential release scenarios associated with accidents or, possibly, terrorist attacks. Jet-type release with pool evaporation Moving continuous source Release during severe fire Release with explosion blast Figure 2: Examples of accidental or, possibly, malevolent hazardous releases The notion of CBRN agents Many words such as species, substance, etc. are used to refer to materials that are released into the atmosphere. As the term agent is nowadays often used when discussing accidental or deliberate releases it is the most appropriate one to use in this report and adopted hereafter. 13

22 The range of hazardous agents that could be transported and dispersed in the atmosphere is extremely diverse in physical, chemical, and even biological nature. But while individual agents have diverse characteristics, groups can be identified that share common features. One group of agents is formed by radioactive elements that emit alpha, beta or gamma radiation. These are called radionuclides. Some radionuclides are also fissile or fusible, which means that under certain circumstances they undergo nuclear fission or fusion reactions. Together the radioactive (R) and nuclear (N) agents are termed radiological agents. Another group of agents is defined by chemical species that chemically react together or with the atmosphere. Finally, agents can also be living entities like bacteria, rickettsia, virus or spores, which may be considered to behave as particles with sizes ranging from 0.1 µm (spores) to 10 µm (bacteria). The main difference in the behaviors of the Chemical, Biological, Radiological or Nuclear (CBRN) agents during atmospheric dispersion is the way in which their nature changes with time, i.e.: Decay chains and daughter products formed by R-N agents, The chemical reactions or transformations of C agents, The decay in lethality of B agents Two categories of hazardous events The conditions in which disastrous releases of agent may occur cover a large range of situations that cannot be described exhaustively. However, 2 categories may be clearly distinguished: Accidental events that occur at an industrial facility, or during the transport of a hazardous material: - For C releases, there is a history of numerous chemical accidents, some of which have had terrible environmental or sanitary consequences. Particular examples are: Seveso, Italy (1976), Bhopal, India (1984), AZF Toulouse, France (2001), Buncefield, UK (2005). - For R-N releases, there have been a number of major accidents with extreme consequences. These have included: Three Mile Island, USA (1979), Chernobyl, Ukraine (1986), Fukushima Dai ichi, Japan (2011). - B releases can happen in laboratories working on living pathogenic materials raising new bio-safety issues (e.g. Sverdlovsk, 1979). An example of biological release in Norway was given in the introduction of this report. Activities with criminal intent involving the dispersion of CBRN agents using a Radiological Dispersal Device (RDD, or dirty bomb ) or Improvised Nuclear / Chemical / Biological Device (IND / ICD / IBD). 14

23 By definition, a RDD is the mixture of explosive and radioactive materials (with possibly a spectrum of radionuclides). In a RDD radionuclides (e.g. 60 Co or 137 Cs) are dispersed into the atmosphere following detonation of the explosive (without any nuclear reaction). In an IND, the radionuclides are fissionable and the explosion initiates nuclear reactions to produce more devastating effects. In other circumstances, accidental releases of CBRN agents may be associated with violent fires or explosions; what is referred to as E threat or F threat. Both these processes: Have a significant influence on the initial phase of the atmospheric dispersion; Produce severe damage to built structures; May produce significant numbers of casualties, irrespective of any dispersal of agent. It is evident that atmospheric dispersion is not and not simply the passive emission of substances in the air, but generally associated with many different processes. However, while accidents and criminal activities are obviously different events, it is important to note that the atmospheric dispersion is similar no matter which CBRN agent is released Releases of gases or particles An important property of an atmospheric release is the physical state or phase of the substances emitted into the air. This leads to consideration of 2 major groups: gases and airborne particles called aerosols. Gases and aerosols are present in the atmosphere independently of any kind of emission following an accidental or a deliberate release. The natural composition of the atmosphere is variable in space and time and is determined by a complex chemical processes. The physical and chemical phenomena that occur in the air are complicated further when that the effect of a hazardous emission is to add gaseous or particulate species that interact chemically with the atmosphere (particularly near the source). By definition, an aerosol is a colloidal suspension of fine solid particles or liquid droplets in a gas (in our case, the air). Aerosol particles play an important role in the precipitation process, providing the nuclei upon which condensation and freezing take place. They also participate in the atmospheric chemical processes. Aerosol particles are characterized by their density and geometric dimensions, or by an equivalent aerodynamic diameter which is a combination of the previous parameters. They are said to be monodisperse if they have similar aerodynamic diameter or polydisperse if they present a granulometric (i.e. size) spectrum. The diameters of aerosol particles range from a few nanometers to about 10 microns (µm). The smallest particles generally follow the streamlines of laminar or turbulent flow and have similar turbulence properties to the air that carries them. 15

24 The largest particles are subjected to inertial effects and gravitational settling and exhibit a different turbulence spectrum to that of the air that carries them. The physics of aerosols closely related to atmospheric transport and dispersion, but the complexity of aerosol physics makes it a domain of study in itself. Further consideration is given to aerosols sub-section Passive and non-passive releases An important distinction between releases is how they are initially emitted into the atmosphere. In general terms the main release categories are: passive releases, buoyant releases (due to density differences), releases with initial momentum (typically, a jet), flashing and / or evaporating releases and chemically reactive releases. In practice, the conditions for a passive release are not generally satisfied, as a passive release means that the hazardous material is left in the carrier flow (the air) without producing any initial effect. More precisely, a passive emission implies that the release does not experience any change in its composition; does not interact with the carrier air flow in terms of ambient air mean velocity or turbulent properties; and does not exchange any heat with the ambient surrounding air (exothermic or endothermic transfer). In the real world, there are many situations in which releases of hazardous materials are not dynamically, chemically, and / or thermodynamically passive. Some examples are: Industrial accidents involving breaches in pressurised and liquefied gas containers and releases from storage containers (bottles, tanks and wagons) resulting in a jet. In these situations, the release may be a monophasic gas, emitted with the same velocity as the surrounding air flow. More often, it is multiphasic with e.g. a thermodynamic flash (mixture of gas and aerosol droplets) and an evaporating pool. In the case of a release accompanied by, or induced by a fire or an explosion, the initial release of energy associated with the atmospheric emission cannot be ignored. Thus, it is necessary to take account of the initial energetic phase of the release before it becomes a passive one. If the release is buoyant or heavier than air, the dispersion behavior may be significantly different to that for a passive release, especially for heavy gases which may lead to situation specific consequences. In these situations, after a first stage in which the releases strongly interact with the surrounding atmosphere, the emitted gases or particles are finally transported and dispersed in the air as tracers (i.e. passive releases). To account for the initial interaction the alternative solutions are to either model the initial physical phenomena in detail, or in a simpler and more operational way take account of them through an adapted source term as explained in sub-section

25 Threat summary In conclusion, European societies face a large spectrum of potential threats connected to the release of hazardous agents into the atmosphere. These threats include the release of: Toxic chemical products as gases or particles; Pathogenic biological entities; Radioactive or nuclear materials. These CBRN threats might be realized as the result of accidents, or as a consequence of criminal or malevolent activities. Although these events are quite different in nature, effective emergency response to them requires similar atmospheric dispersion modelling and health impact assessment tools. Figure 3 attempts to illustrate the continuum of relative probabilities and magnitudes of threats posed by the release of various hazardous materials. Figure 3: The continuum of CBRN releases probability and effects. 17

26 2.3. Modelling and operational challenges Mapping the challenges The CBRN threats described in Section 2.2 evolve through a complex atmospheric dispersion process that produces a distribution of agent that varies in space and time. This section describes the challenges that must be addressed to model these processes, and how they are magnified by the needs of emergency responders. It is important to note at the outset, that mathematical models may attempt to capture the physics of dispersion with various degrees of complexity. But in addition to this once a set of atmospheric dispersion equations have been derived they are solved (in all but the simplest models) over a discretized space and time. Due to this discretization, the choice of numerical algorithm may be crucial, as it may dictate the temporal and spatial precision of the solution. It should also be noted that the reliability of atmospheric dispersion computations are also highly dependent on the quality of the input data relating to: The description of the source term (see section ); The meteorology at the scale of the event (which may often be determined from what happens at larger scales see section and chapter 4). Finally, the output of the dispersion modelling of CBRN agents has to be translated into results that may be used for impact assessments, to meet the needs of first responders, public authorities and higher level decision makers. This is the other side of the challenge beyond making local-scale dispersion modelling tractable within the time constraints of emergency response. It requires that the space and time distribution of the hazardous agent(s) must be transformed into the appropriate radiological exposure, chemical dose, or a biological contamination information. It is important to note that similar concentrations of different agents at a given point may led to extremely different health effects. These may range from immediate to postponed effects, and from no effect to lethal effect, depending on the toxicological properties of the particular CBRN agent. In other words, information about concentration is not of direct interest to emergency responders or decision makers, but it has to be converted into practical expected health effects. The impact assessment is essential for identifying the potential consequences for the population, but also for identifying the risk to emergency response or rescue teams involved in handling the crisis. The real challenge in developing a computational tool, is actually to meet the practical needs of the on-site operational teams and the authorities having to make urgent and critical decisions. This is discussed further in section The preceding paragraphs have shown that there are many challenges to meet in providing a decision-support system to assist in handling an emergency situation. But the importance of the quality and reliability of the dispersion modelling outputs in 18

27 determining the quality and reliability the health impact assessment has also been identified. This is why the focus of the Action is on dispersion modelling The source and source term Source and source term are words with quite different definitions, depending on the communities in which they are used. For example, in fluid mechanics, source and sink terms describe the addition or removal of fluid material through the governing fluid dynamic equations. In this section, the use of these words is limited to practical considerations. Here the term source refers to the location from which the release originates, e.g.: A stack emitting gases or particles; A bag from which a powder is poured; A bottle, container, tank or process equipment which is suddenly opened or ruptured; A storage tank from which a leak occurs; A liquid pool from which a vapour evaporates; A storage container which explodes and instantaneously liberates a large amount of its content. These examples show that the source geometry may vary greatly. It may range from a very small area that may be considered to be a single point, to a surface (e.g. pool), or a volume. The nature of the source also depends on the adopted point-ofview. For example, the exhaust of a stack is a surface, but can be considered to be a point at a certain distance from it. Other considerations are that the height of the release may be important, and that a release may be made from a moving vehicle, creating a moving source. The term source term is taken here to be defined by the following: 1. Nature: C, B or R agents, 2. Composition: e.g. a spectrum of radionuclides or a list a chemicals, 3. Quantity: expressed in various units and sub-units, depending on the nature of the particular agent (Bq, kg, etc), 4. Kinetics: distribution in time of the release from its beginning to its end. Points 1 and 2 designate the properties of the material contained in the source, while points 3 and 4 define the characteristics of the release. It should be noted that the kinetic parameters may be extremely diverse. The releases may be short or long; consist of only one release or a series of successive puffs; or consist of an initially large emission, followed by a long tail release, etc. It is also important to understand that the definitions of the source and the source term are closely connected to the dispersion modelling itself. For example, consider an explosion and a fire: 19

28 1. The source term can be defined as the initial material before it burns or is blown up, releasing gases and particles into the air. 2. The source term can be seen as the result of the initial dispersion of the particles produced by the energetic event, taking account of the aerosol distribution and plume rise until the plume is stabilized and becomes a passive release. In the first case the source may be considered to be restricted to a point location. In the second case, the source term is distributed vertically and horizontally in a volume. The geometry and detailed composition of the source term must be precomputed, as these parameters are inputs to the dispersion model. Depending on the initial physical state and content of the material, and on the strength of the fire or explosion, the nature and quantities of the generated gases and aerosols, the size distribution of the aerosol, etc. may be very different (e.g. the maximum altitude reached by the plume). This illustrates why modelling the initial dispersion of a hazardous plume which may well be significant at the local-scale can become very complicated and generally requires special attention A complex chain of physical processes The previous section described how the source term may become complex when the release is not passive. It also illustrated how the modelling and assumptions behind the source term definition become important inputs into the atmospheric dispersion simulation in complex situations (where the source is located inside a building or an industrial facility, for example). Meteorology is another critical but complicated input required by dispersion models. Determination of this at the local-scale, in complex built environments (industrial site, urban district or inside buildings), introduces substantial challenges, in terms of capturing and modelling the physics, and the implementing local-scale meteorological modelling in simulation systems. If a release was located inside a building, it might be necessary to compute the internal flows through the building (including the influence of the ventilation), and to couple this with an external flow computation. In the external environment, the local-scale flow should at least take account of the effects of the topography and of the obstacles (buildings), and possibly of the moving vehicles. To be complete, the modelling would account of vehicles inside structures, including cars in tunnels and the traffic of the trains in the underground network. To make the process efficient it may be necessary to model dispersion at various scales or precision levels. This might involve carrying out detailed computations near the source and less refined calculations at some distance from the source (where the impact assessment predicts low exposure levels). 20

29 The evolution of the agents with time requires specific attention in modelling to account for: changes in phase, aerosol genesis and evolution, chemical reactions, radioactive decay and the formation of daughter products, and the degradation of biological agents that takes place as a result of the meteorological or other environmental conditions. A further modelling requirement is to account for the deposition of hazardous materials. Deposition has 2 forms: dry deposition that occurs in all weathers, and wet deposition that is associated with precipitation. Deposition occurs not only on the ground and vegetation, but more generally on all accessible horizontal or vertical surfaces, including building roof and façades. A complication is that deposition depends on both the nature of the airborne gases or particles and the surface materials. A final consideration is that previously deposited particles may be re-suspended to form a further source term for dispersion modelling The meteorological data The air flows in the atmosphere and within buildings obey the same laws of fluid mechanics and thermodynamics. These laws are expressed by the mass, momentum and energy conservation equations, and a supplementary equation of state for the particular fluid. The atmospheric flow is extremely complex as it comprises multiple species with multiple phases, is compressible, unsteady and turbulent. The general equations for fluid flow involving all the terms associated with all physical phenomena are applicable whatever the problem but are complex, and it is not usually feasible to solve them. The degree to which the general equations may be simplified and approximated for solution by making reasonable and acceptable assumptions is closely dependent on the characteristic scales of the problem. The unsteady stochastic nature of the atmosphere means that all its physical properties: pressure, temperature, velocity components, etc. intrinsically have a degree of unpredictability associated with them. This means that it is not possible to ascertain the true state of the atmosphere at any particular time, but chapter 3 will explain how the general fluid flow equations are averaged to obtain a statistical description of the mean atmospheric flow and to characterize its turbulence. Meteorological data is an essential input to any dispersion modelling activity. Meteorological data may either be obtained from measurements, or from calculations with challenging requirements: If measurements (also called observations) are chosen, they must be representative for the site where the release occurs and the plume of hazardous material disperses. The number of measurements necessary to achieve this depends on the extent of the simulation domain. The more 21

30 measurements available, the better it is. Measurements should be provided as vertical profiles from ground stations. If computations are carried out, it is important to check that depending on the scale of the problem, the adapted level of detail for the site is taken into account in the modelling. For example, if one needs the distribution of a chemical inside an industrial site or an urban district, it is advisable not to neglect the effects of buildings on the atmospheric flow and dispersion. Agents released into the atmosphere are advected by the wind. They disperse due to the effect of turbulence which originates from the wind shear (i.e. the velocity gradient produced as a result of friction on the ground) and the thermal stratification of the atmosphere. They are then cleared from the air by dry and wet deposition. To represent these processes requires that the meteorological data provided for characterising the dispersion conditions within the area comprise the following as a minimum: Transport: mean wind speed and direction, Dispersion: turbulence related quantities derived from e.g. temperature gradient (difference of temperature between two altitudes) and / or solar radiation and / or cloud cover. Deposition: precipitation type (rain, snow ) and rate of precipitation The physics of aerosols Many agent releases will result in particles which join the background atmospheric aerosol. The components of the background aerosol have many characteristics and undergo complex physical processes. Some elements of the aerosol are described here as it is necessary to take them into account in dispersion modelling. An aerosol component can take numerous and various forms, structures and sizes, depending on its origin and on the chemical, physical and thermodynamic transformations that occur during atmospheric transport from its source to the place where it settles. The size of particles in the atmospheric aerosol covers five orders of magnitude, varying from a few nanometers to some tens of microns. In terms of mass concentration (particle mass per unit of air volume), the aerosol covers a very large range (e.g. ultra clean room ~ kg/m 3, atmospheric aerosol ~ kg/m 3, stack releases ~ kg/m 3 and above a fire ~ kg/m 3 ). The physical processes governing the evolution and transfer mechanisms within the background aerosol vary with particle size. Large particles are well described by the classical laws of continuum physics; small particles are in the molecular domain and follow the gas kinetic theory; intermediate size particles are described by one of the two methods described tending to their respective limits. The atmospheric aerosol is often represented by a three-mode distribution. The existence and prevalence of the modes depend on the main sources, the chemical

31 ageing and microphysics of the aerosol. The so-called coarse mode corresponds to particles whose size is equal to or greater than 2.5 µm, the accumulation mode corresponds to particles in the range 0.08 to 2.5 µm, and the nucleation mode to particles with diameters less than 0.08 µm. The accumulation and nucleation modes constitute the fine and ultra-fine aerosols. Dry deposition and wet deposition are processes leading to a decrease of particles in an aerosol population. They are sinks for the atmospheric aerosol. The dry deposition has two forms: Settling due to the gravity or sedimentation (efficient for large particles); Diffusion, i.e. a particle flux generated by a concentration gradient (efficient for small particles). The dry deposition velocity is smallest for particles of an intermediate size, as mechanisms for deposition are most effective for either very small or very large particles. Very large particles settle out quickly through sedimentation or impaction processes, while Brownian diffusion has the greatest influence on very small particles which easily coagulate until they achieve a diameter of 0.3 µm. Figure 4 illustrates the shape of the deposition velocity curve according to the aerodynamic diameter of the particles (in different flows). Both 10 nm and 10 µm diameter particles have a velocity of ~ m/s, while the minimum velocity is obtained in the transition zone for particles between 0.1 and 1 µm, as it can be observed on Fig. 4. Deposition velocity, v [cm s ] -1 d Sc -2/3 u * z 0 u (-10cm) -1-1 [cm s ] [cm] [m s ] ~ ~ ~8 Sehmel (1980) Moller and Schumann (1970) 10-2 v S Particle Diameter [ m] Figure 4: Deposition velocity (in cm.s-1) of particles according to their aerodynamic diameter (in µm). Adopted from Seinfeld and Pandis (1998) 23

32 The wet deposition has two main forms: Rainout, i.e. removal of a cloud condensation nucleus (aerosol particles act as nuclei for the condensation of cloud droplets; some of these drops grow to such a large size that they fall to the surface as rain drops); Washout, i.e. removal of aerosol by cloud droplets (aerosol is incorporated into an already existing cloud drop under the cloud, and that drop grows large enough to fall as rain). The calculation of an impact assessment for particles differs to that for gases, especially for evaluating radiological exposure. This is because when particles are inhaled with the ambient air, many of them settle in the respiratory tract depending on their size, density, shape, charge, and surface properties and the breathing pattern of the individual. They penetrate to different depths in the respiratory tract. From the toxicological point of view, only particles smaller than 10 μm in diameter have a potential for being biologically active in susceptible individuals The challenge of the computation time The acceptable duration of computations places significant restrictions on the atmospheric dispersion modelling and simulation techniques that may be employed during an emergency or for developing an understanding of the dispersion in specific situations. For computations outside of an emergency situation, for example to better understand a past situation, or to prepare intervention plans for rescue teams, the time taken for computations is not the greatest concern, and detailed phenomenological simulations can be favored. But as a benefit of numerical simulation is that it provides the opportunity to perform parametric or ensemble studies, that could not otherwise be carried out, the CPU time per case should be reasonable. For computations made during the course of an atmospheric dispersion emergency, it is extremely important for the credibility of the method results are available very rapidly. The best situation would be to have both the most validated/most refined/most detailed physics and modelling and the shortest response time. In the real world, we have to accept a compromise between precision and simulation duration. In this context, it is interesting to note that continuously and automatically simulating the atmospheric/inner city/even inner building flow could be a part of the solution. In this situation, if an accidental or malevolent dispersion of CBRN agents occurs (in a domain where the flow simulation is available), one is ready to directly compute the atmospheric dispersion and toxicological or radiological health assessment. This strategy should be feasible within the typical time allowed to produce a dispersion simulation. 24

33 Finally, the impressive advances made in High Performance Computing (HPC) and the associated development of parallel versions of most fluid mechanics codes is a critical factor in meeting the challenges in atmospheric dispersion. It is only through access to increasing computational capabilities that advanced physical models can be used in emergency response situations The challenge to build an operational computational tool Atmospheric dispersion modelling forms an essential part of a complete decisionsupport system, as a precise and reliable impact assessment for hazardous materials releases essentially depends on the dispersion model output. A computational tool devoted to dispersion and impact evaluation will desirably fulfil the following challenging requirements: It should be user-friendly for a broad range of users, e.g. for an operator belonging to the rescue services or a dispersion expert urgently requested to evaluate an emergency situation. It should include three essential modules dedicated to: input data (meteorology and source term), dispersion computation, and health consequences assessment. It should be thoroughly verified and validated against experimental results obtained in wind tunnels or from field trials. It should be fully documented. - All the methods and equations the computational tool is based on should be presented in the reference manual. This should also indicate the possible range of use and the limitations of the code. - The implementation, the computational detail and the installation of the tool should be described in a dedicated document. An exhaustive user guide presenting how to use the computational tool should be made available, with possibly a full version and a short version with the main applicable guidelines. It is essential that there is a thorough explanation of the simulation outputs, i.e. interpretation of the graphics and meanings of numbers. It should give a response in a time consistent with emergency situation; providing quick and precise dispersion computations and output adapted to the needs of rescue teams and decision makers. The real challenge is to obtain all of this together. In conclusion, all countries have to face various threats from releases of hazardous materials, many of them involving atmospheric CBRN releases. Whatever accidental or malevolent origin these releases may have, predicting the dispersion of hazardous materials leads to numerous modelling, computational and operational challenges. The most prominent of these challenges is to provide rescue teams and decision makers with reliable, precise and quick answers in response to requests for information to help them at any stage before a crisis (preparedness), during a crisis (counter-measures) or after the crisis (experience feedback). 25

34 At present often only the most elementary dispersion calculations may be performed during an emergency. The Action will try to identify and design more realistic, reliable, and informative, systems for the crisis responders. This intent is illustrated by Figure 5, in which the same hypothetical radiological release at Bastille square in Paris is modelled, on the left with a Gaussian model neglecting the presence of the buildings, and on the right with an advanced Lagrangian model taking account of the buildings. The Gaussian model requests the wind speed and direction and information on atmospheric stratification, and provides results almost instantaneously. In contrast, the Lagrangian model applied in this example was utilizing a continuously and automatically calculated prediction of the local wind flow in the urban district. While the time taken to compute the advanced solution depends on the number of Lagrangian particles released and the computational resources available, the dispersion and impact assessment results may be obtained within 5 minutes. Figure 5: Total Effective Dose Equivalent (in msv) resulting from the atmospheric dispersion of a radiological threat agent (3 TBq of 137 Cs) as seen by a simple model and by a more comprehensive model taking the buildings into account. (Armand et.al., 2011). Finally, Table 1 gives an overview of typical scenarios which should be dealt with in the scope of this action. It encompasses various situations consisting of accidents and malevolent actions that are characterized by the release of potentially hazardous CBRN agents. 26

35 The scenario The location The source term The computation domain The deliverables Hazardous chemical materials released following a transport accident Country side road, in city streets, railway station C Probably a puff or short continuous release Local to regional, strongly depending on the quantity and emission mode Industrial incident (e.g. refinery). The release does not exceeds the limit of the site Explosion in a chemical factory Industrial districts Industrial districts C Puff or continuous release C Most probably a puff release. If a fire develops, a continuous release is possible Local Local to regional depending on the quantity and emission mode Chemical dosage Contaminated area Population exposed Chemical (malevolent) dispersion in a railway / metro station / wagon Railway / metro station or line C Puff or short release. The specific chemicals unknown at the beginning Local Radioactive materials released following an accident at a Nuclear Power Plant (NPP). The release is not contained within the plant's borders Outside urban areas, in the vicinity of water sources R Probably a continuous release for several hours Local to global scale depending on the specific event Concentrations on the ground and in the air Concentrations in the food chain Expected exposure of the population Radiological Dispersal Device or dirty bomb Populated area (business or commercial districts) R Puff release. The specific radionuclides unknown at the beginning Mainly local Recommendations for sheltering or evacuation Cleanup recommendations for the contaminated area Accident at a biotech company / institution Industrial / Hi-Tech districts B Puff release Mainly local Malevolent spraying of a biological agent (e.g. by an Unmanned Aerial Vehicle) Silent (malevolent) dispersal of a biological agent A city Populated area (business or commercial districts) B Short to continuous release. The specific bio-agents unknown at the beginning Local to regional depending on the affected area Local to regional depending on the affected area Contaminated areas Expected morbidity Possible contagion Table 1: Characteristic real past situations or assumed events accompanied by CBRN possibly hazardous releases and dispersion in the atmosphere 27

36 References Armand, P., Duchenne, C., Oldrini, O., Olry, C. and Moussafir, J. (2011): Application of PMSS, the parallel version of MSS, to the micro-meteorological flow field and deleterious dispersion inside an extended simulation domain covering the whole Paris area. 14th International Conference on Harmonisation within Atmospheric Dispersion Modelling for Regulatory Purposes, HARMO'14, Oct. 2-6, Kos, Greece. Seinfeld, J.H., S.N., Pandis (1998): Atmospheric chemistry and Physics. Wiley-Interscience. 28

37 3. Approaches and tools 3.1. Concepts of tools and models This chapter is devoted to reviewing and categorizing the existing tools and computational models applied during the management of emergencies involving the release of hazardous substances into the air. It is emphasised that the present chapter is not intended to provide an inventory of models, as this will be an output of the Action. The review will focus on the computational approaches adopted to: (a) simulate the dispersion of hazardous materials in the air within short distances of the emitting source, and (b) calculate the flow field (wind and turbulence) and the other meteorological data required by (a). The review makes a distinction between tools and models as follows: The word tool is used to describe a complete system that is used by competent people or authorities to take decisions during the management of an emergency situation caused by the release of hazardous substances into the atmosphere. Tools usually consist of subsystems (or modules) that include: graphical user interfaces, data bases, geographical information systems, diagnostic or prognostic meteorological models, atmospheric dispersion models, consequence assessment models, countermeasures models, postprocessing and presentation facilities. The word model is used to describe the different computational approaches or codes that form components of the tools. Numerical models may provide descriptions of the atmospheric flow and dispersion, for example, but are only components of integrated tools or complete systems. Two examples of comprehensive tools that originated in the US are the CAMEO system for handling chemical releases that result in toxic gas dispersions, fires or explosions, and the HPAC system for handling chemical, biological, radiological and nuclear material releases. Similar tools developed in Europe to assist in nuclear emergencies include the RODOS and ARGOS systems, the CERES system in France, and the DISMA tool in Germany. The next part of the review focuses on the different types of atmospheric flow and dispersion models available. The details of complete tools are not discussed. 29

38 3.2. Short-range and local scale modelling The use of the term short-range is closely connected to the notion of length scales used in the field of fluid dynamics, and consequently to details of the meteorology and dispersion process. From the physical point of view, the term shortrange is related to the spatial (and therefore also the temporal) extent of the different fluid motions, while from the modelling point of view it is related to the motions that a specific model actually resolves or averages. In practice the term short range may include distances from the immediate vicinity of the release up to a maximum radius of a few kilometres (e.g. 10 km). In terms of the spatial scales that are of interest in predicting flow and dispersion in urban areas, short-range includes the building/street and neighbourhood scales (Britter and Hanna, 2003) and extends into the lower meteorological meso-scale (see Figure 6). The term "short range" is mostly used in describing dispersion, and is considered equivalent to use of the terms local scale or "near field" when discussing flow phenomena. These are the meanings assumed during the rest of the review. Size km 1000 km 100 km 10 km 1 km 100 m 10 m 1 m 10 cm 1 cm 1 sec Turbulence Thermals Scales of atmospheric phenomena Convection Building wakes Global Circulation Cyclones Fronts Meso-Cyclones Orographic effects Land-sea breeze Urban Heat Island Deep convection Thunderstorms 1 minute 1 hour 1 day 1 month1 year 10 yr Time Phenomena impact accounted for in: mesoscale models Boundary values microscale models Figure 6: Scales of atmospheric phenomena and their typical appearance in models (adopted from Schlünzen et al., 2011) Macroscale Mesoscale Microscale Directly simulated Boundary values Directly simulated Parametrised Parametrised The modelling of short-range atmospheric dispersion of hazardous pollutants is mainly of interest for assessing the consequences of releases that have occurred inside urban environments or industrial plants close to the release point. However, there is also an interest in predicting the short-range dispersion of hazardous 30

39 pollutants inside enclosed spaces such as: buildings, offices, shopping malls, airports, railway stations, and metro stations, either due to indoor emissions or due to the infiltration of the hazardous substances from the outside air. This review will mainly focus on outdoor releases and their subsequent dispersion. Special attention is given to the atmospheric dispersion of hazardous pollutants in urban environments due to the large numbers of people that may need to be protected. Also the urban environment is characterised by buildings (i.e. vertical surfaces with varying geometric characteristics) and street canyons that introduce complex mechanical and thermal forcing to the wind flow at the local scale. This produces a range of flow features that affect the local scale dispersion including separation and stagnation zones around buildings, turbulent wakes and vortices, interacting wake regions from neighbouring structures, and street canyon channelling. The analysis of the meteorology, local air flow and dispersion of material from localised sources in urban areas over short and long time periods, is a prerequisite for emergency management. Therefore extensive research and development has been, and still is undertaken in the field of urban pollution dispersion. This includes both physical (wind tunnel and field trial studies) and mathematical modelling. It is recognised (e.g., U.S. Government Accountability Office, 2008) that research still needs to be performed to formulate accurate and computationally fast approaches for predicting dispersion in urban areas that are also suitable for use by first-responders, especially during the early stages of an emergency Flow models and dispersion models The current approaches for modelling or estimating the dispersion of a pollutant, start from empirical rule-of-thumb methods, and extend to complex mathematical models. The simplest rule-of-thumb method is to define an angle to the right and left of a wind vector passing through the emitting source that delineates an area downwind of the source where measures should be considered. Beyond such empirical methods are the mathematical atmospheric flow and dispersion models that are the subject of the present chapter. An important issue that must be considered is whether there is an interaction between the existing atmospheric flow and the hazardous substance that is released into it. Two distinct cases can be identified: 1. The released substance does not influence the existing atmospheric flow. This happens when, e.g., the released substance is a gas of similar molecular weight to that of ambient air, or when it is gaseous or particulate but in such a quantity that it does not inflict a change of density when it is mixed with the ambient air. This is also the case when the temperature of the released substance similar to that of ambient air and when no phase-changes occur. In such cases the released substance is considered to be passive, and its 31

40 dispersion is fully controlled by the ambient flow. The majority of the real world releases fall into this category; at least beyond a distance downwind of the source at which the cloud or plume has been adequately diluted to behave passively. In these cases the computations of flow and dispersion can be decoupled. This allows different models to be used sequentially for the flow modelling and the dispersion modelling, i.e. atmospheric flow models may be run before the dispersion modelling is considered. 2. The released substance influences the existing atmospheric flow. This happens when the released substance has a molecular weight different to that of the ambient air, or when its quantity and/or release conditions (temperature, pressure, phase changes) produce changes in the effective density of the ambient air. In this case, the ambient flow conditions (both mean and turbulent quantities) are affected due to the presence of the dispersing substance and the computation of flow and dispersion should be performed simultaneously. Only the most advanced models, such as Computational Fluid Dynamics (CFD) or micro-meteorological models implement a coupled framework for computing flow and dispersion i.e. a single model computes both processes simultaneously and the interaction between the released substance and the flow is taken into account. An additional point to be considered, is that the complexity of the dispersion process is highly dependent on the complexity of the flow. Therefore models that simulate the fate of hazardous releases have to be complex when the flow is complex. In a uniform flow without obstacles over homogeneous terrain, calculation of transport is almost trivial, and the dispersion process is simple. In these situations simple rule-of-thumb models and analytic Gaussian models may be applied. When the flow and dispersion calculations can be decoupled, there are various types of flow models that can be used. The following sections, therefore, present proposed classifications for flow models and for dispersion models Proposed classifications for flow models An atmospheric flow model answers the question: What are the wind, temperature, humidity, turbulence, and solar radiation conditions that will govern the transport, diffusion, and physical transformations of material released into the atmosphere? An atmospheric flow model provides a representation of the flow conditions (wind vector, temperature, humidity, turbulence, solar radiation) over a given domain (generally a three-dimensional domain in which the dispersion will be studied), with or without taking into account the complexities of this domain (complex terrain, variable roughness, the presence of buildings, trees, etc...). 32

41 Classification of flow models by dimension To clarify what is meant by this, a number of types of flow models are defined as a function of their physical dimension: Uniform flow model: all variables (e.g. wind velocity, temperature, turbulence variables) are uniform. This is the solution used by most rule-of-thumb and the most simple dispersion models. Profile flow model: vertical profiles of the former variables are used, and the vertical variation is specified through prescribed universal functions or computed using a one-dimensional numerical model. The flow variables are considered to be uniform in a horizontal plane. This solution is used quite often, sometimes implicitly in simple and more complex dispersion models when predicting dispersion over flat terrain. Three dimensional flow model: whatever the method used to compute them, the flow variables are ultimately given on a 3D grid to describe both the variation with height and in the horizontal plane. The variables are all 3D fields. There are several types of models able to produce 3D representations of the flow field. These include crude analytical functions, very elaborate methods based on solving the conservation equations, and other methods of intermediate complexity based on solving simplified or linearized conservation equations. These flow models need to be used in cases where the dispersion takes place over complex topography or in built-up areas. The above classification helps to identify which flow model can be used as an input to different dispersion models. Some dispersion models will be compatible with 3D flow models, others will not. Conversely, several dispersion models of different types may use the output of the same 3D flow model to drive their simulation of the dispersion processes Classification of flow models by the type of equations resolved The most advanced 3D flow models solve the conservation equations of fluid dynamics (or fluid mechanics). These equations are complex (as explained in chapter 2), and their solution is highly dependent on the CPU power available. These models are either Computational Fluid Dynamics or Micro-meteorological prognostic models, and several well-known models of this type have been developed. The capabilities of these models should not hide the fact that the range of turbulence models that govern their behaviour remains a very active area of research, and that their applicability must be assessed for each specific problem. Another important issue to be considered is that CFD or micro-meteorological models are driven by the initial and boundary conditions of their computational domain, and to reflect reality the boundary conditions should vary with time. The most advanced and physically correct 33

42 approach to defining the initial and boundary conditions is through nesting or downscaling from larger scale meteorological models. Alternatively, the boundary conditions can be defined by the user. A further recently developed approach is to use advanced mathematical techniques that allow the assimilation of available meteorological measurements into the CFD or micro-meteorological models, to improve the quality of the prognoses from them. Another class of 3D flow models are the diagnostic meteorological models that use meteorological measurements from a range of locations and heights. These interpolate (or extrapolate) the data to the computational grid of the atmospheric dispersion model. The same interpolation (or extrapolation) technique can also be applied to gridded results of prognostic meteorological models that need to be transferred to the different computational grid of the dispersion model. It is also possible to combine both (i.e. meteorological measurements and the results of prognostic models) in an optimal way, using data assimilation methods. In other words, the diagnostic models produce snapshots of gridded meteorological fields based on measurements, prognostic results, or both. In addition to the interpolation step, diagnostic meteorological models may include mass-consistent wind flow models that adjust the calculated wind flow field to satisfy the mass-continuity constraint for the specific topography or geometry. This process accounts for the presence of obstacles through parameterisations that define influence zones around them. In addition to the mass conservation constraint, some mass consistent wind flow models include a simplified form of the energy conservation equation or other physical constraints. In this respect they could be considered to be simplified CFD or prognostic models. In conclusion, the user can select from among a hierarchy of flow models based on (a) the geometrical complexity of the case in hand, and (b) the requirements and availability of computational resources (CPU time and hardware) Proposed classification for dispersion models The main categorisation of atmospheric dispersion models (ADM) is based on the frame of reference that describes the dispersion process. There are two reference frames for the dispersion process: Eulerian and Lagrangian. In the Eulerian frame the pollutant dispersion is expressed through the conservation equation of pollutant quantity which is numerically solved on a computational mesh that discretises the flow domain in a fixed ( Earth-based ) coordinate system. In the Lagrangian frame the pollutant dispersion is expressed through the movement of fluid parcels, each of which initially contains a specific quantity of pollutant. The movement (and possibly the deformation) of the parcels is controlled by the underlying flow variables (i.e. the mean velocity and turbulence). 34

43 Models adopt two types of averaging of the conservation equations that describe the flow and dispersion: ensemble averaging (or equivalently time averaging) spatial averaging. The averaging is necessary because the turbulent atmospheric flow contains eddies characterised by a wide range of spatial and temporal scales. Our current computational resources impose practical limits on the range of scales that can be resolved. The limit on the range of scales that are solved by a certain model is set by the type and degree of averaging that is performed on the basic Eulerian flow and dispersion equations to derive the final model equations. The averaging therefore defines the model s complexity, the meteorological data complexity that is needed by the model, and also the interpretation of the model s results. Ensemble averaging transforms the equations from a set describing a single episode of a turbulent dispersion problem to one describing the average of a large number of episodes (formally called realisations) of the problem (U.S. NRC, 2003). The oldest and simplest ensemble-averaged ADMs are the Gaussian plume models. Gaussian plume models were derived to predict the dispersion of continuous releases over flat terrain when driven by a constant wind velocity. They assume that the pollutant concentration downwind of the source (averaged over a large number of realisations of the given dispersion problem) has a Gaussian, or normal distribution in the vertical and lateral directions. The amplitude and width of the distributions are determined analytically from the rate of emission, mean wind speed and direction, atmospheric stability, release height, and distance from the release. Gaussian plume ADMs require very limited meteorological input data: wind speed and direction at a single place and some measure of atmospheric stability. There are also Gaussian models for short-duration ( instantaneous ) releases that describe the dispersion in the form of a Gaussian puff in an ensemble-averaged wind field. The most complex ensemble-averaged models are those based on the Reynolds- Averaged Navier-Stokes (RANS) equations. These are models that numerically solve the momentum, species and energy conservation equations by integrating them on a 3D grid. Turbulence is parameterized by several different sub-models, ranging from semi-empirical relationships, to those based on Reynolds stresses calculated by solving additional conservation equations. Similar parameterizations are used for the turbulent fluxes in the species and energy equation. All these parameterisations calculate the ensemble-averaged turbulence fluxes. RANS models have the ability to simulate flow and dispersion in complicated geometrical configurations. Usually flow and dispersion are calculated simultaneously, a fact that enables RANS models to simulate dispersion of non-passive pollutants (e.g. buoyant or multi-phase materials). The input required by RANS models includes, besides the detailed geometrical description of the case, detailed initial and boundary conditions in the computational domain. RANS models have a broad range of applicability, but require substantial computational resources. 35

44 If the dispersing substance is passive then a RANS model can be used to solve only the species conservation equation(s), using a previously calculated input flow field. The wind field may be provided by a diagnostic or prognostic meteorological model and be constant or variable in space and time. The advantage of this approach is that less computational time is required. A potential disadvantage exists if the flow field is variable in time and the dispersion calculation / solution of the species conservation equation is performed with a different time step (usually larger) than the time variation of the wind field, as this will result in a loss in accuracy. Spatial averaging of the conservation equations produces an equation set that describes a coarser-grained version of a dispersion realisation. The fields resulting from the solution of the equations retain their turbulent character at scales larger than that of the spatial averaging. The most familiar examples of spatially-averaged Eulerian models are the Large Eddy Simulation (LES) models. These are models that numerically solve the 3D, time dependent turbulent flows using a spatial resolution sufficient to resolve the largest turbulent eddies. Sub-grid scale turbulence is parameterised by semi-empirical relations. The output of spatially-averaged dispersion models should be interpreted as one of the possible solutions under the specified conditions. LES models require very fine spatial resolution, which increases both the computational resources and execution time required. For urban-scale flows the spatial resolution used in practical applications is of the order of less than 10 meters. This is sometimes called coarse or under-resolved LES. However, the application of LES modelling is constantly increasing with the advances in computer technology. Lagrangian ADM can be further sub-divided into Lagrangian puff and Lagrangian particle models. Both puffs and particles initially contain a specific amount of pollutant when they are emitted from the source. Lagrangian puff models describe dispersion as a series of puffs which, after their emission from the source, are transported by the mean wind velocity. The puffs then grow in size depending on the atmospheric stability conditions and their travel time. The pollutant concentration distribution inside each puff follows a prescribed function, usually Gaussian or uniform. Concentration is calculated at specified locations in the domain by adding the contributions from all the puffs present in the domain; known as the kernel method for calculating concentration. Some Lagrangian puff models add a random component to the puff movements, either in the displacement or in the wind velocity, to further simulate turbulent diffusion. Lagrangian particle models describe dispersion through the movement of a large (usually very large) number of virtual particles (in the order of 10 4 to 10 6 ) emitted from the source. The particles are transported by the wind velocity field at the point where they are located at each time instant, and may also perform random motions to simulate turbulent diffusion. Concentration is calculated by dividing the computational domain into cells, summing the particles in each cell, and dividing the total pollutant quantity in the cell by the cell s volume. Lagrangian particle models provide a more realistic representation of atmospheric dispersion than puff models 36

45 or simpler approaches, because different parts of the emitted plume can interact with different details of the atmospheric flow. This permits more realistic simulations to be made of complex meteorological conditions including for example: the presence of vertical shear, low wind speeds, temperature inversions flow over complex topography and around obstacles, the presence of topographical discontinuities such as sea-land or town-countryside boundaries. Their drawback is that they require a large number of particles, so they increase the requirements for computational resources. However this may be overcome through the use of parallel computing. Lagrangian dispersion models possess the advantage over Eulerian models that they do not suffer from numerical diffusion in the immediate vicinity of a point source. They are therefore most commonly used in safety- or security-related applications. Lagrangian dispersion models are frequently combined with CFD RANS, LES or meteorological diagnostic and prognostic models that calculate the flow (i.e. de-coupled flow and dispersion calculations), especially to model scenarios involving complex geometrical features, such as urban or industrial sites. If the flow and dispersion computations can be de-coupled, then the combination of a diagnostic meteorological flow model and a Lagrangian dispersion model represents a powerful and fast compromise between the need to consider complex phenomena and a reasonable response time Concepts of use: When to use what Models that simulate the atmospheric dispersion of hazardous substances are used by emergency management authorities for planning purposes, for training personnel, for managing emergency situations, for guidance of first responders and to identify what short-term countermeasures should be taken to protect the population. They may also be used in the recovery phase for efficiently planning and organising the restoration of the area. The appropriateness of using a certain model type depends on: (a) the time phase in relation to the emergency situation, (b) the spatial scales and other specific characteristics of the dispersion situation, and (c) the availability of computational resources. It is apparent that the spatial scales are closely related to time scales by the prevailing wind speed driving the plume transportation. In the following sections the concepts of use for the ADMs will be analysed firstly in relation to time phases, and secondly in relation to the spatial scales. According to the U.S. NRC (2003), it is customary to distinguish three phases in emergency management situations following the release of hazardous substances into the atmosphere: 37

46 1. The preparedness phase; 2. The response phase; 3. The analysis and recovery phase. The requirements for dispersion models used in each of the above phases are different in terms of computational speed, accuracy and output Preparedness phase During the preparedness phase the computational speed of models is not important. To assess the possible consequences of accidental releases numerical simulations are not required in real time, but need to cover all the possible types of accident, meteorological scenarios and consequences (CERC, 2003). This is required to formulate responses to all the likely eventualities, especially with regard to human health or arrangements for evacuation. Usually site-specific meteorological conditions are used to drive probability-based dispersion simulations for specific release scenarios. In particular ADM systems are (or can be) used for the following purposes: 1. Contribution to the design / optimisation / enhancement of monitoring networks, in terms of monitoring locations and spacing, as well as defining sensor sensitivity or sampling requirements, to achieve maximum coverage and effectiveness; 2. Risk assessments for defined threat scenarios, such as terrorist attacks against particular targets, or accidents in industrial sites (such as chemical or nuclear plants) that would cause atmospheric releases of harmful agents. Advanced modelling capabilities are needed to analyse scenarios in urban environments or complex topographies, where higher spatial and temporal resolution is required; 3. Planning and training of personnel, especially for site-specific or event-specific training; identification of high-vulnerability areas to enable monitoring resources and personnel to be allocated in an optimal way, in specific conditions and scenarios. During this phase wind tunnel modelling studies can be performed and RANS CFD or LES models can be applied, as well as Lagrangian particle models Response phase During the response phase, i.e. when hazardous substances have been released to the atmosphere, ADM systems are used to support the decision making process of the early responders. Therefore computational speed is very important, even compromising accuracy of predictions. Models used in this phase should be designed for use by non-specialised personnel, with minimal input data requirements. The use

47 of dispersion models depends on the specific conditions and the time scale after the release. If the release is unknown or partially known (which is very probable in securityrelated applications), ADM systems can be used for determination of time, location and magnitude of the release, by tracing the transport of airborne CBRN substances, using data assimilation and inverse modelling. To this end measurement data are used that may originate from an existing monitoring network, or from observations and measurements performed by the first responders at the area of the event. The measurement data are combined with an ADM (of diverse complexity) through several possible methods: Bayesian inference, variational analysis, Kalman filtering, adjoint equations, etc. For the first 1-2 hours immediately following the event a very simple ADM tool can be used, that requires low level of expertise, operates with limited and basic input data (meteorological and source term) and has very short execution time, such as a Gaussian plume or puff model. Based on the results the emergency managers could decide on the type of protective equipment to be used for protecting the onsite responders and on sheltering of population or evacuation of impacted areas. Over the next several to 12 hours, as new data and information become available, the evaluations and predictions can be refined by using more complex modelling tools (such as a Lagrangian puff model), leading to more accurate representation of affected zones and to decisions on countermeasures in more distant areas Recovery and analysis phase During the recovery phase, the accuracy of model predictions is more important than their computational speed. In later times after the release, ADM systems are used to provide accurate predictions of the contamination evolution, as well as exposure assessments. These predictions can be used to assess the need for responses and to allocate the appropriate resources. ADM system predictions may be used to identify hot spots, i.e. areas in which high accumulations or persistence of hazardous substances are expected. Depending on the spatial scales of interest, CFD RANS or LES, as well as Lagrangian particle models may be applied in this phase. Figure 7 Illustrates the range of spatial scales that the modelling techniques presented in the previous section are appropriate for, according to U.S. NRC (2003). Britter and Hanna (2003) proposed a similar separation of the relevant scales to flow and dispersion in the urban environment (building / street scale, neighbourhood scale and meso-scale), and analysed the corresponding physical mechanisms and the most appropriate modelling methodologies. Another important application of ADMs connected to emergency response is optimisation of monitoring networks. This application concerns the emergency response planning (or preparedness) phase and concerns determining optimum number and location of sensors to assure capturing unknown releases, as well as 39

48 identifying hot-spots or the most seriously affected areas. The aim is to combine best coverage with minimum detection times. A well-planned monitoring network would also greatly improve the performance of inverse modelling techniques to locate unknown sources of hazardous substance releases. CBRN event Spatial scale Local Urban County Regional Continental Global Wind tunnel (for planning purposes) CFD RANS or LES Gaussian Plume or Puffs Lagrangian Puffs or Particles (depending on wind field model) Eulerian grid models km Distance from source Figure 7: Spatial range of application of dispersion model categories (adapted from NRC, 2003) For the sake of completeness, it is noted here that at any given time the majority of the population may be indoors. Therefore, an important issue is to be able to assess the exposure of the population indoors to hazardous substances that have been released either indoors or outdoors. For substances released outdoors, infiltration into the buildings has to be estimated. Indoor dispersion models are used to assess the exposure of the population indoors to contaminants originating either from outdoor releases that subsequently infiltrate into building interior, or from releases that occurred indoors. Infiltration is calculated by combining outdoor dispersion models with building air exchange rates. Depending on the building size and internal complexity, but also on the available computational resources, simple box models, multi-zone models or CFD models may be utilized. Box models assume well-mixed internal air and uniform contaminant concentration. Multi-zone models divide the building to a sequence of well-mixed zones between which there are steady air flows. The use of complex CFD RANS and LES models is increasing; taking advantage of the growing computational power. CFD RANS or LES models are used to simulate transient phenomena and to calculate complex indoor concentration patterns. 40

49 3.7. Input and output requirements The basic data groups that are imported into an ADM modelling system are: 1. Emission data (the source term ), including emission rate as function of time, type(s) and properties of released material(s), source location, source conditions, and in general the information describing the state of the released material at the time it is emitted; 2. Wind and other meteorological data: measured wind speed and direction at specific locations or calculated by a meteorological model on a grid, as well as other meteorological variables characterising the mixing ability of the atmosphere (stability category, mixing layer height, heat fluxes, temperature gradients etc.). The complexity of the input meteorological fields depends on the type and complexity of the dispersion model; 3. Topography and geometry of the area (e.g. description of buildings), including surface properties that are important for calculating deposition of substances; 4. (Optionally) real-time meteorological and / or pollution measurements for data assimilation, in order to improve flow field predictions and produce dispersion simulations that are closer to the real situation. Based on this information, the dispersion modelling system produces results related to risk assessment and the identification of appropriate responses such as: Instantaneous values of concentration in the air (i.e. snapshots of the expected cloud or plume) at different times; Time-integrated values of concentration in the air (i.e. dosages); Maximum expected concentrations, radioactivity-related parameters for radioactive pollutants (such as gamma radiation dose rates); Deposition of the pollutant(s) on surfaces (streets or building walls); Determination of affected areas. Dispersion models used for emergency planning and response should also provide confidence estimates within which prescribed concentrations should not be exceeded outside of predicted hazard zones. This requires that the ADMs provide some measure of the possible event-to-event variability about the average in a given situation. They should also provide an estimate of the concentration fluctuations due to turbulence, which are especially important for toxic or flammable releases. It is recognised that, even a good atmospheric transport model may have singleevent errors of more than a factor of ten, if its results are compared to a specific experiment. In determining evacuation zones based upon estimates of lethality dosage, fluctuations of this magnitude represent substantial human health risks. It is 41

50 therefore important that atmospheric models applied to individual atmospheric releases provide predictions with clearly stated uncertainties. The actual results required from dispersion models depend on the particular application and the needs of the end-users, which in turn depend on the phase of emergency response when the model is applied. Different dispersion modelling methods are appropriate to the preparedness, response, and recovery stages of CBRN events. For the preparedness stage, an accurate model capable of providing confidence-level estimates is desired, but model execution time is not important. For the response stage, accuracy can be compromised to obtain timely predictions, but the dispersion model must still provide some confidence-level estimates. For the recovery stage, model execution time is not important, but accurate model reconstruction of the plume concentration distribution over time is desired Examples of existing tools and models for CBRN releases Comprehensive reviews of ADM systems used for emergency management of accidental or intentional releases of hazardous materials can be found in Borysiewicz and Borysiewicz (2006), Sohn et al. (2004), Sugiyama et al. (2004) and U.S. NRC (2003). In the following section a number of modelling systems are listed and briefly described. It is important, however, to notice that: 1. This list is not exhaustive, and is only meant to provide examples of existing systems. 2. The appearance in, or omission of models and tools from the list is no indication of any relative ranking. Examples of current models in the USA that are (intended to be) used for CBRN emergencies, include: HYSPLIT and CAMEO/ALOHA produced by NOAA; the suite of dispersion models used by DOE/LLNL/NARAC; HPAC produced by the Defense Threat Reduction Agency (DTRA); VLSTRACK produced by the Navy; MIDAS-AT produced by the Marines; and several CFD models used by different groups. Other examples of emergency response systems for chemical releases are the proprietary codes SAFER, and EPICode that include Gaussian plume or puff dispersion models. The DTRA Hazard Prediction and Assessment Capability (HPAC) is an example of an integrated tool. The tool accesses weather data from in-house numerical weather prediction models and military providers. It uses the SCIPUFF and UDM Gaussian puff dispersion models to predict ensemble average concentrations and concentration variances from the local up to regional scales, and the Micro-Swift-Spray (MSS) modelling system (Moussafir et al, 2004), which consists of a diagnostic wind field model coupled with a Lagrangian particle model to directly account for the effects of 42

51 buildings on dense or lighter than air emissions at micro-scale (Anfossi et al., 2010). If the released gas is denser than air it uses a modification of the DEGADIS dense gas dispersion model. The National Release Advisory Center (NARAC) uses a suite of dispersion models that range from a simple Gaussian puff model (INPUFF), to the Lagrangian particle model (LODI) which is driven by the diagnostic meteorological model ADAPT, to CFD approaches (FEM). For nuclear emergencies and predicting dispersion over unobstructed terrain under simple meteorological conditions NARAC uses the standalone Gaussian plume model HOTSPOT. Los Alamos National Laboratory (LANL) has developed two dispersion models that are designed to predict hazardous concentrations in the urban environment. One is a Lagrangian particle dispersion model (QUIC-PLUME) that is coupled with a diagnostic wind field model (QUIC-URB). The other is a CFD-LES model named HIGRAD. The QUIC dispersion modelling system produces a three-dimensional wind field around buildings, accounts for building-induced turbulence, and incorporates a graphical user interface to facilitate setup, running, and visualization. UDM (Urban Dispersion Model) is a Gaussian puff model for predicting the dispersion of atmospheric pollutants in the range of 10m to 25 km and particularly within the urban environment. It was developed by the Defence Science and Technology Laboratory for the UK Ministry of Defence. It handles instantaneous, continuous, and pool releases, and can model gases, particulates, and liquids. The model has a three regime structure: that of single building (area density < 5%), urban array (area density > 5%), and open. The PHAST consequence prediction and hazard analysis tool developed by DNV (Det Norske Veritas), is a software package that contains both release and dispersion models and has dedicated damage models for fires and explosions. The dispersion model can handle both neutral and dense gas dispersion, and is based on the proprietary (of DNV) Unified Dispersion Model. PHAST is used mainly for assessing the potential consequences of chemical material accidents. PHAST is delivered with the DIPPR chemical database of 1500 materials. This includes toxic probits, LEL and UEL values, and all the temperature dependent physical properties, which are required during calculations. The database includes IDLH thresholds for chemicals. EFFECTS is a consequence modelling tool developed and distributed by TNO. The program is based on theoretical models described in the Yellow Book and Green Book 2. These two books are official TNO publications written at the request of the Dutch Committee for the Prevention of Disasters. EFFECTS is used for assessing the potential consequences of hazardous material accidents and during the management of such accidents. TNO EFFECTS is currently used by both the Dutch and Italian regional fire brigades. The typical users are specialist officers in the emergency 2 _id=1663&taal=2 43

52 response teams. For dense gas dispersion, EFFECTS uses the open source SLAB model, produced by the US Lawrence Livermore National Laboratory. EFFECTS is TNO's tool for consequence analysis of accidental releases of hazardous materials. The software contains a large number of different calculation models including release models, dispersion models and damage models. The software comes with a complete chemical database (DIPPR database), containing toxicity and flammability values and all thermodynamic properties, as functions of temperature or pressure. HAZMAT Responder is a commercial emergency response dedicated tool distributed by SAFER Systems. The program assesses the potential consequences of hazardous material releases, and is used by several Civil Defense/Fire fighting organisations and environmental institutes. The tool can take in data from sensors (either concentration samplers or meteorological sensors), which allows a source term estimation (location of release and initial conditions) to be made. SAM-S is a commercially produced specialist emergency response tool distributed by Ingenieurbuero Lohmeyer. It enables the dispersion of air pollutants to be simulated during and after an accidental release from stationary or mobile sources. SAM-S calculates the current concentration of airborne pollutant, as well as a 2 hour forecast in an area up to approximately 30 km x 30 km which it renders on a geographically referenced concentration contour map for visualisation in Google Earth, for example. The diagnosis is updated every 3 minutes. SAM-S also includes a source-term estimation module that uses concentration measurements. RIMPUFF is a local and regional scale real-time puff diffusion model developed by Risø National Laboratory for Sustainable Energy, Technical University of Denmark (Risø DTU). RIMPUFF is an operational emergency response model for assisting emergency management organisations in dealing with chemical, nuclear, biological and radiological releases into the atmosphere. RIMPUFF is in operation in several European national emergency centres for preparedness and prediction of the effects of accidental nuclear and chemical gas releases. DIPCOT (DIsPersion over COmplex Terrain) is a Lagrangian model developed by the Environmental Research Laboratory of the Institute of Nuclear Technology and Radiation Protection of NCSR Demokritos in Greece. DIPCOT can function as a Gaussian puff, or Lagrangian particle model incorporating a random-particlesmovement method. The model can be used to calculate dispersion and deposition of chemical and nuclear / radiological substances. It also calculates radiation doses from dispersed radio-nuclides. A data assimilation scheme for the assessment of an unknown source emission rate has been recently added to the model. Both RIMPUFF and DIPCOT are integrated into the European RODOS (Real-time On-line DecisiOn Support) system for nuclear emergencies. RIMPUFF is the main atmospheric dispersion model of the ARGOS decision support system. In addition to dedicated hazmat dispersion models and tools, CFD models are used for emergency response pre-planning, and post-accidental planning and analysis. 44

53 ADREA-HF is a CFD code developed by the Environmental Research Laboratory of the Institute of Nuclear Technology and Radiation Protection of NCSR Demokritos in Greece. The code can handle a wide range of contaminants, including 2-phase releases, jets, buoyant gases etc. It has been recently parallelized to increase the computational efficiency. An inverse modelling / data assimilation algorithm has also recently been implemented in the code, to allow source term estimates to be made. Examples of CFD models that can be used for simulating airflow and dispersion, both indoors and outdoors are: Flovent (Flometrics Inc.), Airpak (ANSYS), Star CD (Adapco Group), CFX (ANSYS), Flow3D (Flow Science Inc.), Phoenics (Cham Ltd), Fluent (ANSYS), and FEM3MP (LLNL). These can be applied for spatial scales ranging from a few meters (i.e. room scales) up to about 1 km. Settles (2006) presented an overview of potential use of fluid dynamics in security-related applications, while Li et al. (2006) presented applications of CFD modelling in the study of wind flow and pollutant dispersion in urban areas. The results of several CFD investigations have been published that have examined the dispersion of hazardous substances from accidents or terrorist acts in urban environments. The Joint Urban 2003 field study conducted in Oklahoma City, Oklahoma, in July 2003, is a widely used case for the evaluation of both complex (Chan and Leach, 2007, Flaherty et al. 2007) and simple models (Burrows et al. 2007, Hendricks et al., 2007). An interesting CFD based approach to predicting dispersion has been proposed by Patnaik and Boris (2007). Their approach is incorporated into a tool called CT-Analyst which is an urban-oriented emergency assessment tool for airborne Chemical, Biological, and Radiological (CBR) threats. The tool is based on detailed, 3D CFD LES computations for a given area, including, solar heating, buoyancy, complete building geometry specification, trees, and wind fluctuations. The method uses high performance computing (HPC) parallel platforms to perform a number of detailed, high resolution 3D simulations ahead of time. These results are then accessed for operational use with no sensible delay for integration of even simple models. The computations are performed for different wind directions and speeds, different sources and source locations using a new data structure called Dispersion Nomografs. References Anfossi D., Tinarelli G., Trini Castelli S., Nibart M., Olry C., Commanay J. (2010): A new Lagrangian particle model for the simulation of dense gas dispersion, Atmospheric Environment, Volume 44, Issue 6, February 2010, Britter R.E., Hanna S.R. (2003): Flow and dispersion in urban areas Ann. Rev. Fluid Mech. 35,

54 Borysiewicz M.J., Borysiewicz M.A. (2006): Atmospheric dispersion modelling for emergency management. Centre of Excellence for Management of Health and Environmental Hazards (CoE MANHAZ), Institute of Atomic Energy. Burrows D.A., Hendricks D.A., Diehl S.R., Keith R. (2007): Modeling Turbulent Flow in an Urban Central Business District, Journal of Applied Meteorology and Climatology, 46, pp CERC (2003): Dispersion from Accidental Releases in Urban Areas. Final Report prepared for ADMLC by J.C.R. Hunt, D.J. Carruthers, R.E. Britter. Chan S.T. and Leach M.J. (2007): A validation of FEM3MP with Joint Urban 2003 data. Journal of Applied Meteorology and Climatology, 46, pp Flaherty J.E., Stock D., Lamb B. (2007): Computational Fluid Dynamic Simulations of Plume Dispersion in Urban Oklahoma City. Journal of Applied Meteorology and Climatology, 46, pp Hendricks D.A., Diehl S.R., Burrows D.A., Keith R. (2007): Evaluation of a Fast-Running Urban Dispersion Modeling System Using Joint Urban 2003 Field Data. Journal of Applied Meteorology and Climatology, 46, pp Moussafir J., Oldrini O., Tinarelli G, Sontowski J, Dougherty C. (2004): A new operational approach to deal with dispersion around obstacles : the MSS (Micro-Swift-Spray) software suite. 9 th International Conference on Harmonisation within Atmospheric Dispersion Modelling for Regulatory Purposes, Garmisch, 1-4 June Patnaik, G., Boris, J.P. (2007): Fast and accurate CBR defense for homeland security: Bringing HPC to the first responder and warfighter. Department of Defense - Proceedings of the HPCMP Users Group Conference 2007; High Performance Computing Modernization Program: A Bridge to Future Defense, DoD HPCMP UGC, art. no , pp Schlünzen, K.H., Grawe D., Bohnenstengel S.I., Schlüter I., Koppmann R. (2011): Joint modelling of obstacle induced and mesoscale changes current limits and challenges. J. Wind Eng. Ind. Aerodyn. 99, , doi: /j.jweia Settles, G.S. (2006): Fluid Mechanics and Homeland Security. Annual Review of Fluid Mechanics, 38, pp Sohn, C.W., Solberg A., Gonsoulin T. (2004): Analysis of Numerical Models for Dispersion of Chemical/Biological Agents in Complex Building Environments. Report No. ERDC/CERL TR-04-25, U.S. Army Engineer Research and Development Center (ERDC), Construction Engineering Research Laboratory (CERL). Sugiyama, G., Nasstrom J.S., Baskett R.L. (2004): Operational Systems for Emergency Preparedness and Response. Report UCRL-JC U.S. Government Accountability Office (2008): First Responders' Ability to Detect and Model Hazardous Releases in Urban Areas Is Significantly Limited. Report to Congressional Requesters, GAO U.S. National Research Council (2003), Tracking and predicting the atmospheric dispersion of hazardous materials releases Implications for Homeland security. National Academy of Sciences, USA. 46

55 4. Dispersion modelling for emergency planning and response 4.1. Modelling challenges This section presents the numerous conceptual as practical issues that immediately appear when thinking about producing a dispersion modelling system dedicated to emergency preparedness and response Overview of the involved physical processes The physical processes and phenomena that need to be modelled can be summarised as follows: Input to dispersion flow modelling To carry out dispersion computations, it is first necessary to characterise the external and, sometimes, the internal flow conditions. Meteorological data are needed, often at range of different scales, including the influence of topography and buildings; If the case of indoor / outdoor contaminant transfers, it may be necessary to compute the flow inside buildings, taking account of for example: - The ventilation in the buildings (natural and HVAC), - The influence of moving vehicles (cars and trucks in a tunnel or train in the underground network); If necessary, coupling between atmospheric environment and the indoor air flows Dispersion modelling The potential dispersion modelling issues that have to be considered include: - Dispersion over distances close to the source where special treatment may be required because the release is multiphase or energetic (as a result of production from a fire, deflagration or detonation for example). - Depending on the magnitude of the release, dispersion modelling at various scales, with detailed computations near the source and less refined 47

56 calculations at some distance from the source (where impact assessment results in low doses); - Transformations of the emitted substances such as species phase changes, aerosol formation and evolution, chemical reactions, radioactive decay and formation of daughter products, fate of biological agents (degradation or disappearance due to the meteorological conditions) etc.; - Dry and wet deposition, noting that deposition does not only occur on the ground or vegetation, but more generally on all accessible horizontal or vertical surfaces (such as the roofs or façades of buildings), and that the level of deposition depends of the nature of the airborne species and surface materials; - Re-suspension (or take-off of previously deposited particles) Output of dispersion modelling Impact assessment In order to evaluate the practical consequences of a potentially hazardous dispersion, one has to consider: - An impact assessment based on different methods depending on the nature of the CBRN agent (radioactive species, toxic chemical, bio-aerosol); - The countermeasures that might be recommended depending upon the nature of the emitted substances, the relevant regulations, and ranking of countermeasures according to a cost-benefit analysis; - The fate of deposited substances, such as their migration into the food chain or ground water Modelling need in case of a CBRN release Local scale simulation is certainly the most relevant, or even mandatory, modelling choice when a CBRN threat is faced. In practice releases in a complex built environment are the most likely events, whether the origin is accidental or malicious. A general characteristic is that it is not obvious what the extent of the area will be that could suffer environmental or sanitary consequences. But, in all situations, from a very small emission into the atmosphere, to a release that has a significant impact over a large area of a city, first responders require a rough estimate of the total area affected, and if possible, a detailed description of the situation near the source location. Implementation of a local scale approach is very challenging, as not only does it require modelling at this scale, but difficulties arise in linking this to the larger scales. Two cases can be identified: 1. To study past or academic dispersal events by reconstructing real or typical meteorological situations, a simplified CFD diagnostic model or a full CFD model may be operated. A diagnostic model needs data inside the simulation

57 domain which are real in situ measurements, or academic inputs. A CFD model uses initial and boundary conditions (profiles) for the wind, turbulence, temperature, and relative humidity. 3D data given by a meso-scale model may also be used, but in this case, it is not mandatory to make the micro-scale interact with meso-scale. 2. To predict the dispersion and consequences of a hazardous release when it happens, it is necessary to have the weather forecast at the (local or meso) scale of interest. This means that it is necessary to compute in advance (for the foregoing minutes and hours) the species dispersion. This case is certainly the most relevant for a CBRN release, as it is critical for the site operator (in the event of an industrial accident), rescue teams, and public authorities. It is also the most challenging for the model developers and users. In this situation, a micro-meteorological forecast is needed that is consistent with meso-scale weather predictions, but issues of down-scaling and scale interaction must be addressed The principle of down-scaling There are many issues relating to undertaking flow modelling at the various scales introduced in Chapter 3. Down-scaling is a major one, as down-scaling of a meteorological forecast from the meso-scale to the local scale may be requested to compute the dispersion of hazardous species from accidental or malevolent releases. This issue is briefly discussed below Meso-scale modelling Meso-meteorological modelling has benefited from an intense research efforts over several decades. Many countries have developed their own national numerical weather prediction systems which are more or less open (for free, or with payment) to the international scientific community. For example, among the well-known modelling chains dedicated to the 3D meteorological fields reconstruction and forecast, we can cite: The Meso-scale Model 5 (MM5) and its successor, the Weather Research & Forecasting (WRF) (developed by American universities, especially Pennsylvania State University, NCAR National Center for Atmospheric Research and NCEP National Centers for Environmental Prediction, USA); The Regional Atmospheric Modelling System RAMS (developed at Colorado State University and Atmet, USA); The COSMO model (DWD, Germany together with a European consortium of several Countries); BOLAM and MOLOC (CNR-ISAC, Italy); 49

58 50 ALADIN (CNRM- Météo France, France) ; The Unified Model developed by the UK Met Office. Meso-scale modelling is necessary if it is used to initiate and guide / force localscale meteorological modelling. Moreover, meso-scale models have recently been improved to run at resolutions down to 1 km, and have often been the basis for the development of micro-meteorological models. Meso-scale models can profitably be used to provide inputs into mass-consistent models driving ADMs, or to directly drive higher-resolution micro-meteorological codes Local scale modelling There are numerous methods adapted to local scale modelling: For the flow, the most frequent approaches are 3D mass-consistent diagnostic modelling or RANS and LES numerical modelling (or even DNS within the research field). At micro-scale, the CFD modelling is considered as the reference approach, as it is able to deal with complicated unsteady, multiphase, reactive, turbulent flows around complex geometries; For the species dispersion, the alternatives are between Lagrangian, Eulerian, or Gaussian modelling. The latter has significant limitations, as its assumptions are not well-suited to simulating dispersion over short distances (less than 200 m) in the built environment, or distances exceeding 20 km. These methods are often used in crisis management as they are expected to be conservative, but this may be a false assumption as shown in Figure 5 in Chapter 2. The desire would be to replace such methods by those that provide a rigorous treatment of the turbulent flow and dispersion. This said, Gaussian approaches are widely used and have been developed to incorporate recent developments that enable them to take account of heavy gases for example. At present, the Lagrangian approach has difficulties handling chemical reactions, which are more easily accounted for by Eulerian modelling Bridging the gap Modelling at various scales implies defining domains at different resolutions. At the meso-scale flow modelling is associated with nested domains with typical resolutions from 50 km (~0.5 at mid-latitude) to 1 km. Even if there is further research to reduce the finest mesh size, the usual practice with existing parameterizations (and the way the meteorological equations are solved) appears to limit the minimum grid to around one kilometre resolution. At the local scale, flow modelling needs a small mesh size to predict processes inside the urban canopy or between the buildings of an industrial site. In general, it is necessary to compute the concentration field as precisely as possible to give the most

59 reliable information to rescue teams and answer questions such as, for example, where to securely install a first-aid area for the casualties of a toxic dispersion. This suggests that the targeted resolution should be around 1 m (also useful in case of a source located inside a building with indoor / outdoor dispersion). This metric resolution can be reached with both prognostic and diagnostic (mass-consistent) flow models if the simulation domain has a limited extent, of the order of some hundreds of meters. This would typically encompass an industrial site or an urban district. The paragraphs above reveal the gap between the most resolved meso-scale domain at ~1 km resolution and the ~1 m resolution simulation domain required to resolve the dispersion in proximity of the release. To bridge this gap is extremely challenging, as many issues must be addressed. Some examples are: As, typically, the smallest meso-scale domain dimension is ca km, and the local scale domain with an explicit resolution of the buildings ca km, the dimensional ratio of simulation domains is around The mesh size ratio is of the same order of magnitude (i.e. mesh size from 1 km to 1 10 m). Is it thus possible to use the meso-scale vertices to nudge the micro-scale flow (boundary or inner domain conditions)? Or, should one use intermediate imbricated domains to gradually increase the resolution from ~1 km to 1 10 m? At meso-scale, a weather forecast code has many different options available to evaluate the turbulent characteristics of the atmosphere. At local scale, the obstacles, especially the buildings must be taken into account. The turbulence has a double origin, as it results from the atmospheric conditions and also from the mechanical and thermal effects of the obstructions on the flow. At this scale, the total turbulence may be regarded as the addition / superposition of local and background contributions. This leads to complex modelling issues, but they are more tractable than solving the turbulence issue at intermediate scales. Should, for example, the turbulence in the successive nested domains be introduced using boundary conditions? At the local scale, the buildings are in general explicitly taken into account. At meso-scale, there are now urbanized versions of WRF for example, which incorporate modelling of the global city drag effect on the flow in the ABL (Atmospheric Boundary Layer) and / or the heat island intensity in the atmospheric heat budget for large agglomerations. In RAMS, the obstacles are effectively included through the Adaptative Aperture Cartesian coordinate system (ADAP, Walko and Tremback, 2002). Should one use such an urbanized meso-scale model? And what happens in the intermediate nested domains from the meso-scale to the local scale? Should, for example, a porosity variable be introduced according to the mesh size, to gradually take account of the presence of buildings or vegetation obstruction and their influence on the air flow and species dispersion? 51

60 Impact assessment Impact assessment in the sense of simulating the impact of the crisis is, of course, of major interest in the emergency response, as it gives a practical and objective view of the consequences of the dispersion. In a full modelling system, the output of the dispersion simulation must be used to assess, forecast, or anticipate the effects of the CBRN agent: On the health of the impacted population and rescue responders. This is the immediate need in the event of an emergency; On the environment in the post-crisis phase, in order to plan for the remediation of the area affected by the dispersion. Here, the environment may be an industrial site in case of an accident, or a town district in the case of a bomb attack. In general, it is natural or built, and may include rivers, lakes, sea, tunnels, underground tunnels, etc. The space and time computation of the CBRN agent distribution is the fundamental step in any response tool, as the dispersion results determine the outputs from the following evaluation of consequences. Within this framework, the most relevant input data choices should be made, and the most detailed and practically usable modelling suites and numerical tools used, depending on the phase of the crisis, the amount of time available for calculations, and the computing resources available Future needs for model development Uncertainties / variability / confidence levels: Methodologies should be developed to assess the (bounds of) uncertainties in model predictions. These might be presented as a range of concentration values with respective probabilities, or as confidence estimates that prescribed concentrations will not be exceeded outside of predicted hazard zones. Models should provide not only average concentrations, but also an indication of event-to-event variability for a given release. A suggested method for achieving this is to use ensembles of model solutions for the urban area in question. In this case the level of confidence could be quantified in the form of a number of ensemble members (or their percentage in the total number) predicting a certain level of concentration U.S. NRC (2003). A disadvantage of this approach is the increased computational load that is involved. Nevertheless, it has been adopted in the ENSEMBLE initiative, which is concerned with the dispersion of pollutants over regional scales, and in which a large number of organizations are participating from Europe and elsewhere. 52

61 Data assimilation: The data assimilation methods currently applied in ADMs should be further developed in terms of: 1. Their effectiveness / speed of execution to be fully operational in real-time. 2. Range of applications, specifically for multiple pollutants and for source identification, including strength, time variation and location. Data assimilation should include: 1. Meteorological data, such as measurements of wind speed and direction, and data appropriate for the characterisation of the atmospheric turbulence and stability; 2. Data from CBRN sensors. The purpose of this is to correct the model predicted plume and to carry out source term estimation. References Walko, R., and Tremback, C. (2002): The Adaptive Aperture (ADAP) Coordinate. 5th RAMS Workshop and Related Applications, Santorini, Greece. 53

62 54

63 5. Quality assurance of local-scale hazmat dispersion models As highlighted and discussed in the previous chapters, the dispersion models implemented in emergency response systems/tools are the key element for predicting danger zones and health effects. Quality assurance of the predictive capabilities of dispersion models is necessary. Furthermore, it should be conducted in a transparent and generally accepted way, to demonstrate that the specific emergency response system/tool is apt to give reliable information for the defined application (accidental releases in build-up environment in this case) to decision makers. Nevertheless, even with a perfect model, and users applying it correctly, uncertainties remain due to the lack of information about the release, as well as due to the complex flow conditions in the built environment. The focus of the following sections is on the quality assurance of hazmat dispersion models only. Previous comparisons of emergency response tools commonly used to assess the consequences of accidental gas releases, revealed that large variances in the model outputs resulted from differences in source term modelling (Baumann-Stanzer and Stenzel, 2011). It is also known that the derivation of toxicity from the outcomes of the dispersion models may be responsible for wrong predictions of the health impacts (Hanna et al., 2008). This has to be taken into account when the overall quality of emergency response systems/tools are analysed. The evaluation of the quality of technical models comprises in general (Britter, 1991): 1. Assurance of correct coding of the algorithms; 2. A statistical model validation entails comparison with experimental data sets (considerable care is required in performing such a validation. In particular, in determination of an appropriate protocol to ensure unambiguous conclusions can be drawn from the validation); 3. A model assessment including a scientific review. Chang et al. (2003) add that measuring the performance of a transport and dispersion model should comprise an operational assessment in addition to the statistical and scientific evaluation. In a statistical evaluation, the model can be treated as a black box in which model outputs are examined to see how well they match observations. It is sometimes possible for a model to give the right answers, but only as a result of 55

64 compensating errors. In a scientific evaluation, the model algorithms, physics and assumptions are examined in detail for their consistency, accuracy, efficiency and sensitivity. In an operational evaluation, issues related to the model s user-friendliness are considered, such as the user guide, the user interface, error checking of input data, internal model diagnostics, and output display. Error checking of input data may include different levels of sophistication to validate the range of input data. Internal model diagnostics may include procedures to check the reasonableness of intermediate results Specific requirements In an emergency situation, the model must be able to predict the transient dispersion process associated with a short duration (or puff) release. In an urban or industrial environment the dispersion of the agent is strongly influenced by small scale obstacles, and the resulting concentration distribution is therefore highly inhomogeneous in space and intermittent in time. Even for nominally steady boundary conditions the instantaneous or short time averaged concentrations display a large scatter, and can therefore only be treated as a stochastic process. Figure 8 (left) shows results recorded by Harms et al. (2011) obtained from a wind tunnel model of Oklahoma City at one measurement location, under statistically steady approach flow conditions. The strongly varying and highly intermittent concentrations recorded for different short term releases are clearly visible. The distribution of these results led to the rather wide distribution of peak times, shown on the right. The peak time was defined as the time interval after release when the peak concentration was observed. The cumulative or probability density function provides the most general description for such a stochastic process. Simulation results should therefore at least include the statistics of the extreme values and the dosage. Most simulations, however, only provide ensemble mean values, i.e. the lowest moments of the probability distributions. Estimations of the peak concentrations are then based on correlations using mean values. Although the need for a stochastic treatment of atmospheric dispersion has been recognised for a long time (Fox, 1984), quality assurance procedures taking distribution functions into account are only just beginning to emerge. These are discussed further in section 5.3. The stochastic nature of dispersion must be reflected in the intended evaluation methodology applied to dispersion models used in emergency response tools. Effort should be put on properly communicating the need for and the benefits of uncertainties and confidence intervals to decision makers and first responders. In addition, transparent and manageable ways of dealing with threshold values have to be proposed, which can help to remove obstacles that exist in present legislation that prevent the use of probabilistic simulation results with known or estimated confidence levels. 56

65 Figure 8: Left: Typical concentration time series for seven short duration releases in an urban wind tunnel model, where the vertical black lines indicate release time and duration. Right: Histogram of peak times at one measurement station, where the peak time is the time after release when the peak concentration is observed (Harms et al., 2011). Comprehensive experiments and simulations using advanced computational models based on LES modelling have revealed the common special features of urban dispersion at the neighbourhood and street scale. These include how the mean transport direction differs from the mean flow above the buildings (Dejoan et al. 2010); the rapid vertical and horizontal spreading of plumes in wakes and canyons; the more complete mixing produced by interacting wakes (resulting in lower relative levels concentration fluctuation intensity when compared to dispersion in rural terrain); the transition in dispersion behaviour from below the average building height to above them; and the variations in concentration recorded downwind of the source, upwind of the source and within the buildings. These important effects can only be expected to be reproduced by models that explicitly take buildings and other obstacles into account. The more advanced, building accommodating models are generally more computationally expensive, which contradicts the need for emergency response dispersion modelling to be very quick. Depending on the phase of an emergency situation identified in Chapter 3.6, a clear definition of the time delay that can be accepted to obtain the results is necessary. This time delay may be linked with the scale (short range predictions must be obtained quicker than long range predictions), and with practical issues (the time needed to reach the location of the accident). The desperate need in emergency response scenarios is for short computing times. If the aim of the activity is planning or post-accident investigation, then the requirements (and thus the type of models that can be used) may be completely different. Whatever the case, the strategy for assessing the performance and reliability of tools applied for emergency response management should be the same as far as possible. The strategy should build on previous work in dispersion modelling 57

66 and risk assessment, but also adopt procedures from more distant but related application areas. The final requirement is that the quality assurance procedure should be general enough to be applied to models of varying complexity. The definition of the quality assurance procedure has direct consequences for the experimental data that are indispensable in the quality assurance of models. Data sets to be used for the validation of hazmat dispersion models have to be specifically prepared for different phases of emergency response, as well as for pre- and postaccident analysis. A review and preparation of the available data sets for the quality assurance of hazmat dispersion models is required as the data bases are sometimes too big to enable them to be used easily in validation exercises. Within the present COST Action ES1006 therefore a consensus is developed on which data are important for the validation of hazmat dispersion models and in what form they are most useful. Then data sets that fulfil the abovementioned requirements will be identified and put into an easy-to-use data base. Possible data sets are presented and discussed in the next section Data sets General requirements for data sets to be used for validation were formulated by Schatzmann and Leitl (2011). The data should have a high representativeness in space and time, and detailed information on the external conditions and, if necessary, initial conditions in the simulations. Furthermore, the experimental data should be repeatable with known uncertainty, expressed as a confidence interval. As Schatzmann and Leitl (2011) demonstrated, these requirements are very hard to fulfil by field measurements, but can be easily obtained in well conducted wind tunnel experiments. These wind tunnel experiments can also help to assess the uncertainty in field data (Schatzmann et al., 2010). Hunt et al. (2004) presented an extensive list of data sets available for model evaluations with a focus on instantaneous releases in urban environments. Field experiments have been conducted in real cities including Birmingham 1999/2000 (Britter et al., 2000; data-set archive: Cooke et al., 2000), Salt Lake City 2000 (Allwine et al., 2002; Chang et al., 2005), Los Angeles 2001 (Rappolt, 2001; description of data in Hanna et al., 2003), Barrio Logan (Venkatram et al., 2002), London DAPPLE program 2003, Oklahoma City JUIT experiment 2003, Manhattan 2005 (Hanna and Chang, 2012). In addition to flow and dispersion measurements in deep urban street canyons, outdoor-indoor-subway exchange mechanisms were investigated in Midtown Manhattan (MID05) experiment (Hanna and Chang, 2012). Field experiments with artificial structures (in most experiments arrays of cubes) include the Kit Fox experiments in Nevada (Hanna and Chang, 2001; Hanna and Steinberg, 2001), Cardington UK 1993 (Davidson et al., 1995), UMIST Environmental Technology Centre Dispersion Test Site (Macdonald, 1998), Dugway Experiment 2001 (Venkatram et al., 2002). In addition, there are a number of data sets available from wind tunnel 58

67 experiments including realistic city models, like Nantes (Kastner-Klein et al., 2000), or artificial structures (e.g. Davidson et al, 1996; Macdonald et al. 1998). The post-event analysis of real accidents involving hazardous releases also provides valuable information on model performance and short-comings as demonstrated by Hanna et al. (2008), who compared the results of six widely used models that included dense gas algorithms (TRACE, PHAST, CAMEO/ALOHA, HGSYSTEM, SLAB, and SCIPUFF) for three typical chlorine railcar release scenarios, based on data from real accidents. The authors concluded that the estimation of the source or release term was important for reliable results, since the calculated chlorine concentrations were approximately proportional to the mass release rate. It is believed that most current models generally over-predict dense gas concentrations because they do not account for the removal of material by dry deposition at the surface or by chemical reactions in the plume. There is only limited data available that is particularly suited for validating emergency response models. Therefore other data, mostly from measurements of air quality have been used in the past for validating the numerical dispersion models implemented in emergency response tools. The corresponding validation results must therefore be used with caution when related to emergency response in general, especially in the response phase (see chapter 3.4). For scenarios involving releases over very short time scales (i.e. instantaneous releases), standard air quality measurements are not suitable for assessing the validity of the atmospheric dispersion model predictions Evaluation methodologies As stated in the introduction to this chapter, model evaluation comprises a scientific evaluation of the adequacy of the mathematical model to describe the dispersion process in urban or industrial environments. The next step is a demonstration that the mathematical model is correctly implemented as a computational model, i.e. that the executable computer code is free of logical, algorithmic or programming errors. This step is called code verification and should be performed by systematically comparing the computational results with analytical solutions of the mathematical model. Because analytical solutions can in general only be obtained for very simple problems, the currently recommended method for code verification is the Method of Manufactured Solutions, MMS (Roache and Steinberg, 1984; Roache, 2002; Knupp and Salari, 2003; Roy et al., 2004; Eça and Hoekstra, 2009). Here an analytical solution of arbitrary complexity is prescribed first. By inserting this solution in the mathematical model, analytical residuals remain that have to be implemented as source terms in the computational model. When the implementation of the entire mathematical model, including the new source terms, is correct, the solution of the computational model should approach the analytical solution at the theoretical rate. Despite its known efficacy in detecting flaws in 59

68 computational models, MMS is not yet routinely applied, one reason being the codeintrusiveness of the method. While this argument is justified for code users, code developers should be asked to demonstrate that the computational model performs as intended. Assuming that the computational model is free of logical, algorithmic or programming errors, i.e. verified, the third step in model evaluation is validation: the comparison of simulation results with measured data. This step delivers information on the adequacy of the mathematical model to predict dispersion in urban or industrial environments. Contrary to the scientific evaluation, which is based on knowledge of the appropriate mathematical modelling for the physical processes involved, validation is based on observations from the real world. Before starting validation simulations or experiments, the target variables must be defined. The target variables are those variables from the numerical simulation and the experiment that shall be compared to each other. In the context of dispersion these are concentrations. For hazmat dispersion modelling peak concentrations and integrated concentrations, i.e. dosages, are the most relevant variables. For emergency response tools, these define the extent of the affected areas and the expected impact, mainly in health effects. Besides the target variables, intermediate variables that have a large influence on the target variables should also be validated. Especially in the common case of dispersion simulations based on a passive scalar, intermediate variables result from the model for the wind field, described by mean velocity components and, possibly, turbulent fluxes. When there is no influence of the concentration on the wind field, the latter can be calculated independently and in advance of the concentration distribution. Validation can then be performed sequentially, first for the wind field and then for the concentration. If, for example, model predictions for the wind field are already very different from measurements, a good agreement between the computed and measured concentrations cannot be expected. In this case good agreement between the simulation and measurement results for concentrations would mean that the model was right but for the wrong reason. Validation of intermediate variables is therefore necessary to detect the sources of differences between model outcomes and the measurements. For validation purposes it is necessary to acknowledge that the results of the computational model contain numerical errors due to approximations made in space and time in the mathematical model. These may be due to operations with finite precision arithmetic and incomplete iterative convergence if, as usual in advanced dispersion models, iterative solution methods are used. These errors have to be estimated in order to quantify their influence on the numerical solution of the mathematical solution. This is in general done by converting them empirically into confidence intervals or uncertainties. For validation simulations these epistemic numerical uncertainties should be negligibly small, so that the numerical solution is a highly accurate representation of the solution of the mathematical model to be assessed by the comparison with the measured data. 60

69 Even after the removal of the numerical uncertainties other epistemic uncertainties remain in the numerical simulation results. They are rooted in the incomplete knowledge of data needed in the simulation. Examples are uncertainties in the thermo-chemical properties of the materials, uncertainties in the geometry of the actual site, or uncertainties in the boundary conditions. While these uncertainties should be small for proper validations, due to the requirements on the validation experiments described in the previous section, they can never be completely removed. This means that a sensitivity coefficient analysis at least should be performed for the model. More comprehensive information on the influence of different input uncertainties can be obtained using Monte Carlo or Latin Hypercube Sampling methods (ASME, 2009), or polynomial chaos methods, e.g. Xiu and Karniadakis (2002). The last three methods will lead to a stochastic simulation results in form of a distribution for the epistemic uncertainty in the input data of the simulation. With validation simulations performed in this way, uncertainties are known for the numerical results and for the measured data, as described in the previous section. This information should be included in the comparison of numerical results and measured data. Methods to do this are currently an area of active research, e.g. Oberkampf and Roy (2010). At present, however, neither uncertainties nor density functions of stochastic processes are taken into account when comparing numerical results and measured data. Current validation metrics for dispersion are solely based on mean values as briefly discussed below. In most studies (e.g. Hankin and Britter, 1999; Chang et al., 2003; Chang et al., 2005; Hanna and Chang, 2012), the evaluation methodology is designed referring to the performance measures defined by Hanna et al. (1993): the fractional bias (FB), the geometric mean bias (MG), the normalized mean square error (NMSE), the geometric variance (VG), and the fraction of predictions within a factor of 2 of observations (FAC2). It should be noted that the FB and NMSE are both normalised with the mean of the modelled and the mean of the observed data, and so provide better results for models that over-predict (Seibert, 2004). Also uncertainty does not enter these metrics, except for the experimental uncertainty to determine thresholds for the metrics MG and VG, which are very sensitive to small magnitude data (Chang and Hanna, 2004). The listed measures are now defined, and the acceptance criteria for urban dispersion model evaluation are presented following the recommendations of Hanna and Chang (2012). Symbol C represents concentration, subscripts p and o refer to predicted and observed values, and the overbar represents an average. = 2( ) + 61

70 The fractional mean bias FB should be less than 0.67, i.e. the relative mean bias should be less than a factor of = NMSE generally shows the most striking differences among models. If a model has a very low NMSE, then it is performing well in both space and time. On the other hand, high NMSE values do not necessarily mean that a model is completely wrong. That case could be due to time and/or space shifting. Moreover, it must be pointed out that differences in peak values have a higher weight in NMSE than differences in other values. The NMSE should be less than 6, i.e. the random scatter should be less than about 2.4 times the mean. The fraction of predictions that are within a factor of two of the observations (FAC2 where 0.5<C p /C o <2) should be more than 0.3. For all data paired in space, for which a threshold concentration must always be defined, the normalized absolute difference should be less than 0.50, when the threshold is three times the instrument s limit of quantification (LOQ). In the large scale model evaluation exercises ATMES-II (Klug et al., 1992; Mosca et al., 1998) and ETEX (Girardi et al., 1998; van Dop and Nodop, 1998; Klug, 2000), a wide range of magnitude of observed and predicted concentrations were found, especially due to the use of field data with distances from 100 to 2000 km from the source. Therefore, the use of the geometric mean bias (MG) as an extra index for the determination of model overestimation or underestimation was introduced. A "perfect" model would have MG=1, but MG=1 does not mean that predictions coincide with measurements. If the geometric mean bias is larger than 1 the model overestimates, if MG is less than 1 the model underestimates. Usually, the 5th and 95th percentiles of the geometric biases are taken as the limits of the confidence interval. = ( ) Geometric variance is defined as = ( ) As for geometric mean bias, geometric mean variance is computed when the range of concentrations to be evaluated is wide. Contrary to NMSE, this index gives the same weight to pairs showing the same ratio, independently of the absolute value of the data. MG and VG performance measures approach positive or negative infinity for C o and/or C p approaching 0. Furthermore, Hanna and Chang (2012) use the threshold-based normalized absolute difference NAD.

71 = + A F is the average of the number of samplers where the model gives false positive predictions (C p >C T and C o <C T ), and the number of samplers where the model gives false negative predictions (C p <C T and C o >C T ). C OV (where OV means overlap) is the number of samplers where model predictions and observations are both above the threshold (C p >C T and C o >C T ). The threshold concentration, C T, is arbitrarily taken to be three times the instrument s limit of quantification (LOQ), which should be published by the field experiment team in the data reports. NAD would equal zero for a perfect model (since A F would be zero), and would approach 1.0 for a model that never produced a plume that overlapped the observed plume (since A OV would be zero). A OV could equal zero simply because of errors in assumed wind direction, or could be caused by fundamental physics errors. In addition, the median, average, and maximum of C o and C p are often listed in summary tables. The graphical summary of these statistical parameters as suggested by Hanna et al. (1993) is shown in Figure Geometric Variance (VG) FOCUS TRACE B&M DEGADIS PHAST AIRTOX CHARM SLAB GASTAR Geometric Mean Bias (MG) Overprediction Underprediction Figure 9: Geometric mean bias MG and geometric variance VG for maximum plume centreline concentration predictions Cp and observations Co (instantaneous dense gas data set from Thorney Island experiment, involving a total of 9 trials and 61 points). 95% confidence intervals on MG are indicated by the horizontal lines. The solid parabola is the "minimum VG" curve. The vertical dotted lines represent "factor of two" agreement between mean predictions and observations (Hanna et al., 1993). 63

72 Continuous Dense Field Data Cp / Co N=68 N=25 N=17 N= x [m] Cp / Co N=10 N=13 N=45 N=32 N= u [m/s] 100 Cp / Co N=17 N=85 N= PG class Figure 10: Distributions of model residuals, Cp/Co, for the model HGSYSTEM for maximum (predicted versus observed) concentrations on the plume centreline for continuous dense gas data sets (Burro, Coyote, Desert Tortoise, Goldfish, Maplin Sands and Thorney Island) for selected distances from the source (x), wind speeds (u) and stability classes (PG Class) (Hanna et al., 1993). Box plots of model residuals for maximum concentrations on the plume centreline (example in Figure 10) indicate the 2nd, 16th, 50th, 84th and 98th percentiles of the cumulative distribution function of the N points in the box. Box plots reveal the general features of the distribution. If all the percentiles of the model that are drawn in the box plot are lower than the percentiles of the corresponding measurements, it means that the lower 75 % of the predictions is lower than the lower 75 % of measurements. This could well correspond to a modelled distribution that is everywhere lower than the measured one or, less likely, to a model distribution that is much more peaked in the 25 (or less) % of higher values. In this context it must be pointed out that this comparison does not couple values. Thus, in the case of a global box plot the values are unpaired, while in the time dependent box plot they are, obviously, paired in space. In both cases, the frequency distribution is evaluated with the same data filter as in the scatter diagram calculation. For further discussion of these statistical parameters and the application in the frame of model evaluation we refer to Schatzmann et al. (2010). Scatter-plots are often used for the direct quantitative comparison of observed and predicted dosages (e.g. Chang et al., 2003) or maximum concentrations. Comparisons of time-series of observations with high temporal resolution (e.g. 60 s averages derived from high-frequency TGA data with 4 Hz) and model predictions reveal valuable information about puff arrival times and cloud advective speeds. 64

73 General features that are desirable for validation metrics were proposed by Oberkampf and Barone (2006) and Ferson et al. (2008). These researchers proposed that metrics should be quantitative and objective and should at least take the uncertainties in the experimental results into account. Furthermore the metrics should be independent of the analysts acceptance criteria. This is for example not the case for the validation metric hit rate, recommended and used by COST Action 732 (Schatzmann et al., 2010). Oberkampf and Barone (2006) proposed a metric that measures the distance between the mean of predictions and the mean of observations, including the confidence level of the experiment. Another proposed metric measures the area between the prediction and observation distributions (Ferson et al., 2008). This metric therefore fully respects the stochastic nature of the underlying processes. While metrics are the ultimate outcome of a validation exercise to reach a condensed presentation of the computational models capabilities, simple graphical examinations and comparisons of model and measured results should be conducted at every stage of the exercise (Schatzmann et al., 2010). In addition it should be clearly communicated that validation against a particular data set or group of data sets only provides a basis from which it can be inferred how the model will perform in predicting dispersion for a different case. It is, for example, well known that a model, which produces good results when the time scale is two hours, might be completely wrong for a time scale of five minutes. The smaller the differences between the validation cases and the new prediction case, the higher the confidence that the model will perform similar to the validation cases. A detailed documentation of the model evaluation process helps to reduce the previously mentioned uncertainty in model results that depends upon the users. Furthermore, valuable information may be obtained from the sensitivity and uncertainty analysis and incorporated into best practice guidelines on how to conduct local-scale hazmat dispersion modelling. References Allwine, K.J., Shinn, J.H., Streit, G.E., Clawson, K.L. and Brown, M. (2002): Overview of Urban 2000: a multiscale field study of dispersion through an urban environment. Bull. Am. Meteorol. Soc. 83, ASME (2009): Standard for Verification and Validation in Computational Fluid Dynamics and Heat Transfer (ASME V&V ). ASME, New York, USA, Baumann-Stanzer, K., Stenzel, S. (2011): Uncertainties in modeling hazardous gas releases for emergency response. Meteorologische Zeitschrift, Vol. 20, No. 1, Britter, R.E., Caton, F., Di Sabatino, S., Cooke, K.M., Simmonds, P.G. and Nickless, G. (2000): Dispersion of a passive tracer within and above an urban canopy. In: Proceedings of the Third Symposium on the Urban Environment, American Meteorological Society, pp

74 Britter, R.E., S.T. Cole (1994): The evaluation of technical models used for major-accident hazard installations Publications Office European Communities / Union (EUR- OP/OOPEC/OPOCE) (May 4, 1994). ISBN-10: , ISBN-13: , pp. 41. Chang, J. C., Hanna, S. R. (2004): Air quality model performance evaluation. Meteo. Atmos. Phys., Vol. 87, Chang, J.C., Franzese, P., Chayantrakom, K., Hanna SR. (2003): Evaluations of CALPUFF, HPAC, and VLSTRACK with two mesoscale field data sets. J Appl. Meteorol., Vol. 42: Chang, J.C., Hanna, S.R., Boybeyi, Z. and Franzese, P. (2005): Use of Salt Lake City Urban 2000 data to evaluate the Urban-HPAC model. J Appl Meteorol., Vol. 44 (4): Cooke, K. M., Di Sabatino, S., Simmonds, P., Nickless, G., Britter, R.E. and Caton, F. (2000): Tracer and Dispersion of Gaseous Pollutants in an Urban Area. Birmingham Tracer Experiments. Technical Paper CUED/A-AERO/TR.27. Department of Engineering, Cambridge University. Davidson, M.J., Mylne, K.R., Jones, C.D., Phillips, J.C., Perkins, R.J., Fung, J.C.H. and Hunt, J.C.R. (1995): Plume dispersion through large groups of obstacles a field investigation. Atmospheric Environment, Vol. 29, Davidson, M.J., Snyder, W.H., Lawson, R.E., Hunt, J.C.R. (1996): Plume dispersion from point sources upwind of groups of obstacles wind tunnel simulations. Atmospheric Environment, Vol. 30, Dejoan A, Santiago, J.L., Martilli, A., Martin, F., Pinelli, A. (2010): Comparison between LES and RANS computations for the MUST field experiment. Part II: effects of incident wind deviation angle on the mean flow and plume dispersion. Boundary-Layer Meteorology, Vol. 135, Ferson, S., Oberkampf, W.L., Ginzburg, L. (2008): Model Validation and Predictive Capability for the Thermal Challenge Problem. Computer Methods in Applied Mechanics and Engineering, Vol. 197, Fox, D.G. (1984): Uncertainty in Air Quality Modeling. Bull. Amer. Meteor. Soc. 65, Girardi, F., Graziani, G., van Velzen, D., Galmarini, S., Mosca, S., Bianconi, R., Bellasio, R., Klug, W., Fraser, G. (1998): The European Tracer Experiment. Office for Official Publications of the European Communities, ISBN , 107pp. Hankin, R.K.S. and Britter, R.E. (1999): TWODEE: The Health and Safety Executive s shallow layer model for heavy gas dispersion. Part 3 Experimental validation (Thorney Island), Journal of Hazardous Materials, 66 (3). pp ISSN Roy, C.J., C.C. Nelson, T.M. Smith, C.C. Ober (2004): Verification of Euler/Navier Stokes codes using the method of manufactured solutions. Int. J. Numer. Meth. Fluids 44 (6), pp Hanna, S.R., Dharmavaram S., Zhang J., Sykes I., Witlox H., Khajehnajafi S., Koslan K. (2008): Comparison of six widely-used dense gas dispersion models for three recent chlorine railcar accidents. Process Safety Progress, Vol. 27; Hanna, S.R., Steinberg K.W. (2001): Overview of Petroleum Environmental Research Forum (PERF) dense gas dispersion modeling project. Atmos Environ., Vol. 35:

75 Hanna SR, Britter RE and Franzese P. (2003): A baseline urban dispersion model evaluated with Salt Lake City and Los Angeles Tracer data. Atmospheric Environment, Vol. 37: Hanna, S. and Chang, J. (2012): Acceptance criteria for urban dispersion model evaluation. Meteorol. Atmos. Phys.,Vol. 116, no. 3-4, pp Hanna, S.R. (1993): Hazardous gas model evaluations. Is an equitable comparison possible? J. Loss Prev. Process Ind. Vol. 7, No. 2, Hanna, S.R. and Chang, J.C. (2001): Use of the Kit Fox field data to analyze dense gas dispersion modeling issues. Atmospheric Environment, Vol. 35, Hanna, S.R., Chang, J.C. and Strimaitis, D.G. (1993): Hazardous gas model evaluation with field observations. Atmospheric Environment, Vol. 27A, No. 15, pp Harms et al. (2011): Validating LES-based flow and dispersion models. Journal of Wind Engineering and Industrial Aerodynamics, Vol. 99: Hunt, J.C.R., Carruthers, D.J., Britter, R.E., Daish, N.C. (2004): Dispersion from Accidental releases in Urban Areas. report ADMLC/2002/3 prepared for ADMLC (UK Atmospheric Dispersion Modeling Liaison Committee), Cambridge Environmental Consultants Ltd., 3 Kings Parade, Cambridge CB2 1SJ. Kastner-Klein, P., Rotach, M. and Fedorovich, E. (2000): Experimental study on mean flow and turbulence characteristics in an urban roughness sublayer. In: Proceedings of 14th AMS Symposium on Boundary Layers and Turbulence, Aspen, CO, August 7-11, Klug, W. (2000): What did we learn from the ETEX Experiment? Proceedings of the Millennium NATO/CCMS International Technical Meeting on Air Pollution Modelling and its Application, Boulder (Colorado, USA) May 2000, AMS Ed., Klug W., Graziani G., Grippa G., Pierce D., Tassone C. (eds) (1992): Evaluation of Long Range Atmospheric Transport Models using Environmental Radiocativity Data from the Chernobyl Accident: the ATMES Report. Elsevier Applied Sciences. Knupp P, Salari K. (2003): Verification of computer codes in computational science and engineering. Boca Raton, FL: CRC Press. L. Eça, M. Hoekstra (2009): Evaluation of numerical error estimation based on grid refinement studies with the method of the manufactured solutions. Computers & Fluids 38, Macdonald, R., Griffiths, R.F. and Hall, D.J. (1998): A comparison of results from scaled field and wind tunnel modelling of dispersion in arrays of obstacles. Atmospheric Environment 32, Mosca, S., Bianconi, R., Bellasio, R., Graziani, G. and Klug, W. (1998): ATMES II Evaluation of Long-range Dispersion Models using data of the 1st ETEX release. EUR EN. Oberkampf, W.L., Barone, M.F. (2006): Measures of agreement between computation and experiment: Validation metrics. Journal of Computational Physics, Vol. 217, Oberkampf, W.L., Roy, C.J. (2010): Verification and Validation in Scientific Computing. Cambridge University Press, Cambridge. 67

76 Rappolt, T. (2001) Field test report: measurements of atmospheric dispersion in the Los Angeles urban environment in summer Report number 1322, prepared for STI, Bel Air, MD, by Tracer Environ. Sci. and Tech., San Marcos, CA pp. plus data CD. Roache, P.J. (2002): Code verification by the method of manufactured solutions, J. Fluids Eng. 124 (1), Roache, P.J., Steinberg, S. (1984): Symbolic manipulation and computational fluid dynamics. AIAA J. 22 (10), Schatzmann, M., Leitl, B. (2011): Issues with validation of urban flow and dispersion CFD models. Journal of Wind Engineering and Industrial Aerodynamics, Vol. 99, Schatzmann, M., Olesen, H. and J. Franke (Eds.) (2010): COST 732 Model Evaluation Case Studies: Approach and Results. COST Office, ISBN: , 121 pp. Seibert, P. (2004): Remarks on Statistical Parameters for Model Evaluation. Unpublished. van Dop H. and Nodop K. Guest Eds. (1998): ETEX, a European Tracer Experiment. Atmospheric Environment. Vol. 32, no. 24, December 1998, (S.-E. Gryning and M. M. Millan Eds.), Plenum Press, New York, pp Venkatram, A., Upadhyay, J., Heumann, J. and Klewicki, J. (2002): The development and evaluation of a dispersion model for urban areas. 4th AMS Symposium on the Urban Environment, Norfolk, VA, May Xiu, D., Karniadakis, G. (2002): The Wiener-Askey polynomial chaos for stochastic differential equations. J. Sci. Comput., Vol. 26,

77 6. Summary The application of emergency response systems to dealing with releases of hazardous agents on the local scale presents major challenges at all levels of involvement: Developers are presented with the need to derive numerical schemes capable of modelling very complex physical processes to a high degree of resolution very rapidly. Analysts involved in using sophisticated modelling systems must have the training and experience necessary to use the tools and interpret the results correctly. Decision makers have very limited time and uncertain information, on which they must base difficult decisions (such as sheltering or evacuation) that may have serious implications on human health or the environment. Chapter 1 outlined how instantaneous accidental, or deliberate, releases of hazardous materials into the atmosphere in built-up areas, could lead to catastrophic consequences; whether in terms of population casualties, or damage to ecosystems and infrastructure. The difficulties associated with developing improved emergency response tools that could cope with such releases were described. In particular, the challenge of rapidly providing first responders with accurate predictions of the dispersion of airborne hazards at the local scale was identified. A variety of emergency response tools either already exist, or are under development in a number of countries across Europe. One of the key drivers behind COST Action ES1006 is recognition of the increasing need to share expertise and harmonise development efforts to improve emergency response capabilities and procedures within, and beyond, Europe. The scope of the Action activities are defined by answers to the following questions: What is the problem? What has to be done? What is the main focus here? It is important to note that the main focus of the Action is to improve the modelling capability that exists for predicting the atmospheric dispersion of agents at local scales within complex environments. The boundary of the activity is defined by the evaluation of the health and environmental effects. 69

78 The practical outputs of the activity will be: 1. An inventory of existing models and modelling systems; 2. An inventory of data sets that may be used for model validation; 3. A scientific and best-practice reference guide to modelling local-scale airborne hazards. Item 3 will be the core scientific result of the Action, and will be defined by common consent of the participants. Chapter 2 identified the major physical characteristics of the hazardous materials that may threaten the population through atmospheric dispersion of (2.1). It then described the modelling and operational challenges encountered in dispersion modelling, and the development of decision-support systems dedicated to supporting crisis management (2.2). In section 2.1: The terms danger, risk or threat are defined as they are used extensively throughout the report. The origins of anthropogenic and natural emissions into the atmosphere are identified. Anthropogenic releases may occur as a part of normal activities or processes (e.g. emissions from power stations), or may be unplanned (e.g. chemical accidents). Only the latter are considered in this report. The term threat agents that describes harmful materials which may be transported and dispersed by the atmosphere is introduced. The division of agents into Chemical, Biological, Radioactive or Nuclear (CBRN) groups is described. Two categories of dangerous events are distinguished: accidents that occur in industrial plants or during transport, and malevolent or terrorist actions that lead to the dispersion of CBRN agents (a dirty bomb, for example). The major factor that affects the atmospheric dispersion of agents is their physical state, or phase. Agents may exist as gases or airborne particles called aerosols, whose principal properties are described. The initial conditions that drive the emission of material into the atmosphere provide another distinction between threats. The principal types of release are described: passive releases and non-passive releases (buoyant, with initial momentum, flashing and / or evaporating, chemically reactive, etc.) Finally, the probability and magnitude of the threats resulting from the atmospheric dispersion of hazardous materials is summarised. Although the events may be different in nature, they share requirements for atmospheric dispersion modelling and health impact assessment. 70

79 In section 2.2: An overview of the challenges is presented to emphasise the numerous issues that the modellers and also the code users have to cope with in dispersion modelling. It is also emphasised that the dispersion modelling output is only the data from which to evaluate the health effects, which is the information of practical interest to first responders and decision makers. As the same term may have different meanings in different contexts, tentative definitions of the terms source and source term are proposed, that refer to the location, geometry, nature, quantity, and kinetics of releases. It is also noted that the characteristics of the source term may be closely related to the choice of a dispersion model. The physical processes that must be taken into account in the modelling chain not only encompass dispersion, but also (1) an air flow computation which, at the local scale, in a built environment (industrial site or urban district, or inside buildings) may be a challenging problem (especially if the scenario involves indoor / outdoor material transfers), (2) the agents removal or neutralisation while dispersing (through chemical reaction, for example), and dry or wet deposition. Meteorological data forms an essential input to any kind of atmospheric dispersion model. This data may be measurements or forecasts from models. The minimum meteorological data required to characterise the dispersion conditions within a particular area are identified. As hazardous releases often consist of airborne particles, a brief overview of the complex topic of aerosol physics is given in terms of size distribution, aerodynamics and deposition. Pragmatic considerations require mention of the computation time, which may impose a major limitation on the use of atmospheric dispersion modelling and simulation in an emergency or a crisis. The development of HPC facilities and parallel versions of most CFD codes are important improvements in capability that cannot be divorced from improvements in modelling of the physical processes. The requirements necessary to fulfill the development of a decision-support system, including atmospheric dispersion modelling and health impact assessment are identified. The key features required are: quick and precise dispersion computations and a presentation of results consistent with their use by rescue teams at any stage of a crisis. This means before (preparedness), during (counter-measures), or after the crisis (experience feedback). In conclusion a table is given that presents examples of accidental and malevolent (or terrorist) scenarios involving the release of CBRN agents. These scenarios provide a series of test cases guiding the activities of the Action. 71

80 The following section presents a summary of the various methods available for predicting wind flows and the dispersion of hazardous materials close to the source of release. However, the primary focus is on local scale dispersion in built-up areas. The methods available for modelling or estimating the dispersion of a pollutant, start from empirical rule-of-thumb methods, and extend to complex mathematical models. The simplest ADMs are those based on Gaussian dispersion assumptions, while the most complex include Eulerian RANS or LES models, and Lagrangian particle models. The calculation of the meteorological parameters required can be either coupled with, or de-coupled from the dispersion calculation, depending on the physical properties of the dispersing material. Furthermore, the meteorological parameters can either be calculated diagnostically from measurements, or prognostically from meteorological models (possibly through down-scaling). The modelling requirements are likely to depend upon the emergency management phase (planning, response, recovery): simple and fast (but probably less accurate) methods are appropriate during the response phase; while more complex and slower (but potentially more accurate) methods can be used in the planning and response phases. The choice of the most appropriate method also depends on the spatial range: Gaussian and CFD models are typically only applied over local to urban scales (i.e. up to about 10 km from the emitting source), while Eulerian and Lagrangian models may be applied over far greater distances. The data input requirements of dispersion models depend on the model s complexity. In general they include source term-related information, meteorological data, geometry/topography data and possibly measured concentration data. Model outputs include predictions of: Instantaneous and/or time-integrated values of concentration; Peak concentrations; Deposition of agent(s) onto surfaces (streets or buildings walls); Affected areas. For emergency planning and response, confidence estimates relating to the probability of concentrations/dosages exceeding specified values beyond predicted hazard zones are also desirable. Finally, examples of dispersion models, modelling systems or CBRN emergency management tools were given. These describe the current capability, and provide a starting point for the detailed evaluations and data base compilation activities that the Action will undertake. Chapter 4 focuses on the conceptual as well as the practical challenges of undertaking dispersion modelling for emergency planning and response. It first provides an overview of the physical processes involved and then identifies the importance of wind field modelling. This is extremely important as it determines the 72

81 result of the dispersion and deposition calculations, which then feed into the health and environmental assessments. The modelling needs following a CBRN release are then considered, distinguishing between the study of past or academic dispersal events on the one hand, and the prediction of dispersion and health consequences on the other. The issues associated with down-scaling from meso- to local-scales are also considered. The need to bridge the gap between these scales is highlighted, and potential approaches presented. This is followed by consideration of the different natures of the impact assessments required during a crisis and in the post-crisis management. Finally, the future needs for model development are considered: in particular, the need to consider uncertainties, variability, confidence levels, and data assimilation. If dispersion models are to be used in emergency response tools, their predictive capabilities need to be quality-assured in a transparent and generally accepted way. This is needed to demonstrate that a specific emergency response system/tool will provide reliable information to decision makers in the defined application: accidental releases in build-up environment in this case. Nevertheless, even with access to an ideal model and users who will apply it as intended, uncertainties will remain due to lack of information about the source term and the complex meteorological conditions in a built-up environment. The evaluation of hazmat dispersion models for urban or industrial environments requires different methods, measures and data sets to those used for evaluating air quality models. This is because: 1. The transient dispersion process from a short duration or instantaneous release has to be predicted by the model; 2. The meteorology and dispersion are highly inhomogeneous in space and intermittent in time. To address this, a consensus will be reached within the Action on which data are most important for the validation of hazmat dispersion models, and in what form the data is most useful. The selected data sets will then be put into an easily accessible database. An overview of the existing data sets and metrics that are generally used for evaluating dispersion model performance in urban environments is given in Chapter 5. Emergency response system towards a complete solution? A schematic representation of an Emergency Response System is shown in Fig 1. Even though the ADM is the "heart" of the system, all the peripherals are essential for providing a comprehensive and reliable system. The target should be to provide versatile systems that are able to support a range of emergency scenarios, through having a variety of models and options that can be selected according to the specific event. A comprehensive system should be able to provide answers to the different phases of an event: preparedness, response and recovery. Each phase has its specific 73

82 needs in terms of accuracy, input data and outputs required. A versatile system will include a full range of models capable of supporting each phase appropriately. An ideal Emergency Response System will include a toolbox of models that allow decision makers and operators to pick the most suitable tool for the event. There is a significant gap between the models that represent the state-of-the-art in the scientific community, and those that are implemented in operational systems. There are many reasons for this: The computational resources required by sophisticated models prohibit their use in real-time; State-of-the-art models require a large number of input parameter values that must be available in real time; They require high levels of skill and expertise to use them; A perception that the large uncertainties in input parameters and model complexity make the output less reliable than that from current operational models. The benefits obtained by improving numerical models should be commensurate with the efforts invested in improving them. Figure 11 shows an idealization of this process, in which it can be seen that beyond a certain point further improvements lead to decreasing benefits. It should be noted that benefits does not simply mean accuracy, but refers to overall performance (e.g. a 10% improvement in accuracy by doubling the computation time is a model improvement, but does not lead to an overall benefit). Figure 11: Schematic 'benefit-effort'-function to be considered when evaluating the benefits of model improvements. 74

83 Model developers are not always aware of the implications of moving from a scientific/development model to an operational system. Improving operational capability is not only a matter of providing a more accurate or faster model, but also of providing robustness, a user-friendly interface, comprehensive training material and manuals, maintenance services, etc. This process can be perceived as tedious and time consuming, but is essential to deliver a fully operational model. The two main scenarios considered in the Action are: accidental and malevolent releases. They share many similar characteristics, but differ in others with significant consequences on the modelling requirements. Table 2 lists some of these characteristics. When considering accidental releases originating in industrial plants, a large number of simulations can be conducted ahead (preparedness phase). These simulations may have a range of different meteorological conditions and source terms. This is generally not possible for malevolent release scenarios and emergency response systems have to rely on access to simple fast running models. Location Source Term Accidental known (if during transportation the itinerary is usually know ahead) known (at least the agent, and approximately the amount that can be dispersed) Malevolent (Terror) unknown unknown Source size potentially large expected to be relatively small Potential Damage can be severe (depending on the source size and agent) expected to be light to mild in terms of casualties, depending on the agent; severe in terms of public opinion, media reaction, etc. Table 2: Characterization of typical releases. Cooperating with stakeholders A key issue in order to improve emergency response systems is the cooperation and dialog between developers, operators, stakeholders and decision makers through the whole life cycle of the system. This dialog should begin from the conceptual stage, through the development and implementation of the models, the design of the GUI, the personnel training and finally the maintenance of the system. 75

84 Figure 12 shows in a pictorial way how this kind of collaboration should proceed. Voinov and Bousquet (2010) point out that the arbitrary loops go back and forth and that the different stages can be shuffled at any moment. When dealing with stakeholders and decision makers a common language is essential, from trivial things as physical units used in dosimetry to contour levels representing expected damage or exposure. Convincing stakeholders and decision makers to adopt new tools is partially on model developers' shoulders since they have the knowledge and expertise of making better tools and explain the advantages. However, it is essential that model developers understand the needs of stakeholders, decision makers and operators in order to focus their work and keeping in mind that at the end of the road they are supposed to deliver an operational system. Figure 12: Pictured process of stakeholder involvement. Future needs for model development Uncertainties / variability / confidence levels Methodologies should be developed to assess (bounds of) uncertainties in model predictions, possibly in the form of range of concentration values with respective probabilities or as confidence estimates that prescribed concentrations will not be exceeded outside of predicted hazard zones. Models should provide not only average concentrations but also event-to-event variability for a given release. A proposed method for this is the use of model solutions ensembles for the urban area in 76