Partnership for AiR Transportation Noise and Emission Reduction An FAA/NASA/TC-sponsored Center of Excellence A Probabilistic Approach to Representing and Analyzing Uncertainty in Large-Scale Complex System Models Doug Allaire, Karen Willcox, Ian Waitz Presented by: Doug Allaire
This work was funded by the U.S. Federal Aviation Administration, Office of Environment and Energy Project Manager: Maryalice Locke Contact: maryalice.locke@faa.gov The results presented within are the outcomes of sample problems at various stages of model development. These sample problems had different assumptions and scenarios. They are shown for illustrative purposes and should not be interpreted as final results or quoted. 2
Outline 1. Framing the problem 2. Real-World Application 3. Approach 4. Future Work 3
Framing the Problem Representing and analyzing uncertainty in large-scale, complex system models intended to support decisionmaking and policy-making processes Uncertainties in modeling are unavoidable Uncertainty should be properly represented Estimates and predictions Risk analysis Cost-benefit analysis Furthering model development Objective To establish and demonstrate the applicability of a general probabilistic approach for representing and analyzing uncertainty in large-scale complex system models. 4
Environmental Impacts Tools-Suite Policy and Scenarios APMT PARTIAL EQUILIBRIUM BLOCK DEMAND (Consumers) Operations Fares Schedule SUPPLY & (Carriers) Fleet AEDT Airport-level Noise Tools Global Noise Assessment Airport-level Emissions Tools Global Emissions Inventories APMT BENEFITS VALUATION BLOCK Emissions Emissions Noise CLIMATE IMPACTS LOCAL AIR QUALITY IMPACTS NOISE IMPACTS EDS Technology Impact Forecasting New Vehicle Noise Design Tools Design Tools Interface Aircraft Vehicle Cost Assessmen t Vehicle Emissions Design Tools Collected Costs Emissions & Noise Monetized Benefits APMT COSTS & BENEFITS 5
Approach 6
Approach Step 1: Establish Goals Furthering model development Identify gaps in functionality that significantly impact the achievement of model requirements, leading to the identification of high-priority areas for further development. Rank inputs based on contributions output variability to inform future research. Informing decision-making Provide quantitative evaluation of model performance relative to fidelity requirements of various analysis scenarios. Provide quantitative comparisons of various policy scenarios. Properly identify and represent uncertainty in the model. 7
Approach Step 2 & 3: Document Assumptions/Limitations, Inputs/Outputs Documentation of Assumptions/Limitations Many assumptions Future growth scenario Discount rate Many limitations Inability to account for certain technology Inability to vary certain parameters Documentation of Inputs/Outputs Many inputs with different types and levels of uncertainty associated with them Many outputs that may be inputs to downstream models 8
Approach Step 4: Representing Uncertainty and Variability 9
Approach Step 5: Analyzing Uncertainty Probabilistic uncertainty analysis for model development Primary goal is to apportion output variability across inputs Distributional Sensitivity Analysis, Global sensitivity analysis 10
ANOVA-HDMR Decomposition High-Dimensional Model Representation of f(x) Example HDMR for a function of three parameters Mean value Main effects First-order interactions Second-order interaction 11
Approach Step 5: Analyzing Uncertainty Probabilistic uncertainty analysis for decision-making Primary goal is to compare various policy scenarios In many cases, both aleatory and epistemic uncertainties will be present Principle aim is to determine the distribution of the conditional expectation, E[Y E] Conditional expectations evaluated at specific values for inputs with epistemic uncertainty can be used to support policy-making tools 12
Example Results BVB-Noise Module Analysis Global Sensitivity Analysis 0.8 Total Global Sensitivity Index 0.6 0.4 0.2 0 Housing Growth Rate Discount Rate Quiet Level Contour Uncertainty NDI 1.2 1.0 0.8 Total Sensitivity Indices 0.05 0.01 0.02 0.11 0.02 0.36 0.28 0.30 BVB-Climate Module Analysis 0.6 0.4 0.2 0 0.67 Integrated Temperature Change 0.69 0.67 Damage NPV Others RF* short-lived climate sensitivity Damage coefficient 13
Summary & Future Work Summary Representing and analyzing uncertainty in large-scale complex system models General probabilistic approach for development and decision-making Global sensitivity and distributional sensitivity analyses Future work Develop or identify a quantitatively rigorous method for computing E[Y E=e] and var(y E=e) from Monte Carlo data. Develop or identify a quantitatively rigorous method for performing distributional sensitivity analysis. Create a general mapping from how uncertainty and variability in model inputs arises, to how the uncertainty and variability should be analyzed. Develop various methods of presenting uncertainty analysis results 14
Questions? 15
References [1] U.S. FAA, October 2007, ICAO CAEP MODTF Third Meeting, CAEP/8-MODTF/2-IP/03. [2] Ayyub, B. and Klir, G. Uncertainty Modeling and Analysis in Engineering and the Sciences. Taylor & Francis group, 2006. [3] Chan, K., Saltelli, A., and Tarantola, S., Sensitivity Analysis of Model Output: Variance-Based Methods Make the Difference, Proceedings of the 1997 Winter Simulation Conference, 1997. [4] Hofer, E., Kloos, M., Kryzkacz-Hausmann, B., Peschke, J., and Woltereck, M., An Approximate Epistemic Uncertainty Analysis Approach in the Presence of Epistemic and Aleatory Uncertainties, Reliability Engineering and System Safety, Vol. 77, 2002, pp. 229-238. 16
Supplementary Slides 17
The Sobol Method HDMR Constraint ANOVA-HDMR Orthogonality 18
The Sobol Method Variance: Partial Variance: 19
Monte Carlo Estimates 20
Model Development Questions AQ1. What are the key assumptions employed within the module? How do these assumptions translate into quantifiable uncertainty in module outputs? AQ2. What are the key assumptions employed within the module databases? How do these assumptions translate into quantifiable uncertainty in module outputs? AQ3. How do assumptions/limitations in modeling and databases impact the applicability of the module for certain classes of problems? What are the implications for future development efforts? AQ4. How do uncertainties in module inputs propagate to uncertainties in module outputs? Further, what are the key inputs that contribute to variability in module outputs? AQ5. For assumptions, limitations and inputs where effects cannot be quantified, what are the expected influences (qualitatively) on module outputs AQ6. How do assessment results translate into guidelines for use? 21
E[Y E=e] 22