Skip to main content

Opinion: the use of natural hazard modeling for decision making under uncertainty

Abstract

Decision making to mitigate the effects of natural hazards is a complex undertaking fraught with uncertainty. Models to describe risks associated with natural hazards have proliferated in recent years. Concurrently, there is a growing body of work focused on developing best practices for natural hazard modeling and to create structured evaluation criteria for complex environmental models. However, to our knowledge there has been less focus on the conditions where decision makers can confidently rely on results from these models. In this review we propose a preliminary set of conditions necessary for the appropriate application of modeled results to natural hazard decision making and provide relevant examples within US wildfire management programs.

Introduction

Improving the quality of natural resource decision making under risk and uncertainty is a fundamental goal of the research described in this special issue. However, it is critical to recognize the appropriate role of system modeling in improving natural resource management decision making, particularly under conditions of high uncertainty. In 2009, Daniel Kahneman and Gary Klein authored an interesting paper: ‘Conditions for intuitive expertise: A failure to disagree’ (Kahneman and Klein 2009). Gary Klein’s research highlights the ability of experts to make high quality decisions in challenging environments, such as chess masters. Daniel Kahneman, along with Amos Tversky, are considered the founders of behavioral economics, the focus of which is on understanding the conditions and cognitive biases present in situations where humans consistently fail to make rational decisions in the face of uncertainty. Despite these diametrically opposed world views the individuals were able to come together to define the set of conditions where decisions by qualified experts are likely to result in high quality outcomes. The decision situation requires that cues must be stable or consistent (that is capable of being consistently interpreted) and that the decision maker must be competent enough to interpret those cues correctly. In the absence of these conditions it is likely that decision quality will likely be poor.

Similar to the Kahneman and Klein proposal we would like to suggest that decision makers consider the conditions under which modeled results realistically represent the system to be managed when considering the appropriate types of models and application of results. As researchers in wildfire risk and water development program assessment we have both observed numerous research efforts and system models that though technically sound, simply failed to capture the key issues of the system and whose results provide unwarranted confidence and encourage erroneous and/or overly simplified solutions. Under some circumstances, modeled results are recognized as inconsistent with expert understanding of the system. In these cases there is frequently a tendency to expand the complexity of the model to address the identified inconsistency whether or not such adjustments are based on better representing the reality of the system, or to simply produce results more in line with expectations. Rarely, is there recognition that the complexity of the system precludes the pre-determined modeling approach that produced the questionable results.

Within the modeling community there is growing concern that the proliferation of increasingly complex environmental decision models has not led to a noticeable improvement in predictive capability (Arhonditsis et al. 2006; Robson 2014), and that the increased complexity may not coincide with improved structural representation (see for example Flynn 2005). Given these concerns there is interest in developing best practices modeling and to create structured evaluation criteria for complex environmental models (Jakeman et al. 2006; Bennett et al. 2013). Further, it is critical that the modeling technique be appropriately selected for the problem to be addressed. Kelly et al. (2013) developed a decision matrix to select the appropriate integrated environmental assessment and management modeling approach based on the intended purpose of the model, quantitative and qualitative data availability, level of spatio-temporal detail required, and how uncertainty will be considered.

Scott et al. (2013) compared various wildfire modeling systems used in the US according to the following attributes: planning context and decisions supported, duration, fires considered, simulation type, type of burn probability, and source of variation. The authors then demonstrate a composite wildfire risk framework that integrates landscape simulation models, spatially explicit identification of values at risk, expert elicitation of wildfire effects, and leadership ranking of the relative importance of protecting those values to quantitatively define wildfire risk at various spatial scales. The framework has been successfully applied across a range of geographic scales to develop local to national scale wildfire risk assessments that support management decisions. Although information within the assessments may help support wildfire response, its primary focus and use is in supporting pre-wildfire season planning.

Proposed condition set

We find the increased scrutiny and drive for improved best practices in the development of models an important effort. However, in this discussion we focus on the application of models to the decision environment. While Kelly et al. (2013) develop a matrix to define the appropriate modeling technique based on system processes of the management problem, we propose a preliminary set of conditions necessary for the appropriate application of modeled results to natural resource decision making. We highlight these conditions by providing examples in wildfire management within the United States. Recommended criteria include: 1) The fundamental objective is clearly identified, appropriate for the temporal and spatial scale of the problem, and broadly accepted by managers and stakeholders. 2) The modeled system connections appropriately represent the system. 3) Input parameter estimates are known with some level of accuracy and/or uncertainty can be handled through appropriate techniques such as Monte Carlo simulation or sensitivity analysis. 4) There are either historically relevant events with which to compare model results, or significant data points such that statistical analyses becomes relevant. 5) It is relatively clear when shocks to the system make modeled results no longer relevant to the decision problem at hand (i.e. the conditions when the system transitions away from the modeled state is well understood).

Wildfire management in the US provides an interesting example. When an ignition is identified initial attack (IA) suppression resources are dispatched in an attempt to extinguish the fire in the first burn period. Once the dispatched resources arrive they attempt to build a fireline around the growing fire perimeter. If fireline production exceeds the rate of perimeter growth, the fire is successfully contained. If not the fire is declared escaped and typically a larger management team will be assigned to the event. A number of prescriptive models for IA have been developed (see Fried and Fried 1996; Ntaimo et al. 2013; Wei et al. 2014) and much of the dispatch rule sets within the field are based on some form of application of these models. Application of these models to the IA management problem broadly meets the above defined criteria. Specifically: 1) There is a clear objective in both space and time that is agreed upon by all participants; that is, contain the fire as rapidly as possible. 2) The system components of resource arrival time, fireline production rates, and wildfire growth modeling are well understood and broadly accepted. 3) The parameters are known with some level of confidence, distribution around fireline production rates and fire behavior at least under conditions where initial attack is successful are generally understood. 4) There have been extensive IA events (approximately 10,000 per year within the US Forest Service alone) with which to test the model components of resource productivity and fire behavior, thus providing the opportunity to statistically validate these models (although we know of no empirical test of the IA dispatch models to date). 5) There is typically a clear transition under which fire control is recognized as infeasible given the current set of inputs and fire behavior.

However, prescriptive models of large-fire management pose significant challenges and in general fail to meet the criteria we have put forth as follows: 1) Under many situations a clear spatially and temporally defined objective is not broadly accepted. Broadly speaking the objective is to minimize the cost of management plus net value change to affected resources. However, many of the primary resource values affected by fire are non-market in nature and relative value is not well understood (Venn and Calkin 2011) and tradeoffs between competing resource objectives are not consistently interpreted. 2) There are complex interactions between fire behavior and wildfire suppression actions. For example Finney et al. (2009) demonstrated that the primary factor leading to containment of large wildfires was the number of quiescent fire growth periods, and that the quantity of suppression resources could not be demonstrated to have any effect on likelihood of containment. 3) There is high parameter uncertainty around system components including longer term fire weather predictions, extreme fire behavior, and suppression effectiveness (Holmes and Calkin 2013). 4) Large wildfires are highly heterogeneous events in terms of weather, fire behavior, and management approach. Further, the events that result in highest loss have few historical precedents for comparison. 5) The conditions that lead the system to dramatic fire behavior transitions such as blow-ups and crown fire are poorly understood and extremely hard to predict.

Thus, a critical challenge in managing large wildfire is that both the conditions for intuitive decision making identified by Kahneman and Klein and those conditions for development and application of prescriptive environmental management models are absent. Decision makers are responsible for making decisions in highly uncertain environments with limited prescriptive modeling support. Additionally fire managers, at least in the US, are incentivized to commit additional suppression resources and restrict potentially beneficial fire (Calkin et al. 2011).

Recommendations on model use

Proposed Requirement 1 emphasizes that the application of natural hazards modeling requires the presence of a clearly identified objective which will allow the primary risk factors that lead to high consequence events to be defined. It is critical that the objective is tiered to the spatial and temporal scale of the management problem. Comparison of landscape scale hazardous fuel reduction programs to IA provides a relevant example. Mechanical removal and prescribed burning of hazardous fuels is a common practice across the globe. Within public land agencies in the United States coordinated landscape scale fuels programs are developed across relatively large areas of public domain and may cover tens of thousands of hectares. These treatment programs are designed to reduce the likely spread and intensity of future wildfires, and in many ecosystems reduced fuel loading influences future fire behavior for 10 to 20 years post treatment. Compare this to the localized short duration problem of IA; contain a wildfire as small as possible within 24 hours of ignition.

Management actions intended to reduce loss from natural hazards need to address the core risk factors that drive high consequence events. This can be very challenging since high consequence events are dominated by extreme conditions. Additionally, it should be clear if risk mitigation actions are addressing the likelihood of the hazardous event or the consequence of the event once it has begun. For example, two very common wildfire risk mitigation actions are landscape fuels reduction treatments aimed at reducing the likelihood of wildfire spread and modification of the home and immediate surroundings to reduce the likelihood of home destruction if a wildfire event were proximate. Landscape fuels treatments aim to address the likelihood of uncontrollable wildfire spread whereas modification of the home environment aims to reduce the consequence assuming a wildfire is proximate.

When considering loss due to natural hazards, we must focus on those extreme events that result in a majority of economic loss. Mitigation actions, such as flood control or wildland fuels treatments, therefore, need to be designed to reduce loss under the extreme conditions where loss occurs, this has been defined in the wildfire hazard literature as the reference conditions (see for example Calkin et al. 2014). Modelling the consequences of extreme events is a daunting challenge. However, it is only one necessary component to modelling the impact of specific mitigation actions designed to reduce losses from such an event. In some cases we may be observing a butterfly in China effect – what appears as a relatively insignificant event such as containing a small spot fire shortly after ignition, could in fact preclude a major fire disaster several days or weeks into the future.

Another critical recognition is that much of the effort in understanding natural disaster is focused on exploring events that resulted in system failure and high loss. We don’t typically observe when an activity aimed at reducing the likelihood of a high consequence event succeeds since it is difficult to confirm that significant loss would have occurred in the absence of the implemented action; thus reducing the incentive for a manager to make expensive mitigation actions that may not be tested (Collins et al. 2013). This brings up another significant challenge in the application of natural hazard models. Those models that confirm decision makers pre-conceived notions are more likely to be given undue weight; the well-known confirmation bias (Nickerson 1998). Since models are necessarily simplifications of the real world phenomena, and many of the phenomena being examined are highly complex integrated systems, models may be most useful when results challenge well held assumptions; unfortunately it is human nature to discount information under such conditions.

As system complexity and uncertainty increase engineering and optimization models will require simplifying assumptions and the selection of key parameters that may not be known with any level of accuracy. As such, application of such results to natural hazards is likely to be fraught with dangers: are signals and results from the model an accurate portrayal of the system, or cascading errors from incorrect or overly simplistic model components and/or uncertain parameters? Under these conditions simulation models and expert opinion systems that explore potential interactions that may lead to system change may be more relevant than complex optimization models. For example, the Wildland Fire Decision Support System in the US integrates an expert system model to aggregate a range of qualitative factors that combine to define wildfire complexity and a recommended incident team type, simulation models that represent fire spread over single burn periods to multiple weeks, along with spatially represented values at risk and potential management costs (Noonan-Wright et al. 2011). The individual model components help identify component risk factors, but the development of the fire management strategy is left to local decision makers in consultation with incident fire management team leadership.

As consequences from natural hazards become ever more severe it is critical that we improve our ability to consider and model the impacts of a range of risk factors to forest ecosystems. Natural hazard models can be highly valuable in supporting decision making under uncertainty by assisting decision makers in simplifying complex systems to better understand potential outcomes from management actions. However, overreliance on modeled results, particularly as systems transition to states not considered within the model, may exacerbate negative consequences. To achieve improved outcomes in an increasingly hazardous world, it is critical that we understand the appropriate role and type of models that can best inform effective decision making to reduce loss from unexpected environmental and societal shocks.

References

  • Arhonditsis GB, Adams-VanHarn BA, Nielsen L, Stow CA, Reckhow KH (2006) Evaluation of the current state of mechanistic aquatic biogeochemical modeling: citation analysis and future perspectives. Environ Sci Technol 40(21):6547–6554

    Article  CAS  PubMed  Google Scholar 

  • Bennett ND, Croke BFW, Guariso G, Guillaume JHA, Hamilton SH, Jakeman AJ, Marsili-Libelli S, Newham LTH, Norton JP, Perrin C, Pierce SA, Robson B, Seppelt R, Voinov AA, Fath BD, Andreassian V (2013) Characterising performance of environmental models. Environ Model Softw 40:1–20

    Article  Google Scholar 

  • Calkin DE, Finney MA, Ager AA, Thompson MP, Gebert KG (2011) Progress towards and barriers to implementation of a risk framework for Federal wildland fire policy and decision making in the United States. Forest Policy Econ 13(5):378–389

    Article  Google Scholar 

  • Calkin DE, Cohen JD, Finney MA, Thompson MP (2014) How risk management can prevent future wildfire disasters. Proc Natl Acad Sci 111(2):746–751

    Article  PubMed Central  CAS  PubMed  Google Scholar 

  • Collins R, de Neufville R, Claro J, Oliveira T, Pacheco A (2013) Forest fire management to avoid unintended consequences: a case study of Portugal using system dynamics. J Environ Manag 130:1–9

    Article  Google Scholar 

  • Finney MA, Grenfell IC, McHugh CW (2009) Modeling containment of large wildfires using generalized linear mixed-model analysis. For Sci 55:249–255

    Google Scholar 

  • Fried JS, Fried BD (1996) Simulating wildfire containment with realistic tactics. For Sci 42(3):267–281

    Google Scholar 

  • Holmes TP, Calkin DE (2013) Econometric analysis of fire suppression production functions for large wildland fires. Int J Wildland Fire 22(13):234–245

    Article  Google Scholar 

  • Jakeman AJ, Letcher RA, Norton JP (2006) Ten iterative steps in development and evaluation of environmental models. Environ Model Softw 21(5):602–614

    Article  Google Scholar 

  • Kahneman D, Klein G (2009) Conditions for intuitive expertise: a failure to disagree. Am Psychol 64(6):515

    Article  PubMed  Google Scholar 

  • Kelly RA, Jakeman AJ, Barreteau O, Borsuk ME, ElSawah S, Hamilton SH, Henriksen HJ, Kuikka S, Maier HR, Rizzoli AE, van Delden H, Voinov AA (2013) Selecting among five common modelling approaches for integrated environmental assessment and management. Environ Model Softw 47:159–181

    Article  Google Scholar 

  • Nickerson RS (1998) Confirmation bias: a ubiquitous phenomenon in many guises. Rev Gen Psychol 2(2):175–220

    Article  Google Scholar 

  • Noonan-Wright E, Opperman TS, Finney MA, Zimmerman TG, Seli RC, Elenz LM, Calkin DE, Fiedler JR (2011) Developing the U.S. Wildland Fire Decision Support System (WFDSS). J Combust. doi:10.1155/2011/168473

  • Ntaimo L, Gallego-Arrubla JA, Gan J, Stripling C, Young J, Spencer T (2013) A simulation and stochastic integer programming approach to wildfire initial attack planning. For Sci 59(1):105–117

    Google Scholar 

  • Robson BJ (2014) State of the art in modelling of phosphorus in aquatic systems: review, criticisms and commentary. Environ Model Softw. doi:10.1016/j.envsoft.2014.01.012

    Google Scholar 

  • Scott JH, Thompson MP, Calkin DE (2013) Gen. Tech. Rep. RMRS-GTR-315, U.S. Department of Agriculture, Forest Service, Rocky Mountain Research Station., p 83

    Google Scholar 

  • Venn TJ, Calkin DE (2011) Accommodating non-market values in evaluation of wildfire management in the United States: challenges and opportunities. Int J Wildland Fire 20(3):327–339

    Article  Google Scholar 

  • Wei Y, Bevers M, Belval E, Bird B (2014) A chance-constrained programming model to allocate wildfire initial attack resources for a fire season. Forest Science. doi:http://dx.doi.org/10.5849/forsci.14-112

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to David E Calkin.

Additional information

Competing interests

The authors declare that they have no competing interests.

Authors’ contributions

DC developed the conceptual model and drafted the manuscript. MM contributed broad guidance, manuscript editing and review. Both authors read and approved the final manuscript.

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (https://creativecommons.org/licenses/by/4.0), which permits use, duplication, adaptation, distribution, and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Calkin, D.E., Mentis, M. Opinion: the use of natural hazard modeling for decision making under uncertainty. For. Ecosyst. 2, 11 (2015). https://doi.org/10.1186/s40663-015-0034-7

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s40663-015-0034-7

Keywords