Scholarly article on topic 'Risk interpretation and action: A conceptual framework for responses to natural hazards'

Risk interpretation and action: A conceptual framework for responses to natural hazards Academic research paper on "Sociology"

CC BY-NC-ND
0
0
Share paper
OECD Field of science
Keywords
{Risk / Hazard / Interpretation / Decision / Trust}

Abstract of research paper on Sociology, author of scientific article — J. Richard Eiser, Ann Bostrom, Ian Burton, David M. Johnston, John McClure, et al.

Abstract Understanding how people interpret risks and choose actions based on their interpretations is vital to any strategy for disaster reduction. We review relevant literature with the aim of developing a conceptual framework to guide future research in this area. We stress that risks in the context of natural hazards always involve interactions between natural (physical) and human (behavioural) factors. Decision-making under conditions of uncertainty is inadequately described by traditional models of 'rational choice'. Instead, attention needs to be paid to how people's interpretations of risks are shaped by their own experience, personal feelings and values, cultural beliefs and interpersonal and societal dynamics. Furthermore, access to information and capacity for self-protection are typically distributed unevenly within populations. Hence trust is a critical moderator of the effectiveness of any policy for risk communication and public engagement.

Academic research paper on topic "Risk interpretation and action: A conceptual framework for responses to natural hazards"

Contents lists available at SciVerse ScienceDirect

International Journal of Disaster Risk Reduction

ELSEVIER journal homepage: www.elsevier.com/locate/ijdrr

Review Article

Risk interpretation and action: A conceptual framework for responses to natural hazards

J. Richard Eiser*1, Ann Bostrom2, Ian Burton3, David M. Johnston4, John McClure5, Douglas Paton 6, Joop van der Pligt7, Mathew P. White 8

ABSTRACT

Understanding how people interpret risks and choose actions based on their interpretations is vital to any strategy for disaster reduction. We review relevant literature with the aim of developing a conceptual framework to guide future research in this area. We stress that risks in the context of natural hazards always involve interactions between natural (physical) and human (behavioural) factors. Decision-making under conditions of uncertainty is inadequately described by traditional models of'rational choice'. Instead, attention needs to be paid to how people's interpretations of risks are shaped by their own experience, personal feelings and values, cultural beliefs and interpersonal and societal dynamics. Furthermore, access to information and capacity for self-protection are typically distributed unevenly within populations. Hence trust is a critical moderator of the effectiveness of any policy for risk communication and public engagement.

© 2012 Elsevier Ltd. All rights reserved.

ARTICLE INFO

Article history:

Received 7 February 2012

Received in revised form

30 May 2012

Accepted 31 May 2012

Available online 15 June 2012

Keywords:

Hazard

Interpretation

Decision

Contents

1. Aim of the paper.......................................................................................6

2. Defining risk...........................................................................................6

3. Defining uncertainty....................................................................................7

4. Characterising previous research on risk interpretation and decision-making.......................................7

5. Individual decision-making under uncertainty: beyond 'rational choice'...........................................8

6. Heuristics.............................................................................................8

7. Decisions from experience................................................................................9

8. Learning.............................................................................................10

9. Trust in others........................................................................................11

10. Complexity, scale and social context.......................................................................12

11. From risk interpretation to action.........................................................................13

* Corresponding author. Tel.: + 44 1629 640938. E-mail address: j.r.eiser@shef.ac.uk (J. Richard Eiser).

1 University of Sheffield, Psychology, Western Bank, Sheffield S10 2TP, United Kingdom.

2 University of Washington, Daniel J. Evans School of Public Affairs, Seattle, Washington 98195-3055, United States of America.

3 University of Toronto, Geography, Toronto, Canada.

4 GNS Science/Massey University, Lower Hutt 5010, New Zealand.

5 Victoria University of Wellington, Psychology, PO Box 600, Wellington, New Zealand.

6 University of Tasmania, Psychology, Launceston, Tasmania, Australia.

7 University of Amsterdam, Psychology, Amsterdam, The Netherlands.

8 European Centre for Environment and Human Health, University of Exeter Medical School, Knowledge Spa, Truro, TR1 3HD, United Kingdom.

2212-4209/$-see front matter © 2012 Elsevier Ltd. All rights reserved. http://dx.doi.org/10.1016/j.ijdrr.2012.05.002

12. The way forward......................................................................................13

Acknowledgements....................................................................................14

References...........................................................................................14

1. Aim of the paper

A recent report [1] highlighted the need for a better understanding of human decision-making in the face of risk as a priority for disaster risk reduction, noting that

''The risk associated with environmental hazards depends not only on physical conditions and events but also on human actions, conditions (vulnerability factors, etc.), decisions and culture...The seriousness of the consequences of any disaster will depend also on how many people choose, or feel they have no choice but, to live and work in areas at higher risk..." (ICSU [1], p. 14).

To address this challenge, we offer a critical overview of research and theory on the relationships between how people interpret risk and the decisions they make as a consequence of such interpretations. We aim to clarify the key concepts and theories concerning the processes underlying interpretation of risk and decision-making under uncertainty so as to make these more accessible to theoreticians and practitioners in the field of natural hazards.

In adopting this focus, we are not attempting to offer a comprehensive review of the large literature on the causes (still less, the consequences) of disasters. As many have argued, disasters cannot be properly understood, or indeed prevented, without attention to the critical role of human agency and societal processes [2]. Disasters occur—and hazards are experienced—within contexts characterised by varying levels of vulnerability and resilience and different kinds of cultural beliefs and world views [3]. Scientific research itself incorporates particular world views and we need to be vigilant of the implicit assumptions in the words with which different concepts and theories are framed.

Indeed, the very term 'natural' is contentious, to the extent that it could imply that disasters are merely the consequence of meteorological or geophysical events beyond human control or responsibility. This is absolutely not our intention. Instead, we use the term more mundanely and pragmatically so as to signal that (as in the ICSU [1] report) we do not attempt to cover the literatures on risk interpretation within the contexts of industrial hazards, pandemics, war and conflict, and such like. There are continuities with these other literatures, but they mostly fall beyond the scope of this paper, even though (as the Fukushima accident demonstrates) 'natural' and industrial hazards may exacerbate one another. Likewise, the issue of climate change is part of the background rather than foreground of this review. How disaster risk is influenced by climate change, and how climate change is influenced by human decisions and activities, are clearly questions of the greatest importance, but we cannot do justice to them in this review.

We see our present exercise, likewise, as complementing, rather than in any way contesting, the large literature on why some communities are more vulnerable, or resilient, than others [4-8]. Vulnerability is a function both of place— where people live and work—and of human activities and social interactions. Physical features of an environment (e.g. extent of seismic activity, susceptibility to storms or floods) are obviously important, but so too are the adaptive or maladaptive responses of individuals and communities to such hazards. The physical environment will itself reflect the impact of human activity, often in ways that increase such vulnerability, for instance with building on flood plains, or the destruction of mangroves for commercial fish farming. And, of course, human activity is a major driver of climate change, which in its turn increases the vulnerability of populations in large parts of the world to 'natural' disasters [7,9]. Thinking of behavioural processes as merely moderating the effects of physical hazards, therefore, is not just inadequate, but misconceived. As with cultural beliefs, vulnerability or resilience constitutes a context within which hazards are experienced. Too often, communities are vulnerable not simply to single hazards in isolation, but to combinations of perils, including disease, personal and financial insecurity, exploitation, violence and displacement. Understanding the historical and political processes that create and maintain such vulnerabilities is absolutely central to an explanation of disasters. But again, it is not the specific purpose of this paper to describe such processes in general. Vulnerability, like culture, is an aspect of the contexts within which hazards are experienced and risks interpreted [3].

Regardless of the sources of vulnerability in natural disasters, the question still remains: how are risks interpreted and acted upon within, and as a function of, such contexts? This is our question here, and to try and answer it, we need to examine a range of concepts, relating more closely to how people (individually and in groups) deal with uncertainty, update their beliefs on the basis of feedback and make choices among alternatives. In doing so, our review will draw especially on research and theory in psychology and decision sciences to illustrate the potential contribution of these disciplines to an understanding of responses to natural hazards.

2. Defining risk

The concept of risk is a central issue for policy in areas as diverse as health, environment, technology, finance and security [10]. How are risk beliefs developed and enacted? How can risks (that reflect relationships with the environment and are culturally, socially and psychologically constructed) be reliably identified? How can they be managed? Under what circumstances should they be accepted or rejected? Most importantly, how are they likely to be interpreted or 'perceived' by different people? It is difficult to give general answers to these questions if we define risk

merely in term of the kinds of events or activities we would commonly call 'risky'. Rather than attempting to define risk in terms of topics, however, we can attempt to do so in terms of processes, and these, we suggest, may turn out to be quite general and applicable across a wide range of topics. Decision sciences traditionally define risk as a function of (a) the likelihood and (b) the value of some possible future event or events. As we shall see, even the terms 'likelihood' and 'value' need to be used cautiously. Likelihood may mean more than statistical probability and value more than economic benefit and cost. More importantly, however, risk arises not just from how some future can be described, but from the uncertainty, actual or perceived, surrounding that description. Indeed, it is only because we need to act under conditions of uncertainty that the concept of risk is of any interest whatsoever. Living with natural processes that are periodically hazardous means that people have choices to make, even though differences in social and cultural beliefs and characteristics may result in some communities and individuals having many more options open to them than others, and these choices can have consequences for themselves and others. It is because these consequences are uncertain, and may leave us better or worse off, that we talk about 'risk'.

Uncertainty often leads people to depend on others to provide information. These others can be scientists or government agencies, but also fellow community members who share or contest their interests and values [11-14]. Hence, the quality of relationships (e.g., the degree to which people identify with others, trust relationships that have developed over time) with these others influences how they deal with uncertainty. The processes to be discussed are commonly termed 'risk perception'. However, 'perception' usually implies that there is something 'out there' to be perceived. Among other problems, this leads too readily to looking at such discrepancies as may arise between the views of different stakeholders primarily in terms of whose views are more 'correct', whereas, as already noted, risks are inherently uncertain and no one perspective is likely to have a monopoly on the truth. The far more important question is why different individuals and groups hold the opinions that they do and how they are developed, enacted, sustained, and changed. We therefore prefer to use the phrase 'risk interpretation' to refer, more neutrally, to how we anticipate the outcomes of choices made either by ourselves, or by other decision-makers. Simply stated, interpretation of risk is a special case of the interpretation of uncertain information, and 'risk-taking,' 'preparing' and 'avoidance' are special kinds of actions chosen under conditions of uncertainty. How such information is interpreted and actions are chosen depends, as we shall see, on many factors, First, though, we need to define uncertainty.

3. Defining uncertainty

Risk arises from uncertainty, but how is uncertainty itself to be defined? Is uncertainty merely a state of mind—a reflection of our own incomplete knowledge (i.e. epistemic uncertainty)—or intrinsic to the nature of the very things about which we seek knowledge (i.e. aleatory uncertainty)? Applying this distinction, however,

requires that we know the parameters of the distribution we are considering. In many, if not most, real-life situations, this is far from straightforward.

To illustrate this, consider the simple case of tossing a coin. If we toss the same coin a large number of times, the number of heads should approximately equal the number of tails. In this example, a probability is essentially a longrun frequency, representing a frequentist view of probability. However, when it comes to forecasting future events that haven't yet happened (for instance, low-probability high-consequence disasters), there simply is not a distribution of previous events from which to extrapolate, although sciences such as palaeogeography [15] are working on defining relevant distributions. Even when we are dealing with events that happen relatively frequently in particular locations (e.g. coastal or river flooding), a judgement has to be made whether the background conditions (under which previous events have been recorded) have remained stable or have altered, for instance due to climate change. In such circumstances, probabilities cannot strictly be calculated (deductively) but only estimated (inductively). Success and (particularly) failure in forecasting can prompt more systematic methods and observations. But we are still dealing with estimates, that is, interpretations of information, even in the case of 'expert' forecasts. Clearly, this applies with no less force to the judgements made by ordinary citizens on the basis of personal experience, and typically with a less formal understanding of relevant causal processes. In fact, rather than access to mere statistics, it is mainly the understanding of relevant causal processes, and often their incorporation into formal models capable of simulation and/or experimental testing, that both distinguishes expert from less expert interpretations and underpins reliable forecasting (where this is achievable).

As well as uncertainty over the likelihood of an event, there may be uncertainty over the value of the consequences, in part because 'value' means many things. In some contexts (e.g. monetary profit, insurance), it can be a shorthand for the magnitude of any consequences. Even regarding monetary outcomes, however, 'objective' value (e.g. as measured in dollars) is nonlinearly related to subjective feelings of (un)desirability, or 'utility' [16]. Understanding how people interpret risk is difficult partly because of the values they attach to different kinds of outcomes (actual and anticipated). Furthermore, one individual's benefit may be another's cost and the distribution of such consequences (costs and benefits) may be uneven and/or unstable across spatial and temporal domains. Societal impacts are not simply the sum total of impacts at the individual level, whether these relate to physical, psychological, social, or economic well-being.

4. Characterising previous research on risk interpretation and decision-making

The literature on risk interpretation and decision-making is both large and diverse and covers topics ranging from public concerns about threats from natural and industrial sources to changes in industrial development or changed land-use (wind farms, waste storage or incineration, commercial and housing development in rural areas). Similarly,

there is a large body of work on the extent to which people feel that their health is endangered by a host of life-style factors, in particular smoking, alcohol, diet and exercise. There are lessons here for the connection between risk interpretation and action, since a major concern is with persuading people to adopt healthier habits. For instance, does telling smokers that cigarettes damage their health lead them to quit smoking? The answer is: sometimes and somewhat, but not always [17]. So if not, why not? How far might similar factors be involved when people fail to take protective action in the face of natural hazards?

On the other hand, much research examines general principles of how individuals formulate preferences and make decisions under conditions of uncertainty. The explicit focus here is more on the potentials and limitations of human rationality. At any rate, that is the claim. But if one is to propose general principles, one had better be sure that they really are general, and not just limited to the specific paradigms—let alone cultures—within which they have been developed. In drawing lessons from the previous literature—whether descriptive or theoretical—to the context of natural hazards and disasters, moreover, we constantly come across problems of scale. Most of the research deals with the reactions of individuals considered singly, facing a single threat or source of uncertainty at a single point of time. A more adequate conceptual framework must move beyond this to account for how individuals influence and are influenced by one another in social and institutional contexts, how multiple hazard events occur and interact with one another, how people affect hazards and hazards affect people, and how all such interrelations evolve dynamically over time.

5. Individual decision-making under uncertainty: beyond 'rational choice'

The 'rational choice' model, developed initially within classical economics, has been the starting point for much research in this area. In simplified form, this approach assumes that decision-makers compare the prospects of alternative actions in terms of two attributes: the benefits or costs of each possible outcome and the probability of each outcome. The product of the benefit (or cost) and probability then defines the 'expected value' (EV) of each outcome, and it is assumed that this, and this alone, determines preference. In other words, a 'rational' decision-maker should always prefer the option(s) with the most positive EV.

This model has been subject to empirical and theoretical challenges over many years. Especially pertinent is the 'standard gamble' paradigm where participants indicate their preference for either of two options. Typically, one option (A) is presented as a 'sure thing' chance of a specific outcome, for example, a guarantee of winning $10. This is then compared with a second option (B), which could be a 1-in-10 chance of winning $100 but a 9-in-10 chance of winning nothing. According to rational choice theory, in this example, participants should be indifferent between the two options since both have the same EV, specifically 1 x $10 = $10 for A; (0.1 x $100) + 0.9 x $0 = $10 for B. In fact, participants fairly reliably tend to prefer option A

when choosing between certain and uncertain gains. However, if the problem is stated as a choice between losses, option B tends to be preferred. In Kahneman and Tversky's Prospect Theory [18], this is expressed by saying that individuals tend to be 'risk-averse for gains' but 'risk-seeking for losses'. Risk is identified here with uncertainty, option B being termed 'risky' because it is uncertain and not because it is associated with a more negative EV.

Kahneman and Tversky [18] further stress that 'gain' and 'loss' are not absolute but relative to an implied reference point that represents one's expectations. This is illustrated vividly by the fact that decision-makers' preferences can be changed simply by altering the verbal description of a problem so as to imply a different reference point. Such 'framing' manipulations present the same outcome as though it is a gain (thereby inducing risk aversion in the sense above) or as a loss. For example, Tversky and Kahneman [19] had participants imagine a choice between two interventions to combat an epidemic where the expected death toll was 600. In one condition, the choice was between (A) an intervention that would save 200 and (B) one that had a 1/3 probably of saving 600, but a 2/3 probability of saving nobody; in this condition 72% preferred A. In another condition, the same dilemma was presented as one between (C) where 400 would die and (D) a 1/3 probability of nobody dying, but a 2/3 probability of 600 dying; in this condition, 78% preferred D. This is because the 'lives saved' frame implies a comparison with 600 deaths, whereas the 'lives lost' frame implies a comparison with 0 deaths.

Research on health communication [20-22] shows that messages can differ in their effectiveness depending on whether they are framed as gains or losses, and whether they focus mainly on detection (of symptoms) or self-protection. There are important questions concerning which message framings are likely to be most effective in the context of natural hazards [23,24], where greater safety demands both detection of any increased threat (through attentiveness to warning signs) and anticipatory and reactive protective measures (e.g. defences to buildings and infrastructure, evacuation procedures).

6. Heuristics

Another important concept emerging from the critique of the 'rational choice' approach is that of cognitive heuristics [25,26]. For instance, the availability heuristic states that events are judged to be more probable if imagining or recalling similar instances from memory is easier. Consequently, people may give disproportionate weight to a few memorable events (for instance if they receive vivid press coverage) without recognising that their memory is selective. Of special relevance to judgement of risk is work by Slovic et al. [27-29] on the affect heuristic. This states that cognitive judgements, including estimates of probability, can be strongly influenced by affective reactions. For example, Dutch citizens who had more positive affective reactions to the risk of flooding expressed lower estimates of the likelihood of future floods and weaker intentions to take protective measures [30].

The affect heuristic also implies that evaluations of future events or prospects are often oversimplified. For an event (or hazard, or policy) with multiple consequences, there is no a priori reason why the evaluation of one consequence (or aspect) should depend on how any other consequence is evaluated. The null hypothesis is that these separate evaluations should be uncorrelated with each other. The evidence suggests otherwise. If an event or prospect is emotionally charged, individuals tend to resist acknowledging that it can have both clear benefits and clear costs. If they like it, they see it as having more benefits and fewer costs. If they dislike it, they see it as having fewer benefits and more costs. The tedious business of estimating the total value of all benefits and costs is avoided by relying on the feelings elicited by viewing the prospect as a single entity. Thus there are good things you feel good about, and bad things you feel bad about, and not much room for doubt in between.

At first reading, research on heuristics offers a somewhat depressing view of people's capacity for rational decision-making. Among other dangers, it can offer a justification for authorities to dismiss any opposition or dissent from ordinary citizens as ill-informed and irrational [31]. However, it is important to look critically at the evidence from which such inferences are derived.

In particular, the 'standard gamble' paradigm does not realistically simulate the kinds of dilemmas to be addressed by policy-makers or citizens when faced by actual hazards. In this paradigm, the probabilities and values of the outcomes to be considered are defined by the researcher. Even the uncertainties are known and defined. In other words, a large part of experimental work on cognitive heuristics presents participants with a description of the decision problem, whereas, outside the laboratory, decision-makers typically need to estimate the likelihood and magnitude of different consequences from their own experience.

Increasingly, research is showing that such decisions from experience often result in very different choices from decisions from description, even when 'descriptions' convey accurately in advance the information that could eventually be gained experientially from complete sampling of the evidence available [32-35]. This applies particularly to assessment of rare events that might not be directly experienced if sampling of potentially available data is incomplete [36]. This is especially relevant to natural disasters and hazardous events with a long return period. However, even with more frequently experienced events such as wildfires, risk assessments can be vulnerable to bias [37].

Closely related is a tendency for people to treat small samples of data as more representative than they really are. Kahneman and Tversky [38] use the phrase 'the law of small numbers' to refer to people's readiness to over-generalise from small sets of data. By definition, low-probability disasters occur infrequently within a given time period (or within a specific geographical area). If one has not personally experienced a disaster, reliance on personal experience may lead to an underestimation of the statistical risk. This can also lead to overconfidence in the effectiveness of safety procedures, the reliability of building and infrastructure, etc., essentially because these

have not yet been fully put to the test. The other side of the story, however, is that, if a disaster does occur within the small sample of cases one experiences, one may over-generalise to regard all similar hazards as more dangerous than the statistics would otherwise suggest.

7. Decisions from experience

When interpreting risk and making decisions on the basis of experience, people attend to multiple characteristics of risks, including not only the severity of the threat or magnitude of potential consequences, but also their ability to do something about the risk, uncertainties and ambiguities about the risk, and what they know about the hazards creating the risk in question [39-42]. Throughout, experts differ from non-experts, and experts in one area differ from those with other expertise [43,44]. Just as people differ in their amounts and kinds of expertise, they differ in their personal experience.

To ask how individuals base decisions on experience is effectively to ask how people learn from their observations and the consequences of their decisions. Responding appropriately in the face of risk involves, first and foremost, an ability to discriminate potentially dangerous situations from ones that are more probably safe. A framework for considering the costs and benefits of different decisions derives from a classic theory of visual perception known as Signal Detection Theory (SDT) [45]. The problem this theory describes is the 'discrimination performance' of a perceiver faced with the task of identifying whether stimulus information is evidence of a 'signal' or merely 'noise'. For instance, how does a radar operator distinguish between a blip on a screen due to an approaching aircraft and one due to atmospheric disturbance?

We can illustrate with the case where a decision-maker is faced with (uncertain) information about a possible hazard event. The choice to be made is whether to treat any warning signs as evidence of a real and present danger ('signal') or to conclude that the situation is actually safe ('noise'). This can be represented in terms of a cross-tabulation (see Fig. 1—for the moment just consider the four cells in the first two rows) where one axis represents the hazard level (i.e., danger or safety) and the other axis represents the decision-maker's judgement (i.e., treat as dangerous vs. safe). Each of the resulting cells then has a distinct meaning. Treating a real danger as dangerous constitutes a 'true-positive' or 'hit'; treating a real danger

Decision Actual risk Dangerous Safe

Danger Hit Miss

Safety False alarm Correct all clear

Learning False alarms and hits difficult to distinguish Misses may be fatal, or consequences sporadic

Fig. 1. Cross-tabulation of decision-outcome combinations, including potential sources of error in learning from feedback following decisions.

as safe constitutes a 'false-negative' or 'miss'; treating what is actually safe as dangerous is a 'false-positive' or 'false alarm' and treating a safe situation as safe is a 'true-negative' or 'correct rejection'. SDT describes the performance of decision-makers in terms of two parameters: sensitivity or discrimination ability (the proportion of correct responses, i.e. accuracy) and criterion or response bias (the tendency to give response in one direction, e.g. to say that danger is present, regardless of the facts). This second parameter (criterion) is reflected in the type of errors made and not merely their number.

Specifically, adoption of a riskier criterion will tend to result in some ambiguous information being incorrectly interpreted as safe (reflected in more false-negatives, or misses), whereas a more cautious (risk-averse) criterion will tend to result in more false-positives (false-alarms), i.e. ambiguous information being incorrectly interpreted as dangerous.

So what determines the choice of a criterion for any decision problem? The first thing to appreciate is that no one criterion is any more correct than any other in an absolute sense. It all comes down to what kinds of errors we are prepared to accept and what kinds we are anxious to avoid. (Ideally, we don't want any errors, but that just amounts to saying that we aspire to a situation where we achieve perfect discrimination, in other words where information is unambiguous. In such an ideal world, we'd have no need to choose a decision criterion since there'd be no uncertainty). Of major importance in the choice of criterion are the anticipated costs and benefits of different decision outcomes. With natural hazards, the costs of a miss (a failure to detect or predict a hazard event) can be catastrophic. By itself, this should push decision-makers in the direction of adopting a cautious criterion (or 'precautionary principle'), where the chance of a miss is reduced at the price of accepting more false alarms. But the costs of false alarms are not necessarily trivial either, especially if they occur repeatedly [46]. They may induce complacency or cynicism among populations at risk if warnings of imminent disasters fail to materialise. Furthermore, preventive measures (e.g. evacuation) may cause disruption to normal life and economic activity. The important lesson here is that there is always a balance to be struck, and it is best if this is made explicit. We live in an uncertain, not an ideal, world.

This then leads to several questions that are at the heart of risk interpretation and action. How should different costs and benefits be valued? Whose costs and benefits should be given most weight? How fair is any distribution of costs and benefits between different parties or stakeholders, between geographically separate regions and between present and future generations? These questions are intensely ethical and not merely empirical, but it is vital that ethical debates are informed by the best available empirical evidence. Next, how well can one anticipate any such costs and benefits? This bears on the more general question of how we learn from experience, and here there is much empirical evidence from which lessons can be drawn. The original formulation of SDT dealt with collections of discriminations between signals and noise, but not with how such

discriminations are improved through learning and feedback concerning the outcome of such discriminations.

8. Learning

All learning is dynamic. That is to say, beliefs gained through learning change over time as new information is acquired. Most importantly, such beliefs allow us to predict events. Predictions can be based on observations of events that co-occur (associative learning) and/or observations of the consequences of our own or others' behaviour (instrumental learning). To understand how beliefs change through either type of learning, we need to consider what happens when our predictions appear to be confirmed, or not, by feedback from experience. Fairly obviously, beliefs are strengthened by apparently confirmatory feedback and weakened by apparently contradictory feedback. But why the qualification 'apparently'? For at least three reasons: first, because the evidence itself may be uncertain and incomplete; second, because individuals appear biased towards interpreting ambiguous information as consistent with their prior beliefs [47-49] as well as maintaining closer social relationships with others who share their views [50,51]; and third, because decisions cannot be postponed indefinitely, even if evidence is incomplete.

If a decision is followed by a good outcome, this makes the decision-maker more confident that the decision was correct. This in turn increases the probability of making the same decision in similar circumstances in the future, and if the outcome is still favourable, the decision-maker becomes even more convinced. However, there is a major constraint on such 'rationality'. Such choices not only reflect previous learning but shape future learning by constraining the kinds of feedback the decision-maker receives. Put differently, people only sample from a limited part of the 'problem space' (of decision-outcome contingencies) and fail to learn if there are other, possibly better outcomes, from different choices [52].

The original formulation of SDT focused on problems of uncertain rather than incomplete information. Nonetheless, we can extend this framework to consider how the choice of criterion constrains the decision-maker's opportunity to learn from experience. A third row ('Learning') has been added to the standard 2 x 2 matrix of decision-outcome combinations in Fig. 1 to illustrate some of the difficulties in learning from the outcomes of one's decisions. First of all, let's suppose someone adopts a cautious or risk-averse criterion. This should lead to fewer instances of damage or disaster due to inadequate protective measures. The price of this is a greater number of false alarms. But how reliably can false alarms be distinguished from hits? Research has shown that people are quick to learn to avoid situations in which they have been hurt or frightened in the past and that such learnt avoidance behaviours (e.g. certain phobias) can be very persistent. Not only is avoidance rewarded by feelings of relief from stress, it also means that one's fears remain untested by seeing if the danger is real. Put differently, one cannot tell the difference between a hit and a false alarm unless one can tell what would have happened in the absence of such protective action.

In the above example, the protective measure chosen is effective in avoiding an actual threat, but is over-used where no threat exists. Not all avoidance behaviours, however, rely on firm evidence of effectiveness. For example, many people engage in superstitious rituals, adopt fad diets, or undergo unnecessary medical procedures, as ways of warding off personal misfortune or diseases which either wouldn't have happened anyway, or from which they would have spontaneously recovered. If nothing bad then happens, people (and in a medical context, not just patients, but doctors too [53]) believe such actions to be effective forms of protection. Not only does this mean that such actions are reinforced (and hence repeated), it means that people may ignore real signs of danger or recommendations for more effective forms of protection.

Now let's consider situations where individuals adopt a risky criterion, that is, treat some threats as less dangerous than they really are. This can arise in many contexts where instances of unsafe behaviour are not immediately or inevitably followed by harm. In other words, people often get away with behaving dangerously. Not all cases of dangerous or even drunk driving lead to accidents, and arriving unharmed at the end of such a drive may inflate false optimism in one's driving ability, or indeed capacity for alcohol. (Of course, such luck can run out, but then if the consequences for the individual concerned are fatal, learning stops anyway). Many dangerous health behaviours are linked only probabilistically to actual diseases (even cigarette smoking and lung cancer) and, importantly, such effects can be delayed for many years. In the meantime, such behaviours typically provide much more immediate gratification. Such cases are seen as examples of partial or delayed reinforcement. The point in each case is that individual experience often provides uncertain evidence of the actual level of danger. This can also take the form of a discounting of warnings where such advice can only be probabilistic. An example is hurricane warnings, where the strength or trajectory of the hurricane turns out to be less damaging than originally forecast. In addition, as least in Western cultures, many people show an optimistic bias where they judge their own risk from hazards to be lower than that of others [54]. This bias is likely to be aggravated with rare events for which people lack personal experience [55].

Natural hazards vary enormously in the 'reinforcement schedules' they impose on safer and less safe forms of individual behaviour and policy. Most geophysical hazards (less so for tsunamis than for earthquakes and volcanic eruptions) are reasonably predictable in terms of where they are most likely to happen (although prediction of impacts, not to mention their intensity, duration etc, at a very local level is more difficult and/or contentious). However, predicting when they will occur is far more challenging, particularly when these are rare occurrences. By contrast, weather-related hazards may happen more frequently, and be associated with particular regions or seasons, but the intensity of any impacts may be less predictable, not least with the effects of climate change. Anecdotal evidence of novel weather patterns, animal behaviour and growth of crops and flora is increasingly being supported by more systematic analysis [56]. In all such cases, individuals may use their own personal experience alongside available

scientific evidence when deciding on their response (if any) to hazards in their environment. However, it is unclear that people recognise the extent to which their decisions are based on information that is not merely incomplete, but potentially biased by their own previous decisions in the direction of confirming their prior judgements.

9. Trust in others

The question of how risk interpretation and action is influenced by our trust in others is of central importance in the field of natural hazards. Ordinary citizens may need to rely on scientific experts to provide information concerning the severity and/or urgency of any threat, and on risk managers to take many of the decisions required to protect them from such threats. In turn, hazard managers must trust that citizens will adopt recommended mitigation practices and react to events in predictable and socially acceptable ways. But who is trusted by whom and when, and how does acceptance of risk messages depend on recipients' prior belief systems?

In terms of who is trusted, research suggests that we need to distinguish between trust in others' knowledge or expertise on the one hand, and their motivation, honesty and integrity on the other hand. This closely parallels the SDT distinction between discrimination ability and criterion setting [57]. We want other people on whom we need to rely both to know what they are doing and to use their knowledge in a way that does not compromise our safety and well-being for some inappropriate motive, such as personal profit. When we consult doctors for medical treatment, we are putting our trust not only in their expertise, but in their integrity to prescribe the treatment that was most appropriate for us and not one that was most profitable to them personally, e.g. in terms of insurance payments. When we take a flight, we are putting our trust in a whole range of professionals and technological systems. Most of the time, we follow a routine without thinking deeply about how others make decisions on our behalf, but when an event disrupts this routine, as when volcanic ash leads to the lengthy closure of air space, and regulators need to decide if it is safe enough, or too dangerous to fly, the balancing of benefits and costs to all parties becomes exposed.

There are individual and cultural differences in people's beliefs in their own, and others', ability to avoid or control risks. Individual differences in personality and cognitive style may be reflected in people's confidence in their ability as decision-makers, their willingness to make a decision at all as opposed to procrastinating or avoiding responsibility, and how much they prefer 'closure' to continuing uncertainty [58,59]. That is, some people may be particularly likely to turn to others to help resolve their uncertainty. Understanding who these individuals and groups are may help in designing more effective risk messages.

In terms of culture, research suggests that some cultures are more fatalistic than others when it comes to natural hazards [60]. Fatalistic attitudes may be underpinned by religious or spiritual beliefs of various kinds. The influence of spiritual or religious fatalism may be influenced by the phase of disaster being examined. While it may undermine

decisions to prepare, it may facilitate recovery. For example, the cultural fatalism within Buddhist beliefs may have assisted recovery in Thai populations affected by the 2004 Indian Ocean tsunami [61]. Likewise, confucianism promotes long term thinking, perseverance and the importance of preparing for future adversity and these cultural qualities may translate an implicit fatalism into a degree of preparedness. The relationship between human beings and nature may also be viewed differently within different cultures. A classic study by Kluckhohn and Strodtbeck [62] found that cultural groups in western New Mexico differed in their preferences for value orientations described as 'Man subject to Nature', 'Man with Nature' and 'Man over Nature'. Hence, cross-cultural comparison should be based on comparing across cultural dimensions rather than countries [63] and work on cross-cultural equivalence should be complemented with research into culture-specific mechanisms such as Jishubo in Japan and the Hakka Spirit in Taiwan. The latter, in particular, demonstrates how a culturally implicit belief in learning to co-exist with nature influences risk beliefs and resilience in the event of experiencing a hazard event. Similarly, cultural beliefs about physical hazard characteristics, such as native Hawai'ian beliefs regarding the relationship between Pele and lava flows, can increase risk by limiting choice of mitigation [63]. However, acknowledging that risk interpretation can reflect cultural context does not require us to treat culture as cause and risk interpretation as effect. It is equally plausible to regard culture as partly characterised by how people interpret and respond to risks of various kinds [64].

As regard the content of risk messages, we recognise that for hazards, such as earthquakes, that cannot be prevented and only predicted with difficulty, it is understandable that many people will adopt a fatalistic attitude that such events are beyond their control [65]. A policy challenge here is that of convincing people that, nonetheless, there is much they can do that is within their control, such as securing furniture and fitments, preparing stores and planning what to do if forced to evacuate. Research showing which kinds of messages reduce people's fatalism about natural hazards, i.e. which messages are trusted, can be applied to this challenge [66,67].

In addition to the content and focus of communications, research also highlights ways to improve trust through the processes of communication, i.e. not just what but how. One construct that has been used to illuminate how trust develops is empowerment [13,61]. Thus, certain community characteristics (e.g., levels of active participation) and competencies (e.g., collective efficacy) may empower people to identify and represent their hazard management needs. Whether these needs are enacted may depend on interactions with agencies who are responsive to community needs (i.e., create empowering settings). These interactions influence trust beliefs, with trust mediating the relationship with intentions and preparation.

Finally, social processes, beyond the source-recipient relation, are also strongly implicated in the way risk messages come to be interpreted, and reinterpreted, over time. Kasperson [68] uses the terms 'social amplification' and 'social attenuation' to describe how different kinds of risk are picked up (or not) by the media and other agents

and attract greater or lesser attention. In emergencies, people do not simply attend individually to information about what to do, but often try to evaluate it collectively through comparing their interpretations with those of others—a phenomenon known as 'social milling' [69-71]. People will converse with each other about the significance of any risk message, so that what emerges from such 'social milling' is a composite of people's individual interpretations. This process may be greatly accelerated and widened through the internet and use of information communications technology.

Johnson [72] has argued that much of the previous research on public trust in risk/disaster managers has focused on perceptions of the different stakeholders separately and attempts to compare to see who is more or less trusted. However in reality, he argues, individuals have to place their trust in whole systems of risk/disaster management that depend on the interplay of these different agents. Moreover, the public may trust the various actors in some respects but not others, e.g. they may trust scientists to accurately assess the risks but not necessarily to be most concerned about the economic impacts [73]. A 'mental models' approach [40] can improve our understanding of how the public conceptualises not just the hazard but the hazard management system and how trust and distrust can emerge from these perceptions of competing interests and perspectives. Also pertinent is research on factors that can lead to a gain or loss in trust. There is a good deal of truth in the adage that trust is easy to lose and difficult to rebuild [13,57]. A dilemma faced by many risk managers is how far to go in admitting one's mistakes. Such admissions can help bolster one's perceived honesty, but weaken one's perceived competence. However, failing to admit a mistake and then being discovered in the deception is the worst combination.

10. Complexity, scale and social context

As noted, applying lessons from previous research on risk and decision-making to the context of natural hazards is especially challenging because of the increase in scale and complexity, and the fact that we need to consider the interactions between the decisions made by several actors rather than those of individuals considered singly. Different actors may have different levels of access to relevant (e.g. scientific) information, but this is only part of the story. They also have separate, sometimes opposing, interests, and may evaluate different outcomes very differently. The aspects of risks (physical, economic, political) borne by different actors often differ, and many decisions may have the effect (if not the intention) of transferring risk onto other groups, whether those with less economic or political power, those living further away (e.g. in a different country or jurisdiction) or future generations. Such risk transference may be motivated by cynical self-interest, but may also arise less deliberately from misinterpretation of risk and failure to take account of the knock-on effects of inappropriate attempts to mitigate vulnerability to particular hazards [74]. This clearly raises important ethical issues.

Scaling up from the individual to the societal level requires more than a consideration of social collectives

and communities as single entities. Communities are not simply groups of individuals who happen to be categorised together, but groups of individuals who interact and communicate with one another thus creating social systems. Such interactions contribute directly to hazard mitigation and community resilience [75-78].

11. From risk interpretation to action

Assessing risk is one thing, acting on the basis of such assessments is another. A pervasive misconception is that ordinary citizens typically fail to protect themselves from hazards because they are ignorant of 'the facts', irrational in how they interpret information, or both. Citizens may not always respond (as authorities hope) to risk warnings, not because they are 'irrational', but because they feel severely constrained in terms of the options open to them (as when evacuation in the face of a less-than-certain hazard will result in a loss of livelihood and means of supporting one's family). These constraints must be understood and anticipated in any plans for disaster prevention and risk mitigation.

When conceptualising preparedness, it is important not to see it as an all or none process. Some people decide not to prepare [79]. Others may be interested but need more guidance. These starting points are different and informed by different interpretive and decision processes and intervention must acknowledge this. At the other end of the preparedness spectrum are those who have acted and whose continuing to do so may require engaging with them in different ways.

Other things being equal, we would expect people to choose actions that enhance or protect their health and well-being and avoid actions that put themselves and their families at risk. So why is there a gap between risk interpretation and action? For a start, other things very often are not equal. Knowing the risk and knowing how to or being able to respond to the risk are not the same. The same activities can have the potential for both enhanced well-being and/or harm. Profitable activities can be relatively dangerous, as with farming on fertile slopes of volcanoes or flood-prone river valleys. Choices which optimise benefit while minimising risk may simply not be available, or affordable, for people in many real-life situations.

There is another vital distinction that is often overlooked. Beliefs or expectations concerning hazards (e.g. a hurricane or earthquake) differ from attitudes towards acts to be undertaken in the face of such hazards (e.g. evacuation, or making one's home more secure). A lot of this boils down to whether people think such acts will be effective and/or within their own control anyway. Research on individual health behaviour (e.g. smoking, alcohol use, dietary behaviour) contains several examples where unhealthy habits are supported by a whole set of pessimistic self-beliefs, based on personal experience, that changing one's habits is very difficult and trying to do so is likely to end in failure [80]. Research on natural hazards has found that people's expectations about the efficacy of preparedness measures influences actions [13].

Again, lessons can be drawn from the failures and successes of policies to influence health behaviour, such as cigarette smoking. One needs to start with evidence-based messages that cigarette smoking (say) is dangerous, but more is needed, since many smokers still continue, and many young people take up the habit, despite 'knowing' the risks, at least in general terms. In many countries, this 'something else' now involves: (a) at the level of the individual, a finer-grain analysis of the reinforcement processes underpinning physical and psychological dependence; and (b) at the level of the society, re-engineering the environment to make the healthier option many people's default choice. However, if social support is lacking, or even acts against behaviour change (as in many groups and cultures where unhealthy behaviour is the norm [81]) policy initiatives by more remote government authorities or even health professionals are less likely to succeed.

How analogous are—or could be—policy initiatives for disaster reduction in the context of natural hazards? At the more individual level, there are opportunities for interventions to make preventive or protective measures more accessible and affordable. This could include micro-insurance schemes for farmers in developing countries. In more developed countries, grants and other fiscal instruments can be used to incentivise more sustainable behaviours such as lower energy consumption. At the societal or more macro level, there are measures that need to be put in place for which governments must carry the primary responsibility. These include: setting up effective facilities to monitor natural hazards and forecast hazard events; defining and enforcing regulations to prevent unsafe land-use, building practices and industrial activities that compromise environmental safety; protecting vital infrastructure and planning for emergencies, e.g. in relation to evacuation and relief provision. In many such cases, what needs to be done is broadly already known—what are needed, more typically, are the economic resources and the political will to confront special interest groups that may be more powerful than governments. International cooperation has an important part to play here, not only through the pooling of scientific knowledge and sharing of resources, but also through providing examples of best practice and even moral pressure from other states that can also be put at risk through poor practice by their neighbours. There are, however, many cases where the best that can be hoped for is a mitigation of risks that have developed historically, especially through population movements and the growth of cities in vulnerable locations.

Once again, though, the involvement and support of local communities is vital. Conversely, when national governments or international agencies are mistrusted by local communities or vice versa, there will be huge difficulties in putting policies into practice.

12. The way forward

Ultimately, research on natural hazards and disasters, whether from a physical or behavioural science perspective, aims to offer knowledge that might help prevent

death, damage and distress in some measure. This is, ideally, knowledge for the common good as well as for its own sake. Yet this ideal must be tempered by awareness that the translation of research into practice is frequently beset by obstacles of many kinds. Access to such knowledge is limited, and those who have it may not share it, nor use it for the common good.

Our focus in this paper has been on what lessons we can draw from previous research in terms of how best to conceptualise how people interpret risks and choose actions based on such interpretations within the context of natural hazards. This context is defined not just by vulnerability to some physical event, but also by social relationships. Indeed, vulnerability itself is partly a function of such social relationships. The judgements and choices underlying risk interpretation and action, then, are not merely personal, but also interpersonal. However, this is still a work in progress. The literature, while varied and extensive, is not yet well integrated. More research on social and research networks within the field of natural hazards could promote better integration [82].

There is increasing acknowledgement of the role of human behaviour in influencing whether hazard events develop into disasters. However, this acknowledgement is rarely accompanied by a more than superficial analysis of the factors that determine human behaviour and observable differences between individuals and social groups in their feelings, cognitions and actions. This imbalance in research activity and funding, and proposals for research on disasters and risk decisions, is exemplified by the recently released NRC report [83] on U.S. national earthquake resilience. By and large, integrated risk assessments are lacking, and where they do exist, integration with the social and behavioural sciences is weak.

The quality of data available to allow for more integrated risk assessments is also uneven. Data quality issues can stem from lack of monitoring technologies, insufficient funding, or suppression of data or delay (e.g., of disease outbreak information by governments). There have been major advances in earth observation and Geographical Information Systems (GIS). Advances in GIS have been hailed as offering potential means of forecasting a range of natural disasters, including landslides [84]. However, although considerable progress has been made, diffusion of this technology is still hampered by factors such as problems in acquiring appropriate data, the complexity of predictive models and a preference for data that can be acquired at a low cost rather than data that are more relevant and predictive. Even in the (relatively successful) context of Tsunami warning systems, it is only recently that much attention has been paid to what makes warnings effective, and/or to the social milling [69-71] that happens in disasters [85]. Some attempts have been made to develop a more holistic model for effective warning systems, such as that for volcanoes in New Zealand [86].

At the same time, much research in the social and behavioural sciences, such as some of that reviewed in this paper, has progressed in rather abstract contexts. This means that vigilance is needed when extending the conclusions of such studies to disaster research. While many of the theoretical principles identified by such work may have a

high degree of generality, it is important to look critically at the paradigms employed, since these may fail to incorporate factors that are crucial to much real-life decision-making. We have, for example, highlighted the fact that many laboratory experiments on decisions under uncertainty do not deal with the kind of uncertainty that arises from limited access to statistical information, nor with the updating of estimates based on feedback from experience. Yet any method that tries to look at the effect of 'independent variables' while controlling for others has a major limitation. It is often not well suited to the study of dynamic interactions within complex systems over time and space. Natural hazards and disasters are prime examples of complex dynamical systems. From the standpoint of risk interpretation and action, the primary interaction of interest is that between human actors and the natural hazard. Take away the hazard, and the people are safe, at least from the specific peril in question. Take away the people, and we are left merely with a geophysical or meteorological event, not a disaster. But there are, of course, very many other interactions over all kind of scales. The hazard events themselves have complex dynamics, and interact with other events. Likewise, the human actors are not isolated decision-makers making one-off choices independently of everyone else. How individuals and communities interact with one another and shape their physical and social environments greatly influences whether vulnerability and risk are exacerbated, mitigated or transferred onto others separated by time and/or place. Natural hazards and disasters highlight, often graphically, our social dependence on one another. Yet we have to start somewhere, to find some thread to pull that may disentangle part of the knot. One such thread is how we, as individuals and as members of social networks, interpret risk and act upon our interpretations.

Acknowledgements

An earlier version of this paper was prepared as part of the programme Integrated Research on Disaster Risk www. international.org sponsored by the International Council for Science (ICSU), the International Social Science Council (ISSC), and the United Nations International Strategy for Disaster Reduction (UNISDR).

References

[1] ICSU. A science plan for integrated research on disaster risk: addressing the challenge of natural and human-induced environmental hazards. Paris: ICSU; 2008.

[2] Mileti D, editor. Disasters by design: a reassessment of natural hazards in the United States. Washington, DC: Joseph Henrey Press; 1999.

[3] Oliver-Smith A, Hoffmann SM, editors. The angry Earth: disaster in anthropological perspective. New York: Routledge; 1999.

[4] Birkmann J, editor. Measuring vulnerability to natural hazards: towards disaster resilient societies. Tokyo: United Nations University Press; 2006.

[5] Burton I, Kates RW, White GF. In: The environment as hazard. 2nd ed.. NY: Guilford Press; 1993.

[6] Hewitt K. In: Regions of risk: a geographical introduction to disasters.Harlow: Longman; 1997.

[7] Lewis. J. The vulnerability of small island states to sea level rise: the need for holistic strategies. Disasters 1990;14:241-9.

[8] Lewis J, Kelman I. Places, people and perpetuity: community capacities in ecologies of catastrophe. Archives of Civil and Mechanical Engineering 2010;9:193-220.

[9] Burton I. Vulnerability and adaptive responses in the context of climate and climate change. Climatic Change 1997;36:185-96.

[10] Johnson E, Tversky A. Representations of perceptions of risks. Journal of Experimental Psychology: General 1984;113:55-70.

[11] Earle TC. Thinking aloud about trust: a protocol analysis of trust in risk management. Risk Analysis 2004;24:169-83.

[12] Lion R, Meertens RM, Bot I. Priorities in information desire about unknown risks. Risk Analysis 2002;22:765-76.

[13] Paton D. Risk communication and natural hazard mitigation: How trust influences its effectiveness. International Journal of Global Environmental Issues 2008;8:2-16.

[14] Poortinga W, Pidgeon WNF. Trust, the asymmetry principle, and the role of prior beliefs. Risk Analysis 2004;24:1475-86.

[15] Atwater BF, Moore AL. A tsunami about 1000 years ago in Puget: Holocene sedimentation and paleogeography. Washington Geology 1992:223-6.

[16] Bernoulli D. Exposition of a new theory on the measurement of risk. Econometrica 1954;22:23-36 (Translated by L. Sommer) (Reprinted from Commentarii Academiae Scientiarum Imperialis Petropolitanae, 5, 1738, 175-192).

[17] Eiser JR, Reicher SD, Podpadec TJ. Smokers' and non-smokers' estimates of their personal risk of cancer and of the incremental risk attributable to cigarette smoking. Addiction Research 1995;3: 221-9.

[18] Kahneman D, Tversky A. Prospect theory: an analysis of decision under risk. Econometrics 1979;47:263-91.

[19] Tversky A, Kahneman D. The framing of decisions and the psychology of choice. Science 1981;211:453-8.

[20] Detweiler JB, Bedell BT, Salovey P, Pronin E, Rothman AJ. Message framing and sunscreen use: gain-framed messages motivate beach-goers. Health Psychology 1999;18:189-96.

[21] Rothman AJ, Salovey P. Shaping perceptions to motivate healthy behavior: the role of message framing. Psychological Bulletin 1997;121:3-19.

[22] Rothman AJ, Martino SC, Bedell BT, Detweiler JB, Salovey P. The systematic use of gain- and loss-frame messages on interest in and use of different types of health behavior. Personality and Social Psychology Bulletin 1999;25:1355-69.

[23] McClure J, Sibley CG. Framing effects on disaster preparation: is negative framing more effective? Australasian Journal of Disaster and Trauma Studies 2011.

[24] McClure J, White J, Sibley CG. Framing effects on preparation intentions: distinguishing actions and outcomes. Disaster Prevention and Management 2009;18:187-99.

[25] Kahneman D, Tversky A. On the psychology of prediction. Psychological Review 1973;80:237-51.

[26] Tversky A, Kahneman D. Judgment under uncertainty: heuristics and biases. Science 1974;185:1124-31.

[27] Finucane ML, Alhakami AS, Slovic P, Johnson SM. The affect heuristic in judgments of risk and benefits. Journal of Behavioral Decision Making 2000;13:1-17.

[28] Slovic P, Finucane ML, Peters E, MacGregor DG. Risk as analysis and risk as feelings: some thoughts about affect, reason, risk, and rationality. Risk Analysis 2004;24:31-322.

[29] Slovic P, Finucane ML, Peters E, MacGregor DG. The affect heuristic. In: Gilovich T, Griffin D, Kahneman D, editors. Heuristics and Biases: The Psychology of Intuitive Judgment. New York: Cambridge University Press; 2002. p. 397-420.

[30] Terpstra T. Emotions, trust and perceived risk: affective and cognitive routes to flood preparedness behaviour. Risk Analysis 2011;31:1658-75.

[31] Eiser JR, van der Pligt J. Beliefs and values in the nuclear debate. Journal of Applied Social Psychology 1979;9:524-36.

[32] Barron G, Erev I. Small feedback-based decisions and their limited correspondence to description-based decisions. Journal of Behavioral Decision Making 2003;16:215-33.

[33] Hertwig R, Barron G, Weber EU, Erev I. Decisions from experience and the effect of rare events in risky choice. Psychological Science 2004;15:534-9.

[34] Weber EU. Experience-based and description-based perceptions of long-term risk: why global warming does not scare Us (Yet). Climatic Change 2006;77:103-20.

[35] Weber EU, Lindemann PG. From intuition to analysis: Making decisions with our head, our heart, or by the book. In: Plessner H, Betsch C, Betsch T, editors. Intuition in Judgment and Decision Making. NJ: Lawrence Erlbaum Mahwah; 2007. p. 191-208.

[36] Rakow T, Newell BR. Degrees of uncertainty: an overview and framework for future research on experience-based choice. Journal of Behavioral Decision Making 2010;23:1-14.

[37] McCaffrey S. Thinking of wildfire as a natural hazard. Society and Natural Resources 2004;17:509-16.

[38] Kahneman D, Tversky A. Belief in the law of small numbers. Psychological Bulletin 1971;76:105-10.

[39] Fischhoff B. Risk perception and communication. In: Detels R, Beaglehole R, Lansang MA, Gulliford M, editors. Oxford Textbook of Public Health. 5th ed Oxford: Oxford University Press; 2009. p. 940-52.

[40] Morgan MG, Fischhoff B, Bostrom A, Atman CJ. Risk communication: a mental models approach.Cambridge: Cambridge University Press; 2002.

[41] Morgan MG, Fischhoff B, Bostrom A, Lave L, Atman CJ. Communicating risk to the public. Environmental Science & Technology 1992;26:2048-56.

[42] Witte K. Fear control and danger control: a test of the extended parallel process model. Communication Monographs 1994;61:113-34.

[43] Barke R, Jenkins-Smith H. Politics and scientific expertise: scientists, risk perception, and nuclear waste policy. Risk Analysis 2006;13:425-39.

[44] Bostrom A. Risk perceptions: experts vs. lay people. Duke Environmental Law & Policy Forum 1997;8:101-13.

[45] Swets JA. The receiver operating characteristic in psychology. Science 1973;182:990-1000.

[46] Simmons KM, Sutter D. False alarms, tornado warnings, and tornado casualties. Weather Climate and Society 2009;1:38-53.

[47] Darley JM, Fazio RH. Expectancy confirmation processes arising in the social interaction sequence. American Psychologist 1980;35(1980): 867-881.

[48] Russo JE, Carlson KA, Meloy MG, Yong K. The goal of consistency as a cause of information distortion. Journal of Experimental Psychology: General 2008;137:456-70.

[49] Russo JE, Medvec VH, Meloy MG. The distortion of information during decisions. Organizational Behavior and Human Decision Processes 1996;66:102-10.

[50] Newcomb TM. The acquaintance process.New York: Holt, Rinehart & Winston; 1961.

[51] Newcomb TM. Heiderian balance as a group phenomenon. Journal of Personality and Social Psychology 1981;40:862-7.

[52] Fazio RH, Eiser JR, Shook NJ. Attitude formation through exploration: valence asymmetries. Journal of Personality and Social Psychology 2004;87:293-311.

[53] Gigerenzer G. Reckoning with risk: learning to live with uncertain-ty.London: Penguin; 2002.

[54] Weinstein ND. Unrealistic optimism about future life events. Journal of Personality and Social Psychology 1980;39:806-20.

[55] Spittal MJ, McClure J, Siegert RJ, Walkey FH. Optimistic bias in relation to preparedness for earthquakes. Australasian Journal of Disaster and Trauma Studies 2005:1-10.

[56] Miles EL, Snover AK, Hamley AF, Callahan B, Fluharty BDL. Pacific Northwest Regional Assessment: the impacts of climate variability and climate change on the water resources of the Columbia River Basin. Journal of the American Water Resources Association 2000;36:399-420.

[57] White MP, Eiser JR. Marginal trust in decision makers: building and losing trust following decisions under uncertainty. Risk Analysis 2006;26:1187-203.

[58] Janis IL, Mann L. Decision making: a psychological analysis of conflict, choice, and commitment.New York: Free Press; 1977.

[59] Webster DM, Kruglanski AW. Individual differences in need for cognitive closure. Journal of Personality and Social Psychology 1994;67:1049-62.

[60] Becker J, Johnston D, Lazrus H, Crawford G, Nelson D. Use of traditional knowledge in emergency management for tsunami hazard: a case study from Washington State, USA. Disaster Prevention and Management 2008;17:488-502.

[61] Paton D, Tang DCS. Adaptive and growth outcomes following tsunami: the experience of Thai communities following the 2004 Indian Ocean tsunami. In: Askew Edward S, Bromley James P, editors. Atlantic and Indian Oceans: New Oceanographic Research. New York: Nova Science Publishers; 2009. p. 125-40.

[62] Kluckhohn FR, Strodtbeck FL. Variations in value orientations.Evan-ston, IL: Row, Peterson; 1961.

[63] Paton D, Sagala S, Okado N, Jang L, Burgelt PT, Gregg CE. Making sense of natural hazard mitigation: personal, social and cultural influences. Environmental Hazards 2010;9:183-96.

[64] Sunstein CR. Misfearing: a reply. Harvard Law Review 2006;119: 1110-25.

[65] McClure J, Walkey FH, Allen MW. When earthquake damage is seen as preventable: attributions, locus of control and attitudes to risk. Applied Psychology 1999;48:239-56.

[66] McClure J, Allen MW, Walkey FH. Countering fatalism: causal information in news reports affects judgements about earthquake damage. Basic and Applied Social Psychology 2001;23:109-21.

[67] McClure J, Sutton RM, Sibley CG. Listening to reporters or engineers: how different messages about building design affect earthquake fatalism. Journal of Applied Social Psychology 2007;37: 1956-73.

[68] Kasperson JX, Kasperson RE, Pidgeon N, Slovic P. The social amplification of risk: assessing fifteen years of research and theory. In: Pidgeon N, Kasperson RE, Slovic P, editors. The Social Amplification of Risk. Cambridge: Cambridge University Press; 2003. p. 13-46.

[69] Mileti DS, Peek L. The social psychology of public response to warnings of a nuclear power plant accident. Journal of Hazardous Materials 2000;75:181-94.

[70] Mileti DS, Peek L. Understanding individual and social characteristics in the promotion of household disaster preparedness. In: Dietz T, Stern PC, editors. New Tools for Environmental Protection: Education, Information, and Voluntary Measures. Washington, DC: National Academy Press; 2002. p. 125-39.

[71] Mileti DS, Sorensen J, Communication of Emergency Public Warnings: A Social Science Perspective and State-of-the-Art Assessment. Oak Ridge, TN: Oak Ridge National Laboratory Report 0RNL-6609 for the Federal Emergency Management Agency; 1990.

[72] Johnson BB. Trust judgments in complex hazard management systems: The potential role of concepts: the potential role of concepts of the system. In: Cvetkovich G, Lofstedt R, editors. Social Trust and the Management of Risk. London: Earthscan; 1999. p. 62-72.

[73] Johnson B, White MP. The importance of multiple performance criteria for understanding trust in risk managers. Risk Analysis 2010;30:1099-115.

[74] Etkin D. Risk transference and related trends: driving forces towards more mega-disasters. Environmental Hazards 1999;1: 69-75.

[75] Berkes F. Understanding uncertainty and reducing vulnerability: lessons from resilience thinking. Nature Hazards 2007;41:283-95.

[76] Cutter S, Barnes S, Berry M, Burton C, Evans E, Tatae E, et al. A place-based model for understanding community resilience to natural disasters. Global Environmental Change 2008;18:598-606.

[77] Pearce L. Disaster management and communit planning, and public participation: How to achieve sustainable hazard mitigation. Nature Hazards 2003;28:211-28.

[78] Paton D. Disaster resilience: integrating individual, community, institutional and environmental perspectives. In: Paton D, Johnston DM, editors. Disaster Resilience: An Integrated Approach. Springfield, IL: Charles C. Thomas; 2006. p. 305-18.

[79] Grothmann T, Reusswig F. People at risk of flooding: Why some residents take precautionary action while others do not. Nature Hazards 2006;38:101-20.

[80] Eiser JR, Sutton SR. Smoking as a subjectively rational choice. Addictive Behaviors 1977;2:129-34.

[81] Lazuras L, Eiser JR, Rodafinos A. Predicting Greek adolescents' intentions to smoke: a focus on normative processes. Health Psychology 2009;28:770-8.

[82] Schummer J. Multidisciplinarity, interdisciplinarity, and patterns of research collaboration in nanoscience and nanotechnology. Scien-tometrics 2004;59:425-65.

[83] National Research Council. Committee on National Earthquake Resilience Research, Implementation, and Outreach; Committee on Seismology and Geodynamic, National Earthquake Resilience: Research, Implementation, and Outreach.Washington DC: National Academies Press; 2011.

[84] Carrara A, Guzzetti F, Cardinali M, Reichenbach P. Use of GIS Technology in the prediction and monitoring of landslide hazard. Nature Hazards 1999;20:117-35.

[85] National Research Council Tsunami Warning and Preparedness: An Assessment of the U.S. Tsunami Program and the Nation's Preparedness Efforts. Washington DC: National Academies Press; 2010.

[86] Leonard GS, Johnston DM, Paton D, Christianson A, Becker J, Keys H. Developing effective warning systems: ongoing research at Ruapehu volcano, New Zealand. Journal of Volcanology and Geothermal Research 2008;172:199-215.