The function of fear, worry and denial : micro to macro When - TopicsExpress



          

The function of fear, worry and denial : micro to macro When these concepts aggregate in a population, one sees a definitive following of abhorrent or degenerate thinking and associated behaviors permeate the collective societal mindset. What follows this is degenerative blame gaming as the the denial morphs into a downward spiral fueled by the lack of accepting personal and collective responsibility for what measures might have been taken to minimize and prepare for potential and actual threats. After the event... There is a sub function that often occurs that drives people, particularly those in power positions, to overreact in an attempt rectify retroactively the threat which became an event in reality. This overreaction can further Agitate the situation creating yet another level of cascading problems. Keep this in mind as the opposite or response to the following sub functions of denial delineated below. Fear of Impeding doom and denial: The normalcy bias, or normality bias, refers to a mental state people enter when facing a disaster. It causes people to underestimate both the possibility of a disaster occurring and its possible effects. This often results in situations where people fail to adequately prepare for a disaster, and on a larger scale, the failure of governments to include the populace in its disaster preparations. The assumption that is made in the case of the normalcy bias is that since a disaster never has occurred then it never will occur. It also results in the inability of people to cope with a disaster once it occurs. People with a normalcy bias have difficulties reacting to something they have not experienced before. People also tend to interpret warnings in the most optimistic way possible, seizing on any ambiguities to infer a less serious situation. The normalcy bias may be caused in part by the way the brain processes new data. Research suggests that even when the brain is calm, it takes 8–10 seconds to process new information. Stress slows the process, and when the brain cannot find an acceptable response to a situation, it fixates on a single and sometimes default solution that may or may not be correct. An evolutionary reason for this response could be that paralysis gives an animal a better chance of surviving an attack; predators are less likely to eat prey that isnt struggling.[1] The normalcy bias often results in unnecessary deaths in disaster situations. The lack of preparation for disasters often leads to inadequate shelter, supplies, and evacuation plans. Even when all these things are in place, individuals with a normalcy bias often refuse to leave their homes. Studies have shown that more than 70% of people check with others before deciding to evacuate.[1] The normalcy bias also causes people to drastically underestimate the effects of the disaster. Therefore, they think that everything will be all right, while information from the radio, television, or neighbors gives them reason to believe there is a risk. This creates a cognitive dissonance that they then must work to eliminate. Some manage to eliminate it by refusing to believe new warnings coming in and refusing to evacuate (maintaining the normalcy bias), while others eliminate the dissonance by escaping the danger. The possibility that some may refuse to evacuate causes significant problems in disaster planning.[2] Not limited to, but most notably: The Nazi genocide of millions of Jews. Even after knowing friends and family were being taken against their will, the Jewish community still stayed put, and refused to believe something was going on. Because of the extreme nature of the situation it is understandable why most would deny it. Little Sioux Scout camp in June 2008. Despite being in the middle of Tornado Alley, the campground had no tornado shelter to offer protection from a strong tornado.[3] New Orleans before Hurricane Katrina. Inadequate government and citizen preparation and the denial that the levees could fail were an example of the normalcy bias, as were the thousands of people who refused to evacuate. During the September 11 attacks, many in the World Trade Center returned to their offices during the evacuation to turn off their computers and ultimately died when the towers collapsed. The negative effects can be combated through the four stages of disaster response: preparation, including publicly acknowledging the possibility of disaster and forming contingency plans warning, including issuing clear, unambiguous, and frequent warnings and helping the public to understand and believe them impact, the stage at which the contingency plans take effect and emergency services, rescue teams, and disaster relief teams work in tandem aftermath, or reestablishing equilibrium after the fact by providing supplies and aid to those in need. See also black swan event : The black swan theory or theory of black swan events is a metaphor that describes an event that comes as a surprise, has a major effect, and is often inappropriately rationalized after the fact with the benefit of hindsight. The theory was developed by Nassim Nicholas Taleb to explain: The disproportionate role of high-profile, hard-to-predict, and rare events that are beyond the realm of normal expectations in history, science, finance, and technology The non-computability of the probability of the consequential rare events using scientific methods (owing to the very nature of small probabilities) The psychological biases that make people individually and collectively blind to uncertainty and unaware of the massive role of the rare event in historical affairs Unlike the earlier philosophical black swan problem, the black swan theory refers only to unexpected events of large magnitude and consequence and their dominant role in history. Such events, considered extreme outliers, collectively play vastly larger roles than regular occurrences.[1] More technically, in the scientific monograph Lectures on Probability and Risk in the Real World: Fat Tails (Volume 1), Taleb mathematically defines the black swan problem as stemming from the use of degenerate metaprobability.[2] Black swan events were introduced by Nassim Nicholas Taleb in his 2001 book Fooled By Randomness, which concerned financial events. His 2007 book The Black Swan extended the metaphor to events outside of financial markets. Taleb regards almost all major scientific discoveries, historical events, and artistic accomplishments as black swans—undirected and unpredicted. He gives the rise of the Internet, the personal computer, World War I, dissolution of the Soviet Union, and the September 2001 attacks as examples of black swan events.[3] The phrase black swan derives from a Latin expression; its oldest known occurrence is the poet Juvenals characterization of something being rara avis in terris nigroque simillima cygno (a rare bird in the lands, very much like a black swan; 6.165).[4] In English, when the phrase was coined, the black swan was presumed not to exist. The importance of the simile lies in its analogy to the fragility of any system of thought. A set of conclusions is potentially undone once any of its fundamental postulates is disproved. In this case, the observation of a single black swan would be the undoing of the phrases underlying logic, as well as any reasoning that followed from that underlying logic. Juvenals phrase was a common expression in 16th century London as a statement of impossibility. The London expression derives from the Old World presumption that all swans must be white because all historical records of swans reported that they had white feathers.[5] In that context, a black swan was impossible or at least nonexistent. After Dutch explorer Willem de Vlamingh discovered black swans in Western Australia in 1697,[6] the term metamorphosed to connote that a perceived impossibility might later be disproven. Taleb notes that in the 19th century John Stuart Mill used the black swan logical fallacy as a new term to identify falsification.[7] Taleb asserts:[8] What we call here a Black Swan (and capitalize it) is an event with the following three attributes. First, it is an outlier, as it lies outside the realm of regular expectations, because nothing in the past can convincingly point to its possibility. Second, it carries an extreme impact. Third, in spite of its outlier status, human nature makes us concoct explanations for its occurrence after the fact, making it explainable and predictable. I stop and summarize the triplet: rarity, extreme impact, and retrospective (though not prospective) predictability. A small number of Black Swans explains almost everything in our world, from the success of ideas and religions, to the dynamics of historical events, to elements of our own personal lives. Based on the authors criteria: The event is a surprise (to the observer). The event has a major effect. After the first recorded instance of the event, it is rationalized by hindsight, as if it could have been expected; that is, the relevant data were available but unaccounted for in risk mitigation programs. The same is true for the personal perception by individuals. The main idea in Talebs book is not to attempt to predict black swan events, but to build robustness against negative ones that occur and be able to exploit positive ones. Taleb contends that banks and trading firms are very vulnerable to hazardous black swan events and are exposed to losses beyond those predicted by their defective models. On the subject of business in particular, Taleb is highly critical of the widespread use of the normal distribution model as the basis for calculating risk. In the second edition of The Black Swan, Taleb provides Ten Principles for a Black-Swan-Robust Society.[9] Taleb states that a black swan event depends on the observer. For example, what may be a black swan surprise for a turkey is not a black swan surprise to its butcher; hence the objective should be to avoid being the turkey by identifying areas of vulnerability in order to turn the Black Swans white.[10] Talebs black swan is different from the earlier philosophical versions of the problem, specifically in epistemology, as it concerns a phenomenon with specific empirical and statistical properties which he calls, the fourth quadrant.[11] Talebs problem is about epistemic limitations in some parts of the areas covered in decision making. These limitations are twofold: philosophical (mathematical) and empirical (human known epistemic biases). The philosophical problem is about the decrease in knowledge when it comes to rare events as these are not visible in past samples and therefore require a strong a priori, or an extrapolating theory; accordingly predictions of events depend more and more on theories when their probability is small. In the fourth quadrant, knowledge is both uncertain and consequences are large, requiring more robustness.[citation needed] According to Taleb,[12] thinkers who came before him who dealt with the notion of the improbable, such as Hume, Mill, and Popper focused on the problem of induction in logic, specifically, that of drawing general conclusions from specific observations. The central and unique attribute of Talebs black swan event is high profile. His claim is that almost all consequential events in history come from the unexpected — yet humans later convince themselves that these events are explainable in hindsight. One problem, labeled the ludic fallacy by Taleb, is the belief that the unstructured randomness found in life resembles the structured randomness found in games. This stems from the assumption that the unexpected may be predicted by extrapolating from variations in statistics based on past observations, especially when these statistics are presumed to represent samples from a normal distribution. These concerns often are highly relevant in financial markets, where major players sometimes assume normal distributions when using value at risk models, although market returns typically have fat tail distributions.[citation needed] Taleb said I dont particularly care about the usual. If you want to get an idea of a friends temperament, ethics, and personal elegance, you need to look at him under the tests of severe circumstances, not under the regular rosy glow of daily life. Can you assess the danger a criminal poses by examining only what he does on an ordinary day? Can we understand health without considering wild diseases and epidemics? Indeed the normal is often irrelevant. Almost everything in social life is produced by rare but consequential shocks and jumps; all the while almost everything studied about social life focuses on the normal, particularly with bell curve methods of inference that tell you close to nothing. Why? Because the bell curve ignores large deviations, cannot handle them, yet makes us confident that we have tamed uncertainty. Its nickname in this book is GIF, Great Intellectual Fraud. More generally, decision theory, based on a fixed universe or a model of possible outcomes, ignores and minimizes the effect of events that are outside model. For instance, a simple model of daily stock market returns may include extreme moves such as Black Monday (1987), but might not model the breakdown of markets following the 9/11 attacks. A fixed model considers the known unknowns, but ignores the unknown unknowns (Donald Rumsfeld).[citation needed] Taleb notes that other distributions are not usable with precision, but often are more descriptive, such as the fractal, power law, or scalable distributions and that awareness of these might help to temper expectations.[13] Beyond this, he emphasizes that many events simply are without precedent, undercutting the basis of this type of reasoning altogether. Taleb also argues for the use of counterfactual reasoning when considering risk.[14][15] Someone elses problem paradigm: Somebody Elses Problem (also known as Someone Elses Problem or SEP) is a psychological effect where individuals/populations of individuals choose to dissociate themselves from an issue that may be in critical need of recognition. Such issues may be of large concern to the population as a whole but can easily be a choice of ignorance by an individual. Author Douglas Adams description of the condition, which he ascribes to a physical SEP field, has helped make it a generally recognized phenomenon. Somebody Elses Problem used to capture public attention on matters that may have been overlooked and has less commonly been used to identify concerns that an individual suffering symptoms of depression should ignore. This condition has also been employed as trivial shorthand to describe factors that are out of scope in the current context.[1] Various areas of psychology and philosophy of perception are concerned with the reasons why individuals often ignore issues that are of relative or critical importance. Optimism bias tends to reduce issues of subjectivity due to the tendency to have thought processes that are overly positive- Overly positive assumptions can lead to disastrous miscalculations — make us less likely to get health checkups, apply sunscreen or open a savings account, and more likely to bet the farm on a bad investment.[2] Where multiple individuals simultaneously experience the same stimulus, diffusion of responsibility and/or the bystander effect may release individuals from the need to act, and if no-one from the group is seen to act, each individual may be further inhibited by conformity. An example of such instances would be the murder of Kitty Genovese, who on March 13, 1964 was stabbed and killed outside of her apartment building. Most of the evidence suggests that at least half a dozen-and perhaps many more-of her 30 or so neighbours heard the events but failed to come to her aid. Most didnt even bother to call the police.[3] When individuals are exposed to a multitude of messages about pressing matters of concern- information overload (now also known as Information Fatigue Syndrome) may be a result. In Joseph Ruffs article Information Overload: Causes, Symptoms and Solutions Ruff states, Once capacity is surpassed additional information becomes noise and results in a decrease in information processing and decision quality. [4] A student who has spent the entire semester socializing instead of studying would find themselves in a state of information overload the day before a final exam for example. There may also be a tendency to argue that since a proposed solution does not fit a problem entirely then the entire solution should be discarded. This is an example of a perfect solution fallacy. This fallacy is often employed by those who believe no action should be taken on a particular issue and use the fallacy to argue against any proposed action.[5] However, taking responsibility for negative events that are outside an individuals control is related to depression and learned helplessness, particularly in adolescents.[6] Part of the solution is to help the individual to realistically assign a proportion of responsibility to herself/himself, parents and others (step I in the RIBEYE cognitive behavioral therapy problem-solving method).[6][7][8] In politics and economics: French president Nicolas Sarkozy warned the U.S. Congress that The [decline of the] dollar cannot remain someone elses problem. If we are not careful, monetary disarray could morph into economic war. We would all be victims.[9] The New York Times said that when the Shah of Iran was exiled in 1979 he became someone elses problem from the point of view of President Carters administration.[10] In the election of 2006, the Swedish Centre Party used the phrase nånannanism (someone-else-ism) as a catchword for their campaign. British politician Peter Ainsworth acknowledges that climate change can seem huge, complex, remote and someone elses problem.[11] An example which contributes to the effects of climate change can be viewed within Anthony Penna’s book “The Human Footprint: A Global Environmental History” in which Penna states, “Unregulated local industries continue to pollute the soil, water, and air, while the depletion of local wood reserves for fuel degrades the forests.[12] Such environmental destruction can be viewed as Someone Else’s Problem since many individuals are unaware of unregulated destruction taking place and when this fact is discovered they may feel as if there is nothing that they themselves can do to contribute to ceasing this process- ultimately linking it to the condition- diffusion of responsibility. Douglas Adams was himself concerned about such failures to recognise the need for action and, with Mark Carwardine, published the book Last Chance to See, which highlighted endangered animal species. This can coincide with the quotation, “Nature is it not thou”, which sums up the contemporary trend that many individuals/populations have “othered” themselves from the environment resulting in devastating levels of destruction to the land and mass extinction rates.[13] “The background rate of extinction is somewhere between one and five species per year. But today, the extinction rate appears to be anywhere from 100 to 1,000 times greater than that.”[14]
Posted on: Wed, 23 Oct 2013 13:13:41 +0000

Trending Topics



/div>
;">
borderline symptom list 95
HOW TO IDENTIFY DIFFERENT CITIES OF INDIA : Scenario 1 : Two guys
{ 6-30-14 } THIS HAZARDOUS WEATHER OUTLOOK IS FOR PORTIONS OF
Early evening, zodiacal light. Late evening, moon and

Recently Viewed Topics




© 2015