As you mention, Concorde is really famous as a sunk cost example, but when I looked into it for http://www.gwern.net/Sunk%20cost I had to conclude that it was a rhetorically effective example and famous, but not actually a honest example to use. And this afflicted most of the business examples I looked at too. Heuristics & biases texts tend to mention dramatic little experiments and let one infer that there are dramatic problems in the real world, but as you know from reading Stanovich 2010 (_Rationality and the Reflective Mind_), there's an old undercurrent in the field pushing back against this inference and pointing out that people don't perform as badly as one would expect. I mentioned intelligence analysis. The main cognitive biases cited are confirmation bias and the closely related groupthink: with so much data available to an intelligence analyst, they can construct a convincing case by cherrypicking particular datapoints, and even if they have no axe to grind, they may start with a particular theory and begin to build it up. Those aren't the only ones, of course; McCoy in _The Politics of Heroin_ mentions that a State Department analyst pointed out that post-WWII France's Communist Party was going to have to give economic concessions to the austerity-weary public and these did not represent an imminent takeover of France by the Comintern. He was ignored on the grounds that he was an avowed Communist, and the CIA intervened with substantial covert support to the French Corsican mafia, saving it from destruction and maintaining the flow of heroin into the US. This would be a genetic or ad hominem or maybe a halo (horns) effect. Richards Heuer's _Psychology of Intelligence Analysis_ https://www.cia.gov/library/center-for-the-study-of-intelligence/csi-publications/books-and-monographs/psychology-of-intelligence-analysis/index.html is frequently cited in this vein. (You may remember Heuer as the man behind 'Analysis of Competing Hypotheses', which is used in argument diagramming http://lesswrong.com/lw/7g0/make_evidence_charts_not_review_papers_link/ ) Specifically: - https://www.cia.gov/library/center-for-the-study-of-intelligence/csi-publications/books-and-monographs/psychology-of-intelligence-analysis/art13.html - https://www.cia.gov/library/center-for-the-study-of-intelligence/csi-publications/books-and-monographs/psychology-of-intelligence-analysis/art14.html - https://www.cia.gov/library/center-for-the-study-of-intelligence/csi-publications/books-and-monographs/psychology-of-intelligence-analysis/art15.html - https://www.cia.gov/library/center-for-the-study-of-intelligence/csi-publications/books-and-monographs/psychology-of-intelligence-analysis/art16.html Smoking as an example of underweighting statistical evidence (millions of lives lost, billions of dollars cost): > For example, the Surgeon General's report linking cigarette smoking to cancer should have, logically, caused a decline in per-capita cigarette consumption. No such decline occurred for more than 20 years. The reaction of physicians was particularly informative. All doctors were aware of the statistical evidence and were more exposed than the general population to the health problems caused by smoking. How they reacted to this evidence depended upon their medical specialty. Twenty years after the Surgeon General's report, radiologists who examine lung x-rays every day had the lowest rate of smoking. Physicians who diagnosed and treated lung cancer victims were also quite unlikely to smoke. Many other types of physicians continued to smoke. The probability that a physician continued to smoke was directly related to the distance of the physician's specialty from the lungs. In other words, even physicians, who were well qualified to understand and appreciate the statistical data, were more influenced by their vivid personal experiences than by valid statistical data.94 [Nisbett and Ross, p. 56, _Human Inference: Strategies and Shortcomings of Social Judgment_] > A fundamental error made in judging the causes of behavior is to overestimate the role of internal factors and underestimate the role of external factors. When observing another's behavior, people are too inclined to infer that the behavior was caused by broad personal qualities or dispositions of the other person and to expect that these same inherent qualities will determine the actor's behavior under other circumstances. Not enough weight is assigned to external circumstances that may have influenced the other person's choice of behavior. This pervasive tendency has been demonstrated in many experiments under quite diverse circumstances117 and has often been observed in diplomatic and military interactions.118 118: _Perception and Misperception in International Politics_, Jervis looks promising for incidents of wars - from the Amazon review > In a contemporary application of Jervis's ideas, some argue that Saddam Hussein invaded Kuwait in 1990 in part because he misread the signals of American leaders with regard to the independence of Kuwait. Also, leaders of the United States and Iraq in the run-up to the most recent Gulf War might have been operating under cognitive biases that made them value certain kinds of information more than others, whether or not the information was true. Jervis proved that, once a leader believed something, that perception would influence the way the leader perceived all other relevant information. From _Of Knowledge And Power_, on Iraq, from the President's Commission on Weapons of Mass Destruction > "The analytical flaw was not that this premise was unreasonable (for it was not); rather, it was that the premise hardened into a presumption and analysts began to fit the facts to the theory, rather than the other way around." (Iraq is so clearcut that I'm not going to pay too much attention to it unless it would be controversial not to.) A general observation, that analysts may assume deception too much: > The hypothesis has been advanced that deception is most likely when the stakes >are exceptionally high.129 If this hypothesis is correct, analysts should be >especially alert for deception in such instances. One can cite prominent >examples to support the hypothesis, such as Pearl Harbor, the Normandy >landings, and the German invasion of the Soviet Union. It seems as though the >hypothesis has considerable support, given that it is so easy to recall >examples of high stakes situations in which deception was employed. But >consider what it would take to prove, empirically, that such a relationship >actually exists. Figure 17 sets up the problem as a 2 x 2 contingency table. >Barton Whaley researched 68 cases in which surprise or deception was present in >strategic military operations between 1914 and 1968.130 Let us assume that some >form of deception, as well as surprise, was present in all 68 cases and put >this number in the upper left cell of the table. How many cases are there with >high stakes when deception was not used? That is a lot harder to think about >and to find out about; researchers seldom devote much effort to documenting >negative cases, when something did not occur. Fortunately, Whaley did make a >rough estimate that both deception and surprise were absent in one-third to >one-half of the cases of "grand strategy" during this period, which is the >basis for putting the number 35 in the lower left cell of Figure 17. ...It is >not really clear whether there is a relationship between deception and >high-stakes situations, because there are not enough data. Intuitively, your >gut feeling may tell you there is, and this feeling may well be correct. But >you may have this feeling mainly because you are inclined to focus only on >those cases in the upper left cell that do suggest such a relationship. People >tend to overlook cases where the relationship does not exist, inasmuch as these >are much less salient. In another paper Heurer details Whaley more http://www.worldaffairsboard.com/attachments/staff-college/20727d1273228985-ebo-sod-limits-intelligence-analysis-fpri-winter-2005-heurer-.pdf : > For example, Barton Whaley researched 68 cases in which surprise or deception was present in military operations between 1914 and 1968. He found ten cases in which detailed military plans were compromised to an enemy prior to an intended military attack. In half of these cases, the plans were carefully fabricated deception, while in the other half they were a genuine breach of security. The fabricated plans were accepted as genuine in all five cases, while the genuine plans were rejected as fabrications in four of the five instances—a failure rate of 90 percent. [Barton Whaley, _Strategem: Deception and Surprise in War_ (Cambridge, Mass.: MIT Center for International Studies, 1969), p. 230] Establishing the bona fides of clandestine sources is an exceptionally difficult task. Sources are more likely to be considered reliable when they provide information that fits what we already think we know.[Richards J. Heuer, Jr., ‘‘Nosenko: Five Paths to Judgment,’’ in _Inside CIA’s Private World: Declassified Articles from the Agency’s Internal Journal 1955–1992_, H. Bradford Westerfield, ed. (New Haven: Yale University Press, 1995), pp. 379–414] This reinforces existing preconceptions. Continuing: > US policymakers in the early years of our involvement in Vietnam had to imagine scenarios for what might happen if they did or did not commit US troops to the defense of South Vietnam. In judging the probability of alternative outcomes, our senior leaders were strongly influenced by the ready availability of two seemingly comparable scenarios--the failure of appeasement prior to World War II and the successful intervention in Korea. Vietnam is often mentioned as a sunk costs example. (I read a book once by a historian analyzing how Vietnam was constantly compared to WWII and Korea by the top American politicians, per above, a pretty good illustration of the availability heuristic, but I can't recall its name.) (An application to current events I would suggest be Iran: more than a few intelligence agencies have concluded the weapons program is inactive. Should the US or Israel bomb Iran, the consequences will likely be negative... Another potential example is China; while they are a favorite bogeyman now, there's indications their military is dangerously - to them - rotten: http://www.foreignpolicy.com/articles/2012/04/16/rotting_from_within?page=full I don't think these examples are usable right now, though, like most of the political examples because the necessary information for a slamdunk case is not available.) Japan's attack on Pearl Harbor has been described as a cognitive bias; in _Transforming US Intelligence_ ed Sims 2005, _Of Knowledge and Power: The Complexities of National Intelligence_ 2008, or http://republkusa.wordpress.com/2012/02/03/pearl-harbor-perpetual-happy-hour-misunderestimation-and-the-mother-of-all-biases/ or http://www.nationmultimedia.com/opinion/Lessons-in-cognitive-bias-from-the-day-of-infamy-30172123.html > The so-called "day of infamy" is significant because of two ironies. For one thing, senior US officials knew about Japanese intentions to attack, thanks to a code-breaking system called Magic. But they refused to accept the data, since arousing American anger would have been suicidal (it was). Second, the Japanese knew that attacking Pearl Harbour would only buy time before the US retaliated in force...Writing in "The March To Folly", Pulitzer-winning historian Barbara Tuchman argues that such follies stem from "wooden-headedness". A common phenomenon through history are policies of governments that ran contrary to their national interests, she adds...Indeed, the 20th century is replete with similar follies. Every major intelligence failure in the 20th century - China's intervention in the Korean War in 1950, Egypt's surprise attack on Israel in 1973 and the fall of the Berlin Wall in 1989 - was due to rampant failures to "connect the dots". Wooden-headedness - or stubbornness to accept the facts - was largely to blame. In 1950, General Douglas MacArthur was too full of hubris to accept the fact that the Chinese would take on the American military machine. On November 9, 1989, CIA specialists were telling then US president George HW Bush why the Berlin Wall would not fall any time soon; at that point, another staff member asked the president to turn on the television, which was broadcasting the fall of the Wall. Prior to the September 11 attacks, American officials were not unaware of plots to fly hijacked planes into buildings. But confirmation bias - the refusal to accept such a possibility - led to a "failure of imagination", as the 9-11 Commission pointed out. The 2003 invasion of Iraq was the result of availability bias - US officials affected by the trauma of September 11 sought to avoid a similar intelligence failure. In turn, this led to the overestimation of Iraq's weapons programme. Confirmation bias could well afflict American policymakers on the issue of Iran's nuclear programme. They could well be shrugging off disturbing information about Tehran's intentions - say, the repeated threat to use nukes against Israel - and as a result be underestimating the threat. (Stalin's refusal to believe that Germany might invade is also cited in places. But who the heck knows what was going on in Stalin's head? 'bias' is too specific a word. If you want to read about it, see Wikipedia or pg16-21 of http://fas-polisci.rutgers.edu/levy/2009%20Intelligence%20Failure.pdf ) See also https://www.cia.gov/library/center-for-the-study-of-intelligence/kent-csi/vol7no3/html/v07i3a13p_0001.htm and http://en.wikipedia.org/wiki/Historian%27s_fallacy I was going to be a little skeptical of this one, but then I saw an anecdote in _American Foreign Policy and The Politics of Fear: Threat Inflation Since 9/11_ that the Secretary of War, when given a message telling of a Japanese attack on Pearl Harbor, said 'My god, this can't be true. This must mean the Philippines.' Harold Ford http://www.dni.gov/nic/PDF_GIF_anal_meth/tradecraft/purpose_of_estimating.pdf seems like a good overview of all the ignored or misinterpreted evidence; pg8 gives some particularly racist quotes to the effect that Japanese aviation & technology was not dangerous (which combined with the previous opinions that Pearl Harbor was too shallow for *American* torpedos to operate...) The 9/11 Report (http://www.gpo.gov/fdsys/pkg/GPO-911REPORT/pdf/GPO-911REPORT.pdf) FWIW, does not mention biases, although it does seem to be talking about similar ideas: > The next day, Wolfowitz renewed the argument, writing to Rumsfeld about the interest of Yousef ’s co-conspirator in the 1995 Manila air plot in crashing an explosives-laden plane into CIA headquarters, and about information from a foreign government regarding Iraqis’ involvement in the attempted hijacking of a Gulf Air flight. Given this background, he wondered why so little thought had been devoted to the danger of suicide pilots, seeing a “failure of imagination” and a mind-set that dismissed possibilities.74 pg357 starts the general section. Some of the lines are a bit amusing: > It is hard now to recapture the conventional wisdom before 9/11. Indeed. (On the other hand, the person being discussed in that paragraph was basically right: super-terrorism hasn't been a problem, and airplane hijacking was easily fixed.) pg362 > Imagination is not a gift usually associated with bureaucracies. For example, before Pearl Harbor the U.S. government had excellent intelligence that a Japanese attack was coming, especially after peace talks stalemated at the end of November 1941. These were days, one historian notes, of “excruciating uncertainty.”The most likely targets were judged to be in Southeast Asia. An attack was coming,“but officials were at a loss to know where the blow would fall or what more might be done to prevent it.”11 In retrospect, available intercepts pointed to Japanese examination of Hawaii as a possible target. But, another historian observes,“in the face of a clear warning, alert measures bowed to routine.”12 On the other hand, pg365 is interesting, and I didn't notice this when I read the report back in 2004: it points out that preventing a suicide attack with airplanes could be implemented just by watching for certain unprecedented activities by people linked with terrorists - recruiting jumbo jet pilots, attending flight school, or purchasing simulation software. Previously, it argues that the usual intelligence analysis methods for analyzing surprise attacks were never applied to the suggestion of hijacking aircraft. This suggests a cost-benefit failure: by not analyzing a reasonable suggestion, they missed on the opportunity to lay out several very cheap and effective prevention measures - cheap because Moussaoui had been detected and arrested before 9/11 and > briefed to the DCI and other top CIA officials under the heading “Islamic Extremist Learns to Fly.”24 Because the system was not tuned to comprehend the potential significance of this information, the news had no effect on warning. (I am under the impression that armoring/locking cockpit doors is not very expensive either.) Another common example of an intelligence failure traceable to bad analysis is the 1973 Arab-Israeli War (eg _Of Knowledge and Power: The Complexities of National Intelligence_ again). The summary here is that Israeli had won so overwhelmingly previously that they and the US assumed their enemies were so intrinsically incompetent as to be no threat (fundamental attribution bias?), and then Syria & Egypt ran an excellent deception campaign which gave Israel reasons to dismiss all massing of forces and signals. http://onlinelibrary.wiley.com/doi/10.1111/0162-895X.00317/abstract or http://fas-polisci.rutgers.edu/levy/2009%20Intelligence%20Failure.pdf (from intro, see also pg22-27, which discusses the important detail of an intelligence official lying about reports from Israel's best Egyptian source): > The role of pre-existing mental images is given even greater emphasis in many analyses of the Israeli intelligence failure in 1973. Israeli intelligence officers and political leaders shared the beliefs (later known as “the conception”) that Egypt would not go to war unless it was able to mount air strikes deep into Israel to neutralize Israel’s air force, and that Syria would not go to war without Egypt. Since the first condition was not met, Israeli intelligence concluded that war would not occur in 1973, and this judgment led them to interpret the unprecedented magnitude of Syrian and Egyptian deployments at the front lines as evidence of routine Egyptian military exercises and Syrian defensive moves. Thus, the Agranat Commission traced the intelligence failure to the “persistent adherence to ‘the conception.’”[Agranat Commission https://en.wikipedia.org/wiki/Agranat_Commission, Agranat Report, 18.] That latter paper claims that Bay of Pigs is a counter-example: > For example, before launching operation “Zapata” in 1961, Central Intelligence >Agency (CIA) officers consciously underestimated the power of the Castro regime >and overestimated the likelihood that the Bay of Pigs invasion would trigger a >popular uprising in Cuba. They acted so in order to obtain the political >authorization for an operation to which they had become psychologically >committed and which they believed would serve their organizational interests.27 >We will argue that this factor played a critical role in the Israeli >intelligence failure of 1973. >[Peter Kornbluh, Bay of Pigs Declassified: The Secret CIA Report on the Invasion of Cuba (New York: The New Press, > 1999); Trumbull Higgins, The Perfect Failure: Kennedy, Eisenhower, and the CIA at the Bay of Pigs (New York: Norton, 1987); > David A. Phillips, The Night Watch (New York: Ballantine, 1982).] WWI as well? > As Jack Snyder argued with respect to German assessments of the merits of the Schlieffen Plan on the eve of World War I, they saw “the ‘necessary’ as possible.” The Schlieffen Plan had to work if Germany was to win the war, so German leaders were unconsciously motivated to believe that it would work. [Jack Snyder, The Ideology of the Offensive (Ithaca, NY: Cornell University Press, 1984), chap. 5.] Unfortunately, for all the research related to all the above, it doesn't seem to have materially changed the intelligence community, if the hundreds of interviews summarized in the book _Analytic Culture in the US Intelligence Community: an Ethnographic Study_ http://www.dtic.mil/cgi-bin/GetTRDoc?AD=ADA507369 are to be believed. (Pg93 contains an interesting anecdote by the author about his mistaken beliefs on Tiananmen Square, and what he eventually learned was the real Chinese impression of it and why no one supported the students. I don't think I've ever come across his explanation before, but in retrospect it makes a lot of sense.) http://hbswk.hbs.edu/item/3074.html & http://www.leighbureau.com/speakers/mroberto/essays/everest.pdf & http://sovereignnorth.com/subbywan/Military/The%20Art%20of%20Critical%20Decision%20Making.pdf The author argues (pg9) that https://en.wikipedia.org/wiki/1996_Mount_Everest_disaster was due to cognitive biases: sunk cost (refusing to turn back) plus overconfidence (in one's skills & predictions) combined with an ignorance of base rates (Everest had bizarrely good weather in the years previously). I'm not sure how widely this interpretation is accepted* - low oxygen produces general stupidity, and there's a suggestion that conditions were unusually low-pressure (http://www.bioedonline.org/news/news.cfm?art=986). * http://hum.sagepub.com/content/60/7/1039.short & http://www.onepetro.org/mslib/servlet/onepetropreview?id=ASSE-06-754 may mention cognitive bias & Everest, but I couldn't get access easily. https://www.cia.gov/library/center-for-the-study-of-intelligence/csi-publications/csi-studies/studies/vol.-55-no.-4/pdfs-vol.-55-no.-4/Brennan-Reflections%20on%20Outliers-13Jan.pdf demonstrates an interesting hindsight bias/groupthink effect by surveying people before (it turned out) and after Osama bin Laden's assassination. It also provides a helpful list of 'outliers': > Russia would destabilize the balance of power by deploying tactical nuclear >missiles in >Cuba. [The Special NIE on Cuba records the IC’s unwillingness to support the hypothesis of nuclear missiles in > Cuba. This required analysts to ignore eight refugee reports (outliers) out of thousands of reports as bad data. > https://www.cia.gov/library/center-for-the-study-of-intelligence/csi-publications/csi-studies/studies/vol51no3/revisiting-sherman-kent2019s-defense-of-snie-85-3-62.html ] >North Vietnam would invade South Vietnam in the spring of 1975, resulting in >the complete collapse of the South Vietnamese >government. [Interagency Intelligence Memorandum, “Response to National Security Study Memorandum 213--Part I: > Intelligence Appraisal—Factors Influencing the Course of Events in the Republic of Vietnam over the Next Five Years,” > 18 November 1974. Accessed 6 December 2011 at http://gateway.proquest.com.openurl?url_ver=Z39.88-2004&res_dat=xri:dnsartf_dat=xri:dnsa:article:CVW01271 ] >An Islamic cleric would distribute sermons via cassettes, and the Iranian >people would then overthrow their >government.[NSC staffer Gary Sick later concluded, “The Iranian revolution...refused to conform to the conventional > wisdom of the day, and contemporary analyses often had more to say about the prejudices and assumptions of the observer > than about the new reality being created in the mosques and in the streets of Iran.” Gary Sick, _All Fall Down: > America’s Tragic Encounter with Iran_ (New York: Random House, 1985), 106.] >Yugoslavia would not remain intact through the >1990s.[In this case, the Intelligence Community correctly estimated the situation, but was considered the outlier in > a policy community unwilling to accept that forecast. (Based on interview with the NIE author, August 2011).] >Saddam Hussein would abandon his weapons of mass destruction (WMD) program. A >fruit vendor’s self-immolation in Tunisia would set off a firestorm of >demonstrations for self-determination across the Near East. Another list, from the previously linked ethnography: > Often, participants’ responses were not definitions at all but statements >meant to represent familiar historical examples: The attack on Pearl Harbor. >The Chinese sending combat troops into Korea. The Tet Offensive. The Soviet >invasion of Afghanistan. The collapse of the Soviet Union. The Indian nuclear >test. September Eleventh. I think that about exhausts the material on intelligence failure. The next obvious vein is financial data: statistical economics papers on people trading too much, holding onto losers, etc. One could try to extrapolate losses from the samples. Is that a kind of thing you had in mind as examples?