HomeLatest ThreadsGreatest ThreadsForums & GroupsMy SubscriptionsMy Posts
DU Home » Latest Threads » Forums & Groups » Topics » Science » Science (Group) » The bias that can cause c...

Thu Oct 3, 2019, 04:13 PM

The bias that can cause catastrophe

The outcome bias erodes your sense of risk and makes you blind to error, explaining everything from fatal plane crashes to the Columbia crash and the Deepwater Horizon oil spill.


Like much of our understanding of human irrationality, the outcome bias was first observed in the 1980s, with a seminal study of medical decision-making.

Participants were given descriptions of various scenarios, including the risks and benefits of the different procedures, and then asked to rate the quality of the doctors’ judgement.

The participants were told about a doctor’s choice to offer a patient a heart bypass, for instance – potentially adding many more years of good health, but with a small chance of death during the operation. Perhaps predictably, the participants judged the doctor’s decision far more harshly if they were told the patient subsequently died than when they were told that the patient lived – even though the benefits and risks were exactly the same in each case.

The outcome bias is so deeply ingrained in our brains that it’s easy to understand why they would feel that the doctor should be punished for the patient’s death. Yet the participants’ reasoning is not logical, since there would have been no better way for the doctor to have weighed up that evidence – at the time of making the decision there was every chance the operation would have been a success. Once you know about the tragedy, however, it’s hard to escape that nagging feeling that the doctor was nevertheless at fault – leading the participants to question his competence.



This isn't strictly a science article but the study of outcome bias certainly can be.

Outcome bias partly explains many tragedies, from bad science to climate change denial.

This BBC article is by journalist David Robson who wrote the book The Intelligence Trap: Why Smart People Make Dumb Mistakes.

There's a very worthwhile review of the book at AAAS Science:


4 replies, 688 views

Reply to this thread

Back to top Alert abuse

Always highlight: 10 newest replies | Replies posted after I mark a forum
Replies to this discussion thread
Arrow 4 replies Author Time Post
Reply The bias that can cause catastrophe (Original post)
hunter Oct 2019 OP
Mike 03 Oct 2019 #1
unblock Oct 2019 #2
hunter Oct 2019 #3
NNadir Oct 2019 #4

Response to hunter (Original post)

Thu Oct 3, 2019, 04:17 PM

1. This looks fascinating. Bookmarking to read later.

Reply to this post

Back to top Alert abuse Link here Permalink

Response to hunter (Original post)

Thu Oct 3, 2019, 04:52 PM

2. bad example. this fallacy comes up in poker allll the time.

poker players make a decision based on the odds, and the result may hinge on the next card. there's a strong tendency to think you made a bad decision if that card works against you, even though the odds were in your favor.

but that's a clean example, it's just math.

the medical field is more problematic, because it most certainly *is* possible that the doctor was at fault. some of the poor outcomes are simply based on the patient and information that cannot be ascertained prior to or even during surgery. however, some of the poor outcomes may be due to improper hygiene on the part of the doctor/team/facility, or perhaps the doctor missed something that a different doctor might not have.

the more automated the procedure is, the more judging by outcomes is a fallacy. the more there's room for doctor error, the more judging by outcomes makes sense.

Reply to this post

Back to top Alert abuse Link here Permalink

Response to unblock (Reply #2)

Thu Oct 3, 2019, 07:56 PM

3. That, in a nutshell, is the fallacy being studied.

I think this is the original paper...

Outcome bias in decision evaluation.

By Baron, Jonathan,Hershey, John C.
Journal of Personality and Social Psychology, Vol 54(4), Apr 1988, 569-579


In 5 studies, undergraduate subjects were given descriptions and outcomes of decisions made by others under conditions of uncertainty. Decisions concerned either medical matters or monetary gambles. Subjects rated the quality of thinking of the decisions, the competence of the decision maker, or their willingness to let the decision maker decide on their behalf. Subjects understood that they had all relevant information available to the decision maker. Subjects rated the thinking as better, rated the decision maker as more competent, or indicated greater willingness to yield the decision when the outcome was favorable than when it was unfavorable. In monetary gambles, subjects rated the thinking as better when the outcome of the option not chosen turned out poorly than when it turned out well. Although subjects who were asked felt that they should not consider outcomes in making these evaluations, they did so. This effect of outcome knowledge on evaluation may be explained partly in terms of its effect on the salience of arguments for each side of the choice. Implications for the theory of rationality and for practical situations are discussed. (PsycINFO Database Record (c) 2016 APA, all rights reserved)


In complex situations "competency" is irrationally judged by outcome even when everything else is controlled for.

Poker is an interesting example. Super computers are now world class poker players, yet all they can see is the cards... They don't need to know anything about the emotional aspects of the game. All they see is the cards. "Poker faces" and "tells" don't matter.


In a similar way we might find problems in medicine by statistics, not by how we judge the competency of the players.

That's scary when you are heavily invested in a poker hand. It's even scarier when you are a doctor or a patient dealing with a life threatening medical condition.

A doctor can follow all the known rules in medicine, do everything by the highest standards, and things still go wrong.

It's the same with poker players or pilots and it's really scary when a string of "wins" fools us into thinking we are competent people as we fly into the storm.

Reply to this post

Back to top Alert abuse Link here Permalink

Response to hunter (Original post)

Sat Oct 12, 2019, 08:38 AM

4. I'm kicking this interesting thread up, as I intend to reference it in an up coming post. n/t.

Reply to this post

Back to top Alert abuse Link here Permalink

Reply to this thread