Wednesday, August 10, 2005

Erorrs

Talin, a regular guest and panelist at Loscon, has a favorite saying about errors. "The point is not preventing mistakes, the point is having a good error correction mechanism." This is because you're never going to prevent all mistakes.

Walter Williams makes the same point:

We're not omniscient. That means making errors is unavoidable. Understanding the nature of errors is vital to our well-being. Let's look at it.

In this article, Williams looks at errors in knowledge – guessing wrong when absolute certainty is not available. Since absolute certainty is never available in the real world, we do a lot of guessing, and some of our guesses will be wrong.

...continued in full post...

You'll also find these called an alpha error and beta error, respectively. You can also call them "false negative" and "false positive". For example, a test for some disease can be accurate, or it can yield a false positive result – you test positive for the disease when you don't have it after all, or it can yield a false negative result – the test tells you you don't have the disease when in fact you do.
There are two types of errors, nicely named the type I error and the type II error. The type I error is when we reject a true hypothesis when we should accept it. The type II error is when we accept a false hypothesis when we should reject it. In decision-making, there's always a non-zero probability of making one error or the other. That means we're confronted with asking the question: Which error is least costly?

In the case of this test, a false positive means more tests to confirm or rule out the disease, some worry, and maybe a few unnecessary medical treatments. A false negative means your disease goes untreated, which can be fatal.

He applies the concept of Type I and Type II errors to some real-world issues:

The invasion of Iraq, based on evidence Iraq had weapons of mass destruction
False negative: there aren't any WMD after all, but Saddam Hussein has been toppled and is no longer filling up mass graves, promoting terrorism in Israel, or threatening the stability of the region. False positive: There is no invasion, but Saddam Hussein uses WMD on a neighbor, a western country, or provides them to Al Qaeda or some other terrorist group. (Maybe the Palestinians would have wanted some?)
Waging war against Germany based on its atomic bomb program
During World War II, our intelligence agencies thought that Germany was close to having an atomic bomb. That intelligence was later found to be flawed, but it played an important role in the conduct of the war.
Approval of drugs by the FDA
FDA officials can approve a drug that turns out to have nasty side-effects, or they can hold up or ban a drug that's actually safe.
Which error do FDA officials have the greater incentive to make? If a FDA official errs by approving a drug that has unanticipated, dangerous side effects, he risks congressional hearings, disgrace and termination. Erring on the side of under-caution produces visible, sick victims who are represented by counsel and whose plight is hyped by the media. Erring on the side of over-caution is another matter.

Between 1967 and 1976, beta blockers were being used in Europe, but were held up by the FDA. In 1979, a pharmacologist estimated that one single beta blocker could have prevented 10,000 lives each year due to irregular heartbeat.

Assuming this is a typical number for each beta blocker, and assuming there were two beta blockers being held up by the FDA, that's 180,000 people who died as a result of FDA inaction.

The type I error, erring on the side of over-caution, has little or no cost to FDA officials. Grieving survivors of those 10,000 people who unnecessarily died each year don't know why their loved one died, and surely they don't connect the death to FDA over-caution. For FDA officials, these are the best kind of victims – invisible ones. When an FDA official holds a press conference to announce its approval of a new life-saving drug, I'd like to see just one reporter ask: How many lives would have been saved had the FDA not delayed the drug's approval?

To be sure, these are "theoretical lives". These people might have died when they did anyway. But then again, when we're prepared to worry ourselves to death over whether the water has five parts per billion of arsenic or ten parts per billion, we're down in the realm of theoretical lives. There's no way to point to any one death and say it was a result of the extra five parts per billion of arsenic. Why is that worthy of concern, but people who stand in the way of a life-saving drug aren't?

The bottom line is, we humans are not perfect. We will make errors. Rationality requires that we recognize and weigh the cost of one error against the other.

And the first step is to recognize that there are costs on both sides. Always.

No comments: