Walter E. Williams bio photo

Walter E. Williams

Bradley Prize Winner 2017

Professor of Economics.
wwilliam@gmu.edu
(703) 993-1148
D158 Buchanan Hall
Department of Economics
George Mason University

Related Sites:
The homepage of George Mason University.
Homepage of the Department of Economics at GMU.

We’re not omniscient. That means making errors is unavoidable. Understanding the nature of errors is vital to our well-being. Let’s look at it.  There are two types of errors, nicely named the type I error and the type II error. The type I error is when we reject a true hypothesis when we should accept it. The type II error is when we accept a false hypothesis when we should reject it. In decision-making, there’s always a non-zero probability of making one error or the other. That means we’re confronted with asking the question: Which error is least costly? Let’s apply this concept to a couple of issues.  The stated reason for going to war with Iraq is that our intelligence agencies surmised Saddam Hussein had, or was near having, nuclear, biological and chemical weapons of mass destruction. Intelligence is never perfect. During World War II, our intelligence agencies thought that Germany was close to having an atomic bomb. That intelligence was later found to be flawed, but it played an important role in the conduct of the war.  Since intelligence is always less than perfect, we’re forced to decide which error is least costly. Leading up to our war with Iraq, the potential errors confronting us were: Saddam Hussein had weapons of mass destruction and we incorrectly assumed he didn’t. Or, he didn’t have weapons of mass destruction and we incorrectly assumed he did. Both errors are costly, but which is more costly? It’s my guess that it would have been more costly for us to make the first error: Saddam Hussein had weapons of mass destruction and we incorrectly assumed he didn’t.  Another example of type I and type II errors hits closer to home. Food and Drug Administration (FDA) officials, in their drug approval process, can essentially make two errors. They can approve a drug that has unanticipated dangerous side effects (type II). Or, they can disapprove, or hold up approval of, a drug that’s perfectly safe and effective (type I). In other words, they can err on the side of under-caution or err on the side of over-caution. Which error do FDA officials have the greater incentive to make?  If a FDA official errs by approving a drug that has unanticipated, dangerous side effects, he risks congressional hearings, disgrace and termination. Erring on the side of under-caution produces visible, sick victims who are represented by counsel and whose plight is hyped by the media.

 Erring on the side of over-caution is another matter. A classic example was beta-blockers, which an American Heart Association study said will “lengthen the lives of people at risk of sudden death due to irregular heartbeats.” The beta-blockers in question were available in Europe in 1967, yet the FDA didn’t approve them for use in the U.S. until 1976. In 1979, Dr. William Wardell, a professor of pharmacology, toxicology and medicine at the University of Rochester, estimated that a single beta-blocker, alprenolol, which had already been sold for three years in Europe, but not approved for use in the U.S., could have saved more than 10,000 lives a year. The type I error, erring on the side of over-caution, has little or no cost to FDA officials. Grieving survivors of those 10,000 people who unnecessarily died each year don’t know why their loved one died, and surely they don’t connect the death to FDA over-caution. For FDA officials, these are the best kind of victims – invisible ones. When an FDA official holds a press conference to announce its approval of a new life-saving drug, I’d like to see just one reporter ask: How many lives would have been saved had the FDA not delayed the drug’s approval?  The bottom line is, we humans are not perfect. We will make errors. Rationality requires that we recognize and weigh the cost of one error against the other.