The Biggest Mistake Patients Make
The biggest mistake patients make isn’t what you think. It isn’t turning down tests or treatments their doctors recommend. Nor is it deciding not to take the medicines their doctors prescribe. It isn’t insisting on getting a test or beginning a treatment their doctors recommend against, either, and it isn’t failing to exercise, stay out of the sun (or use sunscreen), quit smoking, or lose weight. No, the biggest mistake patients make is thinking anecdotally rather than statistically.
We all tend to arrive at our beliefs about the frequency with which things occur not from statistical analysis but from the ease with which we can recall examples of their happening (see Daniel Kahneman’s book, Thinking, Fast and Slow). So if we’ve recently heard a story of an airplane crash in the news, we’ll believe the likelihood that the airplane in which we’re flying might crash to be greater than it actually is. Or if a friend tells us about a complication he suffered following surgery, we’ll believe the likelihood of that complication happening after our surgery to be greater than statistics suggest.
We all tend to believe stories more than facts. And when faced with the need to make a decision—to start a medication, to have surgery—far more often than not (and mostly without consciously realizing it) we rely on our emotional beliefs about the risks and benefits. And our emotional beliefs come mostly from our experience and the stories we tell ourselves about it. “My wife’s sister’s boyfriend took that pill and had a terrible reaction. There’s no way I’m going to take it!” one patient tells me. “Dr. X operated on a friend of mine and he’s been in pain ever since. No way I’m letting that guy touch me!” says another. “I’ve seen that drug advertised on television. What do you think about me trying that one instead?” a third asks.
We believe thinking this way leads us to make wise decisions, but it doesn’t. We hear about a friend or relative having a known complication as a result of a surgery (one we’re told by more than one doctor that we need ourselves) and decide as a result of hearing that story that we don’t want the surgery—even though the statistical likelihood of such a complication happening to us is less than one percent and has, in fact, never happened to any of our own surgeon’s patients. Or we read about the side effects of a drug our doctor recommends and decide we don’t want to take it even though studies show that the risk of those complications is far lower than the likelihood that it will treat our symptoms or even prolong our life.
Sometimes our intuition actually serves us well. Sometimes the recommendations doctors make are based on nothing more than their clinical judgment and a presumption that they know better than their patients what their patients should do. And while the former is unavoidable (much of what we do in medicine requires judgment because studies that provide clear-cut answers haven’t been done) the latter represents a mistake that often leads doctors to have greater faith in the value of their recommendations than is warranted. But just because we may disagree that our doctor knows what’s best for us, we shouldn’t automatically dismiss his advice if it runs counter to our inclinations, for doctors have a crucial advantage over the patients for whom they care: the ability to think dispassionately about the choices their patients must face.
I’m not advocating that you surrender your judgment to your doctor. I’m saying that when deciding upon the best course of action to take, you need to critique your own thought process mercilessly. Most of us make our decisions emotionally. And while bringing emotion into decision making isn’t wrong per se (how do we place value on something, after all, if not with our hearts?), our feelings can easily mislead us if not based on sound reasoning. And allowing our fears to be swayed by anecdotes rather than statistics is about as far away from sound reasoning you can get.
To think statistically is to calculate the true likelihood that something bad—or good—will happen to us. And in far more cases than most would believe, we have information that allows us to do so accurately—and not only in the medical arena. We know, for example, that the likelihood of any one of us becoming the victim of a terrorist attack is about one in twenty million. But think: is your fear of terrorism proportional to that statistic or to the frequency with which you hear about terrorism almost weekly on the news? Another example: you should be far more afraid of driving a car than flying in an airplane. Not only are car accidents statistically more likely than airplane crashes, but also most of us drive far more often than we fly, thereby exposing ourselves to the risk of a car accident far more often than we expose ourselves to the risk of an airplane crash. But how often do you worry about getting in an accident when you take your car? I’m not suggesting you need to; in fact, I’m arguing the opposite: that because of our exposure to anecdotes, we often worry far more than we should about things whose statistical likelihood is actually small (and, conversely, not enough about things whose statistical likelihood is actually large). So the next time you contemplate refusing a recommended test or treatment, notice the source of your anxiety: is it from a calculated statistic that yields the true likelihood of harm—or from a story that stirs you up far out of proportion to the real risk?
Next Week: The True Cause Of Cruelty, Redux
Yes true, I think it’s inevitable that we bring our stories to the consulting room. But if we are aware of them then hopefully they can be an integrated part of our decision making and reactions.
I couldn’t agree with you more about the biggest mistake that patients make. I’ll add another example into the mix. I suffer from a chronic illness of unknown cause. I came down with a serious viral infection in 2001 and never recovered. I feel as if I have the flu all the time—not fatigue but flu. I don’t fit clearly into any diagnostic code. There’s one for Chronic Fatigue Syndrome but I don’t fit the CDC criteria for it. However, because the CDC’s definition does include a few of my symptoms and the definition comes closest to describing my illness, it’s the diagnostic code that my doctor uses (with my approval).
When someone has a diagnosis of CFS, I’ve found that people are always suggesting “cures.” They say, “I drank this type of water and recovered completely” or “I took this combination of supplements and I got better.” I used to be very susceptible to this anecdotal information. I tried “cure” after “cure,” (some of them ridiculously expensive) but none of them worked.
My husband and I finally arrived at this conclusion about what’s going on here. Some people with a diagnosis of CFS do make a spontaneous recovery. For unknown reasons, the body “rights” itself. When this happens to people, they attribute their recovery to whatever treatment they happened to be trying at the time. Convinced that it accounts for their recovery, with the best of intentions, they try to “spread the word” to everyone else with the same diagnosis. This has been very harmful to many people.
Now, I think this way: “If this or that treatment were really a cure, wouldn’t the medical community be on to it? Certainly the major CFS associations and organizations that are devoted to finding a cause and a cure for this illness would be investigating it.” This is how I evaluate anecdotal information.
[…] The Biggest Mistake Patients Make « Happiness in this World. […]
….but…but I read it on the internet, Dr.!! 😀
Anecdotal and clinical experience are hard to separate. As a horse doctor (retired) I hear this all the time and ended up mostly just nodding my head and not wasting to much time arguing. If it did no harm (or too much) I counted on the old 80% get better anyhow and eventually the body fixed itself.
Good points to balance the patient’s decision process and the doctor patient communication.
Just an additional thought on your comment of “If this or that treatment were really a cure, wouldn’t the medical community be on to it? …” As many know, the western medical community can be very slow to widely embrace practices or treatments that are outside their formal training (that may have been 20+ years ago). Though to their credit, many physicians are now recommending meditation and other “relaxing” habits, noting that stress (our minds) actually does cause much of the dis-ease our bodies experience. It is heartening to see some shifting from the “fix the symptom with surgery or medication” to let’s enter a dialogue on what things may help aleviate the cause of some of our dis-ease.
Thank you for being a beacon in western medicine!
I’d be thrilled if DOCTORS started thinking statistically, rather than anecdotally. It’s a lot to ask of an untrained group. 🙂
I’m fighting cancer for the second time. My doctors tell me not to think about the statistics of my survival. If I go by statistics alone, I should stop fighting this disease and surrender. I’m already suffering from serious clinical depression. By omitting the statistics my doctors are giving me hope. Maybe it’s false hope but none the less it’s all I have now. It’s what is getting me through the day, through all the tests. Life shouldn’t be so black and white. From a patients point of view there’s nothing wrong with thinking anecdotally, or doctor’s — they know the facts, the statistics already. I’ll choose on my own terms not by a bunch of numbers.