The Two Kinds Of Belief

Follow on

“I don’t want to give them up,” my patient told me.

“Why not?” I asked.

“I’ve been reading some articles on the Internet that say they might cure me.”

Tragically, he was referring to vitamin supplements, which he’d somehow come to believe would cure him of Stage 4 metastatic colon cancer. I’d suggested he stop taking them because they seemed, by his own report, to be making him nauseated.

In the kindest tone I could muster, I told him I knew of no reliable studies that showed vitamins have any effect on colon cancer.  And though I certainly didn’t object to people taking them in general, if they were indeed the cause of his nausea, they were compromising the quality of however much life he had left while almost certainly providing him no real benefit.  I urged him to experiment by stopping them to see if his nausea receded.  If it didn’t after two weeks or so, I argued, he could always start them back up again.  In fact, I said, even if his nausea did cease once he stopped them, he could start them back up again to see if it came back in order to produce more definitive evidence that the vitamins were in fact his nausea’s cause.

He paused and then after a few moments reluctantly agreed.

WHAT IS BELIEF?

Simply, a belief defines an idea or principle which we judge to be true.  When we stop to think about it, functionally this is no small thing:  lives are routinely sacrificed and saved based simply on what people believe.  Yet I routinely encounter people who believe things that remain not just unproven, but which have been definitively shown to be false.  In fact, I so commonly hear people profess complete certainty in the truth of ideas with insufficient evidence to support them that the alarm it used to trigger in me no longer goes off.  I’ll challenge a false belief when, in my judgment, it poses a risk to a patient’s life or limb, but I let far more unjustified beliefs pass me by than I stop to confront.  If I didn’t, I wouldn’t have time to talk about anything else.

What exactly is going on here?  Why are we all (myself included) so apparently predisposed to believe false propositions?

The answer lies in neuropsychology’s growing recognition of just how irrational our rational thinking can be, according to an article in Mother Jones by Chris Mooney.  We now know that our intellectual value judgments—that is, the degree to which we believe or disbelieve an idea—are powerfully influenced by our brains’ proclivity for attachment.  Our brains are attachment machines, attaching not just to people and places, but to ideas.  And not just in a coldly rational manner.  Our brains become intimately emotionally entangled with ideas we come to believe are true (however we came to that conclusion) and emotionally allergic to ideas we believe to be false.  This emotional dimension to our rational judgment explains a gamut of measurable biases that show just how unlike computers our minds are:

  1. Confirmation bias, which causes us to pay more attention and assign greater credence to ideas that support our current beliefs.  That is, we cherry pick the evidence that supports a contention we already believe and ignore evidence that argues against it.
  2. Disconfirmation bias, which causes us to expend disproportionate energy trying to disprove ideas that contradict our current beliefs.

Accuracy of belief isn’t our only cognitive goal.  Our other goal is to validate our pre-existing beliefs, beliefs that we’ve been building block by block into a cohesive whole our entire lives.  In the fight to accomplish the latter, confirmation bias and disconfirmation bias represent two of the most powerful weapons at our disposal, but simultaneously compromise our ability to judge ideas on their merits and the evidence for or against them.

EVIDENCE VS. EMOTION

Which isn’t to say we can’t become aware of our cognitive biases and guard against them—just that it’s hard work.  But if we really do want to believe only what’s actually true, it’s necessary work.  In fact, I would argue that if we want to minimize the impact of confirmation and disconfirmation bias, we need to reason more like infants than adults.

Though many people think belief can occur only in self-aware species possessing higher intelligence, I would argue that both infants and animals also believe things, the only difference being they’re not aware they believe them.  That is, they do indeed judge certain ideas “true”—if not with self-aware minds, with minds that act based on the truth of them nonetheless.  Infants will learn that objects don’t cease to exist when placed behind a curtain around 8 to 12 months, a belief called object permanence, which scientists are able to determine from the surprise infants of this age exhibit when the curtain is lifted and the object has been removed.  Animals will run from predators because they know—that is, believe—they will be eaten if they don’t.  In this sense, even protozoa can be said to believe things (e.g., they will move toward energy sources rather than away because they know, or “believe,” engulfing such sources will continue their existence).

Infants and animals, however, are free of the emotional biases that color the reasoning of adults because they haven’t yet developed (or won’t, in the case of animals) the meta-cognitive abilities of adults, i.e., the ability to look back on their conclusions and form opinions about them.  Infants and animals are therefore forced into drawing conclusions I consider  compulsory beliefs—”compulsory” because such beliefs are based on principles of reason and evidence that neither infants nor animals are actually free to disbelieve.

This leads to the rather ironic conclusion that infants and animals are actually better at reasoning from evidence than adults.  Not that adults are, by any means, able to avoid forming compulsory beliefs when incontrovertible evidence presents itself (e.g., if a rock is dropped, it will fall), but adults are so mired in their own meta-cognitions that few facts absorbed by their minds can escape being attached to a legion of biases, often creating what I consider rationalized beliefs—”rationalized” because adult judgments about whether an idea is true are so often powerfully influenced by what he or she wants to be true.  This is why, for example, creationists continue to disbelieve in evolution despite overwhelming evidence in support of it and activist actors and actresses with autistic children continue to believe that immunizations cause autism despite overwhelming evidence against it.

But if we look down upon people who seem blind to evidence that we ourselves find compelling, imagining ourselves to be paragons of reason and immune to believing erroneous conclusions as a result of the influence of our own pre-existing beliefs, more likely than not we’re only deceiving ourselves about the strength of our objectivity.  Certainly, some of us are better at managing our biases than others, but all of us have biases with which we must contend.

What then can be done to mitigate their impact?  First, we have to be honest with ourselves in recognizing just how biased we are.  If we only suspect that what we want to be true is having an influence on what we believe is true, we’re coming late to the party.  Second, we have to identify the specific biases we’ve accumulated with merciless precision.  And third, we have to practice noticing how (not when) those specific biases are exerting influence over the judgments we make about new facts.  If we fail to practice these three steps, we’re doomed to reason, as Jonathan Haidt argues, often more like lawyers than scientists—that is, backward from our predetermined conclusions rather than forward from evidence.

Some evidence suggests we’re less apt to become automatically dismissive of new ideas that contradict our current beliefs if those ideas are presented in a non-worldview-threatening manner or by someone who we perceive thinks as we do.  Knowing, for example, that my patient was more predisposed to consider ideas if they came from me, his doctor (whom he trusted had his best interests at heart), I felt obligated to wield that power for the good, to challenge any ideas that had the potential to cause him more harm than good.  (Though an argument could be made that I shouldn’t have challenged his misguided belief in the power of vitamins to treat colon cancer, when he stopped taking them, his nausea did indeed resolve.)

Despite my reluctance to challenge beliefs that people hold more strongly than evidence justifies, the inconvenient truth is that what one of us believes has immense power to affect us all (think of the incalculable harm the smallest fraction of us have caused because they believe if they die in the act of killing infidels, they’ll be surrounded by virgins in the afterlife).  As a society, therefore, we have critically important reasons to reject bad (untrue) ideas and promulgate good (true) ones.  When we speak out, however, we must realize that reason alone will rarely if ever be sufficient to correct misconceptions.  If we truly care to promote belief in what’s true, we need to first find a way to circumvent the emotional biases in ourselves that prevent us from recognizing the truth when we see it.

Next WeekInverting Education

Leave a Reply to Why We Shouldn’t Decide Ourselves When We Need Medical Attention « Happiness in this World Cancel reply

Your email address will not be published. Required fields are marked *

  • You were very rightly concerned about maximizing the quality of your patient’s remaining life. I do wonder if this would have been better served my eliminating his nausea or by leaving intact his (false) hope that vitamins might cure him.

    Joe: It’s a good question. Ultimately unknowable, so I used my best judgment.

    Alex

  • A great post! What, I wonder, do we do when we know that we hold an untrue belief, but that belief guides us in good ways? For instance, I know on some level, rationally, that the world is not a fair place, but I have a deep belief that it is, or at least that it should or could be. This motivates me to do social justice work, trying to reach the goal of making the world better, more fair, and more just. Some days I really struggle to maintain my faith in this pursuit, in much the same way I imagine clergy struggle with their religious faith from time to time.

  • Very interesting, Alex. When my dad was dying of a brain tumor many years ago, my mother actually spoke to a man who believed in laetrile—that apricot pit cure—which was sheer hogwash. Our family doc persuaded her it was worthless. Because she felt so helpless, though—dad was 58—she found out about “juicing” and “live enzymes” which were supposed to keep you alive. None of this was harmful to dad and he enjoyed slurping the juices from his bed, but as for me, I just wished him a speedy death. He would say to me when I came on my daily visits, “How much longer is this gonna last?”

  • I happen to be a peer counselor and happen to hold up the credence of the bias of the afterlife because I’m also a orthodox Muslim. And I plan to use the example Alex uses in his article about his theory versus medical model.

    Please follow up on these concepts; it seems interesting…

  • In the pet animal world you see this all the time and I too have stopped giving a forceful opinion against most “beliefs.” I keep in mind that even in animals the placebo affect seems to work some probably in conjunction with my favorite medicine TOT, or tincture of time. Coupled with the “fact” that if you cause no harm, a basic medical rule, you get better 80% of the time anyhow. It makes medicine somewhat confusing at times.

    At the same time I think I see the belief based on what you paid which I compare to the taste of wine. The $40 bottle almost always tastes better than the $5 bottle when you are told what they cost. But in a blind test they don’t. I find the whole subject interesting and amusing.

  • And as a scientist I would like to point out that scientists can be the worst of all when it comes to holding beliefs and using confirmation and discomfirmation bias. We pride ourselves in our rationality and the strength of the scientific method, but we rarely accept how common it is in a lab to reject the conclusion of an experiment because it did not confirm our initial hypothesis!!

    Maria: So true.

    Alex

  • Thank you for this article. It comes at a time when I feel particularly abused by some friends about my fear of getting my child vaccinated. I haven’t said no!! I just drag my feet to ensure she’s fit enough to with stand the side effects. I’ve experienced two friends whose child developed seizures as a result of vaccinations here in Jamaica and it makes me very nervous. A little information is almost just as dangerous as none at all. I’d also love to see you do an blog on the stress of raising a child in this information age, where EVERYBODY has a study that “shows.” My mother never had as much information at her fingertips and though I see a psychologist to get “over” myself, I’m an okay person. I feel like if I don’t follow the advice of every new warning I’m a horrible parent. Maybe you can write about that some time.

  • Imagine a political campaign without the usual confirmation/disconfirmation bias determining the rhetoric! What a difference this perspective would make if our three major political candidates for Canadian Prime Minister only made statements that had a basis in evidence and not emotion—a very different election campaign.

  • Wow! What a great post. I have wrestled with this notion for a long time.

    This framework can be applied to so many different schools of thought.

    There is a lesson in here for marketers too.

    Thanks for such great insight!

  • How would you differentiate faith and belief? Faith is also most of the time not really based on logic or reason…isn’t it in a sense belief?

    CB: The word “faith” connotes blind faith, or faith in something without evidence, in Western tradition. Belief, it seems to me, can be used to define what we think is true with or without evidence. When belief is present because of evidence, I call that understanding.

    Alex

  • I’m convinced animals have some kind of belief mechanism going on. We once built a second staircase onto a deck at our house. Our dog went freely up and down the first staircase for years with no problem, but had to be coaxed to go on the new staircase. Each step he would “test” with his paw first, then crouch as he went, as though he didn’t quite believe the stairs were there. It took about a week of this kind of behavior before he was believed his experience that it was safe to go up and down those stairs.

    Njeri, having seen two cases of bad vaccination experiences near you has understandably made you nervous about vaccinating your own kids. Who wants to risk harming their kids? But take a look around you at the many healthy, successfully-vaccinated children, who will not suffer the diseases they’ve been vaccinated for. I’m sure there are many. Hold that thought and get your kids vaccinated!

  • “We need to reason more like infants than adults.”

    Brilliantly put! This is precisely the conclusion I have come to: As they “mature,” most people abandon the natural human scientific approach to finding out what is true and embrace some cultural construct that eases their emotional or social discomfort instead.

    Why does this happen, do you think? Seems to me like a system that has gone haywire—something that worked at one time and in one context (maybe to reinforce tribal bonds or allegiance to a leader?) but now causes far more harm than good.

    For me, abandoning scientific thinking also seems linked with our increasing immersion in a world of abstractions rather than one of what is concrete. After all, it’s easier to continue to believe something no one can disprove by running experiments with tangible objects. Maybe that’s why what began as a useful tool for manipulating information so as to solve problems (abstraction) seems to have supplanted hard evidence in today’s world?

  • Thank you, Alex, for this post. It really helps explain why we cannot move beyond our many beliefs—regarding our personal and political lives. Rarely do we open ourselves up to new ways of looking at issues as they don’t “fit” into our constructs. And arguing only seems to cement positions further… and makes compromise and understanding more difficult. Appreciate your thoughts weekly (not weakly!)

  • A great post. As other commentators have noted, confirmation/disconfirmation biases are often obvious in the political arena. In fact, I’m often struck by how frequently I notice them in people who disagree with me.

    George: 😉

    Alex

  • This article really hit home for me. Recently my sister was diagnosed with colon cancer, had the right side of her colon removed (completely occluded by tumor), her colon re-sectioned, and was given intensive chemo treatments.

    She is a self-taught nutritionist who believes in megadoses of vitamins, distrusts doctors, and is convinced of the pharm-mafia conspiracy to keep people ill. She spent 20 years counseling people in California (without certification from an accredited program and no degree or formal education) on mental and physical health matters, replacing conventional treatment with vitamin therapy. She considers herself on the same level as a doctor/psychologist; no, actually a level above because we are all charlatans. She also lectured for various vitamin retail establishments with some accounts lasts 10-15 years. I respect her efforts and her convictions, but also have concerns for the lack of parameters and regulations that provide a safety measure for her and her clients/patients.

    My background includes an MA in psychology (community counseling), and two undergraduate degrees; a BS in psychology (physiological emphasis/biology minor), a BS in sociology. My passion lies is research and I have been recognized nationally with honors and served as project manager for a major research project funded through our State and a national agency (SAMHSA).

    Needless to say, we operate from two completely different mindsets and theoretical perspectives, yet we are sisters. She is 69 and I am 55. I do not believe in megadose vitamin therapy as there is no scientific evidence to support this approach; in fact there is contradictory information regarding the efficacy and safety of this type of treatment.

    During her chemo she was angry, rude and demanding with staff, tried to control the situation by insisting on reading her chart, knowing her blood counts, contradicting the doctor, etc. Needless to say, she did not follow orders and took megadoses of many supplements including antioxidants which she would not admit to oncology team, leaving them in the dark. When she asked me my opinion of her decision to supplement her treatment program, well, I was obviously reluctant to offer any input, but started to gather peer reviewed articles to support my stance if I chose to inject an answer.

    I had been active in her care from the first diagnosis and emergency surgery, learning about the port, the infu-system pump so I could disconnect it without her having to return to the chemo clinic, shopped for her, sat up with her at night, never violated her right to autonomy, self-determination, or interfered in the doctor/patient relationship. I was completely supportive of her decisions, including taking the anti-oxidants and numerous other supplements. It was my intention to continue to care for her because frankly no one else in the family would. That includes three other sisters and her daughter who had not talked to her in 20+ years.

    I made arrangements for her to see a counselor. Someone for whom I have great respect. I never assumed that role. I remained in the role of the loving, supportive sister. She never made an appointment.

    When she asked me to tell her my position, I attempted to avoid what I knew would be a confrontation. Based on our differences, and past behaviors, it would be at the very least a risky conversation. I was extremely concerned about the impact of her actions on her treatment and recovery, and was hoping she would accept my effort and synthesize the information making a well informed decision. So, I stepped out of my comfort zone and assumed the risk, being ready to accept the consequences. She is extremely rigid in her thinking, and with no peer-reviewed feedback or supervision over the years, I knew deep down that an open, scholarly discussion, exchange of ideas, acceptance of my effort as loving and well-intended and consideration of any opposing method would probably not be forthcoming. After some hesitance and with complete verbal honesty and disclosure of my fears as to her reactions, I relented.

    As I stated earlier, I came prepared. In my attempt to explain the offsetting effects of antioxidants on the oxidative treatments involved in chemo and my opinion of the dangers of megadose treatment, she started getting belligerent. I immediately read the situation and backed off saying, of course, this was simply my opinion and that I totally supported her right to chose her course of treatment and that I loved her, my opinion didn’t matter, it was just that, my opinion. She called me a “right fighter” and said several other vicious and hurtful things, belittled my education and research efforts, and quit talking to me.

    She has not made the progress expected by the doctor and is now facing more surgeries and the prognosis is grim. It has been 7 months now since we have talked. Our mother (87) lives with me and this is taking a toll on her, as well as the rest of the family. She will not visit Mom because she has to see me. I have even offered to drop Mom off at her house, or go somewhere when she comes to visit. It makes no difference.

    Our belief systems and rationalizations have an impact on every aspect of our lives; they create our worldview, define our personality, and and serve as navigator in the course of life. They create a safety net, a survival mechanism, a way of making sense of things that don’t make sense. Unfortunately as you have so eloquently and correctly brought to light, we convince ourselves that something is true even when it isn’t. This can and does create for an individual the illusions that allow them to maintain a psychological homeostasis. The illusion serves to protect the fragile psyche. Something becomes true because we desperately want to believe that it is true, we need to believe that it is true, because we cannot handle the alternative. The behavior that is spawned by the illusion can be adaptive or maladaptive. And to further complicate matters, the behavior is influenced by culture, regulated by the mores of the society, the agents of socialization, and life experience of the individual. Some behaviors are self destructive, yet as you and I both experienced, the individual will not give up the belief in spite of the consequence.

    However, the other side of the issue is when the illusion is shattered, the belief challenged and the truth revealed, crisis can result. After spending five years as a counselor at a crisis center, I have a healthy respect for the mind’s ability to mask the truth in order to perpetuate a comfortable or tolerable existence. It allows an individual to feel as if they are in control of themselves, their life, their environment. When the illusion is intact, there is a safety zone so to speak. When we remove the illusion, we remove the protection. We must, as you did, weigh the consequence of challenging a belief no matter how irrational. If it is creating harm or the type of dysfunction that disrupts or jeopardizes one’s life, then we must try. And as you demonstrated, we must honor the person’s ability to maintain dignity in their discovery. In this way there is no judgment, no shame, humiliation, or embarrassment.

    I know that in order for my sister to accept her treatment, trust her doctors, or consider the efficacy of any other method, she would have to challenge the truths she has based her life on. In spite of her current condition and the possible outcome, she is unwilling, and quite possibly unable to accept any adjustment in her personal philosophy.

    I am saddened by this fact, but it is after all her truths that dictate the direction in which her ship sails. And no matter how much I love her, or what I wish for her, I cannot make the changes for her.

    My philosophy is, “we cannot cause the winds to blow, but we can adjust our sails.” It is resilience, not rigidity that will, in my opinion, provide the best survival response. An organism or species must adapt to survive. Sometimes research looks at problem from a backward perspective. If something kills, then what is it that keeps the organism alive? People who become suicidal are acutely aware of the truth of their circumstances and are stripped of all ability to convince themselves otherwise. The suicidal mind gives us the opportunity to look at the results of no belief, no illusion. As well as the result of too much illusion without adaptability.

    In spite of my education, all of the tools at my disposal, and the love for my sister, I cannot help her. Thank you for helping me to accept this by writing about your experience. It does not change the situation, it does however give me some peace.

    Jodee: What an articulate and sad experience. My condolences on the loss of your relationship with your sister. Even if she never recognizes the selflessness and compassion with which you supported her, it comes right through here in your writing.

    Alex

  • Having lived in Mexico, in a small Mexican community in an area that has few foreigners, I have learned to be somewhat skeptical of social science explanations of social phenomena as they are offered in the United States and to some extent the rest of the “Western” world. Living here, one becomes aware of just how culture-bound social science approaches, including neuroscience, are. These approaches can be useful, but they are far from complete. Before I go further, I would like to make clear that I am not an advocate of alternative medicine, etc. Rather, I would like to advocate more humility on the part of neuroscientists and social scientists whose article in Mother Jones you cite. The good thing about this research is that it challenges the notion that we humans will change our minds in a “rational” way when confronted with rational argument. The not-so-good thing is that it assumes that social science tools can discover and measure the factors that cause humans to change or not change their minds.

    This is a terribly complicated subject. You state, that “a belief defines an idea or principle we judge to be true. When we think about it, functionally this is no small thing…”

    You then move on to a discussion of Chris Mooney’s excellent if limited article in Mother Jones. In this part of your discussion, you don’t mention the social pressure he describes which cause us to hold certain ideas, though you hint at it.

    Essentially (I’m boiling this down too much) you come up with confirmation and disconfirmation bias and to emotional biases coloring reasoning.

    I think this whole issue is much more complicated. Beliefs have to do with finding meaning in our lives. If new beliefs challenge too much, we find ourselves in an existential crisis which is something most of us would prefer to avoid and the depressed among us try desperately to escape. Many of us do move slowly to new points of view, and with good reason. SLOWLY is the key. Thus, at least some Tea Partiers have worked very hard to construct a world In which they feel safe and secure. This world does include a certain amount of prejudice, a certain amount of ignorance. I think of my former mother-in-law who came from a poor family In Canada, whose father was an alcoholic and whose husband worked very hard to provide her with stability. The stability was very fragile indeed. Anything that threatened it was verboten, including her son seeing the movie Klute in the 1960s. She would be a classic Tea Party person.

    Another, different example: poor people in our colonia who do things the “old way” because at least THAT way they know they can survive. Ironically, more of them are convinced of climate change than are people in the US. Here we SEE it, and it hurts.

    People in the US probably would assume our neighbors are superstitious Catholics. And yet these people in our colonia have a certain contempt for the Church: they are not bound to various church practices though they are loathe to give them up at special holidays. They have SEEN the negative consequences of the Pope’s hostility to birth control.

    You are a Buddhist. I am drawn to a lot of the outlook of at least some Buddhists. Currently I am reading Thoughts without a Thinker by Mark Epstein. A long time ago I came to the point where I didn’t believe in being “good” so as not to go to hell. But what if one still believed that? What if one’s minister did, and one’s minister was the same minister one’s parents were married by? And what if he was a Tea Party type?

    In Thoughts without a Thinker a traditional American (whatever he/she is) is confronted with terribly frightening ideas. How threatening is it to people who believe at the core of their being that they must seek security, certainty, to hear that everything is already broken? That we are empty? What of Buddhist challenges to western child-rearing practices such as the idea that one should construct a self to please one’s parents who want achievement western style? Just a few examples.

    So to assert in a very traditional Christian family in the heart of Red America (if you will) that climate change is real is close to impossible. Not only are we dealing with confirmatory and non-confirmatory biases, we are challenging parents and community, reality, God, the fragile hold on reality we have, the doubts we try to bury in the very core of our being.

    And in climate change we are dealing with something that is truly cataclysmic.

    Finally, I would like to say that this social science approach is itself filled with cultural bias. We think it is science, but it is culture. It is culture which speaks to studies rather than the Bible a the way to truth. But who picks the questions? Who decides that if 60% of a population sees things in a certain way, they have some grasp on the truth? And when did we decide that social science, neuroscience, which makes it a rule to strip away poetry and nuance, had, because of its bare bones, have some grip on truth?

    I do not think we should be using social science to demonstrate to people that climate change is real. We should be using a variety of approaches: we should be saying to some, your crops are lagging because of climate change; you see tucans flying across the valley because of climate change. We should also be learning cultural contexts so we can say, using them, that we need to make changes.

  • “When we speak out, however, we must realize that reason alone will rarely if ever be sufficient to correct misconceptions.”

    This point of view deserves a bit more elucidation from you, than just being a mere disclaimer. It’s very difficult to pin-point misconceptions, and beliefs themselves have the capacity of creating conceptions and who knows, since we believe, we are what we believe ourselves to be, having the capacity to believe is much more important. If it were simple to form conceptions from evidence, well, I fear anything in this world is conceptual at all. I may sound a bit paranoid here. However the point I want to make is that conceptions and misconceptions are not as simple a matter as you have presented. Please comment.

    Siddarth: The whole point of the post was to explain and justify that quotation. Sounds like for you I didn’t succeed.

    Alex

  • Great post, as always! I appreciate your desire to blend the Western “scientific” approach with the organic experience of living-feeling-human-beings.

    It is unfortunate that, traditionally, “science” begins from a point of “make wrong” to prove the existence/non-existence of any and everything. The habit of intellectualizing life is, perhaps, the greatest detriment to experiencing it!

    Humans are simultaneously mental-emotional-physical-spiritual beings. (I use the term, “spiritual” to describe the “unknown” quality that religion & science battle to own and define.) To insist that we can define Truth at all is ludicrous since, for example, there are numerous people who have cured themselves from terminal diseases against all odds and scientific facts to the contrary.

    To quote you: “…the inconvenient truth is that what one of us believes has immense power to affect us all (think of the incalculable harm the smallest fraction of us have caused because they believe if they die in the act of killing infidels, they’ll be surrounded by virgins in the afterlife)…”

    I point out that this is the same inconvenient truth as when the missionaries spread Christianity along with disease and manifest destiny, which has destroyed entire cultures, ecosystems, and degenerated the world into the state of why people have the need to continue to fight infidels. The only difference is which infidel is the dominant culture that dictates its society’s view of “right” and “wrong.”

    “… If we truly care to promote belief in what’s true, we need to first find a way to circumvent the emotional biases in ourselves that prevent us from recognizing the truth when we see it.”

    It continues to be True in my experience that when one begins from the “make-wrong” approach of insisting that what appears as truth to a group of people (and to individuals)—even backed by “scientific proof”—IS actually THE ONLY TRUTH, one cuts off POSSIBILITY for enhancement of truth for her/himself and the collective consciousness. Without POSSIBILITY for what is beyond our ability to know from our limitations (social, economic, cultural, religious, etc.), there is no truth—only what people insist, agree, and wish to impose upon others.

    I believe that adopting a mindset of “possibility” is the first step to “circumvent[ing] the emotional biases in ourselves that prevent us from recognizing the truth when we see it.”

  • […] The Two Kinds Of Belief […]

  • Maybe this will provide some insight into our confirmation biases: https://edge.org/conversation/the-argumentative-theory

  • I read the extremely interesting post on Edge. Two challenges (not tested)….one is that individuals can reason by themselves. Second is that groups can impede as well as help. Both of these challenges are based on the idea that not all situations are ones calling for argumentation. For instance, one can have simple curiosity or have no opinion yet but have certain interests at heart which direct one’s use of reason but don’t distort it. Groups obviously sometimes don’t do ANYTHING to enhance reasoning. In some groups, you can’t belong if you don’t go along—go along to get along. Similarly (sort of) certain group leaders can overpower any kind of reason as should be obvious to anyone who knows any history.

    Need to identify with groups can overwhelm this argumentative theory, I think.

    Argumentative theory seems to hit the target, though, in some, but not all, contexts.

  • […] that when one of us hurts another, we must balance the scales by hurting them.  And because of confirmation bias, we can’t help but explain this desire to ourselves in a way that justifies it (as I wrote in […]

  • […] arrived has to do with the way our minds work and a particular cognitive bias I’ve described before.  Disconfirmation bias means simply we ignore facts that support ideas we don’t want to […]

  • Every Christian should read this post.

  • This is an interesting article…my mother-in-law has stage 4 lung cancer, which has essentially spread to most areas of her body. She has been taking cannabis oil to help not only with pain, but also to possibly cure the cancer, or at least stop it from spreading further (I know this is a controversial topic). My mother-in-law is also taking pain medication (as she is staying at the hospital). So, needless to say she is very doped up and confused, tired, etc…the doctors have told her the cannabis oil will not cure her cancer. She believes it will and remains very positive. I’ve always wondered if it is better to get her off the drugs so she is 100% mentally stable while still alive. Or if it makes more sense to keep her on the drugs so she spends everyday somewhat high and under the belief that she will beat the beast of cancer (even if that isn’t the truth). I know if it were me, I’d rather not live with the knowledge that death will be knocking at my door any day. How depressing the last days would be with that knowledge. I’d rather believe I could beat cancer and live happily again even if I’m in denial. I’d go with the old saying, ignorance is bliss. And who knows maybe the mind over matter theory would kick in.

    Deborah: Your question is a crucial one. I wish I knew how to answer it.

    Alex

  • seth pirith jaya piritha

    The Two Kinds Of Belief « Happiness in this World