Sunday, November 06, 2016

How "belief systems" are born and developed

Looking around at the world, full of conflicts and wars, incompatible and irrational belief systems, it is important to look at the processes by means of which all belief systems are created.  How, living in the same world, did we come to have such widely discrepant belief systems, totally incompatible with one another and all believed to be totally right?  How did individuals grow up with such peculiar beliefs about themselves and the people around them?

We seem to have the inborn trait of curiosity and speculation about how the world works.  We want to know what influences what, what controls what.  We want to predict and control the future.  Where this trait arises is open to speculation.  How we use it is fairly clear.  Humans make theories about causation.

Two factors are important.  The first factor is the post-hoc fallacy.  This fallacy stipulates that when thing B happens directly after thing A, thing A "caused" thing B.  This is a fallacy because it is not always and invariably true.  However, it is true a lot of the time, and leads to our first discoveries of the laws of the universe.  Eating the fruit of a strange plant, followed by miserable illness, leads us not to eat that fruit again.  We don't know for sure that the plant was poisonous, but logical certainty is not as important as avoiding taking the chance.

The key phrase here is "not sure".  We form theories of connection or causality.  We think "A may have caused B".  Eating the fruit MAY have caused our illness.  How do we know?  We try it out, or at least observe carefully.  We look for evidence that our theory is valid.  It is important to our survival that we try to understand how things work and make guesses (theories) as to what might hurt us.  We have to accept probabilities, that is, relative proof rather than absolute. We have to look at the data coming in and allow it to strengthen or weaken our theories.

The second factor is called (by us psychologists) confirmation bias.  This bias tells us that when we think X theory may be true, we pay selective attention to  evidence supporting X.  We do NOT look systematically for evidence disproving X, at least not until the birth of scientific thought.  And even scientists trained in collecting data don't think scientifically most of the time.

For instance, someone who believes they are "unlucky" will selectively attend to "evidence" of unluck and selectively ignore evidence of luck. The "unlucky" person accumulates data over time that "proves" his theory about luck to be correct for him.  Someone who believes they are unlovable will collect rejections, and even invite them, believing rejection to be inevitable.  It is easy to see how religious and political beliefs are supported.

These two factors are sufficient to give rise to thousands, millions, of conflicting ideas and beliefs, many of which are so strongly held that people will kill to defend them.  Our beliefs tell us what to look for, what to believe, how to behave.  They define our civilizations, our religions, and our politics. They define which groups are "good", and which "bad".

In children the process is easier to observe than it is in adults, but adults function in pretty much the same way.  Suppose we are given a theory, such as: step on a crack and you'll have bad luck all day.  We then begin paying selective attention to cracks.  We try stepping on one or two, and then observing the following events, which by means of the post-hoc fallacy, we believe to be directly connected to the crack-stepping behavior.  A number of things happen, as they always do on any given day.  However, because of confirmation bias, we notice particularly the events that "confirm" our theory about cracks.  We discount or minimize those events that do not confirm it.  For at least a few days, while we are paying attention, the theory seems to be more and more true.  We do accept negative evidence, but it takes a lot more of it to disprove the theory than positive evidence to confirm it.

When events occur that have special emotional meaning to us, we try to find a theory that accounts for them.  We wonder what we did or observed that might have "caused" the event to happen.  We form a theory.  When we are young, our standards for a good theory are loose.  (Hopefully they get tighter as we mature).  A small child once asked me if her mother had died because the child had "bad thoughts".  The child is not capable of seeing the weakness of the connection between the child's thoughts and the mother's accidental death.  So all of our theories seem worth investigating, at least while we are young and not appropriately skeptical.

Many events can give rise to theory formation, but events with a lot of emotion attached are primary stimuli for theory formation. Theory: If I don't take a raincoat to work it will rain. Event:  If I don't take a raincoat and it does rain, the theory is supported. Event: If I don't take a raincoat and it does not rain, that doesn't count. So theories mostly find support and rarely find disproof.  They get stronger over the years as we collect more "supportive evidence" and continue to discount negative evidence.  

This pattern results in our changing beliefs about ourselves as we grow older.  Something happens to get our attention and we form a theory of connection.  We accumulate support for that theory, but not disconfirmation.  Suppose some event happens that causes us to form a theory about ourselves.  As an example, imagine getting a bad grade on a test in the first grade.  We might begin to form a theory, such as: "Maybe I'm stupid".  We then begin to look for evidence, but we pay most attention to the evidence that supports our belief in being stupid.  From then on we accumulate more evidence and become more convinced that we're "stupid".  

Religions get formed in the same way.  In the dawn of time, a loving parent falls to his knees and prays to the heavens for the return to health of his child.  The child recovers.  The parent forms a theory:  praying to the heavens results in blessings.  He tells his friends what happened.  They all begin collecting evidence that supports the theory and discounts the negative evidence.  When a parent prays for their child and the child dies, the parent discounts the negative evidence by forming a new theory:  one must have to pray in a specific way for it to work, and he must have got it wrong.  The future evidence is heavily weighted in favor of support of the future theory(s).

Some of the theories formed may be valid, others not so much.  But they continue anyway as if they were confirmed.   We still throw rice at weddings, even when we are not strongly in favor of immediate fertility.   A problem is that theories can never be absolutely proven or disproven.  There is always the possibility of getting more evidence.  We may find connections between event A and B that we didn't know before.  So our world is more and more full of divergent and supported (but not proven) beliefs.

We believe we are right.  We forget that "belief" is not proof.  We do not really question our beliefs unless something happens that forces us to reconsider.  That takes a lot of force.  For instance, many people believe the universe is "fair".  A cursory reading of the newspaper should be enough to cause doubts about that theory.  However, in order to keep the theory intact, people develop new "theories" as to why the universe appears unfair:  the people to whom bad things happen "must have deserved it" or "there must be some higher purpose we don't understand" or any number of theories designed to allow the old theory to continue in the absence of supportive evidence.

To overcome our own confirmation bias requires conscious attention and respect for new data, a conscious willingness to question your beliefs and an equal willingness to consider and evaluate new data on its merits.  For instance, to overcome your belief in being unlovable, you have to be willing to consider data that supports your being lovable. By challenging beliefs, you can become more aware of contradictory data, and vice-versa.  Perhaps you can't entirely eliminate beliefs that have accumulated "support" over the years, but you can weaken them over time.  (A central tenet of CBT).

I always value comments.

No comments:

Post a Comment