What happens when a personal belief meets an irrefutable fact? The belief wins every time–facts be damned.
Human nature is such that we believe what we want to believe, and we don’t believe anything we don’t want to believe regardless of the verifiable facts. That’s the entire premise of an ideology, which is a set of shared beliefs that any group uses to explain its existence and way of life.
I am as I believe
What you believe is central to your identity. So if someone presents you a verifiable, irrefutable fact that negates a core principle of your belief system (who you believe yourself to be), you will simply say, “I don’t believe it,” and that’s that.
And if someone can distill that belief into a few words for you, so much the better. Try these on:
- America is the greatest nation on earth.
- The system is rigged.
- Democracy is the best form of government.
- Live fast, love hard, die young. (Oops, sorry. That is a country song, but a belief system none the less.)
- The white race is superior to all others.
- Black lives matter.
- All men are created equal.
- Might makes right.
- It’s every man for himself.
- We’re all in this together.
- Get a good education, work hard, and you’ll be successful.
- You can become anything in life you want.
- Winning is the only thing that counts.
- There is only one true God.
- God is dead.
You get the point.
A true believer, or a believer in truth?
A true believer accepts the party line no matter what. Every bit of information is cast in light of the underlying beliefs. All information contrary to the core belief must be explained and reexplained and reconfigured until it fits the belief.
On the other hand, a believer in truth will seek out information, verify it, consider that information in context of the situation in which it is presented and then arrive at a belief based on it. A believer in truth has no qualms about saying they used to believe one thing, and now they believe something different because they learned new facts.
I would rather be liberated by facts than enslaved by my beliefs. The most liberating phrase I know is, “I might have been wrong about that.” So, why is that not as easy as it sounds?
Psychologists Carol Tavris and Elliot Aronson explain, “The brain is designed with blind spots, optical and psychological, and one of its cleverest tricks is to confer on us the comforting delusion that we, personally, do not have any,” (Mistakes Were Made, But Not By Me, 2007).
It’s not that we can’t see the truth, it’s that we won’t see the truth.
Simply put, we lie to ourselves and believe we are being truthful at the same time. Why? It makes us feel good about ourselves–that we are right.
The paradox of the lie
We want to believe the lie because we believe we will benefit from it, even though it’s a lie.
Because we define ourselves by our beliefs and values, we flock with those most like us. We confirm our beliefs to each other. Then, we tell each other how good we all are and that those not like us are not as good as we are. Us versus them.
Yeah, we would rather believe the lie because we like the lie better than the truth.
The apostle Paul had it right when he warned Timothy, his younger protege, to be diligent about teaching, “sound doctrine.” He wrote that the time would come when people, “will gather around them a great number of teachers to say what their itching ears want to hear,” to suit their own desires (2 Timothy 4:3).
The case for faith
Where does this lead us?
For me, it leads to fact checking and allowing the facts to shape my beliefs. I also must admit that there are some things that are not explained by the empirical facts that I know or have available to me. Therefore, I do accept some things on faith, my belief through personal experience that some things are true, but I don’t know why exactly.
And I will keep that faith, until I have facts that would compel me to believe otherwise. See how that works?
I do believe in the fundamental goodness of people, that we sincerely want to do the right thing as we believe it to be.
Partisanship is like handling fire. It can provide illumination or destruction depending on how it is managed. Right now, in America, it’s blazing out of control and threatening to consume our democracy.
But there is an escape
A fire needs three ingredients: fuel, oxygen, and heat. Eliminate any one of these, and the fire goes out. We control fires by regulating the balance of these three ingredients.
A partisan takes a side. Nothing inherently wrong with that. We need people to represent different points of view to help us balance our own thinking and to help us find solutions to difficult problems. That’s illuminating.
When out of control, the heat of passion ignites the fuel of content (words and positions) in an oxygen rich environment of “us versus them– If you aren’t with us, you’re against us–winning is all that matters.” That’s destructive.
A leader is a firefighter
Such a leader can manage the fire, but not if they are adding to the fuel or fanning the flames.
Unfortunately, our political discourse right now is fueled by the rhetoric of extremism couched in half-truths and outright lies, fueled by the passion of contempt and hatred toward the opposition, in an environment where partisans are lined up shouting, “Burn, baby, burn,” at each other.
It’s time to isolate the political pyromaniacs and deprive them of the heat and oxygen they need to destroy everything in their path.
We need INPowering leaders who can cool the passions, moderate the message from inflammatory diatribes to reasoned discourse, and regulate the environment by creating breathing space for conversation and dialogue instead of screaming across partisan divides.
These partisan movements run on the fuel of “B.S.” We must hold all sides, even our own, accountable for the truth in context of their message. If you must lie and twist the facts to make your point, then you don’t have a point.
We must dig a fire line around our passions so they don’t race out of control, igniting what would otherwise not be in jeopardy.
We must create an environment where opposing points of view can be discussed with cool heads and compassionate hearts.
True leadership rises above partisanship. It illuminates.
Think about the way you think–the process of thinking. Just because you have a brain no more guarantees you can think than having a mouth guarantees you can communicate.
I got tired of being misled by those who smiled at me while saying, “Trust me.”
So, as a matter of rational self-defense, I began studying the process of thinking rationally.
Conclusion: most people do not think rationally at all. They confuse rationalizing with being rational. And those two processes are polar opposites, even though they sound like the same thing.
A big difference that determines who is in control of your mind
Rationalizing: making up your mind first, then looking for information that supports what you want to believe (or others want you to believe without thinking about it).
Rational: gathering information, then processing it systematically to guide you to what you should believe about it. You are in control or your decisions.
What causes us to rationalize?
I’ve been studying the thinking process for over twenty-five years. Why? I felt that I had been misled by the biases and belief systems of others who merely adopted what they had been taught without any rational investigation into why they believed as they did.
No blame intended. They were well intentioned and just wanted me to believe the right things, according to what they had always been taught by other well intentioned people as the right things.
So I did the natural thing. I bought into their beliefs. It seemed like the rational thing to do.
Until . . . I discovered information counter to what I had been told to believe. And it troubled me.
I found out there is a term for that feeling–cognitive dissonance. The feeling you get when you realize you hold a belief that is not supported by the information you gathered, but you continue to hold that belief anyway.
I didn’t like that feeling. The quest began.
I had to come to an understanding of why I believed what I did.
As I embarked, I was astounded that so few were with me on that journey. When I told them what I was doing, some patronized me, “Good luck, and I hope you find yourself.” Others stared back blankly as if they did not understand the concept. And others outright ridiculed me as a liberal elite.
I did not know what a liberal elite was, and come to find out neither did my critics. The best I could determine was that a liberal elite is what they called anyone who contradicted their opinions.
Why bother with rational information to support what you believe when hurling derogatory insults is good enough?
There’s nothing new about that tactic. There’s even a term for it–epithet. The tactic places a derogatory label (a.k.a. name calling) on a person, an idea, a point of view, a way of doing something, or an institution, to make listeners biased against it.
The speaker is trying to get you to believe him by calling his opponents names that you would also find reprehensible. In other words, when you don’t have a rational argument to back up what you are saying, start calling your opponent names or belittling them or slurring them.
The labeling tactic pretty much works every time on people who are in rationalizing mode.
I’m assuming you do want to think for yourself. You can learn to listen to all sides of a position, educate yourself on a broad range of facts and perspectives, understand the context in which the facts exist, apply a critical thinking process, and come to an informed conclusion. Think rationally.
You’ll risk being called a liberal elite. But it’s worth it.
The line between convictions and being obstinate is thin. When we consider our convictions to be absolute truth, to which everyone else must yield, we have set ourselves up for a world of frustration and strife.
I think it’s a sad state when people are so firm in a belief that they cannot even consider the possibility that they might be misinformed or have distorted the facts.
My arrogance was surpassed only by my obnoxiousness.
I’ve learned that the hard way. I would argue about my opinions at the drop of a hat, and I’d drop the hat. I was right, and I had to convince everyone that my way was the right way.
I worked hard on being right about my opinions. So, if you disagreed, you must be wrong. After all, why would I intentionally be wrong about my beliefs?
One day it dawned on me why, when I joined the conversation group, I was soon the only one left standing there. I was literally running everyone off. Fortunately I’ve mellowed, and I have more friends now.
But isn’t that the catch with our beliefs? No one in their right mind would intentionally be wrong about their beliefs.
Then I realized that an opinion is just that: my subjective understanding of what facts mean to me.
Often we overlook our own inconsistencies and ignorance about simple facts.
I saw the caption on the back of t-shirt recently that said something like, “Our national parks belong to the people, not the government. Get the government out of our national parks.”
I’m sure the man wearing that shirt believed passionately in its message. But here’s the problem he obviously did not see. Without the government, there would be no national parks. National parks are established by an act of congress. I’m sure that trying to convince him that having the government involved is a good thing for national parks was a conversation better left alone.
Conviction and facts do not always coincide.
Recently I saw on television a side by side comparison of Facebook posts about the same set of facts as posted on liberal leaving versus conservative leaning pages. You got it–each side represented the identical set of facts with a different interpretation depending on their leanings, a.k.a. convictions.
The truth is, we usually believe what we want to believe, and we cherry-pick the facts to back up those beliefs. That’s why discussing politics and religion with most people is futile. We also will rationalize facts that are contradictory to our beliefs rather than change our minds. Boy are we in trouble.
Let’s get sane by being willing to change our minds.
I still have my personal beliefs, but I realize they come from my personal experiences and understanding of information at the time. Sometimes new and more accurate information comes to light, and my personal believe are subject to change.
However, I am a fact checker. Before I jump on whatever bandwagon you are riding, I’ll challenge your facts and the way you handle you facts.
I recommend everyone get in the fact checking habit. There are several groups that do so in the political and government sectors such as FactCheck.org, Politifact.com, Snopes.com, OpenSecrets.org, and TruthOrFiction.com.
Our mind plays tricks on us, and our eyes and ears are unwitting co-conspirators. They continually get us in trouble. It’s a WYSINWII kind of thing (What You See Is Not What It Is).
Remember Benjamin Franklin’s advice, “Believe none of what you hear, and only half of what you see.”
Perception is the process of interpreting sensory stimuli. So, in a literal sense, touch, taste, and smell get in on the game, too. Our sensory organs take in information from our surroundings and feed it into the brain where it is interpreted into our reality. IIWISII (It Is What I Say It Is)–EOD (End Of Discussion).
Every one of us has experienced a situation in which we were absolutely certain we had our objective facts correct, only to find out that we had been fooled again. Drats.
Misperception is the root of all kinds of unintended conflicts.
And here’s the danger zone: our misperceptions–misinterpretation–becomes our reality, and we act on it. Then, someone perceives our intentions and reacts to our actions. Then, we react to the reaction, and the vicious cycle takes on a life of its own.
What we have here is a failure to communicate: the other person obviously doesn’t understand. After all, how could I possibly be wrong about what I saw with my own eyes and heard with my own ears?
Been there? Done that? I have.
How, then, can we avoid getting fooled?
I have not found a failsafe workaround. But I have been able to come up with a few rules of thumb that help keep me balanced. Maybe they will work for you, too.
- Everyone is always right in their own mind, at any point in time. Think about it. Why would anyone intentionally be wrong? I Accept that everyone has his or her understanding of events, and they are as legitimate to them as mine are to me.
- Never tell anyone they are wrong. Allow their perceptions. Seek to understand the underlying information and how they worked through it to arrive at their understanding. Have a conversation, not an argument.
- I like myself more when I admit that I could be wrong about something without damaging my self-esteem. Always having to be right is too big a burden to bear. Being wrong does not mean I’m incompetent or inadequate–just human.
- Do a double-take. Most things are not quite as they initially seemed. After my initial knee-jerk reaction, which I cannot prevent because the fight or flight system hijacks me, I stop and reflect on what is happening. Reserve judgment about right-wrong, good-bad, and take another look at the situation. Focusing on different aspects of the information might result in a different interpretation of it.
- Change perspective. I shift my point of view to that of the others involved. I try to envision how I would see the situation through their eyes and experience. This also helps me to be more empathetic, and even compassionate; thus, potentially altering my response.
I have never been more embarrassed than
when I was so certain of my opinion
and told everyone so.
No one likes to admit he or she is wrong because of the stigma attached to it. But most often, the right thing to do is fess up. The problem with that is, even though confession is good for the soul, it goes against our nature.
However, once we admit the error of our ways, we can get on with setting things straight–improving, learning, and growing.
There are two things working against making meaningful personal change: blind spots and self-justification.
We have a physical blind spot in our visual field–that place on the retina where there are no photoreceptors. Anytime the eye lens focuses an image on that spot where the optic nerve exits the back of the retina, objects disappear right before our eyes.
We also have intellectual blind spots–those places in our mindset that are simply unreceptive to any ideas or points of view that are contrary to the way we have learned to see the world and position ourselves in it. Our idea receptors are missing. Because we cannot see any other variations on our reality, we cannot fathom that our view of the world could use any adjustment.
By the way, our blind spots cleverly delude us to believe we don’t have any, although everyone else seems to.
Such a blind spot can cause some vexing ethical problems. In their book, Blind Spots: Why we fail to do what’s right and what to do about it, Max Bazerman and Ann Tenbrunsel write, “Ethics interventions have failed and will continue to fail because they are predicated on a false assumption: that individuals recognize an ethical dilemma when it is presented to them.”
It does not occur to us that we could have intentionally been wrong, irresponsible, or corrupt. After all, we are good people with good intentions. Bazerman and Tenbrunsel contend that ethical judgments are based on factors outside of our awareness. In other words, they manifest from our intellectual and ethical blind spots. How could we have possibly been wrong?
Blind spot fixes
Step one: admit that blind spots exist. We can physically find the blind spot in our vision field. Finding them in our intellectual and ethical field is more difficult. Just the awareness that we are susceptible to them will put us on guard and open us up to those who try to help us see what we are missing.
Step two: aggressively seek other points of view so we can see issues from many perspectives. We can expand the field of view by expanding our sources for information.
Step three: accept the idea that everyone’s view of the world is real and legitimate to them. It is no more or no less real to them than ours is to us.
I have come to believe that everyone, in his or her own mind, at any point in time, is always exactly right.
If we end up with two ideas in mind that contradict each other, we are experiencing cognitive dissonance. This creates a psychological tension that throws us in disequilibrium. It must be resolved so we can find internal balance and feel good about ourselves again.
Psychologists Carol Tavris and Elliot Aronson explore self-justification in depth in Mistakes Were Made (but not by me). They explain the connection with blind spots. “(D)issonance theory is a theory of blind spots–of how and why people intentionally blind themselves so they can fail to notice vital events and information that might make them question their behaviors or their convictions,” (p. 42).
The goal of self-justification is to tell our stories to ourselves so that we always come out OK. We are always the hero of our own story, never the villain.
The notion that one might be mistaken never enters their mind. If it does, it quickly evaporates. No change required.
Self-justification manifests in debates of all types: religion, politics, whose version of an incident is correct, personality conflicts, customer complaints, employee disputes, and on and on. Usually neither party is willing to accept blame, and all parties want to be vindicated.
I repeat, everyone, in his or her own mind, at any point in time, is always exactly right.
But what if one said, “You know, I might have been wrong about that”?
Step one: as with blind spots, admit that this happens and that we all are guilty of doing it.
Step two according to Tavris and Aronson: find a few trusted naysayers who will help us avoid operating in a hall of mirrors, “in which all we see are distorted reflections of our own desires and convictions,” (p. 66). Their job is to keep us honest with ourselves.
Step three: find new explanations that take into account honest appraisals of situations that lead us to more constructive solutions. If our need is to maintain self esteem, wouldn’t our self esteem be strengthened by knowing that through self reflection we can come out wiser, more inclusive, more balanced, more aware, and more able to deal with the complexities of our world?
If the goal of arguing is to change minds, it’s an exercise in futility. I’ve never won and argument, and I’ve never lost one either. When people are arguing, they are not interested in changing their own mind. They are only interested in changing their opponent’s mind.
No one is listening.
Asking someone to change his or her mind is like asking them to redefine themselves.
Why? Our personal identity is tied up in our belief system. It defines us. Our belief systems come from our life experiences and how we explain them to ourselves so they make sense to us. Our reality comes from the stories we tell ourselves about what events, ideas, and experiences mean to us and why they matter.
Just listen to the way we speak to ourselves about ourselves:
I am an American.
I am a conservative (or liberal, progressive, moderate, etc).
I am a Christian, and more specifically a Baptist (substitute your faith and faction or denomination).
I am an atheist.
I am _______________. (Fill in the blank.)
Our quest for truth becomes our search of information to validate what we already believe about who we are. We are not interested in changing our minds.
How to think differently about ourselves.
Dick Cavett, a former talk show host and columnist said, “It’s a rare person who wants to hear what he doesn’t want to hear.”
To learn and grow means being willing to explore information that contradicts our prevailing beliefs about who we are. It also means running the risk of being labeled a heretic by others of our belief, and that’s scary.
The INPowered find strength in flexibility. Making judgments derived from critical investigation that helps us to INLighten the mind will make us strong in our beliefs without becoming so rigid and inflexible that we cannot evolve, learn, and grow.
After the heretic leads the way, others follow, and beliefs change.
Winning arguments is not as important as winning the respect and trust of those who hold different opinions and beliefs. You will be respected, if you can say,