Nate Anglin

View Original

How To Articulate Facts That Will Help You Avoid Biased Opinions

Opinions are riddled with emotion. They pull, prod, and probe you into believing a certain way.

Over time opinions become siloed, biased, and funneled. They’re inflexible. Anything beyond what you believe to be true is labeled as dogshit—a conspiracy.

Opinions are the “medium between knowledge and ignorance,” as Plato wrote The Republic.

It takes self-awareness to see your emotion-fueled opinions. It takes an open mind to understand your views are meaningless without guidance.

It happens every second on social media. People read a headline, take a fact-less opinion on the topic, and share their opinion with the world.

They never dive deeper into the topic. Not once do they question what they’re reading.

We’ve all done it. We’re all victims to the storm of our opinions.

Opinions vs. facts.

Can opinions be wrong? What separates a fact from an opinion? Is there a clear principle of distinction?

Some say facts are “concrete,” but opinions can also be true (or false) regardless of what you believe.

The earth revolves around the sun would be a fact for modern civilizations, but not for medieval ones, or possibly even the current flat earth skeptics.

It’s not useful if any statement can be either fact, opinion, or a combination of both based on personal preference.

All statements express beliefs. It’s these beliefs that screw things up when there’s no substance to them.

The goal is to determine which are factual beliefs and which express opinions.

“A fact is a statement that can be proven true.”

Or can it? Does the truth get muddied down based on our beliefs? Do we let cognitive bias rule what is fact versus what is opinion?

“An opinion expresses someone’s belief, feeling, view, idea, or judgment about something or someone.”

These opinion vs. fact definitions still lack clarity. 

A better definition is, “a statement of fact is one that has objective content and is well-supported by the available evidence” while “a statement of opinion is one whose content is either subjective or else not well-supported by the available evidence.”

What’s more, important than these definitions is whether someone can articulate good reasons for the claims they make.

If they’re vague, it’s an opinion, not a well-thought-out fact. 

Your opinions don’t matter; here’s why.

Opinions are sticky. They cling to your mind like an over-processed candy bar.

Once we form an opinion, we fall in love with it. We yell it from the virtual rooftops for everyone to hear.

“I believe...”

“My opinion is...”

We declare, “my opinion is right, therefore yours is wrong.”

Our beliefs bleed into our identity. Changing our mind means changing who we are. 

It’s why disagreements become fueled with anger.

We rarely reflect on what we’re saying. We never test our opinions. We avoid counterarguments and label them as misinformation.

It’s a vicious cycle of consumption, form an opinion, and repeat.

When’s the last time you sat down to think about an opinion you held? 

We accept our opinion as fact–never taking the time to think about a topic. 

Just because we believe something doesn’t mean we’re better informed than others.

Belief superiority is when you believe your opinion is more accurate than others.

Across five studies, it was “found that those people with the highest belief superiority also tended to have the largest gap between their perceived and actual knowledge — the belief superior consistently suffered from the illusion that they were better informed than they were.”

And “despite being badly informed compared to their self-perception, these participants chose to neglect sources of information that would enhance their knowledge.”

Think about all the parent/children relationships that have been destroyed because the child became a writer and not a doctor. Married a woman, and not a man. Became a Buddhist and not a Christian.

We hold opinions on how other people should live, even when their life does no harm. There’s no injustice. No wrongdoing.

We can live a fuller life with fact-based thinking.

Scientists refrain from stating opinions. They focus on generating a working hypothesis. The key is “working,” which translates to work in progress.

When you’re continually working on something, it’s never final. A hypothesis can be changed or discarded based on the facts.

“Opinions are defended, but working hypothesis are tested.”

One person’s opinion is hazy, riddled with emotion, and bias. 

We must validate our opinions, or what we believe is a fact, through “independent validation and multiple testing sources.” This will get us “closer to twenty-twenty vision.”

Our feelings, which are often blurred into opinions, don’t mean anything. “They merely mean whatever we allow them to mean,” says Mark Manson.

Having an opinion, especially one broadcasted to the world, should require us to sit our asses down and seriously think about a topic. 

It does no good to repeat someone else’s opinion. You become a blow horn of ill-informed knowledge.

If we never create something from our ideas, we miss the opportunity to force them into coherence, rather than just reporting a headline we read.

To seriously think, we must draw on a variety of sources, often competing and conflicting sources. We can then form them into a hypothesis that we continuously work on.

But that’s rarely good enough.

Always question the facts.

Walter Isaacson said, “one mark of a great mind, is the willingness to change it.”

Facts are stubborn things, as John Adams once said, but our minds are even more stubborn.

We prefer to focus on “facts” that smack us right in the face and neglect the ones hidden in our blind spots.

It’s why digital media is dangerous. 

It feeds us related topics reinforcing what we believe. It strokes our opinionated ego.

If you believe the earth is flat, you’re going to be exposed to more opinions of why the earth is flat.

It’s what I’ve prescribed as Factual Nearsightedness; we’re only exposed to the facts we believe are true.

It’s selective perception, which “is the process by which individuals perceive what they want in media messages while ignoring opposing viewpoints. It is a broad term to identify the behavior all people exhibit to tend to ‘see things’ based on their particular frame of reference.”

“Just because you don’t like some element of reality doesn’t mean you can change it,” Ryan Holiday reminds us. Life is uncertain. It’s uncomfortable. “It doesn’t really care whether we want or need something.”

“Facts don’t care how hard they are. Just because you can’t bear something doesn’t mean it doesn’t have to be borne. Just because you have an opinion–or a need–doesn’t mean it’s relevant.”

To prevent factual distortion, take these steps to always question the facts–even the ones you’ve attached your identity to.

Kill Your Intellectual Darlings

We seek to prove ourselves right. Every yes, makes us feel good. Every no, generates a feeling of failure. Every yes makes us become more intimate with our beliefs.

But, every no brings us closer to the truth, and “progress occurs only when we generate negative outcomes,” as Ozan Varol said in his book Think Like A Rocket Scientist.

A scientific theory isn’t about being proven right. It’s about not being proven wrong.

Most facts have a half-life. “What we’re advised with confidence this year is reversed the next.”

If there’s no way to test a hypothesis and disprove it, it’s useless.

Steward Brand, who founded the Whole Earth Catalog regularly asks, “how many things am I dead wrong about?”

We must poke holes in our arguments. Look for opposing views. Look for disconfirming facts and ask, “what fact would change my mind?”

Follow Darwin’s “golden rule.” 

Once he stumbled upon a fact that contradicted his beliefs, he would write it down.

“I had, also, during many years, followed a golden rule, namely, that whenever a published fact, a new observation or thought came across me, which was opposed to my general results, to make a memorandum of it without fail and at once; for I had found by experience that such facts and thoughts were far more apt to escape from memory than favorable ones. Owing to this habit, very few objections were raised against my views which I had not at least noticed and attempted to answer.”

Constantly question your deepest beliefs.

Seek to kill your intellectual darlings, and follow Ozan’s advice that “our goal should be to find what’s right—not to be right.”

Work With An Adversarial Team

When you surround yourself with “yes” people, your ego will be stroked, and your beliefs will harden.

Nothing in this world is a fact for very long.

Physicists shine a light on the unknown. It’s how they develop their theories and models.

“A good theory is one that makes new predictions that can be tested in the lab, but if those experimental results conflict with the theory, then it has to be modified, or even discarded” explains Jim Al-Khalili in his book The World According to Physics.

We’re all human. Surprise! 

Journalists and fact-checkers are born with the same mental constraints. The same psychological biases. They’re jaded by selective perception–facts are processed based on their beliefs and values, whether you like it or not.

For this reason, you must question everything. 

You should always work with an adversarial team. Work with people who are likely to tell you “no, you’re wrong,” and why you’re wrong in a healthy and constructive debate.

Abandon the temptation of objectivity and surround yourself, an idea, or a hypothesis with competing claims.

Don’t be comfortable with the world feeding you what you want to be fed–siloing your information into a vacuum of redundancy. 

Stephen Ceci says it best, “the key to cognitive reasoning is for both sides’ claims to appear simultaneously in the very same report. This would minimize the creation of false beliefs that merge as a consequence of exposure to only one side.”

Begin to accept your opinions are hazy. Seek facts. Surround your views with competing sources. And always put “facts” through ongoing tests.

Factual complacency is a killer.