How do we know what we know?

Source analysis toolbox: an ongoing work-in-progress.

Look in the mirror.
This is really the most important thing when analyzing a source for credibility or bias: knowing your own beliefs and your own possible biases. It’s always tempting to accept something uncritically because it fits what we think we already know.

Premises / logic / values.
Know what you differ on: what you believe is a fact, or what consequences follow from it, or whether something is good or bad.

The spam filter.
All of us have a mental filter that works like the spam filter in your e-mail program: it examines input and tries to weed out stuff that appears irrelevant, false, or that just wastes your time. It’s what makes you tune out crazy people. And we need that filter – our mind couldn’t function without it. But the filter is not perfect, and every so often we have to re-calibrate it (just as we need to check our spam folder every so often) to make sure we’re not missing something important. Every once in a while, even the person we thought was crazy might have something important to say.

Flattering beliefs.
We have a natural tendency to want to believe things that are flattering to ourselves. For example, I may be tempted to hold a belief because it makes me feel more virtuous or more intelligent, rather than because it is supported by the evidence.

Competing values.
“I believe in peace.” “I believe in justice.” “I believe in freedom.” “I believe in security.” And so on. These are all perfectly fine things to believe in, but real life often requires us to make trade-offs. Very often the person who disagrees with you believes in the same good things you believe in, but assesses the trade-offs differently.

Confirmation bias.
This is our natural tendency to believe things that fit our world-view. I find it helpful to divide between “things I think I know” and “things I know I know”. Only verified factual information – things I KNOW that I know – is useful for evaluating the truth or falsity of a new claim.

Narrative.
What kind of overall picture, or “narrative”, is the source trying to present?

Baseline.
Before you can determine whether an event is significant or unusual (for example, a crime wave), you need to know what the normal state of affairs is (for example, the average crime rate).

Question sensational reports.
There’s a military saying that “nothing is as good or as bad as first reported”. Sensational reports do just what the name says – they appeal to our sensations (of fear, hope, disgust, arousal, etc.) and can short-circuit our critical thinking. News stories with especially lurid details should be treated with skepticism.

Internal consistency.
Do all the pieces fit together in a way that makes sense?

External consistency.
Does the report agree with verified facts – things I know I know?

Dialog and dissent.
Does the source welcome opposing views and seek to respond to them?

Awareness of objections.
Does the source attempt to anticipate and refute objections?

Nuance.
By nuance I mean the recognition that a thing can be true in general and still admit of exceptions. For example, it may be true that tall people are generally better basketball players, but it can also be true that some short people may be outstanding players.

Logical fallacies.
There are many mistakes in basic reasoning that can lead us to wrong conclusions.

Red herrings / straw men.
A straw man is an argument that can be easily overcome, but that nobody on the other side actually made; you can “refute” this kind of argument to try to make it look like you refuted your opponent’s argument, but you didn’t actually respond to the claim they were making. A red herring is any kind of argument that is irrelevant to the main issue, and distracts you from it.

Unexamined assumptions.
If I ask, “Why is Smith so evil?” I am not questioning whether he is evil, and the form of my question does not allow you to question the assumption “Smith is evil” either. Similarly, if I say “Jones, who was responsible for the disastrous Program X …”, I am closing off any question as to whether Program X was a disaster, or whether Jones was responsible for it. These are examples of framing a question or a statement so as to avoid debating certain things that you don’t want to debate – assumptions that you don’t want to examine.

Snarl / purr words.
Some words have negative connotations (snarl words) or positive ones (purr words). Using them can be a way to appeal to people’s emotions instead of arguing by reason.

Vague quantifiers.
“Many experts believe …” Stop! How many is “many”? A majority? Half? Two or three? A claim involving numbers needs to give you specifics, or it tells you nothing.

Attributions.
Misquoting another party is, literally, the oldest trick in the Book – going all the way back to the Serpent in Genesis. It is also easy to selectively or misleadingly quote somebody, to give a false impression of what they said. My rule is, “go by what the person said, not what somebody else SAID they said.”

Black propaganda – rhetorical false flag.
This is a particularly nasty trick: creating outrageous or shocking arguments and making them appear to be coming from your opponent, to discredit the opponent.

Discrediting by association – “57 Communists”.
This is a little more subtle than the rhetorical false flag. This is the practice of making known false statements, which can be easily disproved, that appear to come from your opponent. The goal is to damage your opponent’s credibility – or more accurately, to damage the credibility of a CLAIM made by your opponent. A real-life example was the case of ‘National Report’ – the granddaddy of fake-news sites – which created all kinds of hoax stories designed to fool conservatives; the conservatives then would be made to look gullible when the stories were shown to be false. (See the “fifty-seven Communists” scene in the film ‘The Manchurian Candidate’.)

Bias of intermediaries.
More subtle than the ‘straw man’ is the practice of pretending to present a neutral forum for debate, but deliberately choosing a more articulate, stronger debater for one side and a weaker debater for the other.

What are the source’s financial interests?
I think this one is a no-brainer, but a person who owns a lot of stock in XYZ Corporation is going to have an incentive to promote pro-XYZ legislation and contracts. In the case of the MSM, we all know that “bad news sells”.

Debts and favors.
Is the source looking for a payoff down the road? If I go on record saying nice things about Candidate A, maybe I am hoping to get appointed to a nice comfy job if A wins the election.

The medium is the message.
News stories go through news networks, broadcast networks, and publishers. Books go through publishing houses. In other words, somebody has to provide the materials for the message to be communicated. Somewhere, a network executive makes decisions about what gets on the air and what doesn’t. Somewhere, an editor or publisher decides what gets printed and what doesn’t. So if you’re reading a book you have to think about not only the author’s background and point of view, but also the publisher’s orientation: for example, they might publish mostly liberal books or mostly conservative books. Knowing something about the background of a publisher or a broadcast network can help give you an idea of what to expect.

What are the source’s own experiences? How might those experiences be relevant, and how might they affect the source’s perceptions?
First-hand knowledge of any issue is always helpful; on the other hand, a person might have had an experience that was atypical or unrepresentative. A soldier on the front lines is going to have a very vivid, detailed, and specific recollection of a battle. The general in a command bunker may not see the battle up close, but he will have information on the “big picture” of troop strengths, enemy positions, strategic decisions, and other things that the soldier will not know, and may not be allowed to know. The soldier’s memory may be distorted by trauma, confusion, fear, or shame (of a real or imagined failiing on the battlefield); the general may ignore or suppress key information, perhaps with his career in mind. Both perspectives are valuable, both have their limitations.

Psychological factors.
There are basic psychological factors that operate in all of us to one degree or another. Resistance to change is one. There is a need for approval of others; there is also a need for a sense of autonomy and a belief that we determine our own destiny. And of course we all like to be thought knowledgeable, which is why we are often tempted to speak more than we actually know.

The human voice.
By this I mean an intangible quality that may include a distinctive personality, awareness of ambivalence, self-analysis and self-criticism. This one is not a matter of rigorous logic but of gut instinct: something tells you that the person sounds real or fake.

Hard to win a debate, easy to lose one.
When you’re debating an issue, it is very difficult to “win” in the sense that your opponent throws up their hands and says “Oh, you were right and I was wrong.” Or even to definitively convince an audience that your position is the correct one. However, it is very very easy to LOSE a debate, simply by saying or doing something that brings discredit to yourself and your cause: getting your facts wrong, making a basic logic error, or losing your cool and cursing or attacking your opponent. Sometimes the most important part of debating is knowing when to stop.