2018-03-18

Source analysis toolbox.

Ever since I started blogging, I've been interested not only in current events, but also in the meta-questions of "How do we know what we know?" In 2005, I posted "How can you determine a source's biases?" in an attempt to list some of the mental processes and checklists I go through to try to decide what to believe and what not to believe. The more recent phenomenon of fake news (in the sense of overtly false and spurious hoax news sites) gave fresh urgency to the problem, as I posted at my LiveJournal. Related posts are collected under my epistemology tag. This post is the latest update to my checklist.

Look in the mirror.
This is really the most important thing when analyzing a source for credibility or bias: knowing your own beliefs and your own possible biases. It's always tempting to accept something uncritically because it fits what we think we already know.

Premises / logic / values.
Know what you differ on: what you believe is a fact, or what consequences follow from it, or whether something is good or bad.

Confirmation bias.
This is our natural tendency to believe things that fit our world-view. I find it helpful to divide between "things I think I know" and "things I know I know". Only verified factual information - things I KNOW that I know - is useful for evaluating the truth or falsity of a new claim.

Narrative.
What kind of overall picture, or "narrative", is the source trying to present?

Baseline.
Before you can determine whether an event is significant or unusual (for example, a crime wave), you need to know what the normal state of affairs is (for example, the average crime rate).

Question sensational reports.
There's a military saying that "nothing is as good or as bad as first reported". Sensational reports do just what the name says - they appeal to our sensations (of fear, hope, disgust, arousal, etc.) and can short-circuit our critical thinking. News stories with especially lurid details should be treated with skepticism.

Internal consistency.
Do all the pieces fit together in a way that makes sense?

External consistency.
Does the report agree with verified facts - things I know I know?

Dialog and dissent.
Does the source welcome opposing views and seek to respond to them?

Awareness of objections.
Does the source attempt to anticipate and refute objections?

Nuance.
By nuance I mean the recognition that a thing can be true in general and still admit of exceptions. For example, it may be true that tall people are generally better basketball players, but it can also be true that some short people may be outstanding players.

Logical fallacies.
There are many mistakes in basic reasoning that can lead us to wrong conclusions.

Red herrings / straw men.
A straw man is an argument that can be easily overcome, but that nobody on the other side actually made; you can "refute" this kind of argument to try to make it look like you refuted your opponent's argument, but you didn't actually respond to the claim they were making. A red herring is any kind of argument that is irrelevant to the main issue, and distracts you from it.

Snarl / purr words.
Some words have negative connotations (snarl words) or positive ones (purr words). Using them can be a way to appeal to people's emotions instead of arguing by reason.

Vague quantifiers.
"Many experts believe ..." Stop! How many is "many"? A majority? Half? Two or three? A claim involving numbers needs to give you specifics, or it tells you nothing.

Attributions.
Misquoting another party is, literally, the oldest trick in the Book - going all the way back to the Serpent in Genesis. It is also easy to selectively or misleadingly quote somebody, to give a false impression of what they said. My rule is, "go by what the person said, not what somebody else SAID they said."

Black propaganda - rhetorical false flag.
This is a particularly nasty trick: creating outrageous or shocking arguments and making them appear to be coming from your opponent, to discredit the opponent.

Discrediting by association - "57 Communists".
This is a little more subtle than the rhetorical false flag. This is the practice of making known false statements, which can be easily disproved, that appear to come from your opponent. The goal is to damage your opponent's credibility. A real-life example was the case of 'National Report' - the granddaddy of fake-news sites - which created all kinds of hoax stories designed to fool conservatives; the conservatives then would be made to look gullible when the stories were shown to be false. (See the "fifty-seven Communists" scene in the film 'The Manchurian Candidate'.)

Bias of intermediaries.
More subtle than the 'straw man' is the practice of pretending to present a neutral forum for debate, but deliberately choosing a more articulate, stronger debater for one side and a weaker debater for the other.

The human voice.
By this I mean an intangible quality that may include a distinctive personality, awareness of ambivalence, self-analysis and self-criticism. This one is not a matter of rigorous logic but of gut instinct: something tells you that the person sounds real or fake.

Hard to win a debate, easy to lose one.
When you're debating an issue, it is very difficult to "win" in the sense that your opponent throws up their hands and says "Oh, you were right and I was wrong" Or even to definitively convince an audience that your position is the correct one. However, it is very very easy to LOSE a debate, simply by saying or doing something that brings discredit to yourself and your cause: getting your facts wrong, making a basic logic error, or losing your cool and cursing or attacking your opponent. Sometimes the most important part of debating is knowing when to stop.