Friday, September 10, 2010

Truth, Or, Why We Should All Be Running Around, Screaming, With Our Pants On Our Heads

OK. This is the last post about Narratives and Narrative Engineering for a while. I really do want to touch on the other aspects, and this is going to be a pretty good note to leave on. Today's topic is: The crisis of believability in the Information Age. The groundwork: In order for information to be useful to an observer, it has to be true.

Humans are naturally driven to ascertain the truth value of propositions for themselves. Small children, when they're told something is hot, will often touch the object to be sure. After a few bad experiences, the child will learn to trust that when that particular speaker (be it a parent, sibling, or so forth) says that an object is hot, it is likely so. The human brain, a pattern recognizer, shifts the burden of proof from the individual to another, trusted, individual. This trust-based shift can extend to other entities—organizations, religious entities, and sources of information such as media outlets.

When two sources deliver conflicting information or opinions, we tend to side with a) the sources we have trusted frequently in the past and b) the sources that agree with our own existing viewpoints. But how do we choose what informational sources to trust in the first place? Here we find a gap worth exploring, but alas, I don't have a full range of studies to glean from to explain it. Suffice it to say that the actual truth value of propositions doesn't mean much in our perception of their truth.

Jonah Lehrer, who might actually be the most interesting blogger on the internet, wrote a feature for Wired's December 2009 issue called "Accept Defeat: the Neuroscience of Screwing Up". It's a tremendous piece about what it takes to be a good scientist, and well worth an entire reading. The part that interests me, however, is the bit where a brain scientist wired undergrads up and makes them watch videos of falling objects. There were two videos: one where the two, differently-sized objects fell at the same velocity, as they do in nature, and one where the larger object fell faster. The undergrads, who were unversed in physics, experienced activity in the anterior cingulate cortex, something of a BS detector in the brain, when viewing the accurate video.

Physics majors, however, had the reverse experience. The mental error alert popped up while watching the inaccurate video. In both cases, the dorsolateral prefrontal cortex, which suppresses memory, was also activated. The brain actively attempts to erase that which it perceives as impossible or incorrect.

As we are presented with information that we cannot confirm or deny ourselves, our biases work in combination with each other to accept and reject information and conclusions. Unfortunately for us, information needs to be true to be useful to the one receiving it, but this is not the case for the one delivering it. Some sources may (and very likely do) deliver up information that is intentionally misleading or untrue in order to further an ulterior purpose. Other sources adopt a stance because they believe it to be the morally correct one, or as a means to capture a segment of the content market.

For the most part, it appears we know this. We're aware that not all institutional media outlets are to be implicitly trusted. And, of course, as human beings, we're going to touch the burner to see if it's hot—and to build a worldview that will work as a shortcut to the discernment of burner temperature in the future. I see five these shortcuts frequently trusted by content consumers (and have fallen prey to some of them myself). Here they are, along with their shortcomings:

Triangulation. This method entails checking multiple reputable news sources, and believing what's reported by a large number of them. This will definitely cut some of the clutter, but not all, as can be seen from the curious story of "Jenny DryErase". Humor repository site "The Chive" ran 32 images of a young woman telling the story of the ridiculous work troubles that caused her to quit her job via whiteboard. Various newspapers (including some international outlets) ran the story as if it were true, apparently never bothering to fact-check it. I don't blame them—the images appeared to speak for themselves. The whole exercise turned out to be a hoax.

A fake, satirical Congressman, Rep. Jack Kimble, was made up by a blogger, and reported by a Washington Post blogger as if he were real. There are plenty of other examples of why we can't just triangulate our way to truth and happiness.

APCs. Similar to this is the "APCs" concept discussed by Craig Silverman on his media corrections blog, Regret the Error. The idea is that you can judge a source's validity by checking its Authority, Point of View, and Currency. The problem with this is the very same as the triangulation issue—supposedly authoritative, neutral, current outlets occasionally make major mistakes. Besides that, blogs and other features appended to major outlets (like blogs) trade on the authority of their parent brand, making it difficult to control those things for the expected level of quality. Further, since we identified above that we as people form our opinions based on other preconceived notions, the "authority" and "point-of-view" values are essentially unavailable to us.

Bias Stripping. This method is as it sounds—read the clearly biased news, then try to mentally remove the perceived bias from the story. Along with sharing in a few of the problems already discussed, there is this: While we're pretty good at removing extraneous, editorializing content while we're reading, a bigger problem arises when news sources omit information in order to maintain a narrative. For all his own biases, Jon Stewart illustrates this fairly well. (Note: long clip.)

Third Party Fact-Checkers. Politifact, FactCheck.org, and Snopes are great sites run, I believe, by people who are honestly trying to produce bias-free content. But, two things: no one is bias-free, so being harder on one person and softer on another is liable to occur, at least insofar as one contributor is concerned. Two: what sources do these sites consider trustworthy? We've just gone through the idea that you cannot ever really tell for sure.

Finding the Median Between Extremes. This is simple: you take a clearly biased source and another clearly biased, but opposing source, and split the difference. As it has been shown that bias is often a symptom of an unfounded belief, for this to work, you would have to assume that the bias of the two sources a) represented two points on a spectrum that contained all the possible representations of reality, the most accurate being located at the center, b) were nearly equidistant from that center, and c) were intentionally biased. If they were unintentionally biased, it seems unlikely that both outlets would be systemically and systematically biased in the same direction.

It seems incredibly unlikely that any major dichotomy available in the realm of public discourse contains all possible accurate representations of reality, especially the political Left-Right dichotomy in America. (DO NOT read this as an endorsement of any political viewpoint that eschews the Left vs. Right continuum.)

In short, there's nothing we can do to be sure that any relayed information is accurate. I would highly suggest reading primary sources to obtain accurate information. Primary sources (like the text of bills, speeches, laws, etc.) tend to be long and boring, though, so that means that our ability to be informed is limited by time and attention, which in many cases, is just as well served in other pursuits. My head is starting to hurt a little now, I'll be honest.

Cue the pants.

2 comments:

  1. Great stuff, as always. Still waiting for an RSS feed, though.

    ReplyDelete
  2. Your wish is my command:

    http://thefuture.chatterbucket.net/atom.xml

    ReplyDelete