I’ve been reading Daniel Kahneman’s very accessible and entertaining Thinking, Fast and Slow, which is his highly anecdotal take on more than 40 years of research by him, his students, his friends and his associates about how the mind operates. I recommend it highly to anyone who wants to know why people think as they do, and how to improve one’s own decision-making, be it in selecting new employees or investing.
Now Kahneman is not a biologist who will depict the chemical and physical processes the brain undergoes. His interest is how the conscious part of the brain—I like to call it the mind—thinks, and specifically, how it makes decisions.
Kahneman proposes that we have two thinking systems, #1 and #2. System #1 is fast-thinking and impressionistic, while system #2 is methodical and rational, and yet will take directions at a moment’s notice from the conclusions provided by System #1.
What I found most interesting in Thinking, Fast and Slow is that virtually all the tricks by which journalists, academics and speech-writers twist the truth are made possible by peculiarities of the human mind.
For example, the mind will tend to make decisions based on the principle of “what you see is all there is” (WYSIATI), which means that people will assume that all the information at their disposal is all the information that is relevant to a decision. By their selection of criteria, experts and details, writers create a world of facts that readers tend to take as WYSIATI, which is why propaganda techniques that involve selection and non-selection work. No one bothers to ask why only the rightwing anti-labor expert is being asked about the impact of the strike, or why none of the options being discussed involves adding taxes to the wealthy. We just accept the facts and experts the media selects for us.
People also tend to let the order of seeing facts or events influence their thinking. Pudovkin and Vertov proved this oddity by running film shots in different orders before audiences in the 1920s. Every audience changed its emotional reaction to the narrative depending on how the shots were ordered. Kahneman describes research that proves we do the same thing when we make decisions or evaluate people. Propagandists have used editing to distort the truth from the ancient Greek sophists up to Andrew Breitbart and the maker of the rightwing anti-union Waiting for Superman.
Kahneman’s book describes extensive research that demonstrates that people will believe an anecdote much more readily than they will believe statistics that go against their current ideas. This peculiarity of thought, proven in multiple contexts, demonstrates why the argument by anecdote is such an effective propaganda tool. The argument by anecdote proposes that one story proves a trend even if the statistics show otherwise. Thus the “Willy Horton” case history that haunted the Dukakis presidential campaign. The anecdote doesn’t even have to be true; witness the great success of Reagan’s “welfare queens” remark, which Newt Gingrich and Rick Santorum want to resurrect as “food stamp squires.”
Another weird thing about the mind is that, when asked a difficult question, it will substitute an easier question and answer that one instead. Propagandists take advantage of this predilection of the human mind to eschew the really tough question when using such rhetorical devices as conflation, false conclusions, the Matt Drudge Gambit, question rigging and trivialization. For non rhetoricians, here are some quick definitions:
- Conflation: Equating two events, objects, trends or facts that have nothing in common; for example, using fictional evidence to prove a historical trend or comparing Bush II’s spotty National Guard stint to the military record of war hero John Kerry.
- Criteria Rigging: Selecting the criteria that will prove the point you want to make, for example, the studies that use criteria that exist in the suburbs to show that the top places to live are all in suburbs.
- False Conclusions: Putting a false conclusion at the end of a paragraph or article that is factually based and logically reasoned.
- Matt Drudge Gambit: Reporting that a disreputable reporter or media outlet, such as Matt Drudge or Glenn Beck, said something that you know probably is false.
- Question Rigging: Selecting the questions to get a better answer. For example, instead of asking people if they believed global warming was occurring, research groups asked them if they thought the news media reported too much on global warming. When asked the second way, many more people seem not to believe that global warming is occurring.
- Trivialization: Reducing discussions of important decisions to trivialities, for example focusing on the personality differences between opponents while ignoring their substantive differences.
If we can just overcome these propensities to think in an illogical way, or learn to recognize why we are reaching a conclusion, we would all make better decisions and not be so susceptible to the smoke and mirrors of politicians and pundits.
I’m not sure how many people will end up thinking more rationally after reading Thinking, Fast and Slow, but for the rest of us, at least there is the delight in reading about all the neat research involving the measurement and analysis of human decisions and reactions in both real and artificial environments. I highly recommend Thinking, Fast and Slow.