When I was in graduate school, I took a course in urban geography, and one of the texts that we read was a book called City of Plagues: Disease, Poverty, and Deviance in San Francisco by Susan Craddock. In the opening paragraphs of the first chapter of the book, Craddock describes a visit to a San Francisco homeless shelter in 1991. She found ‘a number of small rooms … located off a central hallway, each room containing six built-in bunk beds, three on each side with perhaps three feet separating each bed. None of the rooms has a window.’ She also observes that the few windows in the hallway ‘look as if they have never been opened.’ There’s more, but you probably get the picture.
At a time when fully one third of the city’s homeless population was testing positive for tuberculosis, it would be hard to imagine a worse design for limiting the spread of the disease. Or, as she put it, ‘it would seem that the homeless shelter was designed specifically with the transmission of tuberculosis in mind,’ but not in exactly the way we might expect.
Even back in 1991, tuberculosis (or ‘consumption’ if you’re from the early part of the 1900s) was a disease with a long history, about which the medical and public health establishments had been fairly knowledgeable for a pretty long time. If you imagine an early 20th century tuberculosis sanitarium (like the one pictured above) you are probably picturing big, open rooms and big, open porches scattered with rocking chairs and frail women wrapped in blankets. You are probably not imagining a ‘beehive’ of cramped, windowless rooms with folks sleeping in bunk-bed close quarters.
I remember vividly when I first read that chapter in that book, thinking how bizarre it was that people could so easily dismiss or ignore information that was readily available. I’m sure I had heard of cognitive dissonance and cognitive bias before, but I found this example so concrete and so jarring that it really stuck with me.
Cognitive dissonance and cognitive bias fascinate me because they are so central to the ways in which our knowledge can fail to inform our behavior, not to mention the ways that our behavior can limit our knowledge. And even when you know your brain is playing tricks on you, the lure can be hard to resist. A few examples:
- Confirmation Bias – Ah, election years. When you search for, focus on and remember information in a way that confirms your previously held beliefs? When you ignore new information that challenges those beliefs? That’s confirmation bias. (OK, maybe not this election year. I don’t think anyone saw any of this year’s stuff coming.)
- Anchoring Bias – What was the first thing you learned about, say, how to cook? Or how to buy a car? Anchoring is when we rely too heavily (or ‘anchor’) on one thing (usually the first thing we learned) when making decisions. Was it Dear Abby or Miss Manners that said that for most folks, the ‘right’ way to wash dishes or fold towels is usually the way your parents taught you. The way your partner’s parents did it? That way is the wrong way.
- Dunning-Kruger Effect – You know this one, too, though you might not know that you do. This describes the situation when someone who has a little bit of knowledge or skill takes themselves to be an expert, while the person who is actually an expert tends to underestimate their own knowledge or ability. This is one of my favorites, and reminds me of the lines from the Yeats poem:
The best lack all conviction, while the worst.
Are full of passionate intensity.
And of course, I say ‘we’ and ‘our’ throughout, because I know I am not immune to flawed thinking. It doesn’t make it any less fascinating, though, and in any case, knowing about all of this stuff can’t hurt, now can it? (There’s probably a cognitive bias at work there somewhere, but I’ll ignore it for now.)