Reading the details of the speakers of the Boring Conference is so tantalizing. I dream of minutiae and technicalities. When is next year’s?
At my company’s annual retreat, I presented a talk about rationality in design. There’s nothing I like more than being allowed to natter endlessly about rationality, and people seemed to even like it! I’m posting the transcript in parts, as it was long.
I spoke with my talented colleague and all around best bud Doug Stuart who had to skype in as he’s in freaking Glasgow. The slides are his.
Here’s what we’re trying to do as designers and BAs:
- Find the best solution
- Solve problems
- Maintain relationships (with each other and with client)
- Make an excellent product by doing science
But sometimes we really screw it up
- we alienate the client
- we over design and solve problems that don’t need solving
- we think our designs are wonderful and are miffed when someone doesn’t like them
- we fight with each other
Why do we do that? We use research, we’re smart, why do we sometimes fail?
I think it’s because we don’t really do science and we don’t get to our conclusions rationally.
“WHAT?” the straw man says. “I’m super rational!”
No you’re not, buddy. In fact, claiming you’re a super rational person is a leading indicator (no citation, just observed by me) of someone who is not rational, just smart. And that can be very dangerous for you!
We are all familiar with biases in other people but have a difficult time observing biases in ourselves. But of course we all have them. So we live with these biases and aren’t always aware of them.
Being intelligent but not carefully rational can make your biases worse! You’ve probably heard of confirmation bias, or favoring information that supports your side. You could also fall prey to the following effects:
- Prior attitude effect. Subjects who feel strongly about an issue—even when encouraged to be objective—will evaluate supportive arguments more favorably than contrary arguments.
- Attitude polarization. Exposing subjects to an apparently balanced set of pro and con arguments will exaggerate their initial polarization.
- Attitude strength effect. Subjects voicing stronger attitudes will be more prone to confirmation and disconfirmation biases.
- And the bias that’s probably most dangerous to ThoughtWorkers: Sophistication effect. Knowledgeable subjects, because they possess greater ammunition with which to counter-argue incongruent facts and arguments, will be more prone to the above biases.
If you’re intelligent, but biased at the start, having more knowledge can hurt you! You’ll only gain more ammunition with which to argue a biased point. Your irrationality will deepen, and you will be more and more sure that you are right.
Even scientists are prone to this. The group of people among us that are the most driven by truth can be blinded by their biases. For example, there is the famous story of the researchers who discovered ulcers. The common knowledge in medicine at the time was that stress caused ulcers. Scientists and doctors alike believed this to be true, and were attached to that belief.
Researchers Barry Marshall and Robin Warren found that it was actually a bacteria that caused ulcers, and not stress. They were laughed out of conferences, rejected from journals, and generally ridiculed for their experimental discovery. For a decade, they couldn’t convince anyone. Finally, in the 80s, they both swallowed the bacteria themselves and showed that ulcers developed in their own guts. Finally, the scientific community believed them, and they were awarded the Nobel Prize. People are attached to their beliefs in ways they can’t always detect. http://discovermagazine.com/2010/mar/07-dr-drank-broth-gave-ulcer-solved-medical-mystery
What is rationality anyway?
- Rationality is the method of reasoning that is based off a strict principle of scientific experimentation.
- We acknowledge that Truth exists and that we can learn about it by studying the world and surpressing our emotional attachment to how we want the world to work.
- If you are interested in rationality, you must be always examining and refining your lens that you use to understand the world.
- We call that lens the Model–we want our Model of the truth to match reality. We use our Model to evaluate new information as true or not. It is therefore very important for your model to be correct.
- If your Model and someone’s report of reality don’t match, either your model is wrong or they are lying.
In this specific talk, we’ll be talking about the mistakes of thinking and decision making you can make if you’re not careful, and how to avoid those mistakes to make better decisions.
This is a diary of a business analyst. As I read, I’ll write what I think about it here. Other articles I like, other pictures I like, will go other places.