Practical School

The world is incredible: Honey Bees Edition

Watch this video:

SPOILER ALERT the bees kill the wasps. Not only that, but they all signal to each other with a thorax waggle (what??) to swarm around the spy wasp (they’re bees?? How do they know to do this???) and vibrate so furiously that the swarm cooks the wasp but not the bees. It’s an evolutionary edge that is laughably slim. They can survive temperatures three degrees hotter than the wasp. The wasp dies, the bees live, and the hive survives.

This does not jibe with my understanding of evolution. Evolution does not work though detective work, where a species analyzes its strengths agains its foe, then strategizes how to take advantage of it. I have NO IDEA how a hive of bees could figure out that this is how they can kill wasps, that they have a secret edge, and don’t get me started how some bees could teach their sisters that a thorax waggle means ‘swarm and cook that big thing over there.’

I was so flummoxed I told everyone I knew about this issue. Hang out with me if you want to learn about bugs. And Sam Hotop (QA wunderkind) shed some light. “Maybe all bugs cook at the same temperature?”

Ah! What if, many many years ago,these Japanese bees somehow learned to cook wasps, vibrating their little hearts out, and also cooked themselves? Bees are more related to each other than we are to our siblings (citation: The Selfish Gene) and are that much more willing to die to protect each other. Wasp dies, some bees die, and the hive survives. Then, after millennia, hives that have bees that survive at a little higher temperature have a little better survival edge. The success of the cooking method selects for bees that have a higher boiling point.

Still doesn’t make any sense how they figured out the cooking in the first place, but I’m satisfied. Bee wearing contest. China.

The Things I Know

I am a fact motivated person. Maybe fact isn’t the right word–I naturally connect trivia to other trivia. I compulsively share them. I’m trying to connect with people like someone might talk about sports, or new diets, or astrology.

If I am having a conversation, there’s often a flare up of something I’m reminded of. It could be a metaphor or a scientific principle, or a story I heard on a podcast. I call it ‘bus stopping.’ My thoughts are a bus on a big, dark metropolitan bus network, that when I’m talking, will suddenly light up and the bus will stop at 30 bus stops at once. I can’t help myself from talking about the bus stops as they come up.

I’ve come to accept bus stopping in myself, and understand it’s not for everyone. Those people who don’t like it should run from a conversation with me because I can’t stop it. My loving family has a cruel impression of me, pointer finger up, droning about the beauty of mathematics. You know you’re something if there’s a caricature of you.

But I like these bus stops, and like describing them. I also have recently learned about meta-knowledge from “On Being Certain” by Robert A Burton (bus stop!), or how we know that we know something, what is that feeling like. These facts and trivia are lying dormant in my mind, until someone triggers the bus network, and I remember all that I know.

I’ve started to list some of these bus stops in my notes. and will start to write little blurbs about them. Map out the bus network.

Pigeons can be superstitious too


Awesome post on Quora Digest this morning.

Psychologist B. F. Skinner thought free will in humans was a farce, and it had much more to do with reinforced behavior. His peers at the time offered superstition as a counter example–other animals don’t show it, and it develops without true reinforced behavior.

Skinner was able to show that pigeons developed superstition when a learned reinforcement was randomly applied, just like humans. We don’t control when it rains, for example, but we have plenty of practices that try to bring storms. Pigeons didn’t control when the food pellet came, but they’d turn in circles like the first time, or put their head in the corner.

The research was done in the ’30s, so I’m not sure what the contemporary idea of this is. But I still love it for pointing out that superstitions are based on logical reactions–it doesn’t matter if it only works some of the time (because the two activities are unrelated) we’re still able to draw a connection between them.

You’re Being Irrational: Becoming a Better Designer (Part 1)

At my company’s annual retreat, I presented a talk about rationality in design. There’s nothing I like more than being allowed to natter endlessly about rationality, and people seemed to even like it! I’m posting the transcript in parts, as it was long.

I spoke with my talented colleague and all around best bud Doug Stuart who had to skype in as he’s in freaking Glasgow. The slides are his.

awayday2014optimizedV4 (1) copy

Here’s what we’re trying to do as designers and BAs:

  • Find the best solution
  • Solve problems
  • Maintain relationships (with each other and with client)
  • Make an excellent product by doing science

But sometimes we really screw it up

  • we alienate the client
  • we over design and solve problems that don’t need solving
  • we think our designs are wonderful and are miffed when someone doesn’t like them
  • we fight with each other

Why do we do that? We use research, we’re smart, why do we sometimes fail?

I think it’s because we don’t really do science and we don’t get to our conclusions rationally.

WHAT?” the straw man says.  “I’m super rational!”


No you’re not, buddy. In fact, claiming you’re a super rational person is a leading indicator (no citation, just observed by me) of someone who is not rational, just smart. And that can be very dangerous for you!

We are all familiar with biases in other people but have a difficult time observing biases in ourselves. But of course we all have them. So we live with these biases and aren’t always aware of them.


Being intelligent but not carefully rational can make your biases worse! You’ve probably heard of confirmation bias, or favoring information that supports your side. You could also fall prey to the following effects:

  • Prior attitude effect. Subjects who feel strongly about an issue—even when encouraged to be objective—will evaluate supportive arguments more favorably than contrary arguments.
  • Attitude polarization. Exposing subjects to an apparently balanced set of pro and con arguments will exaggerate their initial polarization.
  • Attitude strength effect. Subjects voicing stronger attitudes will be more prone to confirmation and disconfirmation biases.
  • And the bias that’s probably most dangerous to ThoughtWorkers: Sophistication effect. Knowledgeable subjects, because they possess greater ammunition with which to counter-argue incongruent facts and arguments, will be more prone to the above biases.

If you’re intelligent, but biased at the start, having more knowledge can hurt you! You’ll only gain more ammunition with which to argue a biased point. Your irrationality will deepen, and you will be more and more sure that you are right.


Even scientists are prone to this. The group of people among us that are the most driven by truth can be blinded by their biases. For example, there is the famous story of the researchers who discovered ulcers. The common knowledge in medicine at the time was that stress caused ulcers. Scientists and doctors alike believed this to be true, and were attached to that belief.

Researchers Barry Marshall and Robin Warren found that it was actually a bacteria that caused ulcers, and not stress. They were laughed out of conferences, rejected from journals, and generally ridiculed for their experimental discovery. For a decade, they couldn’t convince anyone. Finally, in the 80s, they both swallowed the bacteria themselves and showed that ulcers developed in their own guts. Finally, the scientific community believed them, and they were awarded the Nobel Prize. People are attached to their beliefs in ways they can’t always detect.


What is rationality anyway?

  • Rationality is the method of reasoning that is based off a strict principle of scientific experimentation.
  • We acknowledge that Truth exists and that we can learn about it by studying the world and surpressing our emotional attachment to how we want the world to work.
  • If you are interested in rationality, you must be always examining and refining your lens that you use to understand the world.
  • We call that lens the Model–we want our Model of the truth to match reality. We use our Model to evaluate new information as true or not. It is therefore very important for your model to be correct.
  • If your Model and someone’s report of reality don’t match, either your model is wrong or they are lying.
pg10themap copy

In this specific talk, we’ll be talking about the mistakes of thinking and decision making you can make if you’re not careful, and how to avoid those mistakes to make better decisions.

Beginner’s Guide to Irrational Behavior

Because of a recommendation from Quora, I have just belatedly started a Behavioral Economics class on Coursera called “A Beginner’s Guide to Irrational Behavior.” Right up my alley. I’ve already written down some interesting points and I’m only 20 minutes in.

  1.  Humans have difficulty with the Planning Fallacy. These dopes consistently underestimate how long a task will take, even when they are extremely experienced in that task. These mistakes only happen when you evaluate yourself, and not if you are an unrelated observer–in that case, you over-estimate. Selfish, counterproductive optimism.
  2. We may have good intentions, plenty of experience, and strong intuitions but we can still be wrong. The fundamental principle of irrationality is that we think we know the right thing to do, but we are wrong.
  3. We do not see with our eyes, we see with our brains. Our brains incorporate our expectations into our perceptions, which is why optical illusions work. Though our brains are very strong and very good at vision, we still make predictable, systematic errors. We are evolved to be good at eyesight, and yet that is still the case. Consider other skills that we aren’t evolutionarily optimized to do (investing money, for example), and it becomes clear that we will make predictable, systematic errors in those areas too.

It’s really good shit. I’ll keep at it.

Betting on the Future

By the people who run the Long Now Foundation, and used by the likes of Warren Buffett. I like the Rules. Some favorite bets–

That last bet is only for $1000. Worth it.

Willpower that drains

I read something about willpower, but I don’t remember where. I told my friend Sam about it. This author that I cannot recall said that you have a finite amount of it at the start of the day, and any small amount of expenditure will diminish it. That includes the willpower spent not checking your phone, or not checking Facebook. Sam countered that his willpower is bolstered when he accomplishes something as in “I went for a run this morning and now I have the willpower later on not to eat like shit.” I’ve felt similarly.
So we made the distinction between active and passive expenditures. Successfully finishing something is of a different character than the endless small obstacle of not doing something. There’s no triumph, no boost of confidence. It’s draining. The suggestion made by that thing I read was to remove those drains on willpower–go without your phone, block Facebook, make the expenditure of willpower only available to the task at hand. Remove yourself from distractions when you have to do the things you don’t want to do. Now all I have to do is find that link, but first I have to check my email again.