A Virus of Cognitive Errors

You may have seen this question:

In a lake, there is a patch of lily pads. Every day, the patch doubles in size. If it takes 48 days for the patch to cover the entire lake, how long would it take for the patch to cover half of the lake? 

It’s a variation of the “how many times would you need to fold a paper in half for the thickness to reach the moon?” that you have probably heard. There’s also the rice on a chessboard version.

Why are there so many covers of the same idea? Because even when people know it’s a trap they still get it wrong. Like watching someone smell something that you warned was gross. This never gets old. These questions don’t get old because we have no intuition for geometric growth.

Jacob Falkovich writes:

Before Rationality gained a capital letter and a community, a psychologist developed a simple test to identify people who can override an intuitive and wrong answer with a reflective and correct one.

Feel free to take that “test” here. (Link)

A Fatal Combination Of Cognitive Errors

So it appears our System 1 thinking is restricted to linear intuition. This is not an issue in isolation. It’s more of a problem if most people are incapable of passing a CRT and overriding this System 1 thinking. I’m not well-versed on CRT literature but I suspect most subjects don’t even have an intuition for when a growth problem lives in Mediocristan or Extremistan. There’s another angle though. If it turns out most people are at least socially aware that these questions are traps and they are still getting them wrong then I’m extra sorry. That means we can recognize something’s up but the bottleneck is 2nd-grade arithmetic.

So we don’t know when our slower, methodical thoughts should take the reins from our gut reactions. Or worse, our slower thoughts don’t even know how to drive. But really getting stuck in the mud requires a wider community cognitive failure.

Falkovich continues:

Most people sitting alone in a room will quickly get out if it starts filling up with smoke. But if two other people in the room seem unperturbed, almost everyone will stay put. That is the result of a famous experiment from the 1960s and its replications — people will sit and nervously look around at their peers for 20 minutes even as the thick smoke starts obscuring their vision.

The coronavirus was identified on January 7th and spread outside China by January 13th. American media ran some stories about how you should worry about the seasonal flu instead. The markets didn’t budge. Rationalist Twitter started tweeting excitedly about Rand supply chains. (Link)

So let’s sum up:

  • We have poor intuition about geometric processes.
  • Many people don’t override this intuition because they don’t realize when they should.
  • Even if they realize they should, they often can’t add.
  • And those who do override it are socially inhibited

The devil is too smart to knock on each person’s front door. He waits for people to get together then slips the poison in the punch — remember, alarmism about any 1% event has a 99% chance of indistinguishable from crying wolf.

The Flu Kills More People

Wrong logic. A frequentist will look at zika, Sars, ebola, swine flu and conclude overblown. This is the definition of survivorship bias. The fat, happy turkey who thinks November will be just like the prior months. Two people can come to opposite conclusions if one merely counts past results while the other goes below the surface to find the underlying dynamic.  Tyler Cowen generalizes the camps into “base raters” vs “growthers”. (Link)

This has been my favorite thread quantifying the trajectory and timing of CoVid penetration, hospital bed and mask shortages, and the interaction of these variables. (Link)

Leave a Reply