The climate changes (but it snows in the backyard of the cousin of my uncle’s neighbour)

A myth is an established belief in something that is scientifically incorrect. I didn’t come up with that definition myself, Stephan Lewandowsky and John Cook described and researched this. Fun fact (actually not really funny) is that the article encountered so much resistance from people who believe in myths, that the magazine has withdrawn the article. Not because the research was bad or the results were not correct, but because it was ‘too sensitive’. 1-0 for the myths, so to speak.

Not everyone is sensitive to believing in myths, but they are extremely persistent and contagious. Once you believe them, they are stuck in you head. And there is even scientific evidence that a myth, even if you don’t believe it, affects your worldview once you hear it. As they say: it can not be unheard. What makes such a theory so darned attractive?

That’s because of all those different shortcuts and preferences in our heads I mentioned earlier. We love facts that confirm our worldview. We see patterns everywhere. We attach more importance to 1 fact that we have experienced ourselves (or that a friend’s uncle has experienced) than to facts with large numbers. Large numbers are scary. And so on and so on.

As an example of this is the myth that the climate does not change at all. Why is it so tempting to believe this? Research shows that this myth has an enormous appeal, especially among people with a strong belief in the free market. If you believe that the market is the best organisational mechanism for society, you do not believe that man is responsible for global warming. From a ‘confirmation bias’-view this correlation  is easy to understand: because if you would believe in climate change, you would also have to admit that the free market may have some limitations. Furthermore, global warming is, of course, pre-eminently a theory of large numbers, vistas and macro-effects. A snowball on the other hand is close by, tangible and….. Cold. Quod erat demonstrandum.

There is that senator again with his snowball.

All these different mechanisms in our brain make that a wrong theory, that feels right (fits our moral compass), fits our worldview, explains patterns we see and comes with facts we can relate to (of that friend’s uncle), is very attractive to remember (remember, Darwin?). And remembering is the first step to believing. Compare it to songs like the Macarena or Sofia or Gangam style, music that gets so stuck in our heads that we eventually find it beautiful.

Besides being very ‘sticky’ to enter our memory, they are also very ‘sticky’ when we try to remove them again. Research has shown that when trying to correct conspiracy theories there is a life-sized danger that you only strengthen the theory in people’s heads. How is that possible?

That too is mainly due to the confirmation bias. And through selective memory. When we pierce a myth with facts, and do so imprudently, people only remember ‘there was something with that myth’. The myth becomes more common in people’s minds, and thus more accepted.

Fortunately, there is help with de-mythisation. There are three simple rules you have to follow according to the ‘Debunking Handbook‘ of John Cook and Stephan Lewandowsky:

1. Focus on the facts, not on the myth.
2. Do not mention the myth itself, or if you have to, warn of misinformation before quoting the myth.
3. Provide an alternative explanation for the myth.

What they say actually comes down to this: mentioning the myth strengthens it. So avoid this. Especially as a title, as a heading or as an introduction. After all: over time, we only remember the titles of things, the facts sink away and the myth remains. Be clear and short: too many facts makes that nothing is remembered. Keep it Simple, Stupid! And use pictures, because we always remember pictures better than words.

Besides the sparse and clear use of facts, they also say that you should always explicitly announce it when quoting untruths (the myth). Make sure you stay away from the ‘somewhere in the middle’ swamp. With that prejudice, people are presented with two opinions (the climate changes as a result of human action, and the climate does not change). Despite the fact that one opinion occurs much more often and is better substantiated, your brain mainly hears ‘there are two opinions’. The automatic reaction of your brain to that is: the truth will be somewhere in the middle.

Finally: give an alternative. People are much better off disbelieving something if they have an alternative that they can store in their brain it its place. Just think how much easier it is to believe that your son hasn’t taken a biscuit out of the biscuit tin if you find out that your husband was hungry last night. That knowledge works much better than the sweet eyes of your child that declares to have ‘really done nothing’.

Back to climate change. How do you tackle that, as a myth piercer? John and Stephan have used their rules on this specific myth. And write the following article:

The image is from the Debunking Handbook, and can be found larger there.

See how the fact is the title, not the myth. See the beautiful infographic. See the warning, that explicitly announces that untrue information is coming. And finally, see the alternative explanation. All the ingredients for a true myth piercer, stuck together. Believe me, it works!

Oh, and do you know what works (even if it feels a bit crazy)? Before you confront a myth follower with the facts, let him or her tell you first about what is good about them. The better a person feels about himself, the easier he or she changes his or her mind. Bizarre maybe, but really true.

Conspiracy Theory

When you try to get pregnant, you see prams everywhere. When you try to lose weight, you see cakes everywhere. You see what you pay attention to.

I learned this when I tried to learn how to ride a motorcycle. My instructor sighed disappointedly after yet another time I fell over because I hit a curb or pawn: you drive to where you are looking.

Photo from www.verkeerstraining.nl, you can see that this person ís looking in the right direction.

Our head is built in such a way that we best see and remember the facts and events around us if they are in line with the world view we already have. We have a desire for consistency: once we have formed an opinion, our brains no longer want to have the continuous feeling that they have to do the work again. And so our brain prefers to pay attention only to those things that strengthen our opinion. As Charles Darwin wrote in his biography: “I have followed a golden rule for many years, namely that as soon as I came across a fact, a new observation or a thought that contradicted my theories, I had to write them down as soon as possible: for experience has taught me that I forgot those kinds of facts or thoughts much faster than facts or thoughts that supported my theory.”

Charles Darwin with another famous quote relevant to my research.

This trick of your brain that Darwin saw and described is called the confirmation bias. This confirmation bias has been described for hundreds of years, but also more and more scientifically researched. Recently, researchers at the University of Amsterdam have also scientifically established that confirmation bias occurs, even in simple decisions where you as a human being have no status, identity or self-esteem. In the research they asked people which way they thought a certain dot pattern moved in (do you remember what I said about seeing patterns in everything?). They asked this twice in different studies and what turned out to be the case? The people who found in the first test that the dots moved mainly to the right, saw this image confirmed in the next test. And the people who found that the dots moved mainly to the left in the first test? You guessed it: they too thought that the second study confirmed their earlier opinion.

Scientists have been studying the confimation bias for some time now. In the 1951 research into an American football match that had turned out to be incredibly rough, this trick of your brain was also mentioned. The game was rough with many fouls and even wounded, and led to accusations between the football clubs for weeks to come. The researchers asked the supporters from both sides questions about the match. Again, the confirmation bias was crystal clear: both groups of supporters saw a totally different match. In the words of the researchers: “the ‘same’ sensory influence from the football field, transmitted via the visual mechanisms to the brain, clearly gave a totally different experience to different people”.

Photo of the 1952 superbowl, wikipedia

According to Stephan Lewandowsky, a psychologist who researches the belief in conspiracy theories, the confirmation bias plays an incredibly important role in polarizing people’s opinions. He calls it ‘cherry picking’: people choose that one scientific fact or that one empirical experience that confirms their thesis. Think of: “Smoking is not bad for you, that’s a fabrication by the government. My great uncle has smoked one packet a day all his life and it turned 98!” Or think of: “The climate doesn’t heat up at all! How do I know that? Look at this snowball!”.

The confirmation bias: our best friend in times of uncertainty. Our warm blanket of consistency. The director in our head who tells us where to look, what to see. Nice and clear and quiet.

But the confirmation bias is also our most dangerous opponent. Because so often it stands between us and other people. Because it prevents us from getting to know and appreciate other points of view. Because it pushes us up the barricades and shouts insults on the internet.

So let’s all be a bit more like Darwin: remember those moments when your worldview staggers. Write it down, photograph it, share it on instagram. Let’s celebrate it together, when our balloons burst! (also makes for very nice memes).

The doctor changes

I know, I still owe you a blog about a very important heuristic: The confirmation bias. The heuristics that make us see and remember facts that fit our world view better than facts that contradict our world view.

But first I just want to catch up. September is over, the first real month of active research for me. August was the month of reflection, of accepting that my subject is really worth an investigation. Feeling that it can be really absurdly difficult to change my mind.

In September I started to find out why this is so. My search led me further and further away from my home base as a sociologist, and further and further into the wonderful world of stories, brains and primates. I’ve learned a lot (and I haven’t finished yet, most books and courses are halfway through) about how your brain works. About how my brain works. About why I froze completely when I listened to someone telling me that antidepressants don’t work. About why I think I have seen enough patterns, know enough fact to prove that they do work. The journey I made into the fields of neurology and psychology was very instructive and confrontational. Daniel Kahneman, Nobel Prize winner and book writer, even made me really unhappy at one point when he announced that he thought that improving how you think, how your head works, is actually an impossible task.

Daniel Kahneman on TED

If that’s true then we’re stuck with this brain, with these heuristics. If that is the case, it isn’t it a matter of brain training to make it easier for people to change their mind. A conclusion that is still a bit too premature for me (what confirmation bias?), but I will tentatively let it ‘roam around’ in my head for a while.

A second excursion in September was into primatology. Because I heard so many times (especially when I tried to find out how people think): that it comes from our hunter-gatherer brain. What was that, a hunter-gatherer brain? And why did our past millions of years ago actually influence our current behavior?

What this excursion taught me is the power of sharing food. And the power of stories. And above all: the power of groups, of socially connected living together. The founders of sociology were our ancestors, who already needed each other to survive. And who had to learn and grow intergenerationally to guarantee their existence. Why eating together and telling stories together was so central to this, I would like to come back to another time. But for now it provides me with a very nice thread of research for October or November.

Finally, in September I immersed myself in political philosophy. That’s actually because of my first research into how our brain works. How we think. That research showed me the tentative conclusion that we are better at changing our mind if we don’t know we are doing it. If we use our heuristics. That quickly brings us to subjects like nudges, big data, facebook and cambridge analytics. A ‘detour’ that caused me to do two important things in September: I quit Facebook, and I started following the Justice course at Harvard. Political philosophy, the basis of moral equality, is a complicated but fascinating subject. I’m far from finished with this either, but that’s not bad: morality of the group brings me back to stories and sociology via another way. To what our society needs to facilitate as much as possible that we can still change our opinion in the 21st century.

September was a busy and beautiful month. October promises to be that too, but perhaps for other reasons. For example, I teach a lot in October, I have to complete a number of jobs and answer some urgent questions. ‘Real’ life pulls its strings on me this month. Keeping up with and making time for the research will be a real challenge. But that’s okay: I had to tackle that issue sooner or later, so bring it on! Because I know after just one month that this research is incredibly fun, interesting and also useful. I probably know this because of all the heuristics and my hunter-gatherer’s train probably, but still ☺

Oh, and that title of this catch-up blog? That is due to last night, when the BBC aired the first episode of the new Doctor Who series. With – for the first time in 50 years – a female doctor. A fact that has caused quite some controversy in recent months. In response, the new doctor spoke the following encouraging words in the episode yesterday: “We are all capable of the most incredible change. We can evolve while still staying true to who we are. We can honor who we’ve been and choose who we want to be next.”


The Sting

I promised I would tell you a little more about my favourite fallacy: The Gamblers Fallacy. This fallacy reasoning, this shortcut in our heads, is based in the fact that we are so bad at dealing with coincidence (read: Welcome to the Matrix for more about this subject).

The gamblers fallacy is easy to explain: imagine you’re at the casino’s roulette wheel. The last 4 times the ball has landed on a red number. What do you do when you bet? You gamble on black, right? Because ‘it’s due’!

Another good example also comes from inside the casino. You see people standing there for hours and hours with the same slotmachine. Because, as they say: this one is due a jackpot! What they mean is that the gambling machine hasn’t had a big payout for a very long time, so it’s ‘time’ for the jackpot to fall.


As you all know from my previous blogs, coincidence is really coincidence, even if we don’t want that. The chance that the ball wil land on a black number is (a bit less than) 50%. The chance that the jackpot will fall is as big as the slotmachine has been set at (and trust me, that chance is not that high). These odds only change if you adjust the machines. Not if you use them more often.

The tricky thing about this fallacy is that it is about chance, but also about statistics (the science of collecting and comparing numbers). And our head is not built for numbers (read more about that in Hans Rosling’s factfulness). We’re bad at crunching numbers, we pull them out of proportion and make them into something with an emotional or at least normative value (much! little! dangerous!). While statistics collects and calculates, we feel and draw conclusions.

Take the teacher who compliments a student with a particularly high grade for an exam, just to see that this student is doing worse next time. While the student he gave a telling off for an incredibly bad grade does better next time. The teacher draws the – emotionally very logical – conclusion that students are more affected by punishments than by rewards. While the effect of statistics on these second series of grades has probably had a greater effect on the grades, than his rewards and punishments.

The effect of statistics on these two students works as follows: there is an average score for all students at an examination. A student with a deviating high grade has a higher chance of scoring more average next time (in the picture below 68% of the students score average) than he has of scoring such a deviating high grade (16% of the students score deviantly high). The student with a deviating low grade, also has a higher chance of scoring more average (68% chance) next time, than that his chance of another deviating low grade. The law of averages is much less intuitive, but for the teacher probably a stronger prediction of future results, than his reward or punishment policy.

Disclaimer: I am not trying to say that studying does not help improve your grade. That is the difficulty with statistics: the conclusions are always about larger groups and averages. And on average, the chance is higher that you get a grade somewhere in the middle of the curve.

Isn’t it strange? It feels like we’re all suddenly living in a lab. Of course, we all understand that when we throw a coin in the air a lot of times, about half of the throws will be heads, and half will be tails. But that we are coins ourselves, subject to the same laws of chance, the big averages and the probability calculations, that feels strange and unnatural. And yet that is the consequence of accepting that so much in our lives does not happen for a reason, but in coincidence.

So the next time you think: it must go well/bad now, because the previous couple of times….. Then stop yourself and smile in a mirror. And take a coin from your wallet to throw it up. Just to make it visible that what will happen that day depends to a large extent on chance. Of course that’s scary, but secretly it’s also quite liberating: after all, it’s there is that much coincidence around you, not everything is entirely up to you anymore. Kinda nice, right?

For lots more on statistics, and how to use it in the decisions you make, visit the website of poker champion and science fan Liv Boeree. Or, if you just want to spend 6 more minutes, please see her TED talk on ‘three lessons on decision making from a poker-champ’.