“Consider The Opposite”

A few weeks ago, I listened to a very interesting episode of “Hidden Brain.” I listen to them almost every Sunday on WBEZ, and I know that most of their podcast episodes are great and discuss unexpected topics. However, that one was especially interesting. I liked it so much that I listened to it one more time to make sure I captured all the details.

In this episode, Shankar Vedantam talks to the economist Alex Edmans, the author of the book May Contain Lies, How Stories, Statistics and Studies Exploit Our Biases and What We Can Do About It. The episode is titled “When the Truth Lies,” and you can say the topic is confirmation bias, but actually, it’s way more than that. Alex Edmans says that we often think about misinformation as something spread by our enemies; however, quite often, we are our own enemies, and we provide false information to ourselves. He talks about “data mining.” I only knew the technical term, but Alex Edmas defines data mining the following way:

Data mining is that you start with a preferred result that you want to find, and then you mine the data, you run the data in so many different ways until you get a positive result. 

He talks about the link between diversity in the company workforce and inability to find a direct link, even though he personally supports diversity:

Absolutely, and this is irrespective of your personal views on the topic. So personally, I believe that diversity is important. It’s an important thing. As an ethnic minority, it’s something which is dear to me. But I believe that as a scientist, you should be like an expert witness in a criminal trial. Your role as a scientist is to state the evidence, just like your role as a witness is to state the evidence clearly irrespective of your views of the issue.

Alex Edmans challenges several statements that the majority of us consider solidly -proven, like the 10,000 hours rule or the importance of breastfeeding for the child’s brain development (you can read all the details following the link above). But then the question comes, how we can combat these confirmation biases. Erdmans suggests using “consider the opposite” rule. Here is how he explains it:

So the consider the opposite idea is to try to get around this problem of confirmation bias. So again, what is confirmation bias? We latch on to something uncritically if it confirms what we want to be true, and we reject something out of hand if we don’t want it to be true. So why is this interesting? Because what it means is that we are able to show discernment. If there’s a study that we don’t like, we can come up with a whole host of reasons for why it’s unreliable. And so what I’m doing with the Consider the Opposite Rule is to try to activate the discernment that we already have and we use selectively for studies that we don’t like, but now apply it to studies that we do like. So maybe just by giving an example, this will come to life. So let’s say I want an excuse after finishing this podcast to drink loads of red wine. So I might look up on Google why red wine is good for your health, and I find studies that people who drink red wine live longer. But consider the opposite. We’ll ask, what if I found the opposite result? People who drink red wine live shorter. How would I try to attack that result? I might say, well, maybe people who drink red wine are poor. They can’t afford champagne. They have to drink red wine instead. And it’s that poverty which leads to a shorter life, not the red wine. Well, but now I’ve alerted myself to the alternative explanation of income being the driver. Then I should ask, is this the driver of the result that I do want? Again, red wine is correlated with longer life. Is it the case that the people who can afford red wine are richer and it’s their wealth that leads to the longer life? So the idea of considering the opposite is to trigger the discernment that we exercise selectively and make sure it’s now universal.

Another technique is exercising your curiosity:

So this is also interesting because often we think that just general knowledge is perhaps a way to avoid misinformation because the smarter we are, the more able we are to separate the weak from the chaff. But unfortunately, this is not the case. There are some studies which actually suggest that knowledge makes things worse because the more sophisticated we are, the more intelligent we are, the easier it is for us to slam evidence we don’t like, and to come up with reasons for why we don’t want to believe it. But again, we deploy this only selectively. We don’t deploy this to the evidence that we do like. So even if knowledge doesn’t work, well, actually curiosity does. So there was a study which looked at the effect of knowledge, found it had no effect, but curiosity did have an effect. So these researchers found that the more curious you were, the more balanced you were on issues such as climate change. In particular, your views in climate change were less linked to your political affiliation. So you were going based on the evidence, not based on your identity.

And now I want to read this book!

One thought on ““Consider The Opposite”

Leave a comment