Daniel Levitin on how to spot a lie

Daniel Levitin on how to spot a lie

In his latest book, A Field Guide to Lies and Statistics, neuroscientist Daniel Levitin shows how an understanding of statistics can help us to cope with the information that swamps our lives.

Published: January 19, 2017 at 12:00 am

Is the amount of misinformation in the world increasing?

Absolutely. Not just in raw numbers, but also as a proportion of the [total] information.

I think digital technology and the web have given people a platform who otherwise wouldn’t. It reminds me of my childhood in 1950s America, when every neighbourhood had a nutjob with a hand printing press. They would create their own little newspapers about the terrible conspiracies that were going on, handing them out or leaving them on people’s doormats. You’d know just by looking that they weren’t the work of a mainstream journalist.

Now someteenager in Macedoniacan make a website look as real as the BBC. The internet is a double-edged sword: we have access to ideas we wouldn’t otherwise encounter, but that includes fringe theories, fake news and lies.

Isn’t it too ambitious, to prepare readers for any form of deception they might encounter?

Critical thinking is a dynamic process. I can teach you to identify one kind of distorted claim, but there’s an arms race between the liars and the truth-tellers. They’re always coming up with new ways to deceive! This book puts people in a mindset where they ask “How do I know this is true?”. Part of critical thinking is asking “Who’s a reliable source?” and “What makes a source reliable?” It’s a process I’m trying to teach, not a template.

How often are we lied to?

It depends on where you live, what you do and where you get your news. If you get all your news from Facebook, you’ll be lied to a lot more than someone who reads a mainstream newspaper. There are hierarchies of information sources, and we can’t treat all sources as equivalent. If a journalist in Syria says there’s been a gas attack by the Syrian government, I trust that more than someone with no experience of the situation.

Which is more tainted by untruth in our daily lives: words or numbers?

I haven’t done a study to quantify it, but my hunch is that it will vary. Many people think numbers are facts – they’re not. They’re collected by people who have their own biases and limitations, and then interpreted and contextualised; errors can creep in at any stage. The problem is that people encounter a number and think it must be true. I’m fascinated by framing effects – you can get people to believe all kinds of things if you frame them correctly.

What are some examples of framing effects?

In medicine, for example, if you say a procedure has a 70 per cent chance of saving you, people are more likely to choose it than if it has a 30 per cent chance of killing you. That’s framing.

Averages are also a kind of framing and can be misleading. Saying one in two marriages will end in divorce doesn’t mean that it will apply in your office because there are probably multiple factors that contribute to a divorce. You have to be careful about the group you’re looking at.

Averages can sometimes give you the illusion of understanding something when actually it’s quite subtle and nuanced. In my book I give the example that, on average, humans have one testicle, even though that’s not representative.

What are your top three tips for spotting a lie?

When looking at information, three things to think about are plausibility, source and specificity. For plausibility, you’re asking if the claim is even possible. A taxi driver once told me there are 17 billion people without internet. There aren’t that many people in the world! For source, we might ask who collected the data – do they actually have access to that information? In terms of specificity, if someone tells you “Crime has dropped by 30 per cent in Manchester” you need to ask “What kind of crime?”. Perhaps violent crime actually increased, but the police were so busy pursuing it they stopped giving out tickets for minor misdemeanours.

Without naming names, who lies more: politicians or the media?

I don’t think that the mainstream media lies. Some politicians lie, cherry-picking facts or being very careful about how they say things to give an impression that will please the greatest number of people. The journalists I know and interact with are underpaid and have no motivation to lie. They went into the field for truth, not for glory or riches. In that way they’re like scientists, who do the work because they find it intellectually stimulating. Of course there are journalists and scientists who do lie and they undermine the whole process.

If it causes no direct harm, is it ever OK to mislead the public for the greater good?

I’m not qualified to talk about the role of lying in society, but for issues of national security I think it can be necessary for public figures to lie. You don’t talk about a stealth operation in another country or about undercover agents if it will endanger lives. There’s a delicate balance between what the public has a right to know and what they don’t. It’s a constant struggle and the press plays a critical role.

On a personal note, I think lies are sometimes necessary. You don’t tell Aunt Tilly you don’t like her new hat because you don’t want to hurt her feelings, and you try to clear your plate at a fancy dinner because you don’t want to hurt the host’s feelings, even though the food tasted like shoe leather. These little niceties are the glue that keeps society running.

What’s your guiltiest irrational habit?

Like everybody, I’m prone to faulty reasoning. One of the big ones is that a vivid story will loom larger in my decision-making process than actual, statistical data. I have to fight that tendency; it’s a constant effort. Making sense of an irrational world is a constant effort too. It’s not something you just do once in a while – it’s something that needs to be practiced and exercised like a muscle.

How can we be sceptical, but not cynical?

To me, scepticism is not rejecting every claim you encounter – it’s being open-minded and trying to understand where a claim comes from and what backs it up.

As scientists we try to hold our emotions and preconceptions at bay. Much like a police investigator, you collect evidence and hold off the decision-making process as long as possible. But evidence is always ambiguous – it rarely points to one conclusion or totally excludes another. You end up with a mound of evidence that is weighted to one side or the other, and at some point you form a conclusion. I’m not advocating for a world without emotions, but rather to keep them at bay so they don’t cloud the decision-making process. Once you make a decision, emotion can help rally you to take action based on that decision.

A Field Guide to Lies and Statistics by Daniel Levitin is out on 26 January (Viking, £14.99)
A Field Guide to Lies and Statisticsby Daniel Levitin is out on 26 January(Viking, £14.99)

Follow Science Focus onTwitter,Facebook, Instagramand Flipboard

© Getty Images