Most people do not regard hypocrisy as a case of inconsistent behaviour, experimental social psychologist Daniel Effron says.
Effron, a professor of organisational behaviour at the London Business School. designs experiments to try to better understand how people think about moral and immoral behaviour.
“What I'm really interested in is not so much how hypocrisy should be defined, but how the average person thinks about hypocrisy,” he told Kathryn Ryan.
His research suggests that people are not so much bothered by a person’s failure to practise what they preach, so much as a person trying to appear more virtuous than they deserve.
This has darker implications in the political domain, he says, where people’s perceptions of behaviour is skewed by their political affiliations.
“There's was this Republican, a number of years ago, named William Bennett, who framed himself as a family-values conservative.
“He edited a volume called the Book of Virtues and he was a drug czar for a while, anti-illegal drugs and helping to enforce those laws.
“And it turns out he was he was in debt for almost a million dollars because he had a gambling addiction.”
Democrats, he says, saw this as straight out hypocrisy, Republicans not so much.
“Republicans said well that's not hypocrisy, there's not even any consistency here. He's preaching about drugs, and he's practising gambling, drugs are illegal, gambling is not.
“There's no contradiction here. He's not pretending to be more virtuous than he is.”
When it comes to lies more generally in politics, his research suggests that three factors which influence how people react to them.
One is our motivation, he says.
“If you like what the person saying, if you like the lie, even if you recognise it as false, it doesn't bother you as much and you’re more willing to let the person off the hook.”
The second factor, he says, is repetition.
“It turns out that if you hear the same lie again and again and again, even if you never believe it, you stop caring. It doesn't seem quite as unethical to tell the lie.”
The third factor, Effron says, is imagination.
“As humans we have this this amazing capacity to imagine the world as it might have been, or as it might become, and that's really useful for planning for the future or learning from our past mistakes.
“But it also has a dark side, this capacity for imagination. And that's when we imagine that a lie could have been true, or it might become true in the future.”
His research shows we can become desensitised to dishonesty when we hear the same lie repeatedly.
“In one study, with some collaborators at Vanderbilt University in the US, we had 100 volunteers sign up to receive text messages on their smartphones over the course of a week.
“At random intervals we texted them, made up news headlines about various corporate scandals, things like a company injured a monkey doing cosmetics testing, or a flight attendant slapped a crying baby in the face during a flight.”
After two weeks the participants were asked, “a really simple question”.
“We said how unethical do you think the behaviour is that this headline describes?
"So, for example, how unethical is it to slap a crying baby in the face on a crowded flight?
“Now, it turns out, luckily, everyone thought that these behaviours were unethical, because they were pretty severe.
“But the scary thing is that they thought the behaviours were a little bit less unethical if we had described it to them in a headline that they had seen lots of times over the past couple of weeks, compared with a headline they've never seen at all, or a headline they've just seen a small number of times.”
The conclusion – repeated exposure to a description of wrong doing makes it seem less unethical, he says.
His strong inclination is that the social media age is supercharging this phenomenon.
“I strongly suspect that this moral desensitisation effect has gotten worse. And the simple reason is that we are inundated with information, moment to moment throughout our days in a way that we humans have never been before.”