It is the end of a dangerous moral experiment
Published in The Times on 7th July 2012
Share
The banking scandals, rate fixing and resignations may have a silver lining if they awaken us to a fact about which we have been in denial for decades.
Morality matters. Not just laws, regulations, supervisory authorities, committees of inquiry, courts, fines and punishments, but morality: the inner voice of self-restraint that tells us not to do something even when it is to our advantage, even though it may be legal, and even if there is a fair chance it won’t be found out. Because it’s wrong. Because it’s dishonourable. Because it is a breach of trust.
We are reaching the endgame of a great experiment that didn’t work: society’s attempt to live without a shared moral code. The 1960s applied this to private life. The 1980s applied it to the market. It was the age of deregulation and faith in the power of exchange. Hadn’t Adam Smith convincingly shown that the market, by the alchemy of “the invisible hand,” turned the pursuit of self-interest into collective gain? Smith never said that greed is good, but some of his followers did. To which, after a succession of scandals that has shaken the financial system and brought the economy to its knees, we are entitled to say, “Up to a point.” Actually, with little fanfare, a discovery in the early 1950s had already refuted this central premise of classical economics.
It emerged from one of the most brilliant minds of the twentieth century, John von Neumann. Neumann was a mathematician and physicist, but was also the son of a banker who had a habit of discussing the day’s business over the dinner table. This was enough to tell Neumann that key decisions in banking and finance didn’t work the way economic theory said they did. They didn’t follow abstract computations of profit and loss. Whether a decision was good or bad depended on how others responded to it, and you could not predict that in advance. To help make decisions under conditions of uncertainty Neumann invented a new discipline, Games Theory.
This gave rise to a famous puzzle known as the Prisoner’s Dilemma. This showed that two or more rational agents, each acting in their own self interest, will produce an outcome that is bad for both, individually and collectively. This was to classical economics what Einstein was to Newton. It proved that there are things the invisible hand can’t handle.
The key variable turns out to be trust. With it, the market economy works. Without it, it fails. The choice is simple. Either you have a trust economy or a risk economy. In the first, you can rely on people to act with due regard to the interests of those they serve. In the second, you depend instead on a structure of laws, regulations, supervisory authorities, contracts, courts, punishments and fines. Transaction costs are high. Even so, ingenious people will find ways of outwitting the most elaborate regulations. Without trust, self-interest defeats regulations, undermines institutions and eventually causes systems to collapse.
But aren’t most people trustworthy? Not according to research published by behavioural economist Dan Ariely in his recent book, The (Honest) Truth about Dishonesty. His essential finding is that most of us are willing to cheat, given the temptation and opportunity. We’re just not willing to admit that we do. We cheat just a little, enough to pass unnoticed and to convince ourselves that we aren’t really being dishonest. After all, we say, everyone would do likewise given the chance.
The key, says Ariely, is the “fudge factor.” We want to benefit from cheating, but we also want to view ourselves as honest, honourable people. We resolve the conflict by “our amazing cognitive flexibility” – academic-speak for self-deception. He illustrates it by a simple story. Eight year old Jimmy comes home from school with a note from his teacher saying, “Jimmy stole a pencil from the student sitting next to him.” His father is furious. “If you needed a pencil, why didn’t you ask? I could have brought you dozens back from work.” We notice other people’s dishonesty, blind to our own.
Ariely and his academic colleagues found that the “fudge factor” is greatest when there is a distance between act and consequence, where there are grey areas, and where we have financial incentives to act against the interests of clients. We are more likely to cheat when stressed or exhausted. The more creative we are, the greater our ability to find self-justifying reasons for bad behaviour. We believe our own fictions (Harvard sociologist David Riesman once defined sincerity as “believing your own propaganda”). Dishonesty is contagious. Seeing colleagues cheat makes us more likely to do so. Most tempting of all, says Ariely, is “altruistic” cheating. If we can persuade ourselves that an act of dishonesty is for the good of our colleagues, even the best can go bad. Many of these factors were present in the Libor rate-fixing affair.
How do you change a corporate culture? You need to go beyond codes of conduct, says Ariely. He and his team tested students from two universities. The first were asked at the outset to sign an agreement that they would abide by their university’s code of honour. The second weren’t. Predictably, the second group cheated, the first did not. The irony is that the first university didn’t have a code of honour, while the second did. What matters, says Ariely, is not the code but the constant reminder.
The vast rewards, skewed incentives, high pressure and extreme opacity of modern finance combine the maximum of temptation with the maximum of opportunity. We have, it seems, an impressive capacity for bending the rules in our favour while telling ourselves we are doing nothing wrong. But the market economy needs trust, and without it, it will fail. Essential though legislation and regulation are, they are not enough. Trust depends on virtues of self-restraint, embedded in a culture, embodied by its leaders and embraced by individuals. Until morality returns to the market, we will continue to pay a heavy price.