Our lives are governed by both fast and slow – by quick, intuitive decisions based on our gut feelings; and by deliberate, ponderous ones based on careful reflection. How do these varying speeds affect our choices? Consider the many situations when we must put our own self-interest against the public good, from giving to charity to paying out taxes. Are we naturally prone to selfishness, behaving altruistically only through slow acts of self-control? Or do we intuitively reveal our better angels, giving way to self-interest as we take time to think?
According to David Rand from Harvard University, it’s the latter. Through a series of experiments, he has found that, on average, people behave more selflessly if they make decisions quickly and intuitively. If they take time to weigh things up, cooperation gives way to selfishness. The title of his paper – “Spontaneous giving and calculated greed” – says it all.
Working with Joshua Greene and Martin Nowak, Rand asked volunteers to play the sort of games that economists have used for years. They have to decide how to divvy, steal, invest or monopolise a pot of money, sometimes with the option to reward or punish other players. These games are useful research tools, but there’s an unspoken simplicity to them. Sure, the size of the payoffs or the number of rounds may vary, but experiments assume that people play consistently depending on their personal preferences. We know from personal experience that this is unlikely to be true, and Rand’s experiments confirm as much. They show that speed matters.
Rand started with a simple public goods game, where players decide how much money to put into a pot. The pot is then doubled and split evenly among them. The group gets the best returns if everyone goes all-in, but each individual does best if they withhold their money and reap the rewards nonetheless.
Rand recruited 212 people for the experiment using Amazon’s Mechanical Turk, an internet marketplace where people can outsource tasks to worldwide volunteers. The Turk provided two advantages: the volunteers were more diverse than the W.E.I.R.D. undergraduates who normally take part in psychological studies; and Rand could measure how quickly they made their decisions. With that data, he found that players contributed around 67 percent of their money if they made decisions within 10 seconds, but only around 53 percent if they took longer.
Rand also went back to four of his earlier studies, where he had recorded reaction times (including one I’ve written about before). All of these involved college students, but they used different economic games. Nonetheless, all showed the same link between faster decisions, and more cooperative choices. “Although the cold logic of self-interest is seductive, our first impulse is to cooperate,” Rand writes.
Of course, these results are just correlations. Does rapid-fire decision-making actually foment cooperation, or are selfless decisions just quicker to make? To find out, Rand recruited more volunteers through the Mechanical Turk, and got them to play another public goods game. This time, he manipulated the speed of their decisions—he either told them to choose quickly, or asked them to write about a time when their intuitions served them well or careful reasoning led them astray. Under both conditions, the volunteers made faster choices, and contributed more money to the communal pot. If, however, they were told to decide slowly, or to write about times when reflection beat intuition, they stumped up less money.
From these results, it’s tempting to conclude that cooperation is somehow “innate” or “hardwired” and that selfishness is somehow imposed upon these predispositions. But Rand points out that our intuitions are also shaped by our daily lives. In so many of our choices, cooperation is the sensible call; if we cheat, we may be punished, lose our reputation, or deny ourselves the future goodwill of those we wrong.
So, when volunteers take part in the experiments, “their automatic first response is to be cooperative,” Rand writes. “It then requires reflection to overcome this cooperative impulse and instead adapt to the unusual situation created in these experiments, in which cooperation is not advantageous.” He found two lines of support for this idea when he surveyed his volunteers: The link between fast-thinking and charity only held for people who said that their daily lives were mostly filled with cooperative interactions; and it only held for those who hadn’t taken part in similar experimental games before.
“This shows how it’s difficult to consider experimental play in isolation from things outside the lab or as completely determined by the game’s monetary payoffs, as a lot of economists do,” says Ann Dreber Almenberg from the Stockholm School of Economics, and one of Rand’s former colleagues.
Obviously, all of these results are averages, and the individuals in the study varied greatly. Some people hardly cooperated at all, regardless of how quickly or slowly they thought. Others took a long time, and still erred on the side of selflessness. “We weren’t able to find any traits that differentiated these people from everyone else, but it something we are quite interested in exploring more,” says Rand. And more importantly, no one showed the opposite trend—no one cooperated more when they made intuitive rather than reflective decisions.
Rand now wants to find out more about how the link decision speed and cooperation varies between individuals, and across different cultures. He also wants to understand how these trade-offs play out in real settings, and he suspects that “fact-based rational pitches may be less effective than more emotional appeals, if you are trying to get other people to be cooperative.”
Reference: Rand, Greene & Nowak. 2012. Spontaneous giving and calculated greed. Nature http://dx.doi.org/10.1038/nature11467
Image by Marcusrg