Darwin in the Crib

Last week my editor at the New York Times asked me to write an article about the evolution of crying, to accompany an article by Sandra Blakeslee on colic. Both articles (mine and Blakeslee’s) are coming out tomorrow. As I’ve written here before, human babies are by no means the only young animals that cry, and there’s evidence that natural selection has shaped their signals, whether they have feathers or hair. Among animals, there’s a lot of evidence that infants can benefit from manipulating their signals to get more from their parents. On the other hand, evolution may sometimes favor "honest advertisements" that prevent offspring from deceiving their parents. Human crying may be the product of the same conflict of evolutionary interested between parents and children.

This was a tricky article to write, because on the one hand there are some very interesting ideas to examine, but on the other hand, they’re only hypotheses that haven’t been put to much of a test in humans. I’ve come across two big papers in the past couple years, this one by Jonathan C.K. Wells in the Quarterly Review of Biology in 2003 and another by Joseph Soltis in the latest issue of Behavioral and Brain Sciences. They offer and evaluate a number of hypotheses for human crying. They even give some thought to colic, that maddening far end of the crying spectrum where perfectly healthy babies cry for hours, turning their parents into shambling wrecks. According to one hypothesis, colic is just a case of deceptive signals from child to mother, carried to an absurd extreme.

These are just preliminary hypotheses, though, and they face a lot of tough tests. As I mention in the article, chimpanzees show no sign of colic, which makes you wonder how deep the evolutionary roots of colic could go if it is not found among our closest living primate relatives. What I didn’t have room to mention in the article were some comments published in response to Soltis’s paper in Behavioral and Brain Sciences by Hillary Fouts of NIH and here colleagues. They study foraging societies in Africa, and in their years of observing how these people raise kids, they haven’t seen any colic either.

One way to account for this pattern is the possibility that colic is a disease of affluence–an adaptation turned maladaptive in the modern age, like a taste for sweets that was once satisfied by fruits and can now be drowned in a sea of high-fructose corn syrup. Wells even suggests that the modern Western food supply may have cut down the cost of crying, making it easier for kids to cry more. In foraging societies, mothers nurse their children up to four times an hour, while mothers in farming and industrial societies nurse their babies far less. Babies also cry to be held (perhaps for warmth and protection from attack), and while foragers hold their babies constantly, Westerners keep their babies separated from them much of the time in cribs, carriages, and car seats. Wells suggests that when a colicky baby sends its cranked-up signal and doesn’t get the right response, it cranks up even more.

Again, this is only a hypothesis–a starting point for investigation. Hillary Fouts and her colleagues show what this sort of investigation can look like. In the latest issue of Current Anthropology, they report on a study about the end of crying, comparing how babies respond to weaning in two cultures. Both cultures are found in the same rain forests of the Central African Republic. One group live as foragers, and the others as farmers. The foragers nurse their children many times a day and wean them by gradually taper off nursing. The farmers, on the other hand, cut off their children abruptly–in part because the women need to get back to working in their fields.

Fouts and her colleagues found that the farmer children fussed and cried a lot around the time of weaning, while the forager children didn’t show much difference. But the researchers kept following the children and found something interesting: the farmer children stopped fussing before long and then cried a lot less in general. The forager children, on the other hand, kept crying more than the farmer children long after they had been weaned.

Fouts and her colleagues see a subtle strategy at work here. The farmer children may cry in response to weaning because it represents the end of a reliable milk supply and perhaps even because weaning raises the odds of their mothers will get pregnant with another child that will compete for the mother’s investment. But once the farmer children are weaned and it is clear that their cries will not do them any more good, they don’t waste any further effort on the tears.

The forager children, on the other hand, don’t get that clear signal of an impending cut-off, and so they don’t fuss and wail more in response. But it’s also important to bear in mind that in the foraging community, the children are always around some relative who will be quick to pick up a child. So even after weaning, crying still has some value as a signal, and so the children keep it up.

What I find particularly interesting about this study is that it suggests that we shouldn’t use evolution to manufacture a false sense of nostalgia. Just because our ancestors lived in a particular way doesn’t mean that the way we live now is automatically bad. Our evolutionary heritage is not completely fossilized; it can in some respects alter itself in response to the conditions in which we grow up. If colic follows this pattern, it is not a cause for collective Western guilt that we don’t live as foragers. Instead, it’s a call to understand the evolutionary roots of the behavior of our children–both for their well-being and our own sanity.