On April 17, 1997, Bill and Hillary Clinton organized a one-day meeting with a long and lofty title: The White House Conference on Early Childhood Development and Learning: What New Research on the Brain Tells Us About Our Youngest Children.
The meeting featured eight-minute presentations from experts in public policy, education and child development, and one neuroscientist. They discussed, among other things, how 6-month-old infants learn to discriminate the sounds of their native language, and how, if a kitten’s eye is patched during early development — and therefore deprived of light inputs — it will go permanently blind in that eye, even after the patch comes off. The First Lady gave the gist of the meeting in her opening remarks: The first three years of life, she reportedly said, “can determine whether children will grow up to be peaceful or violent citizens, focused or undisciplined workers, attentive or detached parents themselves.”
Behind the hyperbole of that statement is an important idea based in solid science. The first few years of life are a “critical period” for brain development, during which experiences — strong parental attachments, exposure to written and spoken language, social interactions — sculpt brain circuits in a way that’s difficult to un-sculpt. When a developing brain isn’t adequately stimulated, as often happens to children living in poverty, for example, or in the foster care system, this deprivation can lead to problems in cognition, attention and social behaviors.
The conference spurred a media frenzy. “Suddenly, every magazine and newspaper is saying, ‘Oh my god, life ends at the age of 3 when the critical period ends’,” recalls Charles Nelson, a developmental neuroscientist at Harvard. You might think that all the attention on critical periods would have led to more research on disadvantaged children. To some extent, it did (more on that later), but the vast majority of the public discussion went toward the other end of the socioeconomic spectrum.
America’s new obsession with the critical period launched a cottage industry of educational materials — such as the well-known Baby Einstein/Baby Mozart brand — which were endorsed and distributed by several nonprofits and state governments. The governor of Georgia, Zell Miller, even convinced hospitals to give out classical music cassette tapes to all new parents with instructions to play for their newborn. All of it was based on an untested premise: If depriving a baby during the critical period leads to terrible psychological outcomes, then giving her extra stimulation (or “enriched environments”) should lead to a super-duper brain.
There was (and still is) little science to back this up, and Nelson and other researchers did the best they could to clear up misconceptions. A few years ago, after a study came out showing that children who watch Baby Einstein videos actually do worse at learning words, the company stopped claiming its materials have educational benefits*. Yet many parents are still being told that enriched environments — whether colorful mobiles, “sensorial materials,” car seat galleries, or horseback riding lessons — spur early brain development.
That well-intentioned parents may be wasting money on a lot of shiny toys doesn’t exactly keep me awake at night. What’s disappointing is that the enrichment meme seems to have overshadowed the real lesson of the research on critical periods: that poverty and child neglect often have devastating and long-lasting effects on the brain.
Two studies published in the past week, for example, have shown that children who experience to severe neglect, abuse, or injury in childhood (even after age 3, by the way) have abnormal brain wiring when they hit adolescence.
The first report, published by Nelson and his colleagues in the Proceedings of the National Academy of Sciences, was part of a 12-year study tracking the fates of 136 Romanian orphans, some of whom were raised in state-run institutions and others in foster care families. Around age 8, children who grew up in institutions have less white matter, the tissue that links up different brain regions, compared with those raised in families, the study found.
Some may think of this as an unfortunate, though unsurprising reality of life in a post-Communist country — a tragic story for Romania, but not particularly relevant to us here in the U.S. Not true. Nelson points out that the defining element of institutional living, the absence of invested caregivers, is also what happens to many children in poverty. The work in Romania, he says, “is a wake-up call to the millions of children in the U.S. who are living in circumstances that are only marginally better than kids living in institutions.”
This idea is bolstered by the second new study, published yesterday in Neuropsychopharmacology. Researchers in Texas scanned the brains of adolescents who had experienced neglect or abuse before age 10. These kids had weaker white matter tracts in adolescence compared with peers who didn’t experience early adversity. What’s more, the adolescents with deficits in brain connectivity were more likely to be dealing with depression or substance abuse five years later.
When it comes to the reception of this research in yuppie circles, how much responsibility falls on science journalists? Whenever I pitch a story, an editor is bound to ask me about its relevance, or “take-home message,” for the publication’s readers. This is legitimate; of course I want my readers to be interested. But I suspect that sometimes, in framing a story for a targeted demographic, its message for those outside the bubble is lost. I can’t help but use the obvious metaphor. There seems to have been a critical period for reporting on critical period research, and the misinterpretations sculpted during that window are difficult to un-sculpt.
*Baby Einstein contests the claims of this study.
Photos courtesy of Charles Nelson
This post was originally published on The Last Word on Nothing