Can you grow new brain cells? The debate could reveal the secret of superagers.
A pair of new studies have provided fresh evidence in the long-running scientific debate—and the result could be game-changing for treating diseases like Alzheimer’s and dementia.

Ever since the dawn of modern neuroscience, there’s been one seemingly absolute belief as laid out by Santiago Ramón Cajal, the father of neuroscience: In adults, brain cells are “fixed, ended, and immutable. Everything may die, nothing may be regenerated,” he proclaimed in the early 20th century. “It is for the science of the future to change, if possible, this harsh decree.”
Over the past few decades, science has been steadily trying to do just that. And that has led to a long-running debate about whether adult humans form new brain cells—a process called neurogenesis.
A paper published last week in the journal Nature created a flurry of excitement when it appeared to settle that debate in the affirmative. It suggested that the secret of superagers—people with exceptional cognitive ability for their age—may be the fact that they have more new neurons than their peers. It was the second study in the last year that seemed to support the possibility of neurogenesis.
(Superagers seem to share this one key personality trait.)
This is an important study, says Eva Feldman, a neuroscientist at the University of Michigan who studies neurogenesis, because “it at least associates a person over 80 who has an exceptional memory with all these immature neurons.”
Still not all experts are convinced. It turns out that it's very tricky to find evidence of neurogenesis. But the discovery could be game-changing for how we treat diseases like Alzheimer’s and dementia.
The debate over neurogenesis heats up
In 1962, Josef Altman, then a neurobiologist at MIT, was among the first to challenge Cajal’s theory. He used radioactive tagging to show that in adult rats new neurons were formed in several brain regions, including the hippocampus, an area of the brain that plays a role in learning and memory.
Most experts agreed that Altman’s findings in rodents did not apply to humans. It took another three decades for anyone to find evidence of neurogenesis in adult human brains.
In 1998, Fred “Rusty” Gage, a neuroscientist now at the Salk Institute for Biological Studies, examined postmortem tissue of cancer patients who had been treated with bromodeoxyuridine, a chemical taken up by cells that are in the process of dividing. Gage and his team found that neurons were indeed being divided in the hippocampus of these patients. This was the first evidence that adult humans grow new brain cells.
(Scientists identify 4 key turning points for your brain as you age.)
Still, this research was not looking at actual brain cells being formed. It relied on chemical markers, the bromodeoxyuridine, and analysis of postmortem tissue. Some experts have pointed out that bromodeoxyuridine labeling can be difficult to interpret because cell death or cell repair can potentially cause false positives.
For Agnes (Yu) Luo, a molecular geneticist at the University of Cincinnati who studies stem cells and regeneration, the most convincing evidence of adult neurogenesis was a 2014 paper that brought a novel approach to the question by examining the level of radioactive particles in postmortem human brain tissues.
Between 1945 and 1963, atomic bomb tests spewed radioactive particles into the atmosphere. Among these were carbon 14 isotopes, which accumulate in the DNA of a divided cell—or new cells being created. When scientists examined the hippocampal tissues of people who had been adults at the time of the bomb tests, they found high levels of the isotopes in their DNA, suggesting that new neurons were being formed well into the fifth decade of life.
Clever as it is, that study has been criticized as well, including by those who point out that the carbon could have been generated by cellular processes, such as methylation and demythelation, in which carbon atoms are added and removed.
(Lithium plays a mysterious role in the brain. Could it be used to prevent Alzheimer’s?)
This critique was also raised in a 2020 review of the scientific literature, which argued that adult neurogenesis is a chimera. The review analyzed neurogenesis studies that had been conducted in decades prior, citing methodological errors, conflicting findings, and other problems, and it called for more standardization in methods used to look for neurogenesis.
If neurogenesis does occur, it is extremely rare, says Shawn Sorrells, a neuroscientist at the University of Pittsburgh who studies brain development. In a 2018 study, he and colleagues used a variety of methods for testing tissue samples and found that after the first few years of life, neurogenesis in the human hippocampus is nonexistent or at least extremely rare.
Still proponents of the neurogenesis theory have argued that there are flaws with these conclusions too—and the debate has continued.
A new look at the debate
Some experts say that two recent papers may have delivered the one-two punch that finally ends the debate.
In July 2025, researchers reported in the journal Science that, using gene sequencing and with the help of artificial intelligence, they were able to identify cells in adult human brain tissue that had the genetic hallmarks of cells that divide to create neurons. This proved, the researchers say, that at least some people make new neurons in adulthood, though the new neurons grow slowly.
In late February 2026 another intriguing paper was published, this time in the journal Nature. Researchers used genetic analysis to identify cells at various stages of growth, and found that, when compared with cognitively normal adults, superagers have an exceptional number of immature neurons—around twice as many as cognitively normal adults—suggesting that they have a pool of immature neurons ready to grow into functional neurons.
(Your body ages rapidly in two 'bursts,' at 44 and 60. Here's how to prepare.)
Luo, for one, is convinced. “I think the Science paper is more or less accepted by the field, that it sort of sealed the deal that we have [adult neurogenesis] in the human brain.”
Eva Feldman agrees and says the debate now is not so much about whether neurogenesis occurs in humans, but how much and how consistently it occurs. “When these new neurons differentiate, when they become adults, what is their function and how does that clinically move the needle?” The paper that studied superagers, she says, “makes one think that these new neurons when they mature, when they grow up, so to speak, do contribute to learning and memory.”
However, she adds that is an association, not yet a causation. And Sorrells is still skeptical. All of these studies, he says, are based on indirect evidence. “We don't actually see one cell physically changing into the next cell.”
Even with these genetic markers, it’s still not clear whether the neurons were ever going to mature. “You could have cells being born that then die immediately and never get used,” says Staci Bilbo, neuroscientist and interim chair of the Department of Neurobiology at Duke University.
Debra Lynn Silver, neuroscientist and molecular geneticist, also at Duke, agrees. “These studies are not necessarily asking about function, whether functional neurons are made or looking at things at the protein level.”
As is often the case in science, the devil is in the details.
Ultimately, learning more about how the brain works is one of today’s most important scientific projects. If scientists can discover ways to replace dead neurons or goose new ones to grow faster, they might be able to treat a variety of neurodegenerative diseases, including Alzheimer’s disease.
But other scientists argue that perhaps we don’t need new cells after all. “Maybe the human brain has figured out different ways to be flexible that don't rely on maintaining neural stem-cell pools, then differentiating neurons slowly over time,” says Sorrells. Perhaps our lack, or relative lack, of neurogenesis is a feature, not a bug.