You’ve probably heard the astonishing fact that your brain contains around 86 billion neurons. It’s a number so vast it’s almost meaningless, a piece of trivia you might drop at a party or see in a pop-science article. It feels concrete, precise, and scientifically absolute. But have you ever stopped to wonder… who counted them? And how on Earth do you count something so infinitesimally small and impossibly numerous, packed inside the most complex object in the known universe?
You might picture scientists in white lab coats, hunched over microscopes for years, painstakingly clicking a tally counter for every single cell they see. It’s a noble image, but it’s completely wrong. The real method, the one that gave us that 86-billion-neuron figure, is both more bizarre and more brilliant than you can imagine. It’s a story of challenging old assumptions, of radical thinking, and it involves a process that sounds more like something from a mad scientist’s kitchen than a cutting-edge neuroscience lab. It involves turning the human brain… into a uniform, homogenous soup.
That’s right. To get the most accurate count of brain cells we’ve ever had, scientists had to first figure out how to liquefy a brain. This isn’t a story about a single, simple number. This is the story of how a revolutionary technique completely upended a century of dogma, and how, even today, that famous number is the subject of intense scientific debate. The journey to counting our neurons is a perfect illustration of how science actually works: it’s messy, it’s ingenious, and the final answer is almost never as simple as it seems. We’re going to pull back the curtain on this entire process, from the frustrating old ways to the mind-bending new one. By the end, you won’t just know the number; you’ll understand the incredible intellectual journey it took to find it, and why that journey is far from over.
Section 1: The Impossible Task – Why Counting Neurons is So Hard
Before we can appreciate the genius of brain soup, we have to understand why scientists needed to resort to such an extreme measure in the first place. Why is counting neurons one of the most difficult accounting jobs in all of biology? The problem isn’t just that there are billions of them. The problem is the brain itself.
Think about what a brain looks like. It’s a three-pound, wrinkled, lumpy organ. It is anything but uniform. Different regions have vastly different architectures. The cerebral cortex, the outer, wrinkly layer responsible for higher thought, has its neurons arranged in six distinct layers. The cerebellum, tucked underneath at the back and crucial for motor control, is one of the most densely packed structures known to exist. It’s estimated to contain about half of all the brain’s neurons, despite only making up about 10% of the brain’s volume. Then you have deep brain structures, like the thalamus or the brainstem, each with its own unique cellular layout.
This extreme non-uniformity, or heterogeneity, is the core of the problem. A cubic millimeter of tissue from the cortex has a completely different number of neurons than a cubic millimeter from the cerebellum. This means you can’t just take one small piece, count the neurons in it, and multiply that by the brain’s total volume. It would be like trying to estimate the population of a country by sampling a single city block in Manhattan and then scaling it up. The result would be wildly inaccurate. You’d be ignoring vast stretches of sparsely populated farmland and the different densities of other cities. The brain is the same; it’s a landscape of bustling metropolises and quiet rural villages, all at the cellular level.
For decades, the only real tool neuroscientists had was a method called stereology. The name sounds complex, but the concept is basically a highly sophisticated version of that sampling idea. For a long time, it was considered the gold standard. Here’s how it worked: a researcher would take a brain, fix it in a chemical like formaldehyde, and then use a machine called a microtome to slice the entire brain into incredibly thin sections. Imagine slicing a loaf of bread, but your loaf is a human brain and each slice is almost transparent.
These delicate slices would then be stained with special dyes to make the neurons visible under a microscope. Now came the truly painstaking part. A scientist couldn’t count every neuron on every slice; it would take a lifetime. So, they would sample, using a grid to count the neurons in specific, randomly selected squares on a systematic sample of the slices. The final step was the extrapolation. Using a series of complex mathematical calculations, they could estimate the total number of neurons. This was a huge improvement over just guessing.
But it had massive drawbacks. First, it was incredibly slow and labor-intensive. Preparing, slicing, staining, and then spending weeks peering through a microscope was a monumental effort. Second, it was still based on sampling, which always has a margin of error. You’re only ever looking at a tiny fraction of the whole brain. Because of this immense effort, studies were often limited to very small brain regions. Counting the neurons in a whole human brain this way was practically unthinkable.
This is why, for most of the 20th century, the number of neurons in the human brain wasn’t a known fact; it was more like scientific folklore. You’d see estimates in textbooks from 100 billion to even a trillion, but the citations often led nowhere. The scientific community had settled on “about 100 billion neurons” and “about a trillion glial cells,” the brain’s support cells, creating a famous 10-to-1 glia-to-neuron ratio. But nobody had ever truly counted them. The tools just weren’t up to the job. The world was waiting for a revolution.
Section 2: The “Aha!” Moment – The Revolutionary Idea
The revolution didn’t come from improving the old methods. It came from someone who looked at the problem and decided to throw out the rulebook. That someone was Brazilian neuroscientist Dr. Suzana Herculano-Houzel. In the early 2000s, she was frustrated by the flimsy foundations of her field. She was studying how the brains of different species compare, but how can you compare brains if you don’t have a reliable count of their building blocks?
She was bothered by the “100 billion neurons” figure that everyone repeated but no one could source. She knew that the old stereological methods were too slow and cumbersome to ever provide a definitive answer. As she saw it, the problem was the brain’s solid, lumpy structure. As long as you were counting cells in an intact brain, you were stuck with the problem of sampling.
This is where the leap of genius happens. The story goes that she was pondering how to make the brain uniform when a simple, almost domestic analogy came to mind. If you want to know how many carrots are in a soup, you don’t have to count them in the pot. You can blend the whole thing into a smooth liquid. Then, you can take one spoonful, count the carrot bits in that manageable sample, measure the total volume of the soup, and calculate the total number of carrots. The key is to destroy the structure to make the contents countable.
What if you could do the same thing to a brain? What if you could turn this complex organ into a homogenous “brain soup,” where every drop was identical? If you could do that, the problem of heterogeneity would vanish. You’d no longer worry about the difference between the cortex and the cerebellum. You’d have a perfectly mixed suspension of brain cells.
The central insight was this: a cell’s identity is held within its nucleus. Every cell has exactly one nucleus, a tough little packet containing its DNA. So, if you could dissolve the soft cell membranes while leaving the nuclei perfectly intact, you’d have a soup of free-floating nuclei. Counting those nuclei would be the exact same thing as counting the cells they came from.
This idea was radical. It went against the entire tradition of neuroanatomy, which was dedicated to preserving the brain’s intricate structure. Dr. Herculano-Houzel was proposing the exact opposite: to obliterate that structure in service of a single, accurate number. It was a trade-off—giving up spatial information to gain quantitative precision.
She and her colleague Roberto Lent developed this method in their lab in Rio de Janeiro. They experimented to find the perfect chemical cocktail, landing on a solution with a detergent called Triton X-100 to dissolve the membranes but preserve the nuclei. They had their theory and their recipe. Now, they had to test it on rat brains. They would take a brain, place it in a lab-grade blender called a glass homogenizer, and add their detergent solution. The device would grind the tissue, creating a cloudy suspension. They called their new method the “isotropic fractionator.” “Isotropic” means “uniform in all directions,” and “fractionator” refers to sampling a fraction of the whole. The “Aha!” moment was complete. The stage was now set to apply this bizarre, brilliant technique to the human brain.
Section 3: The Reveal – The “Brain Soup” Method Step-by-Step
So, how do you actually go from a whole human brain to a number like 86 billion? Let’s walk through the isotropic fractionator method, step by step.
**Step 1: The Source Material**
Everything begins with the brain itself, typically from medical donors who willed their bodies to science. For the landmark study, the team used four adult male brains. The brain is first “fixed” in a chemical solution like paraformaldehyde to preserve the tissue and prevent decomposition.
**Step 2: Dissection and Preparation**
Next, the brain is carefully dissected into its major regions—the cerebral cortex, the cerebellum, the brainstem, and so on. This is because scientists want to know how many neurons are in each part, not just the total. The different regions are weighed precisely. This is the last time the brain’s original structure matters.
**Step 3: The Creation of the “Soup”**
This is the most dramatic step. A dissected brain region, say the cerebral cortex, is placed into a glass homogenizer. This isn’t your kitchen blender; it’s a precise piece of lab equipment. The researcher adds a measured volume of the special detergent solution. Then, the homogenization begins. A pestle shears the tissue, and this mechanical force, combined with the detergent, tears apart the cell membranes. The membranes dissolve, but the tough nuclei are left intact, floating freely. In its place is a cloudy, milky liquid. This is the “isotropic suspension” or, our “brain soup.” The magic of this soup is that it’s homogenous. The nuclei are distributed perfectly evenly. The impossible problem of sampling a solid organ has been solved.
**Step 4: Distinguishing Neurons from Other Cells**
Now we have a soup full of nuclei, but they came from neurons and other cells, like glial cells. We need to tell them apart. This is where a technique called immunocytochemistry comes in. A small, measured sample of the soup is taken. Researchers add a fluorescent dye called DAPI, which binds to the DNA inside every nucleus, making it glow bright blue under a special microscope. Counting all the blue dots gives you the total number of cells. The second marker is the magic bullet: an antibody engineered to attach to a protein found only in the nuclei of most neurons, called NeuN (Neuronal Nuclei). This NeuN antibody is tagged with a different colored fluorescent molecule, usually red or green. So now, the nuclei from glial cells will only be stained blue. But the nuclei from neurons will be stained by both the blue DAPI and the red NeuN antibody, making them appear purple. This gives scientists a clear way to distinguish a neuron from a non-neuron.
**Step 5: The Count**
The stained sample is now ready to be counted on a special microscope slide called a hemocytometer, which has a grid of a known, precise volume. A scientist looks through a fluorescence microscope and counts the glowing dots. They count all the blue dots for the total number of nuclei. Then, they count only the dots that are also positive for NeuN, which gives them the number of neuronal nuclei. They typically count at least 500 total nuclei to ensure the number is statistically robust. From this, they calculate a percentage. For example, if they counted 500 total nuclei and 200 were also positive for NeuN, they know that 40% of the cells in this brain region are neurons.
**Step 6: The Final Calculation**
This is where it all comes together. First, they determine the total number of nuclei in the entire brain soup by scaling up their count from the small grid to the total volume of the liquid. If they counted 500 nuclei in one-millionth of a liter, and their total soup volume was one liter, they know there are 500 million total cells in that original piece of tissue. Second, they use the percentage they just found. If there are 500 million total cells in the cortex, and 40% are neurons, the final calculation is simple: 40% of 500 million is 200 million neurons. They repeat this for each dissected part of the brain. Finally, they just add up the neuron counts from all the different parts to get the grand total. When Dr. Herculano-Houzel and her team did this for their four male human brains, the average number they arrived at wasn’t the neat, round 100 billion. It was 86 billion. A new, empirically verified number was born.
This book is the scientific documentary of the Kingdom of God.
Section 4: The Implications, The Debate, and The Messiness of Science
The arrival of the isotropic fractionator and the 86-billion-neuron figure was a paradigm shift. Neuroscientists now had a fast and seemingly reliable method to get accurate cell counts for entire brains. A process that once took months could now be done in about 24 hours.
One of the first pieces of scientific dogma to fall was the 10-to-1 glia-to-neuron ratio. The new counts revealed that the ratio in the human brain is actually much closer to 1-to-1, with roughly 86 billion neurons and about 85 billion non-neuronal cells. This completely changed our understanding of the cellular composition of our minds. The data also confirmed the immense importance of the cerebellum, which held a staggering 69 billion of the total 86 billion neurons. The 86 billion number quickly became the new standard. It felt like a solid foundation.
But in real science, a groundbreaking discovery is the beginning of a new chapter of scrutiny and debate. The clean, simple picture of 86 billion neurons has become significantly more complicated. The first point of contention is the sample size of that original, famous study. The 86 billion average came from just four adult male brains. What about the other half of the population? One study analyzing five healthy female brains with the same method found a neuron count that ranged between 61 and 73 billion. Let that sink in. The range for the four male brains was roughly 79 to 95 billion neurons. The range for the five female brains was 61 to 73 billion. These two ranges do not overlap at all. Does this represent a true, massive sex difference in the human brain? Or does it suggest the method itself has more variability than initially thought? The answer is still being debated.
The second challenge targets the heart of the method: the NeuN antibody. The technique rests on the assumption that NeuN reliably tags all neurons and only neurons. But we’ve since learned this isn’t strictly true. Some types of neurons, like the iconic Purkinje cells of the cerebellum, often don’t express the NeuN protein and therefore don’t get counted. NeuN expression can also change with age or disease. This means the isotropic fractionator method likely produces a slight, systematic undercount of the true number of neurons.
This leads to the most recent challenge to the 86 billion consensus. A 2025 academic paper took a hard look back at the original data from the 2009 study. The authors argued that given the small sample size and the variability between individual brains, the data isn’t statistically strong enough to support such a precise number as “86 billion”. They contend that a more scientifically cautious statement would be that the average number of neurons is likely “between about 73 and 99 billion neurons.” This may seem like splitting hairs, but it’s a crucial distinction. It’s the difference between presenting a figure as a hard fact and presenting it as the best current estimate within a range of uncertainty.
So where does that leave us? It leaves us with a perfect example of the scientific method in action. Science is not a collection of unchanging facts. It is a dynamic process of inquiry, discovery, debate, and revision. The isotropic fractionator was a brilliant leap forward. The controversies happening now are not a sign that the science is weak; they are a sign that it’s working. We replaced a vague number like 100 billion with a specific one like 86 billion. Then, we investigated further and found the real answer is more nuanced and variable. The final number may be less certain, but our understanding of the complexity is far greater.
Conclusion
So, we began with a simple question: How do you count 86 billion neurons? The answer, it turns out, is to blend the brain into a soup. We saw how the brain’s complex structure made traditional counting nearly impossible, and how the classic method of stereology was too slow and painstaking to ever get a full-brain count.
Then, we witnessed a moment of scientific brilliance: the idea to eliminate the brain’s complexity by transforming it into a homogenous, countable liquid. By dissolving cell membranes while preserving their nuclei and using fluorescent tags to tell neurons apart, scientists could finally get a direct count. This method reshaped our understanding, busting the myth of a 10-to-1 glia-to-neuron ratio and giving us the celebrated 86 billion neuron figure.
But our story didn’t end there. We saw that in science, no answer is ever truly final. We uncovered the active debate surrounding this famous number, from the original study’s small sample size to the puzzling differences found between male and female brain counts and questions about the tools used. New analyses are calling for more caution, suggesting we think in ranges rather than absolute certainties.
The quest to count our brain’s neurons is not just about a number. It’s a story about human ingenuity and our relentless drive to understand ourselves. It shows that science isn’t a straight line to a single truth, but a messy, fascinating process of discovery, correction, and refinement. The real insight isn’t the number 86 billion, but the incredible journey to find it, and the ongoing conversation that continues to shape our knowledge of the amazing organ inside our heads.



