Pocket worthyStories to fuel your mind

Dude, Where’s My Frontal Cortex?

There’s a method to the madness of the teenage brain.

Nautilus

Read when you’ve got time to spare.

3707_2ad9e5e943e43cad612a7996c12a8796.png

Illustration by John Hendrix.

In the foothills of the Sierra Mountains, a few hours east of San Francisco, are the Moaning Caverns, a cave system that begins, after a narrow, twisting descent of 30-some feet, with an abrupt 180-foot drop. The Park Service has found ancient human skeletons at the bottom of the drop. Native Americans living there at the time didn’t make human sacrifices. Instead, these explorers took one step too far in the gloom. The skeletons belonged to adolescents.

No surprises there. After all, adolescence is the time of life when someone is most likely to join a cult, kill, be killed, invent an art form, help overthrow a dictator, ethnically cleanse a village, care for the needy, transform physics, adopt a hideous fashion style, commit to God, and be convinced that all the forces of history have converged to make this moment the most consequential ever, fraught with peril and promise.

For all this we can thank the teenage brain. Some have argued adolescence is a cultural construct. In traditional cultures, there is typically a single qualitative transition to puberty. After that, the individual is a young adult. Yet the progression from birth to adulthood is not smoothly linear. The teenage brain is unique. It’s not merely an adult brain that is half-cooked or a child’s brain left unrefrigerated for too long. Its distinctiveness arises from a key region, the frontal cortex, not being fully developed. This largely explains the turbulence of adolescence. It also reflects an important evolutionary pressure.

The frontal cortex is the most recently evolved part of the human brain. It’s where the sensible mature stuff happens: long-term planning, executive function, impulse control, and emotional regulation. It’s what makes you do the right thing when it’s the harder thing to do. But its neurons are not fully wired up until your mid-20s. Why?

The delayed maturation of the frontal cortex also helps explain the defining feature of adolescence, namely the weird predilection for bungee jumping.

It’s a central tenet of genetics that the genome that we start off life with, back when we were just a fertilized egg, is passed on to every subsequent cell in the body. But if the frontal cortex is the last part of the brain to fully mature, it’s the brain region least shaped by that genome and most sculpted by experience.

Our success as primates, and as individuals, revolves around social intelligence and adapting to the subtleties and idiosyncrasies of the environment. This is the portfolio of the frontal cortex. So if it’s going to develop the ability to do this right, it’s going to have to be profoundly shaped and informed by every smidgen of experience along the way.

Like evolution itself, though, maturation seldom follows a straight and narrow path. Adolescence reminds us that life has many rivers to cross, some fraught with turbulence.


Around the onset of adolescence, the frontal cortex is the only brain region that has not reached adult levels of grey matter, made up of neuronal cell bodies. It would seem logical that gray matter levels would increase thereafter. But no, over the course of adolescence, frontal cortical gray matter volume decreases.

This occurs because of one of the cleverest things that brains ever evolved. During fetal development, mammalian brains generate far more neurons than are found in the adult brain. Why? Because the fetal brain holds a dramatic competition. Winning neurons get to migrate to the correct location and form the optimal number of connections with other neurons. Neurons that miss the cut undergo “programmed cell death.” Neuronal overproduction followed by competitive pruning (a process that has been termed “Neural Darwinism”) allows more complex and optimized neural circuitry, a wonderful example of less being more.

The same plays out in the adolescent frontal cortex. At the beginning of adolescence, gray matter volume is greater than it is in adults, and subsequently declines, as less optimally connected neurons are pruned away. Within the frontal cortex, it is the evolutionarily oldest sub-regions that mature first; the spanking new dorsolateral prefrontal cortex, for example, does not even begin to lose gray matter volume until the end of adolescence. This delayed frontal cortical maturation means that adolescents aren’t at adult levels of expertise at various cognitive tasks, like recognizing irony or Theory of Mind—the ability to operate with the knowledge that someone else has different information than you do.

In an adult, the frontal cortex steadies the activity of parts of the limbic system, a brain region involved in emotion; in contrast, in the teenage brain, the limbic system is already going at full speed, while the frontal cortex is still trying to make sense of the assembly instructions. One result of this imbalance is that emotions are more intense. Stick people in a brain scanner and show them pictures of faces expressing strong emotions. In the adult, there is activation of a prime limbic structure, the amygdala; shortly after, there is activation of frontal cortical regions, which damp the amygdaloid response: “OK, calm down, it’s just a picture of an angry/sad/happy/scared face, not the real thing.”

But in the teenager, the frontal cortical response is muted, and the amygdala’s response is augmented. That means emotional highs are higher, and the lows are lower. This is shown in studies of limbic pathways that release dopamine, a neurotransmitter central to anticipation of reward and pleasure (cocaine, for example, works on this limbic dopamine system). Average dopamine levels in adolescents and adults do not differ; what differs are patterns of dopamine release. Put an adult in a brain scanner, give them a small reward, and there’s a small degree of activation of this dopamine circuitry. Medium reward, medium activation; big reward, big activation. Now do the same with a teenager. Medium reward, medium activation, same as in the adult. With a big reward, though, dopamine signaling increases far more than in an adult. Small reward and dopamine signaling decreases. A small reward feels like a punishment. This is a neurochemical gyroscope that’s way off kilter.

He had reached the point where if he had to spend one more day with the same baboons he’s known his whole life, he’s going to scream.

The delayed maturation of the frontal cortex also helps explain the defining feature of adolescence, namely the weird predilection for bungee jumping. During risky decision-making, adolescents show less activation of some key sub-regions of the frontal cortex than do adults, and among adolescents, the less activity in these regions, the poorer the risk assessment.

And adolescents are bad at risk assessment in a particular way, shown by Sarah-Jayne Blakemore of University College London. Ask test subjects to estimate the likelihood of some event happening to them and then tell them the actual likelihood of it happening. The feedback can constitute good news, as subjects learn that a good event is more likely to occur than they thought. Conversely, the feedback can constitute bad news, as subjects learn an event can happen more often than they expected. Both adults and teens are likely to adjust their assessments when asked again about the likelihood of good news. But teens alone don’t get the picture about bad news. Researcher: What do you think the odds are of your having a car accident if you’re driving drunk? Adolescent: One chance in a gazillion. Researcher: Actually, the risk for people in general is 50 percent; now what do you think your chances are? Adolescent: Hey, we’re talking about me; one chance in a gazillion. This helps explain why adolescents have 2 to 4 times the rate of pathological gambling than adults.

So adolescents are lousy at risk assessment and take more risks. But there’s more to the story of those skeletons in Moaning Caverns. It’s not the case that adolescents and adults have an equal desire to do the same dumb-ass thing, and the sole difference is that the fully mature frontal cortex in the latter prevents them from doing so. Adolescents feel the allure of jumping off things. Middle-aged adults just recklessly cheat on their diets. Adolescents not only take more risks, they seek more novelty.

Adolescence is the time of life when we develop our tastes in music, food, and fashion, with openness to novelty declining thereafter. We’re not alone. When are lab rats most willing to try a novel food? During rodential adolescence. Among many social mammals, the adolescents of one of the sexes leave their natal group, avoiding inbreeding. Consider impalas, who live in a “harem structure,” a group of related females and offspring with one breeding male (and where the rest of the males knock around disconsolately in “bachelor herds”). When a young male hits puberty, he is driven out by the resident breeding male.

It’s different among primates. Take baboons. Two troops encounter each other on opposite sides of a stream. The males hoot and holler at each other in a manly fashion until they get bored and go back to foraging, ignoring the interlopers. An adolescent stands on the edge of the stream, riveted. “New baboons, a whole bunch of ’em!” Nervous and agitated, he runs five steps toward them, runs back four. He tentatively crosses the stream, sits on the very edge of the other bank, scampers back as soon as anyone so much as glances at him.

The next day he stays on the other side for an hour. Then an afternoon. Then a night, breaking the umbilical cord. He wasn’t pushed out of his home troop, as with impalas. He had reached the point where if he had to spend one more day with the same baboons he’s known his whole life, he’s going to scream. The same thing happens with adolescent female chimps, who can’t get off the farm fast enough, heading off into the great unknown in the next valley. We primates don’t get driven out of our herds at adolescence. We get a mad craving for novelty.

These traits are exacerbated when adolescents are around peers. In one study, Laurence Steinberg of Temple University discovered that adolescents and adults, when left on their own, don’t differ in the risks they take in a driving simulator. Add peers egging them on and rates don’t budge in adults but become significantly higher in teens. When the study is carried out in a brain scanner, the presence of peers (egging on by intercom) lessens frontal cortical activity and enhances activity in the limbic dopamine system in adolescents, but not in adults.

This teenage vulnerability to peer pressure is worsened by the fact that such pressure rarely takes the form of hesitant adolescents coerced into joining in the fun of committing random acts of kindness. Instead, pressure disproportionately takes the form of “deviance training,” increasing the likelihood of risky sexual behavior, poor health habits, substance abuse, and violence. As has been said, the greatest crime-fighting tool available to society is a 30th birthday.

There’s a feature of adolescence that makes up for the stupid risk-taking and hideous fashion decisions: a frenzied, agitated, incandescent ability to feel someone else’s pain.

One brain-imaging study reveals the neural depths of adolescent pain in not belonging. Put someone in a scanner to play a video game with two other individuals, and manipulate things so that the subject believes they are being ostracized. In adults, this social exclusion activates the amygdala along with other limbic regions associated with pain, disgust, anger, and sadness. But then the frontal cortex kicks in—“Come on, it’s a stupid game”—and the limbic structures quiet down. Do the same with an adolescent and the frontal cortex remains silent and that agonized limbic network of teenage angst wails.

The slowpoke frontal cortex is not the only explanation for teen behavior. Another factor comes into play that keeps that teen brain off balance, namely gonadal hormones like estrogen and progesterone in females, and testosterone in males. This helps explain why adolescence is more turbulent than childhood—the frontal cortex is immature at both ages, but the tsunamis of hormones haven’t started in pre-adolescents. Hormones have numerous effects on the function of both the limbic system and frontal cortex. Testosterone decreases the ability of the frontal cortex to communicate with and rein in the amygdala. Not surprisingly, landmarks of adolescent maturation in brain and behavior are less related to chronological age than to time since puberty.

The onset of puberty is not merely about the sudden onslaught of gonadal hormones. The defining feature of ovarian endocrine function is, of course, the oscillations of hormone release. In adolescent females, puberty does not arrive in full flower, so to speak, with one’s first period. Instead, for the first few years, only about half of cycles actually involve ovulation and the accompanying hormonal surges. So not only are there major fluctuations in gonadal hormone levels due to ovulation, but also higher-order fluctuations as to whether ovulation occurs. And fluctuations in hormones affect emotion and cognition. (While an adolescent male does not have the hormonal fluctuations of his female counterparts, it can’t be a great thing that his frontal cortex probably keeps getting hypoxic from all that priapic blood flow to his crotch.)

As adolescence dawns, the frontal cortex’s efficiency is diluted with superfluous connections failing to make the grade. The limbic system is fully online and dopamine is careening all over the place. Meanwhile, the brain is being marinated in the ebb and flow of gonadal hormones. No wonder the human species produces beings like Justin Bieber and Miley Cyrus, making a handy living catering to their constituency.

Party in the Brain: In adolescence, the frontal cortex is diluted with superfluous connections, dopamine is careening all over the place, and the brain is being marinated in the ebb and flow of hormones. Photo from Getty Images for MTV.


But adolescence isn’t always as dark as it’s made out to be. There’s a feature of adolescence that makes up for the stupid risk-taking and hideous fashion decisions. And that’s an adolescent’s frenzied, agitated, incandescent ability to feel someone else’s pain, to feel the pains of the entire world, to want to right all its wrongs. Adolescents are nature’s most wondrous example of empathy, where the forcefulness of feeling as the other can border on nearly being the other.

This intensity is at the intersection of so many facets of adolescence. With the highs higher and lows lower, the empathic pain scalds and the glow of having done the right thing makes it seem plausible that we are here for a purpose. Another factor is the openness to novelty. An open mind is a prerequisite for an open heart, and the adolescent hunger for the new readily presents opportunities to walk a mile in someone else’s shoes. And there is the egoism of adolescence. There was a period during my late adolescence where I hung out with Quakers. They’d often say, “All God has is thee.” This is God of limited means, not just a God who needs the help of humans to right a wrong, but who needs your help most of all. Egoism is tailor-made for adolescents. Throw in inexhaustible energy and the sense of omnipotence and it seems possible to make the world whole.

A few years ago, I saw a magnificent example of the empathy that a sluggish frontal cortex can produce in teenagers. My daughter is seriously into theater and at the time she was in a production of a superb, searing play about the Bosnian genocide, Stefanie Zadravec’s Honey Brown Eyes. She played a 12-year-old Bosnian girl for whom things don’t go so great, and whose life-or-death fate is ambiguous as the play ends.

Some high school kids had come to a performance as a group outing for an English class. About halfway through the play, my daughter’s character appears for the first time, cautiously emerging from a ventilation duct in her kitchen where she’d been hiding, unaware that the soldier who had just left the apartment after killing her mother was going to return. Up until that point, she had only been hinted at as a character. The soldier had his ethnic-cleansing to-do list of names of Bosnians in the building to kill, and kept demanding of the mother, “Where’s your daughter? It says you have a daughter.” “I don’t have a daughter,” the mother repeated up until her death. So as the girl begins to emerge from the ventilation duct, the realization sweeps through the audience: there is a daughter. As my daughter began to crawl out, the teenagers in the audience did something you’re not supposed to do in a theater, something no adult with a developed frontal cortex would do. After a moment of hushed silence, two or three voices called out, “No!” Another called, “Go back in, it’s not safe!,” another, “He’s coming back!” After the play, the teenagers clustered around my little girl when she came out of the stage door, hugging her, reassuring themselves that both she and her character were OK.

This is the picture of adolescents with their hearts on their sleeves, limbic systems going full blast, and their frontal cortices straining to catch up with some emotional self-regulation. When I see the best of my university students in that agitated, optimistic state, I always have the same thought: It used to be so much easier to be like this. Having this adult frontal cortex of mine probably enables me to do good in a more efficacious, detached way. The trouble, though, is the same detachment makes it so much easier to decide that it’s really not my problem.


So what is the adaptive advantage of human brain development evolving this way? Potentially, there is no advantage. Perhaps the human frontal cortical maturation is delayed because it is the most challenging construction project the brain undertakes. In this view, wiring up something like the visual cortex can pretty much be wrapped up in the first year of life, but doing the same in the frontal cortex takes another quarter century. This seems unlikely. The frontal cortex has the same basic structure as the rest of the cortex, uses the same neurotransmitters and types of neurons. On a nuts and bolts level, its maturation is slower than necessary, suggesting that there has indeed been active selection for the delay, that there is something advantageous about it.

One possibility, of course, is the adolescent turbulence. If the frontal cortex wired up at the same speed as the rest of the brain, there’d be no Sturm und Drang, no emotional fireworks, no psychic acne. Just a simple transition from kids to fertile adults around age 12. One can imagine that something would then be lost—that developmental phase of antsy, itchy exploration and creativity that has been evolutionarily enriching. We might not have had that long line of pimply adolescent geniuses who worked away to invent fire, cave-painting, and the wheel.

Maybe. But this just-so story has to accommodate the fact that behavior doesn’t evolve for the good of the species, but for passing on copies of genes of individuals. And throughout history, for every teenager who scored big-time in the reproduction department thanks to some adolescent inventiveness, there’ve been far more who broke their necks from some adolescent imprudence.

No, I think that the genetic program of brain development has evolved to help free the frontal cortex from the straightjacket of genes. If the frontal cortex is the last part of the brain to fully mature, it is by definition the brain region least shaped by that genome and most sculpted by experience. With each passing day, the frontal cortex is more the creation of what life has thrown at you, and thus who you become.

A key theme in current neuroscience—the plasticity of the adult brain—underscores how this happens. As shown with a vast body of research, repeated stimulation of a synapse (the connection between two neurons) causes it to become more excitable, to pass on messages more readily; the synapse “remembers,” as a likely building block of how memory works. Similarly, the dense arbor of neuronal processes that determine which neurons connect to each other can expand or contract, depending on experience. Spend a summer learning how to juggle and entire circuits in the motor cortex will remap. Despite a zillion years of neuroscience dogma, it is now clear that the adult brain makes new neurons in response to things like an enriched environment. All of these occur in adults of any age, but most readily in the young adult brain. And if the frontal cortex is unique in still developing then, that makes it the brain’s hotspot for plasticity.

Why is this important? One answer comes from recent studies of intelligence in education. Some educators stress that a student’s “emotional intelligence” or “social intelligence” (as measured various ways) is a better predictor of adult success and happiness than their IQ or SAT scores. It’s all about social memory rather than memory of vocabulary words, about emotional perspective-taking, impulse control, empathy, ability to work with others, self-regulation.

There’s a parallel in other primates, with their big, slow-maturing frontal cortexes. What makes a “successful” male baboon in a world of dominance interactions? Attaining a high rank is all about muscle, sharp canines, well-timed aggression. But once alpha status has been achieved, maintaining it is all about social smarts—which potential coalitions to form, which to stay away from, how to cow a rival through psychological intimidation, having sufficient impulse control to walk away from provocations, and avoiding displacing aggression onto everyone else when you’re having a bad hair day. This is the realm where doing the right thing is often the harder thing, and adult life is filled with consequential forks in the road where intelligence lights the way forward.

This is all worth keeping in mind the next time you find yourself being an adolescent, or dealing with one who has the volume up to 11 in every domain. Sure, adolescence has its down sides, but it has its abundant pluses—the inventiveness, the optimism, the empathy. Its biggest plus is that it allows the frontal cortex time to develop. There’s no other way we could navigate the ever-increasing complexity of our social world.

Robert Sapolsky is professor of biology, neurology, and neurosurgery at Stanford University. His most recent book is Behave: The Biology of Humans At Our Best and Worst.

How was it? Save stories you love and never lose them.


Logo for Nautilus

This post originally appeared on Nautilus and was published May 23, 2019. This article is republished here with permission.

Did you enjoy this story?

Subscribe to Nautilus