From Why You Truly Never Leave High School
New science on its corrosive, traumatizing effects.
By Jennifer Senior, New York Magazine, January 20, 2013
Not everyone feels the sustained, melancholic presence of a high-school shadow self. There are some people who simply put in their four years, graduate, and that’s that.
But for most of us adults, the adolescent years occupy a privileged place in our memories, which to some degree is even quantifiable: Give a grown adult a series of random prompts and cues, and odds are he or she will recall a disproportionate number of memories from adolescence. This phenomenon even has a name—the “reminiscence bump”—and it’s been found over and over in large population samples, with most studies suggesting that memories from the ages of 15 to 25 are most vividly retained. (Which perhaps explains Ralph Keyes’s observation in his 1976 classic, Is There Life After High School?: “Somehow those three or four years can in retrospect feel like 30.”)
To most human beings, the significance of the adolescent years is pretty intuitive. Writers from Shakespeare to Salinger have done their most iconic work about them; and Hollywood, certainly, has long understood the operatic potential of proms, first dates, and the malfeasance of the cafeteria goon squad. “I feel like most of the stuff I draw on, even today, is based on stuff that happened back then,” says Paul Feig, the creator of Freaks and Geeks, which had about ten glorious minutes on NBC’s 1999–2000 lineup before the network canceled it. “Inside, I still feel like I’m 15 to 18 years old, and I feel like I still cope with losing control of the world around me in the same ways.” (By being funny, mainly.)
Yet there’s one class of professionals who seem, rather oddly, to have underrated the significance of those years, and it just happens to be the group that studies how we change over the course of our lives: developmental neuroscientists and psychologists. “I cannot emphasize enough the amount of skewing there is,” says Pat Levitt, the scientific director for the National Scientific Council on the Developing Child, “in terms of the number of studies that focus on the early years as opposed to adolescence. For years, we had almost a religious belief that all systems developed in the same way, which meant that what happened from zero to 3 really mattered, but whatever happened thereafter was merely tweaking.”
Zero to 3. For ages, this window dominated the field, and it still does today, in part for reasons of convenience: Birth is the easiest time to capture a large population to study, and, as Levitt points out, “it’s easier to understand something as it’s being put together”—meaning the brain—“than something that’s complex but already formed.” There are good scientific reasons to focus on this time period, too: The sensory systems, like hearing and eyesight, develop very early on. “But the error we made,” says Levitt, “was to say, ‘Oh, that’s how all functions develop, even those that are very complex. Executive function, emotional regulation—all of it must develop in the same way.’ ” That is not turning out to be the case. “If you’re interested in making sure kids learn a lot in school, yes, intervening in early childhood is the time to do it,” says Laurence Steinberg, a developmental psychologist at Temple University and perhaps the country’s foremost researcher on adolescence. “But if you’re interested in how people become who they are, so much is going on in the adolescent years.”
If humans really do feel things most intensely during adolescence, and if, at this same developmental moment, they also happen to be working out an identity for the first time—“sometimes morbidly, often curiously, preoccupied with what they appear to be in the eyes of others as compared with what they feel they are,” as the psychoanalyst Erik Erikson wrote—then it seems safe to say this: Most American high schools are almost sadistically unhealthy places to send adolescents.
Until the Great Depression, the majority of American adolescents didn’t even graduate from high school. Once kids hit their teen years, they did a variety of things: farmed, helped run the home, earned a regular wage. Before the banning of child labor, they worked in factories and textile mills and mines. All were different roads to adulthood; many were undesirable, if not outright Dickensian. But these disparate paths did arguably have one virtue in common: They placed adolescent children alongside adults. They were not sequestered as they matured. Now teens live in a biosphere of their own. In their recent book Escaping the Endless Adolescence, psychologists Joseph and Claudia Worrell Allen note that teenagers today spend just 16 hours per week interacting with adults and 60 with their cohort. One century ago, it was almost exactly the reverse.
Something happens when children spend so much time apart from adult company. They start to generate a culture with independent values and priorities. James Coleman, a renowned mid-century sociologist, was among the first to analyze that culture in his seminal 1961 work, The Adolescent Society, and he wasn’t very impressed. “Our society has within its midst a set of small teen-age societies,” he wrote, “which focus teen-age interests and attitudes on things far removed from adult responsibilities.” Yes, his words were prudish, but many parents have had some version of these misgivings ever since, especially those who’ve consciously opted not to send their kids into the Roman amphitheater. (From the website of the National Home Education Network: “Ironically, one of the reasons many of us have chosen to educate our own is precisely this very issue of socialization! Children spending time with individuals of all ages more closely resembles real life than does a same-age school setting.”)
In fact, one of the reasons that high schools may produce such peculiar value systems is precisely because the people there have little in common, except their ages. “These are people in a large box without any clear, predetermined way of sorting out status,” says Robert Faris, a sociologist at UC Davis who’s spent a lot of time studying high-school aggression. “There’s no natural connection between them.” Such a situation, in his view, is likely to reward aggression. Absent established hierarchies and power structures (apart from the privileges that naturally accrue from being an upperclassman), kids create them on their own, and what determines those hierarchies is often the crudest common-denominator stuff—looks, nice clothes, prowess in sports—rather than the subtleties of personality. “Remember,” says Crosnoe, who spent a year doing research in a 2,200-student high school in Austin, “high schools are big. There has to be some way of sorting people socially. It’d be nice if kids could be captured by all their characteristics. But that’s not realistic.”
The result, unfortunately, is a paradox: Though adolescents may want nothing more than to be able to define themselves, they discover that high school is one of the hardest places to do it. Crosnoe mentions the 1963 classic Stigma: Notes on the Management of Spoiled Identity, in which the sociologist Erving Goffman very devastatingly defines the term in his title as “a trait that can obtrude itself upon attention … breaking the claim that other attributes have on us.” For many people, that’s the high-school experience in a nutshell. At the time they experience the most social fear, they have the least control; at the time they’re most sensitive to the impressions of others, they’re plunked into an environment where it’s treacherously easy to be labeled and stuck on a shelf. “Shame,” says Brené Brown, a researcher at the University of Houston, “is all about unwanted identities and labels. And I would say that for 90 percent of the men and women I’ve interviewed, their unwanted identities and labels started during their tweens and teens.”
Out of all the researchers who think about high-school-related topics, Brené Brown may be the one whose work interests me most. Since 2000, she has studied shame in pointillist detail. She’s written both academic papers and general-interest books on the subject; her ted lecture on shame was one of the most popular of all time. Because that’s what high school—both at the time and as the stuff of living memory—is about, in its way: shame. And indeed, when Brown and I met for breakfast this fall, she told me that high school comes up all the time in her work. “When I asked one of the very first men I ever interviewed, ‘What does shame mean to you?’ ” she recalled, “he answered, ‘Being shoved up against the lockers.’ High school is the metaphor for shame.”
The academic interest in shame and other emotions of self-consciousness (guilt, embarrassment) is relatively recent. It’s part of a broader effort on the part of psychologists to think systematically about resilience—which emotions serve us well in the long run, which ones hobble and shrink us. Those who’ve spent a lot of time thinking about guilt, for example, have come to the surprising conclusion that it’s pretty useful and adaptive, because it tends to center on a specific event (I cannot believe I did that) and is therefore narrowly focused enough to be constructive (I will apologize, and I will not do that again).
Shame, on the other hand, is a much more global, crippling sensation. Those who feel it aren’t energized by it but isolated. They feel unworthy of acceptance and fellowship; they labor under the impression that their awfulness is something to hide. “And this incredibly painful feeling that you’re not lovable or worthy of belonging?” asks Brown. “You’re navigating that feeling every day in high school.”
Most of us, says Brown, opt for one of three strategies to cope with this pain. We move away from it, “by secret-keeping, by hiding”; we move toward it, “by people-pleasing”; or we move against it “by using shame and aggression to fight shame and aggression.” Whichever strategy we choose, she says, the odds are good we’ll use that strategy for life, and those feelings of shame will heave to the surface, unbidden and unannounced, in all sorts of unfortunate settings down the road.
Like among our future families, for instance. Brown says it’s remarkable how many parents of teenagers talk to her about reexperiencing the shame of high school once their own kids start to experience the same familiar scenarios of rejection. “The first time our kids don’t get a seat at the cool table, or they don’t get asked out, or they get stood up—that is such a shame trigger,” she says. “It’s like a secondary trauma.” So paralyzing, in fact, that she finds parents often can’t even react with compassion. “Most of us don’t say, ‘Hey, it’s okay. I’ve been there.’ We say, ‘I told you to pull your hair back and wear some of those cute clothes I bought you.’ ”
And it’s not just the bullied who carry the shame of those years. Rosalind Wiseman, author of Queen Bees and Wannabes (subsequently transformed into the movie Mean Girls), points to the now-legendary Washington Post story that ran last spring, which documented Mitt Romney’s escapades as a prep-school ogre: pinning down an outcast and cutting his hair; shouting “Atta girl” to a closeted boy when he tried to speak; leading a teacher with poor eyesight into a set of closed doors. Years later, one of the victims carried that pain with him still (“It’s something I have thought about a lot since then,” he said). But even more telling, she notes, was that Romney’s co-conspirators in thuggery felt so awful about their misdeeds as boys in 1965 that they talked about them openly, on the record, as grown men in 2012. “To this day, it troubles me,” Thomas Buford, a retired prosecutor, told the Post. He carried around that shame for almost half a century.
In the fall of 2011, Tavi Gevinson, the 16-year-old force behind the web magazine Rookie, solicited a wide variety of celebrities for advice about how to survive high school. Among the wisest essays came from Winnie Holzman, the creator of My So-Called Life. “In high school,” she wrote, “we become pretty convinced that we know what reality is: We know who looks down on us, who is above us, exactly who our friends and our enemies are.” The truth of the matter, wrote Holzman, is that we really have no clue. “[W]hat seems like unshakable reality,” she concluded, “is basically just a story we learned to tell ourselves.”
There happens to be a body of contemporary research that suggests Holzman is right. Adolescents often do take a highly distorted view of their social world. In 2007, for instance, Steinberg and two colleagues surveyed hundreds of adolescents in two midwestern communities, asking them to decide which category they most identified with: Jocks, Populars, Brains, Normals, Druggie/Toughs, Outcasts, or None. They also asked a subsample of those kids to make the same assessment of their peers. Then they compared results.
Some were predictable. The kids who were identified as Druggies, Normals, or Jocks, for example, tended to see themselves in the same way. What was surprising was the self-assessment of the kids others thought were popular. Just 27 percent in one study and 37 in a similar, second study in the same paper saw themselves as campus celebrities. Yes, a few declared themselves Jocks, perhaps just as prestigious. But more were inclined to view themselves either as normal or none of the above.
Faris’s research on aggression in high-school students may help account for this gap between reputation and self-perception. One of his findings is obvious: The more concerned kids are with popularity, the more aggressive they are. But another finding isn’t: Kids become more vulnerable to aggression as their popularity increases, unless they’re at the very top of the status heap. “It’s social combat,” he explains. “Think about it: There’s not much instrumental value to gossiping about a wallflower. There’s value to gossiping about your rivals.” The higher kids climb, in other words, the more precariously balanced they feel, unless they’re standing on the square head of the totem pole. It therefore stands to reason that many popular kids don’t see themselves as popular, or at least feel less powerful than they loom. Their perch is too fragile.
It’s also abundantly, poignantly clear that during puberty, kids have absolutely no clue how to assess character or read the behavior of others. In 2005, the sociologist Koji Ueno looked at one of the largest samples of adolescents in the United States, and found that only 37 percent of their friendships were reciprocal—meaning that when respondents were asked to name their closest friends, the results were mutual only 37 percent of the time. One could argue that this heartbreaking statistic is just further proof that high school is a time of unrequited longings. But these statistics also suggest that teenagers cannot tell when they are being rejected (Hey, guys, wait for me!) or even accepted (I thought you hated me). So much of what they think they know about others’ opinions of them is plain wrong.
Deborah Yurgelun-Todd, director of the Cognitive Neuroimaging Laboratory at the University of Utah, did a well-known pilot study at McLean Hospital a few years ago asking teenagers to look at a picture of a face and identify the emotion they saw. Every adult who looked at that picture—100 percent of them—saw fear in that face. Not the teenagers. Half of them saw anger or confusion, even sadness.
It was a really small study. I wouldn’t necessarily read too much into it. But its results sum up the entire high-school experience, in my view: mistaking people’s fear for something else.
Kurt Vonnegut wrote that high school “is closer to the core of the American experience than anything else I can think of.” And it is, certainly, in the sense that it’s the last shared cultural experience we have before choosing different paths in our lives. But for years, I’d never quite understood why high-school values are so different from adult ones. In fact, whenever I spoke to sociologists who specialized in the rites and folkways of this strange institution, I’d ask some version of this question: Why is it that in most public high schools across America, a girl who plays the cello or a boy who plays in the marching band is a loser? And even more fundamentally: Why was it such a liability to be smart?
The explanations tended to vary. But among the most striking was the one offered by Steinberg, who conjectured that high-school values aren’t all that different from adult values. Most adults don’t like cello or marching bands, either. Most Americans are suspicious of intellectuals. Cellists, trumpet players, and geeks may find their homes somewhere in the adult world, and even status and esteem. But only in places that draw their own kind.
Robert Faris puts an even finer point on this idea. “If you put adults in a similar situation”—meaning airlifted into a giant building full of strangers with few common bonds—“you’d find similar behaviors.” Like reality television, for instance, in which people literally divide into tribes, form alliances, and vote one another off the island. “And I think you see it in nursing homes,” says Faris. “In small villages. And sometimes in book clubs.” And then I realized, having covered politics for many years: Congress, too. “It’s not adolescence that’s the problem,” insists Faris. “It’s the giant box of strangers.”
As adults, we spend a lot of time in boxes of strangers. “I have always referred to life as ‘perpetual high school,’ ” Paul Feig wrote me in our first e-mail exchange, later adding, when we spoke, that his wife’s first order when she landed her Hollywood dream job was to go fire her predecessor. Brown tells me she frequently hears similar things from men in finance—as a reward for outstanding quarterly earnings, they get to pick their new office, which means displacing someone else. (The corresponding shame led one to consider quitting: “I didn’t sign up to terrorize people,” he tells her in her latest book, Daring Greatly.) Today, we also live in an age when our reputation is at the mercy of people we barely know, just as it was back in high school, for the simple reason that we lead much more public, interconnected lives. The prospect of sudden humiliation once again trails us, now in the form of unflattering photographs of ourselves or unwanted gossip, virally reproduced. The whole world has become a box of interacting strangers.
Maybe, perversely, we should be grateful that high school prepares us for this life. The isolation, the shame, the aggression from those years—all of it readies us to cope. But one also has to wonder whether high school is to blame; whether the worst of adult America looks like high school because it’s populated by people who went to high school in America. We’re recapitulating the ugly folkways of this institution, and reacting with the same reflexes, because that’s where we were trapped, and shaped, and misshaped, during some of our most vulnerable years.
High school itself does something to us, is the point. We bear its stripes. Last October, the National Bureau of Economic Research distributed a study showing a compelling correlation between high-school popularity—measured by how many “friendship nominations” each kid received from their peers—and future earnings in boys. Thirty-five years later, the authors estimated, boys who ranked in the 80th percentile of popularity earned, on average, 10 percent more than those in the 20th. There are obvious chicken-and-egg questions in all studies like this; maybe these kids were already destined for dominance, which is why they were popular. But Gabriella Conti, an economist and first author of the paper, notes that she and her colleagues took into consideration the personality traits of their subjects, measuring their levels of openness, agreeableness, extroversion, and so forth. “And adolescent popularity is predictive beyond them,” she says, “which tells me this is about more than just personality. It’s about interpersonal relations. High school is when you learn how to master social relationships—and to understand how, basically, to ‘play the game.’ ” Or don’t. Joseph Allen and his colleagues at the University of Virginia just found that kids who suffer from mild depression at 14, 15, and 16 have worse odds in the future—in romance, friendship, competency assessments by outsiders—even if their depression disappears and they become perfectly happy adults. “Because that’s their first template for adult interaction,” says Allen when asked to offer an explanation. “And once they’re impaired socially, it carries forward.”
Yet even the most popular kids, the effortlessly perfect ones, the ones who roamed the halls as if their fathers had built them especially in their honor, may not entirely benefit from the experiences of the high-school years. In 2000, three psychologists presented a paper titled “Peer Crowd-Based Identities and Adjustment: Pathways of Jocks, Princesses, Brains, Basket-Cases, and Criminals,” which asked a large sample of tenth-graders which of the five characters from The Breakfast Club they most considered themselves to be, and then checked back in with them at 24. The categories were “immensely predictive,” according to Jacquelynne Eccles, one of the authors. (Criminals were still most apt to smoke pot; male jocks still had the highest self-esteem.) But one datum was interesting: At 24, the princesses had lower self-esteem than the brainy girls, which certainly wasn’t true when they were 16. But Eccles sees no inconsistency in this finding. In fact, she suspects it will hold true when she completes her follow-up with the same sample at 40. “Princesses are caught up in this external world that defines who they are,” says Eccles, “whereas if brainy girls claim they’re smart, that probably is who they are.” While those brainy girls were in high school, they couldn’t rely on their strengths to gain popularity, perhaps, but they could rely on them as fuel, as sources of private esteem. Out of high school, they suddenly had agency, whereas the princesses were still relying on luck and looks and public opinion to carry them through, just as they had at 16. They’d learned passivity, and it’d stuck.
Whether it’s for vindication or validation, whether out of self-punishment or self-appeasement, many of us choose to devote a lot of time revisiting our high-school years. That’s the crazy thing. In 2011, the Pew Research Center found that the largest share of our Facebook friends—22 percent—come from high school. Keith Hampton, a Rutgers sociologist and one of the researchers who did the analysis, says this is true for college- and non-college-educated Americans alike. In fact, Hampton suspects that Facebook itself plays a role. “Before Facebook, there was a real discontinuity between our high-school selves and the rest of our lives.” Then Mark Zuckerberg came along. “Social ties that would have gone dormant now remain accessible over time, and all the time.”
Maybe that’s what ultimately got me to that nondescript bar near Times Square last fall. Until Facebook, the people from my high-school years had undeniably occupied a place in my unconscious, but they were ghost players, gauzy and green at the edges. Now here they were, repeatedly appearing in my news feed, describing their plans to attend our reunion. And so I went, curious about whom they’d become. There were the former football players, still acting like they owned the joint, but as much more generous proprietors. There were the beautiful girls, still beautiful, but looking less certain about themselves. There was my former best pal, who’d blown past me on her way to cheerleaderhood, but nervous in a way I probably hadn’t recognized back then. I was happy to see her. And to see a lot of them, truth be told. We’d all grown more gracious; many of us had bloomed; and it was strangely moving to be among people who all shared this shameful, grim, and wild common bond. I found myself imagining how much nicer it’d have been to see all those faces if we hadn’t spent our time together in that redbrick, linoleum-tiled perdition. Then again, if we hadn’t—if we’d been somewhere more benign—I probably wouldn’t have cared.
Love makes you strong: Romantic relationships help neurotic people stabilize their personality
Science Daily, May 9, 2014
Psychologists from Jena and Kassel (Germany) found, that a romantic relationship helps neurotic people to stabilize their personality.
It is springtime and they are everywhere: Newly enamored couples walking through the city hand in hand, floating on cloud nine. Yet a few weeks later the initial rush of romance will have dissolved and the world will not appear as rosy anymore. Nevertheless, love and romance have long lasting effects.
Psychologists of the German Universities of Jena and Kassel discovered that a romantic relationship can have a positive effect on personality development in young adults. Researchers report on this finding in the online edition of the science magazine Journal of Personality. The scientists focused on neuroticism — one of the five characteristics considered to be the basic dimensions of human personality which can be used to characterize every human being. “Neurotic people are rather anxious, insecure, and easily annoyed. They have a tendency towards depression, often show low self-esteem and tend to be generally dissatisfied with their lives,” Dr. Christine Finn explains, who wrote her doctoral dissertation within the framework of the current study. “However, we were able to show that they become more stable in a love relationship, and that their personality stabilizes,” the Jena psychologist says.
The scientists have accompanied 245 couples in the age group 18 to 30 years for nine months and interviewed them individually every three months. Using a questionnaire the scientists analyzed the degrees of neuroticism as well as relationship satisfaction. Moreover, the study participants had to evaluate fictitious everyday life situations and their possible significance for their own partnership. “This part was crucial, because neurotic people process influences from the outside world differently,” Finn explains. For instance, they react more strongly to negative stimuli and have a tendency to interpret ambiguous situations negatively instead of positively or neutrally.
The scientists found that this tendency gradually decreases over time when being in a romantic relationship. On the one hand, the partners support each other, according to Christine Finn. On the other hand, the cognitive level, i.e. the world of inner thought of an individual, plays a crucial role: “The positive experiences and emotions gained by having a partner change the personality — not directly but indirectly — as at the same time the thought structures and the perception of presumably negative situations change,” Finn emphasizes. To put it more simply: Love helps us to tackle life with more confidence instead of seeing things pessimistically straight away.
The scientists were able to observe this effect in men as well as women. “Of course everyone reacts differently and a long, happy relationship has a stronger effect than a short one,” Prof. Dr. Franz J. Neyer says. He is the co-author of the new publication and chair of Differential Psychology of the Jena University. “But generally we can say: young adults entering a relationship can only win!”
For Christine Finn the results contain yet another positive message — not only for people with neurotic tendencies but also for those who suffer from depression or anxiety disorders: “It is difficult to reform a whole personality but our study confirms: Negative thinking can be unlearned!”
Experiencing the Thrill of Young Love Makes You Less Neurotic
by Callie Beusman, jezebel.com, May 12, 2014
Ah, the transformative power of love: that lovely phenomenon of two hearts beating in tender unison, metamorphosing this bleak and mundane world into a reality that’s actually pretty chill. According to a new scientific study, this is a measurable psychological occurrence that actually happens, and not something that the romance-industrial complex invented to trick you into seeing Endless Love in theaters.
According to a new study published in the Journal of Personality, being in Young Love (i.e., a romantic relationship when you are between the ages of 18 and 30) can have a positive effect on personality development; specifically, it’s linked to a decrease in neuroticism. As Dr. Christine Finn, one of the study’s lead authors, told Science Daily: “Neurotic people are rather anxious, insecure, and easily annoyed. They have a tendency towards depression, often show low self-esteem and tend to be generally dissatisfied with their lives.” According to her findings, however, such people “become more stable in a love relationship, and that their personality stabilizes.”
The study followed 245 couples for nine months, interviewing them individually at three-month intervals in order to evaluate both relationship satisfaction and degrees of neuroticism. They also had both partners evaluate fictitious everyday life situations re: their possible significance for their own partnership. Neurotic people tend to interpret ambiguous stimuli negatively and react more strongly to negatively stimuli, but the scientists found that this tendency decreases over time in a romantic relationship.
… In short, LOVE GIVES US THE POWER TO FACE THE WORLD WITH CONFIDENCE. Love is a splendid thing, love can change the way you apprehend the world, love can make you unlearn your negative thought-tendencies, etc. And here I thought I stopped sending my boyfriend psycho texts because I had just gotten lazy.