The radio show host discusses her husband's illness and their often fraught marriage.
After years of talk therapy, Kaitlin Bell Barnett wanted a faster cure for her depression, so at age 17 she started taking prescription anti-depressants, and she’s been on meds ever since.
Some medical professionals describe Kaitlin and her peers as part of a giant uncontrolled experiment, one which uses America’s children as guinea pigs. They are the first generation of children, now adults, who were treated with psycho-pharmaceuticals from the time they were quite young.
In her new book, “Dosed: The Medication Generation Grows Up,” Barnett tells the stories of five young people, now adults, who were medicated as children.
She focuses on the psychological impact of having grown up taking these drugs and how these drugs change how a person views their identity.
I fall hard for coming-of-age stories, and my list of favorite books and movies contains many in this genre, from Pride and Prejudice to The Catcher in the Rye. The movie Garden State, which starred Zach Braff and Natalie Portman, also struck a chord with me when it came out in 2004.
It dramatizes a few days in the life of Andrew Largeman, a twenty-sixyear-old struggling actor in Los Angeles who returns to his native New Jersey for his mother’s funeral. Andrew is nothing if not alienated: he feels disconnected from celebrity-studded Hollywood as well as from his old hometown, which he hasn’t visited since leaving for boarding school nearly a decade earlier. For the first time in sixteen years, Andrew has stopped taking the psychotropic medications his psychiatrist father prescribed after ten-year-old Andrew caused an accident that rendered his mother a paraplegic. Like the illegal drugs his high school buddies take, Andrew’s meds serve as a metaphor for the feelings of inadequacy, disappointment, and rootlessness endemic to my generation of twentysomethings. Judging from the film’s cult-hit success, its target audience of my peers apparently found the metaphor apt. When Andrew falls in love with a quirky, vibrant girl he meets in a doctor’s waiting room, she shows him how to reengage with his feelings—and the world. Presumably, he leaves the medications behind.
For several years, Garden State remained my favorite movie about my generation. It spoke to me as a young person growing up in turn-ofthe-millennium America—though not as a young medicated person. In
fact, I completely forgot psychiatric drugs were even mentioned. Funny, because I myself have been taking medication since high school, and Garden State is one of just a couple of films I know of to allude to the psychological impact of growing up taking psychotropic drugs. But although it touches on this important phenomenon, the film never really examines its underlying assumptions that medications numbed Andrew’s pain and guilt, and that getting off them allows him once again
to experience the agony and ecstasy of life.
For the first time in history, millions of young Americans are in a position not unlike Andrew’s: they have grown up taking psychotropic medications that have shaped their experiences and relationships, their emotions and personalities, and, perhaps most fundamentally, their very
sense of themselves. In Listening to Prozac, psychiatrist Peter Kramer’s best-selling meditation on the drug’s wide-ranging impact on personality, Kramer said that “medication rewrites history.”
He was referring to the way people interpret their personal histories once they have begun medication; what they thought was set in stone was now open to reevaluation. What, then, is medication’s effect on young people, for whom there is much less history to rewrite? Kramer published his book in 1993, at a time of feverish—and, I think, somewhat excessive—excitement about Prozac and the other selective serotonin reuptake inhibitor antidepressants, or SSRIs, that quickly followed on its heels and were heralded as revolutionary treatments for a variety of psychiatric problems. For most people, I suspect, medications are perhaps less like a total rewriting of the past than a palimpsest. They reshape some of one’s interpretations about oneself and one’s life but allow traces of experience and markers of identity to remain. The earlier in life the drugs are begun, the fewer and fainter those traces and markers are likely to be. All told, the psychopharmacological revolution of the last quarter century has had a vast impact on the lives and outlook of my generation—the first generation to grow up taking psychotropic medications. It is therefore vital for us to look at how medication has changed what it feels like to grow up and to become an adult.
• • •
Our society is not used to thinking about the fact that so many young people have already spent their formative years on pharmaceutical treatment for mental illness. Rather, we focus on the here-and-now, wringing our hands about “overmedicated kids.” We debate whether doctors, parents, and teachers rely too heavily on meds to pacify or normalize or manage the ordinary trials of childhood and adolescence. Often, the debate has a socioeconomic dimension that attributes overmedication either to the striving middle and upper-middle classes, or to the social mechanisms used to control poor children and foster children. We question the effectiveness and safety of treating our youth
with these drugs, most of which have not been tested extensively in children and are not government-approved for people under eighteen.
We worry about what the drugs will do to developing brains and bodies, both in the short and long term. The omnipresent subtext to all this: what does the widespread “drugging” of minors say about our society and our values? Certainly, these questions are worth debating—even agonizing over. But they ought not to constitute the be-all and end-all of our society’s conversation about young people and psychiatric drugs, particularly with millions of medicated teens transitioning into adulthood. Too much of the discussion occurs in the abstract, and drugs too easily become a metaphor, as in Garden State, for a variety of modern society’s perceived ills: the fast pace of life and the breakdown of close social and family ties; a heavy emphasis on particular kinds of academic
and professional achievement; a growing intolerance and impatience with discomfort of any sort. Far too rarely, though, do we consult young people themselves. How do they feel about taking medication? How do they think it has shaped their attitudes, their sense of themselves, their academic and career paths, their lives? How do they envision medication affecting their futures?
Focusing on people who needed therapeutic intervention early in life and who continued to use medication for an extended period of time can help us get past the are-they-or-aren’t-they nature of the “overmedicated kids” debate. By assessing what medication has actually meant for my peers, I hope to get at something more intimate and more complex than either the psychotropic true believers or the total skeptics allow themselves to consider. “A prescription becomes an event that generates fantasies, wishes, concerns, and meaning,” write the social scientists Tally Moses and Stuart Kirk in an analysis of young people’s experiences of psychotropic meds. “It structures one’s own expectations and those of others in ways that are not necessarily intended or foreseen.”
As Moses and Kirk suggest, the experience of taking psychotropic drugs is more than just popping a pill, and these drugs often become a part of the self in ways children and teenagers could not have imagined when
they swallowed their first dose. Now that the first generation of medicated kids is entering adulthood, we have an invaluable opportunity to hear about their experiences with psychotropic drugs, and their assessment of those experiences. In addition to looking at the family, medical, and educational circumstances in which drugs are prescribed and taken, that is what this book sets out to do.
My cohort lives with some powerful contradictions. On the one hand, we have grown up with the idea that prolonged sadness, attention problems, obsessions and compulsions, and even shyness are brain diseases that can—and ought—to be treated with medication, just as a bodily disease like diabetes ought to be treated with insulin. The 1990s, sometimes called “the Decade of the Brain,” encompassed a period of unprecedented growth in understanding how the brain works, which generated enormous enthusiasm about the prospects for discovering the underlying mechanisms behind mental illness, enthusiasm that many say was overwrought and premature. Direct-to-consumer pharmaceutical advertising on TV, which the U.S. Food and Drug Administration authorized in 1997, has allowed drug companies to define the public’s understanding of mental illness and psychiatric medications—and this is especially true, I think, for young people who knew no other paradigm. Even as we grew up, though, immersed in the idea of an “imbalance” of particular brain chemicals—an outdated theory that has not held up to the science—we have inherited the American ideal of selfsufficiency, of solving one’s problems through one’s own resourcefulness. As we’ve sought to forge our identities, we have often struggled to reconcile the two.
• • •
It took me a while to conceive of my medicated peers as a coherent group, much less one deserving of a book-length study. Until recently, I would not even have included myself in the category of people whose formative years were shaped by psychotropic drugs, let alone shaped in any profound way. When I began taking Prozac at age seventeen, I didn’t spend a lot of time contemplating whether I’d lose some essential aspect of my identity, and I certainly didn’t consider myself a part of any larger societal phenomenon. I was only vaguely aware that various drugs like Prozac had become available in the late 1980s and through the 1990s, and that they had helped a lot of people. Before taking Prozac, I had spent two years in therapy with first one therapist and then another. To a depressed adolescent, two years feels as though it’s an eon. I thought it was time to give medication a try. I was far less concerned with the implications of taking a psychiatric drug than I was with banishing the exhausting and oppressive symptoms of my pervasive unhappiness: the apathy and hopelessness, the irritability and boredom, the pessimism and incessant self-criticism. For several years these symptoms had seemed to colonize my entire being, and when they abated after a couple of months of Prozac, I credited the medication. I figured a “chemical imbalance” must be to blame, and skipped the philosophizing. Even when Prozac seemed to stop working five years later and I began, in desperation, trying a battery of other antidepressants and antianxiety drugs, I still basically swallowed the pills and left it at that. The drugs only mattered to me insofar as they treated, or failed to treat, my symptoms.
Then, shortly before my twenty-fifth birthday, seven years into my life on medication, I read a column in the New York Times about a young woman in her early thirties who had been taking antidepressants almost continuously since she was fourteen. Richard A. Friedman, the young
woman’s psychiatrist and a frequent contributor to the science section of the Times, noted that the patient “credited the medication with saving her life.” But, he wrote, she had also begun to struggle with the “equally fundamental question” of how the drugs had shaped her psychological development and, ultimately, her identity. Noting the huge increase in the number of youths taking antidepressants and other psychotropic drugs, Friedman postulated that his patient was just one of
innumerable people entering adulthood with such concerns. With little scientific data on how drugs affect the developing body and mind, he added, these were problems for which few answers were forthcoming.
I was intrigued. Since I didn’t relate to the woman’s doubts and anxieties, I wondered whether starting medication at a younger age than I did, at a more vulnerable and unstable period in her emotional development, had contributed to her discomfort. I can’t say the article suddenly
prompted me to start questioning who I’d be had I never taken Prozac, but it did make me intensely curious about my peers. Who are these people, I wondered, and why are they so ambivalent about their medication? I didn’t know anyone who fit the bill, so I began looking. I talked to friends of friends, current college students and recent graduates I met through campus mental health groups, and I joined online forums for mood and behavior disorders, both those devoted specifically to psychotropic medication and more general sites where people post their medical data and histories, such as PatientsLikeMe.
Since I identified myself as a journalist, I suspect most people didn’t want to talk to me at all, fearing public disclosure of a private, online identity. Others shared my initial, uncomplicated assessment: the medication worked, therefore, they must have had a biochemical malfunction in their brains, as much a disease as diabetes and no more profound.
The more stories I heard, though, the more it seemed that even people who believed, on balance, that the drugs helped them—somehow made them more stable, motivated, focused, reliable, or upbeat—still entertained plenty of ambivalence about issues such as what side effects the medications caused and how to understand their identity while taking a drug that affected their mood, behavior, and maybe even their entire personality. Some harbored these suspicions from the get-go; others developed them over time. “At first, I thought, ‘Oh, okay, yeah, that’ll make me feel better,’ ” one young woman from outside Boston told me of her experience beginning antidepressants at age eleven. “And then it entered my consciousness more that it was something more serious . . . that had stronger effects on my being.”
Although qualms and questions exist among people of all ages who take psychiatric meds, the more I talked with young adults, the more I came to realize that many of the queries and worries were directly linked to and intensified by the process of coming of age. The same large themes kept appearing. In what ways, and to what extent, had psychotropic drugs shaped them into the people they had become? How had their problems and symptoms shifted over time, and what role had the
meds played in those shifts? Did they still need medication, or might they have outgrown their old problems? They speculated about the implications of being diagnosed and treated for a chronic disease at an age when they were supposed to be models of health. They puzzled about how taking a prescription drug had affected their body image, their sexuality, their attitudes toward alcohol and other mind-altering substances, and their sense of their own abilities to control their emotions and behaviors. One young man whom I met online put it bluntly.
“The areas that have most been affected by my medication use. Huh. All of them.” A twenty-one-year-old woman who had been taking stimulants and antidepressants since age eight explained plaintively, “I always had this idea of figuring out who I am—and not who the medication makes me.”
My interviewees’ questions and answers struck me as more compelling than I had expected, but I was still pretty sure that I didn’t have much in common with them. The writer Andrew Solomon has said that depression is “the aloneness within us made manifest,” and I would add that for many this isolation includes the unshakable conviction that you are uniquely miserable, that no one shares or could possibly grasp the complexity of your wretchedness and the plight that desolation causes.
I have certainly felt that way when I was depressed; neither Prozac nor any of the other drugs I’ve taken convinced me of the banality of my distress. As far as I was concerned, I had been uniquely unhappy—not necessarily unhappier than other people, but unhappy in a way particular to me. The miracle of medication for me was that it had erased my despair without making me feel stripped of myself.
Later, after a lot of reading and interviewing and researching, I realized my melancholy was hardly unique. But you might forgive me for having believed it was, as I didn’t discuss my medication use with anyone except my immediate family and some close friends—and even then, only in passing. What was there to discuss, now that the symptoms I suffered as a teenager had more or less disappeared? I’d experienced some side effects and even some alarming new symptoms, but how could medication—the solution to my problems—pose any kind of fundamental dilemma? Very few people, for that matter, had discussed their own medications with me. If the subject came up, it was in the abstract, a snide remark that a friend or acquaintance “really should be medicated,” or in some vaguely moralizing debate over whether stimulants such as Adderall and Ritalin gave some of our peers an unfair academic edge, or whether learning disabilities had been misdiagnosed as attention deficit hyperactivity disorder (ADHD) and might be better treated through other means. Yes, there were other people my age taking medication, but I certainly didn’t feel any sort of cheesy, self-help-group connection to them.
Even as I began talking to people in my basic situation, I remained skeptical. How could their experiences have any meaningful commonalities one could extrapolate to larger truths? To be sure, these people were all “medicated,” but they had been diagnosed with vastly different conditions, and they were taking a large array of very different types of psychotropic drugs. Several of them (echoing experts in the mental health field) cautioned me against lumping everyone together. Taking a short-acting stimulant like Ritalin, they said, was vastly different from taking a long-acting antidepressant like Prozac, which, in turn, was totally different from taking a mood stabilizer like Depakote, or a powerful atypical antipsychotic like Zyprexa.
Having been trained both as a journalist and a historian, though, I harbored a conviction that people, myself included, were products not only of their individual experience but also of their culture and their era.
There was no denying that my peers and I had lived through—had indeed been the vanguard of—the psychopharmacological revolution.
Prozac was not the first of the selective serotonin reuptake inhibitor antidepressants, but it was the first to hit the U.S. market, gaining FDA approval at the end of 1987. Thanks to national education campaigns trumpeting depression as a major public health issue and a period when, due to changes in FDA requirements, few new psychiatric drugs were introduced, Prozac made a huge splash.
Other SSRIs such as Zoloft and Paxil followed a few years later. Starting in the early 1990s,new kinds of antipsychotic medications were released. Originally used for schizophrenia, these “atypical antipsychotics” were increasingly prescribed to stabilize the mood swings of childhood bipolar disorder and to quell irritability associated with autism and behavior disorders. Longer-acting formulations of the stimulant Ritalin, which had been used in children since the 1950s, appeared, as did other drugs for attention deficit/hyperactivity disorder. By the mid-1990s, the prescribing of psychotropic drugs to children was front-page news in major newspapers. When I entered college in 2001, college counseling centers were reporting an overwhelming influx of patients, including growing numbers who arrived at school with a long history of mental illness and
Since reliable statistical analyses lag years behind actual shifts in medical practice, the statistics about the actual number of kids prescribed medication began to hit the media when these children had already entered adolescence, or even adulthood. When the data did emerge, it confirmed what people already sensed, a massive increase in the number and percentage of children being treated with psychiatric drugs. Although children and teenagers were—and still are— prescribed such drugs less frequently than adults (with the exception of stimulants), the rate of growth is remarkable. Between 1987 and 1996, the percentage of people under twenty taking at least one such drug tripled, from about 2 percent of the youth population to 6 percent, at minimum an increase of more than a million children. Between 1994 and 2001, the percentage of visits to doctors in which psychotropics were prescribed to teenagers more than doubled: to one in ten visits by teenage boys and one in fifteen visits by teenage girls between the ages of fourteen and eighteen.
In 2009, 25 percent of college students were taking psychotropic meds, up from 20 percent in 2003, 17 percent in 2000, and just 9 percent in 1994.
The prescribing of more than one medication has become far more common in child psychiatric patients in recent decades—even though, as the National Institute of Mental Health’s head of child and adolescent psychiatry research noted in 2005, there was “little empirical evidence of efficacy and safety from welldesigned studies.”
Although statistics about medication use show a clear increase, nationally representative data is still severely lacking. Many children and teenagers were also facing the prospect of taking
medication for far longer than people who first encountered psychotropic drugs in adulthood. In the 1980s and 1990s, doctors tended to prescribe drugs for a limited period of time for both adults and children, except for bipolar disorder and schizophrenia, long considered intractable, lifelong conditions. But it became increasingly clear to doctors that ADHD persisted into adulthood in about two-thirds of people, and that an early bout of anxiety or depression often portended more
frequent and more severe episodes later in life. And so my peers and I found that a drug initially prescribed by a pediatrician as a stopgap measure for some alleged hormonal or developmental problem often became a long-term, perhaps indefinite commitment.
The medical profession wasn’t the only force driving the increase in prescriptions. Our parents, the ubiquitous baby boomers, are notorious for seeking medical solutions to every ailment (one book on the subject, by journalist Greg Critser, cheekily dubbed them “Generation Rx”).
The boomers also tend to be portrayed as overly indulgent parents, obsessing endlessly about their children’s fragile self-esteem and allimportant academic performance. They wanted us, their children, to be not just happy, fulfilled, and confident, but also high achievers from a
young age. They worried that their children could come under the influence of—or be outright harmed by—the unhappy, disaffected kids who captured headlines in the 1990s for their dramatic suicides or school shoot-outs. The boomers tried to be cooler or more hip than their own parents, but most of them were far from “anything goes” when it came to their offspring. As one of my subjects put it, describing his parents’ and teachers’ expectations, “You can’t not function. You can’t wake up
in the morning and not be able to function.” The goal, he said, was to strike a magical balance between being “happy-go-lucky” and “efficient.” These conflicting expectations and aspirations produced some rather stressed-out children—and some parents, teachers, and doctors readily inclined toward pills to help manage the effects of that stress.
My peers and I also came of age in a time when the economics of health insurance were changing drastically. In the 1980s and 1990s, most employer-based health insurance moved toward a managed-care model, and state- and federally funded health coverage for children expanded. The government and the HMOs were both eager to keep costs down, and therefore preferred relatively cheap psychiatric drugs to long-term talk therapy (despite a growing medical consensus that
the most effective treatment for most psychiatric conditions was a combination of medication and therapy). Meanwhile, a shortage of child psychiatrists, especially in poor and rural areas, meant that many troubled children could not see a specialist.
As a result, already-busy pediatricians shouldered more of the burden of treatment: in the late 1970s, about 7 percent of all visits to pediatricians involved a child with emotional or behavioral problems, but by the mid-1990s, that rate had nearly tripled.
Increasingly, those visits involved writing a prescription. Prescribing data collected between 1992 and 1996 showed that pediatricians prescribed 85 percent of psychotropic medications.
As psychiatrists switched from forty-five-minute visits with time for psychotherapy to fifteen-minute “med checks,” prescriptions often came with little continuing discussion about how kids felt
about taking medication, or how it was affecting them. That was fine by some, but not by others. One young woman I interviewed, who received her prescriptions from time-crunched psychiatrists who scheduled fifteen-minute appointments they often cut short, wished there had been time to “talk about feelings, not just symptoms.” One young man told me that even though psychiatric medication is “so much a part of our culture,” he could “probably count on one hand” the conversations he’d
had about his medication use, or anyone else’s.
Most psychotropic drugs, it’s important to note, were and still are prescribed to children and teens without official FDA approval for the relevant condition and age group. As long as clinical trials have shown a given medication to be safe and, under certain narrow requirements,
effective for some condition, and as long as the pharmaceutical companies don’t advertise a drug for a nonapproved use in kids, doctors can legally prescribe it “off-label.” Off-label prescribing to children is nothing new, in large part because concerns about the ethics and legality of conducting drug trials on minors have plagued medical research for decades. (As the pediatrician and pharmacologist Henry Shirkey observed in a 1968 article in the Journal of Pediatrics, children were becoming “the therapeutic orphans of our expanding pharmacopoeia,” and people calling for drugs to be tested in children were still reiterating Shirkey’s formulation three decades later.)
Historically cautious, doctors used to wait a number of years after a new medication came on
the market before prescribing it to children. But with all the hoopla surrounding the introduction of new psychotropic drugs in the late 1980s and 1990s and an influx of young patients seeking treatment, fewer doctors bothered to wait.
This sharp increase in prescribing made the lack of research all the more acute. Controversies brewed. Did antipsychotic drugs cause dangerous obesity and early-onset diabetes? Did stimulants stunt growth or increase the risk of drug abuse later in life? Did taking antidepressants or stimulants before puberty predispose children to bipolar disorder in adulthood? Did SSRIs like Prozac increase the risk of a teenager attempting suicide?
In the late 1990s, responding to the precipitous rise in prescribing and what researchers called a shameful lack of safety and efficacy data, the National Institute of Mental Health began funding a series of major, multisite medication trials in children and teenagers. They led to some important findings, with trials for major depressive disorder and obsessive-compulsive disorder concluding, for example, that combined medication and cognitive-behavioral therapy (CBT), a short-term therapeutic treatment focused on refocusing thought and behavior patterns, was the optimal regimen for the greatest number of children, compared to a placebo or to either medication or CBT alone.
These so-called “multimodal” studies were groundbreaking because they were some of the first to compare the efficacy of different treatments, measuring one drug against another, drugs against therapy, and standardized, care-fully managed treatment against “community care,” the treatment a
child would ordinarily receive in his or her local area. They also were comparatively long-lasting, which produced certain notable findings.
For example, children in the government’s major ADHD trial at first seemed to do best on medication alone, compared to various other combinations of treatments. But when the same children were assessed two years after the study ended, there were no differences in either ADHD symptom reduction, or improved school or family relationship functioning, among different treatment groups. Medication’s superior effects, in other words, did not last.
The vast majority of studies are not this wide-ranging or long-term. As a result, myriad issues remain unsettled and hotly debated today and continue to bedevil parents, doctors, and young people as they weigh the relative risks and benefits of embarking on psychopharmaceutical treatment.
Recently, some studies have questioned the efficacy of antidepressants for mild and moderate depression in adults, which has generated considerable public interest and raised questions in many people’s minds about the wisdom of taking the drugs at all, let alone long term.
In fact, as psychiatrist Peter Kramer pointed out in a column in the New York Times, this particular new evidence only applies to episodic, not chronic and chronically recurring, depression, and it doesn’t say anything about the drugs’ efficacy for many other conditions, including numerous anxiety disorders, severe depression, menstrual-related mood disorder, and
the depressive phase of bipolar disorder.
Other studies have raised troubling questions about how some psychotropics may affect the brain
long term; the drugs are effective at preventing relapse for as long as they are continued, but some evidence suggests, for example, that the changes the drugs cause may set patients up for withdrawal symptoms that look very much like relapses, prompting, some have argued, a kind
of psychological dependence on the drugs.
The studies have generated considerable controversy in the mental health profession and in
the popular media.Overall, extended, decades-long longitudinal studies tracking outcomes of medication treatment in kids remain close to nonexistent, because of the great expense and effort involved in tracking people over many years. As a result, the original group of medicated children has entered adulthood with very little information about the lasting physical, emotional, and cognitive effects of using psychiatric medication during childhood and beyond. They are left with the legacies of using medication, though in many cases they’re not quite sure what those
legacies are, or will be.
I don’t mean to suggest that every young adult who spent his child or adolescent years taking medications is preoccupied with the existential implications of that treatment. Undoubtedly, many would contend, as some of my interviewees initially did, that the effects were simple: the
drug either worked to resolve symptoms, or it didn’t. But I suspect that, as was the case for many people I interviewed, taking a little time to reflect on a drug’s role in their own coming-of-age stories would allow them to see just how wide-ranging and complicated the impact of medication has been. Thus, this book focuses on exploring the legacies of medication, while placing them in a larger cultural and sociological context. I have tried to provide a basic, clear assessment of what’s known, from a scientific and medical perspective—as well as what remains unknown—about how psychiatric drugs affect the developing brain, body, and psyche. To that end, I have interviewed medical and social science researchers, as well as clinicians who have counseled young adult patients who have been taking drugs since childhood. My thinking has been informed by historical accounts, memoirs, and academic studies from many fields, along with newspaper and magazine articles about psychopharmacology and childhood mental illness from the past three
decades. A small group of social scientists has begun studying the subjective or “lived” experience of medication use in young people. Their research, and that of sociologists and medical anthropologists interested in people’s experience of illness and treatment, has shaped my
thinking. But the backbone of this book consists of the individual stories of my peers, the people whose voices have been drowned out amidst the clamor of researchers, doctors, politicians, school administrators, and, yes, parents. No book can encompass the totality of a generation’s experience, and I don’t pretend mine will be able to do that for psychiatric medication and mental illness. I do hope that it provides an instructive range of perspectives and experiences. Since I have aimed for depth over breadth, I have chosen to focus on a few people who represent a variety of medical conditions, geographic locales, and socioeconomic backgrounds, supplementing their perspectives with those of others I interviewed.
Examining young people with long histories of psychiatric medication has to some extent skewed my focus toward more seemingly intractable mental problems. In part, this is because more serious mental illnesses tend to be diagnosed earlier; in part because the people whose experience with medication is more fraught are those who have the most to say and who wanted to talk for this book. I found most of my subjects through social media groups and chat rooms dedicated to medical and
psychiatric issues, groups that are admittedly more likely to attract people still struggling to treat their disorders, though certainly a large number of them live full, productive, and successful lives. As many of them have observed, the drugs help keep them from bottoming out, but
they don’t make everything perfect. “When you get into this issue of having people take these medications for a long period of time, at some point you have to ask, ‘What is the goal of the treatment?’ ” one of my interviewees asked. Is erasing the symptoms the goal, he wondered, or
learning how to cope with them? What is the definition of “recovery?”
“Because, as you probably know,” he added wryly, “no psychiatrist will ever tell you you’ve been cured of depression or anxiety. You’re only in remission.”
I have found considerable variation in people’s experiences and assessments of their medication and its impact on their development. Some people have stuck with one or two drugs over many years; others have cycled through multiple diagnoses trying virtually every drug available.
Some have formed intimate relationships with their therapists and psychiatrists; most have had their treatment limited to “med checks” with psychopharmacologists. Some people’s parents pushed hard for medication; others opposed it strongly. Some credit medication with saving their lives; some think it ruined portions of their lives. Just as striking are a single person’s shifting views and relationships to medication over time. The drugs influence one’s life story, of course, but experiences and circumstances, so much in flux as one comes of age, also influence how
one perceives and thinks about the drugs.
As a result, interviews with my subjects about events that happened to them ten, fifteen, or twenty years ago are necessarily colored by what they have learned and experienced since then, although I have tried as much as possible to distinguish what they knew or felt at the time from what they realized or decided later. I have tried to present my subjects as three-dimensional characters, not as medical patients whose treatment exists in a vacuum. Consequently, I discuss the forces, individual, familial, and societal, that led them to take medications and explore how the experience of taking these drugs substantially affects their lives.
To some extent I take my subjects at their word since my goal is to understand how they feel about taking medications, but I also try to step back and analyze the broader implications of their experiences. Clearly, my own perspective is influenced by the fact I am writing about my peers—as well as the fact that I have been taking various medications for anxiety and depression for over a decade. I have referenced my own experience or point of view when I thought it would add to the narrative, or when I felt it shaped my analysis of larger trends or other people’s accounts.
Taking psychotropic drugs is an individual, sometimes lonely, act.Many have criticized it as an individual, piecemeal solution to larger societal problems better solved through collective action. For young people accustomed to near-constant digital connectivity and to sharing the most intimate personal details online, taking medication is a bit of an anomaly, a peculiarly solitary experience (I’m struck, for example, by how few young people blog about their medication, though, admittedly,
it’s hard to measure the frequency of Facebook or Twitter updates). This is not a self-help book, but it does offer some new ways of thinking about what it means to take medication from a young age. Almost all of the people who agreed to be profiled in depth for this book said they did so
because they thought it was important to help others understand what it’s like to be young and medicated. They want to inform not only their peers, but also their parents’ and grandparents’ generations, as well as young children and teens, and the parents of those kids, who are con-
sidering embarking on drug treatment, or who want to know what the experience is like and what may lie ahead. And they want to encourage others to speak out about growing up on medication, a topic much discussed by others but far too rarely discussed by the young people who have lived through it.
Excerpted from “Dosed: The Medication Generation Grows Up” Copyright Beacon Press, 2012.
Throughout the week, Here & Now is looking at the impact a raise in the minimum wage would have on states, the federal government and workers.