After they took too many pills, after their mother found them unconscious on their bedroom floor, after paramedics couldn’t get them to respond, after being life-flighted to a children’s hospital and intubated, and after days in a medically induced coma, 13-year-old Taylor Little finally woke up.
[time-brightcove not-tgx=”true”]
Regaining consciousness after this first suicide attempt in Nov. 2015 was “the most immense disappointment I’d ever felt,” Little recalls. “I was so upset I was still alive.”
And “the first thing I asked after I got my tube out,” Little says, “was, ‘Can I have my phone?’”
Little, who identifies as nonbinary and uses they/them pronouns, joined Instagram when they were 11. It didn’t take long before they were introduced to self-harm videos on the platform, Little says. They spent much of the next seven years contemplating suicide, attempting it at least twice. They stabilized only after getting off social media entirely.
Now Little is among more than 1,800 plaintiffs suing major social-media companies, including Instagram and its parent company Meta, in a massive multidistrict litigation in Northern California. The plaintiffs allege these companies have been “recklessly ignoring the impact of their products on children’s mental and physical health,” and that they are “direct victims of intentional product design choices made by each defendant.” Little’s own complaint seeks to hold Instagram accountable for “knowingly unleashing onto the public a defectively designed product that is addictive, harmful, and at times fatal to children.” They allege the platform fed them a persistent stream of self-harm content that altered their brain and perpetuated constant thoughts of death.
Little acknowledges a history of anxiety and depression in their family that may have materialized even without social media. But they firmly believe that they would not have been suicidal without Instagram. “The fact that I was obsessively suicidal at the age I was, that was not just my brain chemistry. That was my brain chemistry being altered by the platform I was on,” Little tells TIME. “Social media shaped my brain.”
A spokesperson for Meta says content encouraging suicide or self-harm breaks company rules. Potential self-harm content is sent to human moderators for prioritized review, and those moderators can remove the content, direct the user to support organizations, or even call emergency services, the spokesperson says. For years, Meta has consulted with mental-health experts about how to address posts that fall into the gray area between promoting self-harm and expressing sadness or celebrating recovery, but the company’s approach to monitoring self-harm content has evolved significantly over the past decade, it says. Meta implemented new safeguards for users under 18 in Jan. 2024, blocking much of this kind of content entirely while introducing so-called Teen Accounts later that year.
“We know parents are worried about their teens having unsafe or inappropriate experiences online, and that’s why we’ve significantly reimagined the Instagram experience for tens of millions of teens with new Teen Accounts,” says Meta spokeswoman Liza Crenshaw. “These accounts provide teens with built-in protections to automatically limit who’s contacting them and the content they’re seeing, and teens under 16 need a parent’s permission to change those settings. We also give parents oversight over their teens’ use of Instagram, with ways to see who their teens are chatting with and block them from using the app for more than 15 minutes a day, or for certain periods of time, like during school or at night.”
Those changes came too late for Little. For years, their days and nights were spent on their Instagram feed, and their Instagram feed was filled with images of girls falling off buildings, videos of blades cutting into unscarred flesh, and soft music framing stylized photos of hanging bodies.
“Everything I learned about suicide,” says Little, “I learned on Instagram.”
Growing up in Butte, Mont., Little was a pretty happy kid. They loved theater, took dance and voice lessons, and had lots of friends. Then, in the summer between fifth and sixth grade, Little joined Instagram, Facebook and Snapchat, sidestepping the platforms’ limits by lying about their age.
“As soon as I was on social media, I couldn’t put it down,” Little recalls in a recent video interview from their bedroom in Colorado Springs, Colo. At times, they spent more than 10 hours per day on Instagram. Social media eclipsed sleeping, studying, hanging out with friends, and even eating. Instagram “felt like a safety blanket,” Little says. “It felt like something I could kind of put between me and the world.”
One day on Instagram, Little recalls receiving a suggestion to “check out this account.” Clicking the link took them to a page that was a “diary of graphic self-harm,” Little says. The accompanying captions were about not being able to take the pain anymore. “I didn’t look for it,” Little says. “I clicked a notification and was shown gore.”
Little was shocked, and yet couldn’t look away. “We see a car crash on the road and want to stare at it, you know?” they say. “If gore is put in front of my face, I’m 11, I’m gonna look at it, right?”
Soon their Instagram algorithm was feeding them more and self-harm content. “You look at one self-harm page, all you get is self harm,” Little says. “And I kept clicking on it.”
Little made it through sixth grade. Things got worse over the summer before seventh. They stopped going outside much, spending days curled up in their room, scrolling through images of girls cutting themselves. They lost weight and dyed their hair black. On Instagram, depression was “romanticized,” Little says. The self-harm content “was kind of comforting”—it felt like a twisted validation of their depression in a way.
“I was living with something so ugly and so dark,” Little says. “Being able to make it something that felt prettier or more romanticized honestly made it more bearable.” They started cutting themselves, because users they followed on Instagram “said it would make me feel better.”
By the time Little turned 12, the “Explore” page on their Instagram feed was filled with images of cutting, videos of people jumping off buildings, videos of people hanging themselves. The content could be instructional—”like a how-to guide,” Little says.
Throughout seventh grade, Little developed plans to die by suicide. In the culture of self-harm that dominated their corner of Instagram, planning a suicide was thought of as a way to survive another day. Little says they were so “severely and chronically suicidal” that if they didn’t have a suicide date somewhere in the future, ”I would have literally tried to kill myself every single day.” They saw planning suicides as a “coping mechanism.”
One night in early 2015, when Little was 12, they went to the roof of a 9-story building in Butte, planning to jump. On Instagram they had watched videos of people floating through the air, set to mournful music. The idea of jumping to one’s death struck them as almost romantic. “I know it sounds like it was a kind of collective psychosis, because it felt like that a lot of the time,” Little says. But the accounts they followed seemed to promote a violent plunge as a peaceful and elegant way to die.
Little didn’t jump. But the episode jolted their family to action. “That was when my mom was like, okay, this is out of hand,” they recall. In March 2015, Little was sent to inpatient treatment for their eating disorder, where they spent six weeks. That September, the family moved to Colorado so that Little could get better mental-health treatment.
By late 2015, Little had a new home, new friends, and new doctors. But their Instagram feed remained the same. “My brain worked like an algorithm,” Little says. “It filtered out the good stuff. It filtered everything through this lens of suicidality and self-harm.”
That November, Little made their first suicide attempt. In the years that followed, they were hospitalized repeatedly for depression and anorexia, rotating through different medications and treatment plans. But nobody focused on their phone as part of the problem, Little recalls. Researchers had not yet established the connection between social-media addiction and suicidal ideation. Their parents were only beginning to realize that social media was a driver of their problems. “In their head, this was only a mental-health issue, and so they were treating it as a mental-health issue,” Little recalls. “But none of the treatment I was getting was enough to offset the social media.”
In Jan. 2018, Little attempted suicide a second time. This attempt, they say, was more serious than the first, and the prognosis for recovery was worse because they lied to the doctors about which medication they’d taken—another tip they’d learned from posts about self-harm on Instagram. But now Little qualified for mental-healthcare funding under Colorado’s Children and Youth Mental Health Treatment Act, which covered more than 90% of the cost of inpatient treatment. They stayed in the residential program for more than a year, again cycling through various treatment plans and medications.
There was one big difference this time: Little didn’t have their phone.
Suicide is the second-leading cause of death for American teenagers and young adults, according to the U.S. Centers for Disease Control and Prevention. Teen suicides increased more than 57% between 2007 and 2018, a jump that roughly corresponds to the time frame in which young Americans began spending most of their free time on social media. According to a 2021 advisory issued by U.S. Surgeon General Vivek Murthy, the share of high-school students seriously considering suicide rose 36% between 2009 and 2019.
Jean Twenge, a professor of psychology at San Diego State University and author of the forthcoming Ten Rules for Raising Kids in a High-Tech World, was one of the first researchers to offer an explanation for the sharp rise in teen suicides during this period. “Why did suicide go up between 2010 and 2019?” Twenge asks. “The rise of smartphones and social media was by far the biggest change to teens’ everyday lives during that period.”
Recent research offers some corroboration. A 2019 study published in JAMA Psychiatry found that using social media for more than three hours a day was associated with increased risk of depression and anxiety, even after adjusting for history of mental-health issues. Another 2019 study out of the U.K. examined surveys of 11,000 young people and found that 38% of teens who used social media for an average of more than five hours per day showed signs of clinically relevant depression. A 2021 study based on research by Brigham Young University, which tracked teens over the course of a decade, found that teenage girls who increased their use of social media over time were at a higher risk of suicide as young adults. And in a study published in 2025, researchers at Weill Cornell Medical College in Manhattan who followed 10-to-14-year-old social-media users over several years found that kids who were more addicted to social media were at two to three times higher risk of suicidal behavior.
“It’s the compulsive psychological feelings that the kids are attached to the phone and cannot stop using it,” says Dr. Yunyu Xiao, the lead author of the study and assistant professor of population health science at Weill Cornell. The groups that had the highest addictive use were the ones with the highest risk of suicidal behaviors, Xiao says. “It’s the addictive behavior that’s the game changer, not just the time.”
Arturo Bejar joined Facebook in 2009, as a senior director responsible for protecting users. Bejar left the company in 2015, before returning in 2019 as a consultant on Instagram’s Well-Being team. During his time away, Bejar’s 14-year-old daughter began using Instagram, and started receiving unwanted sexual comments. Bejar soon realized this was a typical experience on the platform for girls his daughter’s age. And when the teenagers reported the comments to Instagram, he says, the company seemed to do nothing.
After Bejar rejoined the company, he began to think Instagram was more interested in minimizing the platform’s problems than solving them. During his time there, he alleges, only a fraction of a percent of the worst content was removed. “The people who are designing, implementing, and pushing these algorithms, as far as I can tell from the choices they make,” he tells TIME, “don’t really care to understand or reduce the harm of the content they’re delivering.”
Lewd comments aimed at teens like his daughter, Bejar says, were just the tip of the iceberg. Adolescence is a jungle of angst; the wrong content served up at the wrong moment can be catastrophic. “If a kid is having a vulnerable day and they watch some of this content, the algorithm will adapt very quickly to it. That’s very dangerous,” he says. “What does it do to a kid if they get a firehose of thousands of pieces of that kind of content if they’re considering suicide?’”
Meta was aware teens were being exposed to toxic content on Instagram. According to the platform’s internal 2021 Bad Experiences and Encounters Survey, roughly 8.4% of users between the ages of 13 and 15 reported seeing self-harm content on Instagram in the previous seven days, with girls seeing it at a slightly higher rate than boys. The proportion may seem small. But because of the platform’s ubiquity, the number of kids getting fed self-harm content was likely enormous.
“Every parent should know that your kid is going to be recommended this type of content,” Bejar adds, “and just hope that your kid is not in a moment of vulnerability when that happens.”
A spokesperson for Meta says that the platform distinguishes between “user perception,” which is measured in the survey, and “prevalence,” which is the company’s own measure of content they’ve found and removed for breaking rules. User perception is subjective and variable, the spokesperson says, while prevalence measures the amount of content that’s been found to break Meta’s rules. According to a company dashboard, Instagram took action on 7.8 million pieces of self-harm and eating disorder content in one three-month period in 2021. By 2022, 98% this type of content was proactively removed before it was reported, according to Meta.
Before Bejar left the company for the second time in 2021, he sent an email to company executives highlighting what he called “a critical gap in how we as a company approach harm, and how the people we serve experience it.” In a second email to Instagram executives, he noted one in eight users under 16 said they had experienced unwanted sexual advances on the platform in the previous seven days. “It’s staggering to me, knowing how much harm happens on the platform, that they haven’t been more aggressive about reducing harmful experiences to teenagers,” he says.
Since then, Bejar has been outspoken about the ways he believes Meta has minimized reports of harm on their platform. He testified to a Senate subcommittee in 2023 that after he had sent detailed warnings to executives about the harms teens experienced on Instagram, their reaction was “not constructive.” Zuckerberg, he said, didn’t even respond to his email. Most of the harm on the platform, he testified, remained unaddressed. For a tech company with enormous resources, he says, that’s a deliberate choice. “Google does a good job of putting the right protections in place,” Bejar says. “This is a very solvable problem.”
A Meta spokesperson says that Instagram relies on both user-experience reports and prevalence metrics to keep users safe, and pointed to a new report by the National Survey on Drug Use and Health which found that teens’ suicidal thoughts declined slightly from 2021 to 2024. Starting in Nov. 2020, Instagram implemented a tool that directs users to mental-health resources if they use search terms related to suicide.
Yet even after Meta unveiled Teen Accounts, critics maintain social media is still dangerous for vulnerable kids. According to a new report from the Molly Rose Foundation—named for Molly Russell, a U.K. schoolgirl who died by suicide in 2017—Instagram’s algorithm is still recommending self-harm content to teens “at industrial scale.”
From Nov. 2024 to June 2025, researchers at the foundation, posing as 15-year-old girls, set up a series of accounts and monitored the content the algorithm recommended. On accounts that had previously viewed suicide and self-harm content, the researchers found that 97% of all the content recommended on Instagram reels glorified suicide, depression, or self-harm, making the experience into a “near constant barrage of harmful posts.”
“These results show that Teen Accounts was effectively a PR exercise that was designed to appease lawmakers on Capitol Hill but actually has not fundamentally changed the dial,” says Andy Burrows, the CEO of the Molly Rose Foundation. Burrows, who oversaw the research, says the teenage girl accounts he monitored this year were bombarded with videos that glorify suicide and self harm, reference suicide methods, or contain intense themes of misery hopelessness and despair, and that the videos came in such quick succession that they had a “compounding effect” that could cause “cumulative harm.”
“Our results show that the measures that Instagram claims to have taken are essentially paying lip service to teen safety,” says Burrows. “When algorithms identify that a young person has potentially an interest in these topics, it takes no time at all for the taps to be turned on and huge reams of harmful material to be recommended.”
“We disagree with the assertions of this report and the limited methodology behind it,” says a Meta spokesperson.
During their phone-free stay in the treatment facility, Taylor Little started feeling better. Quitting social-media cold turkey led to “withdrawal” symptoms, Little says, but over time, their suicidal ideation receded. They spent their days writing and doing art projects and making friends with the girls on their floor. “Instagram wasn’t a part of the picture,” Little says. “And that truly is what saved my life.”
Little uses a flip phone now. At 22, they still have mental-health struggles, including depression and anxiety, and take medication for bipolar disorder. But they havent self-harmed in four years. Their eating disorder is under control.
“Teens everywhere are struggling with a lot of issues. I think that is part and parcel of being a teenager,” says Jade Haileselassie, a South-Carolins based attorney with Motley Rice who is representing Little in the multidistrict litigation. “What social media does is it takes those struggles and it magnifies them in a way that is unprecedented.”
In joining the lawsuit, which is still in the discovery phase and could go to trial next year, Little is seeking significant financial damages and compensation for years of mental-health treatment, as well as injunctive relief compelling the companies to change their behavior. Instagram’s recent efforts to curb self-harm content are strategic, says Hailesselassie, because they allow the company to claim that the issue is about specific photos and videos, rather than about algorithmic design. “They’re trying to highlight the issue of content,” she says, “over the algorithm and the design issues that the plaintiffs are actually complaining about.”
Little hopes the litigation will force social-media companies to reform their practices in a way that legislation so far has not. They aren’t impressed with Meta’s new Teen Accounts, which increase protections for younger users, or the other efforts they’ve made to make Instagram safer for kids. “It’s performative on their part, and it’s in response to backlash,” Little says. “It lasts exactly as long as that backlash lasts, which is why litigation is important.”
Sometimes Little thinks about what life would have been like if they hadn’t been born in the era of social media. They know they may have had a predisposition to mental-health challenges, but they believe they would have been able to manage with medication and therapy, like they’re doing now, and would not have become suicidal if it weren’t for Instagram.
“My brain isn’t like a claw trap anymore,” Little says. “And social media made my brain a claw trap.”
If you or someone you know need help, call or text 988 to reach the 988 Suicide and Crisis Lifeline or go to SpeakingOfSuicide.com/resources for a list of additional resources.
Leave a comment