YouTube algorithms consistently push eating disorder and self-harm content to teen girls, new study finds


Anna Mockel was 14 years old and suddenly became obsessed with losing weight. It was the spring of 2020 and he had just graduated eighth grade by distance learning. Housebound and nervous about transitioning to high school next fall, she sacrificed countless hours that summer of COVID confinement scrolling through social media apps.

Anna spent a lot of time on YouTube “not looking for anything in particular”, just looking at what appeared in her feed. He remembers the spiraling thoughts that began when he watched videos with girls who were slightly older and invariably thin. The more Anna watched, the more obstructed these videos became and the more determined she was to look like the girls in the videos.

When I clicked and tapped, YouTube’s “Next” panel of recommended videos began to transform based on the content with skinny girls weight loss how to. Diet and exercise videos began to dominate Anna’s account. As she continued to watch, she says, the content intensified, until her feed was flooded with videos glorifying skeletal-looking bodies and tricks to maintaining a 500-calorie daily diet. (Adolescent girls are recommended a daily caloric intake of 2,200).

“I didn’t know this was even an online thing,” Anna says of the eating disorder content she’s recommended. “A lot of it just popped up in my feed, and then I gravitated towards it because it’s what was already happening for me.”

Anna copied what she saw, restricted her diet and began to lose weight at an alarming rate. At age 14, she says she was aware of eating disorders, but “didn’t connect the dots” until she was diagnosed with anorexia. Over the next few years, he endured two hospitalizations and three months in a residential treatment center before beginning his recovery at age 16.

Now 18 and in high school, she says social media, YouTube in particular, perpetuated her eating disorder.

“YouTube became this community of people who are competitive with eating disorders,” she says. “And it kept me in that mindset [anorexia] it wasn’t a problem because a lot of other people online were doing the same thing.”

Now, new research confirms that this content was served to Anna intentionally. A report released Tuesday by the Center for Countering Digital Hate claims that when YouTube users show signs of interest in dieting and weight loss, nearly 70 percent of videos driven by the platform’s algorithms recommend content that is likely to worsen or create anxieties about body image.

In addition, the videos average 344,000 views each, nearly 60 times more than the average YouTube video, and are embroidered with ads from big brands like Nike, T-Mobile, and Grammarly. It’s not clear whether the companies are aware of the ad placements.

“We cannot continue to let social media platforms experiment with new generations as they come of age,” says James P. Steyer, founder and CEO of Common Sense Media, a nonprofit organization dedicated to to educate families about online safety.

He says these platforms are designed to keep viewers’ attention, even if that means amplifying content harmful to minors.

The report, titled “YouTube’s Anorexia Algorithm,” examines the first 1,000 videos a teen would receive in the “Next” panel when watching weight loss, diet, or exercise videos for the first time.

To collect the data, CCDH researchers created a YouTube profile of a 13-year-old girl and conducted 100 searches on the video-sharing platform using popular eating disorder keywords such as “ED WIEIAD” (eating disorder, what I eat in a day), “ABC diet” (anorexia boot camp diet) and “safe foods” (a reference to foods with few or no calories). The research team analyzed the top 10 recommendations that YouTube’s algorithm sent to the “Next” board.

The results indicated that almost two-thirds (638) of the recommended videos pushed the hypothetical 13-year-old user over to an eating disorder or problematic weight loss content; A third (344) of YouTube’s recommendations have been considered harmful by the CCDH, i.e. the content promoted or glamorized eating disorders, contained weight-based bullying, or showed copycat behavior; The study found that 50 of the videos involved self-harm or suicide content.

“There is this anti-human culture created by social media platforms like YouTube,” says Imran Ahmed, founder and CEO of the Center for Countering Digital Hate. “Kids today are essentially being re-educated by algorithms, by companies teaching them and persuading them to starve themselves.”

Ahmed says the study illustrates the systemic nature of the problem, that Google-owned YouTube is violating its own policies by allowing such content on the platform.

According to the Pew Research Center, YouTube is the most popular social media site among teenagers in the United States, ahead of TikTok and Instagram. Three-quarters of American teens say they use the platform at least once a day. YouTube does not require a user to create an account to view content.

The Social Media Victims Law Center, a Seattle-based law firm founded in response to the Facebook Papers 2021has filed thousands of lawsuits against social media companies, including YouTube. More than 20 of those lawsuits allege that YouTube is designed to be intentionally addictive and perpetuate eating disorders in its users, especially among teenage girls.

The law firm connected 60 Minutes with a 17-year-old client. His experience mirrors Anna’s.

“YouTube taught me like have an eating disorder,” says the 17-year-old, whose lawsuit accuses YouTube of perpetuating anorexia nervosa. She says she created a YouTube account when she was 12. She logged in to watch dog videos and dog challenges. gymnastics and cooking Then, she says, she started watching videos of girls dancing and working out, which turned into dieting and weight loss videos.

She says her feed became a funnel for eating disorder content, a stream of influencers promoting extreme diets and ways to “stay thin.” She spent five hours a day on YouTube, learning terms like “bulimia” and “ARFID” (avoidant/restrictive food intake disorder). He learned what it meant to “purge” and “restrict” food; became deeply concerned about caloric intake and his BMI (body mass index).

When he was in seventh grade, he stopped eating. Soon after, she was diagnosed with anorexia, and for the next five years, she says, she would spend more time out of school than in it. Now in high school, she has been hospitalized five times and spent months in three residential treatment centers trying to recover from her eating disorder.

“It practically took my life,” he reflects.

When asked why algorithms are used not to protect young users, but to intentionally recommend eating disorder content, YouTube declined to comment.

The video-sharing site says it “continually works with mental health experts to improve [its] approach to the content recommendations for teenagersIn April 2023, the platform expanded its policies on eating disorders and self-harm content, adding the ability to age-restrict videos that contain eating disorders that are “educational, documentary, scientific, or artistic” or that discuss “details that may be putting viewers at risk.” Based on this policy, these videos may not be available to viewers under the age of 18.

YouTube has taken steps to block certain search terms like “inspiration,” a word used to find images of emaciated bodies. However, the CCDH study found that these videos still appear in the “Next” board. And users learn that by entering a zero for the letter “O” or an exclamation mark for the letter “I,” those terms are still searchable on YouTube. A video noted in the report as glorifying skeletal body shapes had 1.1 million views at the time of analysis; now it has 1.6 million.

As part of the investigation, CCDH flagged 100 YouTube videos that promoted eating disorders, weight-based bullying or showed copycat behaviors. YouTube has only removed or age-restricted 18 of these videos.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *