This spring, Reset Australia and the Tech Transparency Project (TTP) showed how Facebook allowed ads for things like alcohol, drugs, and extreme weight loss to target teens as young as 13, revealing a disturbing facet of the company’s ad business.
In response, Facebook said it was “investigating why some of these violating ads were not detected,” and the company later announced that it would no longer allow advertisers to target users under 18 based on their interests and activity.
In September, TTP tried the ads experiment again, to see if Facebook is doing any better at protecting kids from harmful messages. The results show a repeat fail.
TTP submitted the same set of six test ads to Facebook for approval, promoting pill abuse, alcoholic drinks, anorexia, smoking, dating services, and gambling. They were aimed at users aged 13 to 17 in the U.S., and just like before, all were approved quickly—most of them in less than an hour.
Following its recently announced changes, Facebook no longer gave the option to target teens based on their perceived interest in topics like “alcoholic beverages,” “pharmaceutical industry,” and “extreme weight loss.” But Facebook continues to allow advertisers to target teens broadly as a group, meaning TTP’s latest round of test ads had the potential to spread harmful messages to more than 9 million young people across the U.S.
TTP didn’t allow any of these test ads to actually run, canceling them before the scheduled publication date. But the results highlight Facebook's ongoing problems protecting children, and they undermine statements by Facebook’s global head of safety, Antigone Davis, at a Senate hearing on Thursday. Davis testified that Facebook forbids ads for alcohol and weight loss to young people—a policy it clearly doesn’t enforce, according to TTP’s findings.
The Senate hearing also focused on a bombshell series of reports in the Wall Street Journal about Facebook’s knowledge of the harms it poses to users, particularly Instagram’s negative impact on the mental health of teenage girls. Facebook said the Journal mischaracterized internal company documents that formed the basis for the reports, while at the same time announcing it would “pause” plans to build a version of Instagram for kids under 13.
Facebook’s poor track record on teen-based advertising, as revealed by TTP, raises yet more questions about the company’s handling of vulnerable young users.
Faulty ad reviews
According to Facebook’s advertising policies, the company’s ad review system relies “primarily” on automated tools to scan for violations, with ads subject to re-review after they go live.
The company’s heavy reliance on artificial intelligence to police its platform has come under heavy scrutiny amid the company’s non-stop problems identifying content that violates its policies. A September report from researchers at New York University’s Stern Center for Business and Human Rights recommended that Facebook and other tech giants “double the size of their human content-moderation corps and make moderators full-fledged employees,” noting that bringing moderators in-house would lead to “more careful analysis.”
TTP’s latest study found that careful analysis by Facebook was clearly lacking with advertising directed at teens.
Take the test ad for pills. It features a backdrop of colorful pills, with the text, “Throw a skittles party like no other.” That’s a reference to pill parties where teens pilfer their parents’ pharmaceuticals, mix them in a community bowl, and take turns taking them. Addiction specialists have been warning about the innocent-sounding but dangerous “skittles party” trend for years.
TTP targeted the pill ad at all 13- to 17-year-old users in the U.S. And just like before, the ad was quickly approved, this time in 51 minutes—despite Facebook’s rules prohibiting ads that promote the use of prescription and recreational drugs.
In its original experiment in May, TTP targeted the pill ad at a subset of teens that Facebook categorized as interested in drug-related topics like “capsule (pharmacy),” “drugs.com,” and “pharmaceutical industry,” yielding a potential reach of about 340,000 users.
Facebook removed those targeting options before TTP’s second experiment, conducted in mid-September. But the company still approved the pill ad to broadly target 13- to 17-year-olds, giving it an audience of up to 9.1 million users.
At Thursday’s Senate Commerce hearing, Davis, the Facebook global head of safety, told lawmakers that “we actually don’t allow weight loss ads to be shown to people under the age of 18 already.” But TTP’s study found that Facebook still approved an ad promoting the eating disorder anorexia nervosa to teens.
The ad, which shows the skinny waist of a young woman, provides an “AnaTip” that reads, “When you’re craving a snack, visit pro-ana sites to feed your motivation and reach your goal.” (“Ana” is a slang term for anorexia that’s been identified in media reports.)
Back in May, TTP targeted the “Ana” ad to teens interested in topics like “diet food” and “extreme weight loss.” Those interest categories were removed ahead of the September test, but the anorexia ad still had the ability to target all 13- to 17-year-old users—again, up to 9.1 million of them.
The results cast doubt on Facebook’s promise—now more than two years old—to crack down on content that promotes eating disorders. As recently as this February, Facebook was touting its work with experts on negative body image issues.
See the teen test ads that got a green light from Facebook—a second time.
Profiling teens
The other test ads approved by Facebook promote alcohol (“Summer is coming. Get ready with these cocktail recipes”); gambling (“Feeling lucky? Win cash, prizes, and more!”); and dating services (“You look lonely. Find your partner now to make a love connection”). Another ad shows an image of a young woman vaping.
The cocktails and vaping ads are particularly noteworthy, given assurances by Facebook’s Davis at the Senate hearing that, “We don’t allow tobacco ads at all. We don’t allow them to children either. We don’t allow alcohol ads to minors.”
Following the initial reports by Reset Australia and TTP, Facebook did limit ad targeting of teens, removing the “interest categories” that Facebook developed for 13- to 17-year-olds based on their online activities. That prevents advertisers from hypertargeting teens—though, notably, Facebook never promised to stop collecting information on teens and amassing profiles of them. Instead, the company merely pledged to remove “interest category” targeting of young users until they turn 18. That means Facebook may still be tracking teens’ movements online and their interest in harmful topics like drugs, alcohol, and extreme weight loss.
In what may be a sign of that continued tracking, TTP observed that Facebook is still allowing advertisers to see the potential pool of 13- to 17-year-olds that fall into categories like “alcoholic beverages,” “online gambling,” and “electronic cigarette," even though they can't be targeted on that basis—yet.
When asked at a Sept. 21 congressional hearing if Facebook should be able to advertise to teenagers at all, Steve Satterfield, Facebook’s vice president of privacy and public policy, told lawmakers the company has adopted “appropriate” measures, adding, “We’re going to continue to look into ways to keep teenagers safe on our services while being able to support them with advertising.”