YouTube announced in October 2020 it would ban misinformation about Covid-19 vaccines on its platform, removing any videos that “contradict expert consensus from local health authorities or the World Health Organization.”
Seven months later—at a time when U.S. public health officials are trying to convince millions of vaccine holdouts to get inoculated—YouTube is allowing conspiracy theorists to spread an array of false and misleading claims about the Covid vaccines, according to a new Tech Transparency Project (TTP) investigation.
TTP found numerous YouTube videos making baseless allegations that COVID-19 vaccines have been linked to deaths and miscarriage, contain fetal byproducts, or can cause transmissible health risks. Some of the videos even ran with ads, allowing purveyors of vaccine misinformation—and YouTube owner Google—to make money off these lies.
It’s likely that many advertisers aren’t aware they’re appearing next to vaccine misinformation videos on YouTube. In one example spotted by TTP, an American Lung Association ad encouraging vaccine adoption ran alongside a video claiming that Pfizer and Moderna vaccines contain harmful substances that could “cost you your life.”
YouTube’s failure to take down such false vaccine claims threatens to undermine U.S. and global efforts to expand vaccinations, persuade skeptical individuals to get the shot, and bring the pandemic under control. It also highlights YouTube’s repeated pattern of ineffective enforcement against content violations, from election misinformation to hate speech.
Bill Gates and ‘vaccine shedding’
Under YouTube’s October 2020 policy changes, the company prohibited false claims that Covid vaccines contain fetal byproducts or “cause death, infertility, miscarriage, autism, or contraction of other infectious diseases.”
But TTP found videos on YouTube pushing exactly these kinds of theories. They often failed to include the COVID-19 “information panel” that YouTube has promised to append to videos about the pandemic, directing people to authoritative health guidance.
Dozens of videos on YouTube warn that vaccines, including the COVID-19 shot, are part of a Bill Gates-led “depopulation” plan to sterilize or kill certain groups of people. For more than a year, baseless conspiracy theories have depicted Gates, the billionaire philanthropist and Microsoft co-founder, as the mastermind behind the pandemic.
Other videos identified by TTP trade in half-truths, taking factual information out of context to imply that the vaccines are far more dangerous than studies have shown.
For example, some videos use the Vaccine Adverse Event Reporting System (VAERS), a database maintained by the U.S. Department of Health and Human Services, to claim that thousands of people have died from the COVID-19 vaccine—a claim that’s been debunked by fact checkers. The database allows anyone to report incidents, which are not verified, and VAERS makes clear the adverse events may be entirely coincidental and not caused by a vaccine. As of May 24, the Centers for Disease Control and Prevention had not established a causal link between any of the deaths reported in the VAERS database and the COVID-19 vaccines.
Another video suggests, without evidence, that VAERS is underreporting the number of “spontaneous miscarriages” caused by COVID-19 vaccines. This too is a willful misinterpretation of the data.
“Vaccine shedding,” the unfounded theory that vaccinated people can transmit side effects to unvaccinated individuals with whom they come into contact, is a particularly popular topic among vaccine skeptics on YouTube. In one video, anti-vaccine advocate Sherri Tenpenny warns vaccine shedding can cause “blood clots, menstrual irregularities, heavy bleeding, [and] cardiovascular side effects.” YouTube allows ads to run alongside this video, helping the person who posted it to profit from disinformation, with Google taking a cut as well. On one viewing, Tenpenny’s video appeared after an ad for vacation rental website Vrbo.
In another video that ran without YouTube’s Covid vaccine information panel, Mikael Cromsjö, who identifies himself as part of the Swedish Freedom Movement, says it’s possible for people who have received mRNA vaccines like Pfizer and Moderna to pass along harmful agents to unvaccinated people.
Meanwhile, a YouTube channel called “Yoga and Healing with Elvira” has posted at least three videos on the supposed dangers of vaccine shedding. In one video, the host conveys what she says is a divine message about herbal remedies to protect from vaccine shedding. In a conversation in the comments, she states that vaccinated individuals can reduce the amount of time that they “shed” symptoms or side effects through “homeopathy & herbal medicine + energy healing + clean diet.”
Magnetic chips and false fetal claims
Some YouTube videos identified by TTP claim that vaccines cause side effects because of their contents. In a video describing the vaccines as a “rushed product” that could have life-threatening consequences, a vlogger with over 35,000 subscribers says the currently available vaccine has “byproducts in it, and other chemicals” that pose health risks. Perversely, an American Lung Association ad encouraging viewers to get the vaccine ran alongside this video during our tests.
YouTube policies ban claims that an approved COVID-19 vaccine will contain substances that are not on the vaccine ingredient list.” But that hasn’t stopped such claims from popping up all over YouTube. Another viral YouTube video with over 12,000 views shows a woman claiming that her vaccine injection point has a magnetic field, saying, “We’re chipped. We’re all f--ed!” The video’s title and descriptive text suggest that “nanobots” are to blame.
No rumored vaccine ingredient gets more airtime on YouTube than fetal remains. YouTube provides safe haven for conspiracy theorists who claim that the vaccines contain the byproducts of an elective abortion, despite the platform’s explicit ban on claims that the vaccines contain unlisted ingredients “such as biological matter from fetuses (e.g. fetal tissue, fetal cell lines).”
These rumors stem from a half-truth. Fetal cell lines were used in the testing phase of the Pfizer and Moderna vaccines, and in the production of the Johnson & Johnson vaccine, but as fact checkers have pointed out, the vaccines themselves contain no human byproducts.
One YouTube video deceptively edits videos containing factual information about the use of fetal cell lines in vaccine development and uses the clips to introduce a monologue in which the vlogger claims that the vaccine is "made from aborted babies." "You're getting a shot with babies inside the shot," he tells viewers.
Evasive Maneuvers
Some vaccine skeptics use YouTube to promote banned content that is hosted on other platforms. These posts often contain innocuous video content accompanied by descriptive text that contains misinformation or links to third-party sites devoted to vaccine conspiracy theories.
For example, YouTube banned the channel belonging to the anti-abortion outlet LifeSiteNews for spreading misinformation about the pandemic, including a video that cast doubt on the effectiveness of “abortion-tainted” Covid vaccines. But the group’s co-founder, John-Henry Westen, uses his personal YouTube channel to direct viewers to same LifeSiteNews video, now hosted on Rumble, an alternative video platform popular with right-wing users. Westen’s message on YouTube redirecting viewers to the banned video has more than 13,000 views.
Other YouTubers use unrelated video content to advertise misinformation on other platforms. For example, a post by YouTube user Allyson Jayne contains a video of a vintage ad for Barbie dolls, but the video description refers to the vaccine as the “covid kill shot” and links to a video on a third-party site that claims that contact with vaccinated individuals is dangerous. The linked video, hosted on “free speech” video platform UGETube, is entitled “Is their shot putting you at risk? Side effects for YOU when THEY get the shot!”
Similarly, the YouTube channel circleofmamas posts innocuous videos alongside misinformation about vaccine deaths and links to a website filled with conspiracy theories about “vaccine injury stories.” In one video, circleofmamas posted a video of MSNBC commentator Midwin Charles alongside text suggesting that Charles’s recent death was due to complications from the Pfizer vaccine. The video description says, “Her family announced her death April 7th. She received the first dose of Pfizer on March 1. Healthy, young people do not just die,” and links to a page amplifying that speculation. Charles’ family has not announced the cause of death, and she has become a target for conspiracy theorists looking for evidence of vaccine fatalities.
In a similar video, circleofmamas posted an innocuous video of a doctor receiving the vaccine, alongside text that states that the doctor “died suddenly of a heart attack on April 7” after receiving the vaccine in early January. Like the Midwin Charles video, circleofmamas ended the video description with the tagline “Healthy, young people do not just die” and a link to the group’s webpage containing vaccine disinformation.
The above examples show YouTube allows vaccine conspiracy theorists to flout its rules and spread misinformation that could pose a threat to public health. The platform even allows some posters to monetize this dangerous content by showing ads. YouTube's lax enforcement is all the more alarming, given recent reports that a public relations firm working for an unknown client has attempted to enlist European YouTubers in a baseless "information campaign" about the Pfizer vaccine causing hundreds of deaths.
Despite YouTube’s claims that it is getting better at removing content that breaks its rules, TTP's study shows that the platform still has a long way to go.