Hi, it’s Mark Bergen. Erin Elizabeth was kicked off Facebook. Elizabeth, owner of an alternative health blog, is one of the “disinformation dozen” named in a recent high-profile report on social media’s anti-vaccination problem. Soon after that report, she was purged from Facebook and Instagram, losing access to millions of followers.
I learned this because Elizabeth said so on her YouTube channel.
Over the weekend, President Joe Biden and fellow Democrats ripped into social media for letting vaccine fictions spread. Biden accused social networks of “killing people,” then clumsily walked that back, blaming instead the prolific anti-vaccine social accounts. It’s not clear what the president’s regulatory strategy is here, but it’s clear who his nemesis is: Facebook Inc. Like countless sagas since 2016, most chatter and coverage of this episode focused on Facebook. Some mentioned Twitter, its littler rival. YouTube, which has over two billion monthly visitors, four times Twitter’s revenue and huge cultural reach, was mentioned as an afterthought, if at all.
For one, YouTube, tucked into Alphabet Inc.’s Google, is harder to track. People angry about vaccines often post concise, shareable text or memes on Instagram and Twitter. On YouTube, they may drop vaccine skepticism or lies forty minutes into an hour-plus clip.
The Center for Countering Digital Hate, a British nonprofit, released the report on the “disinformation dozen” that the White House has cited. Of the dozen mentioned, only a few have lively YouTube channels. YouTube took down two of the accounts and “removed a number of videos from the others” if they violated the company’s “three strike” rule, a spokesperson said.
Yet most of these people appear regularly elsewhere on YouTube, as talking heads on vlogs, podcasts or news clips. Some YouTubers will repackage conspiracies as business motivational videos, making them harder to detect, the CCDH noted in an earlier report. For its “disinformation dozen” assessment, the CCDH looked at Facebook and Twitter data because it was easily accessible. They don’t have comparable methods for YouTube, the group said in an email.
Facebook is also a bigger, easier target. The social network’s reputation evokes privacy failures, Russian bots, fake news—take your pick.
But many advocates who have gone after Facebook see similar patterns on YouTube. The video site now bars anything that contradicts health authorities about Covid-19 or vaccines. However, that rule doesn’t work smoothly in practice, said Jessica González, co-CEO of Free Press, an advocacy group. She sees the problem as particularly acute with videos watched by the Latino community, where YouTube is very popular. “Tons of stuff is slipping through. It’s not a well-enforced policy,” González said. “YouTube has really flown under the radar.”
YouTube would say it has flown under the radar because it has acted. Since March 2020, the company said it has removed over 900,000 videos for violating Covid-19 misinformation policies. Some vaccine dissenters with big followings elsewhere, like Elizabeth, have smaller audiences on YouTube. (Joseph Mercola, another of the “dozen,” has close to 400,000 subscribers there, but he’s got more followers on Facebook.)
And YouTube argues that its search results elevate mostly “authoritative” videos about health, which some academic studies have concluded. Although YouTube’s algorithmic system, like Facebook’s, remains a black box to most outside researchers, regulators and users.
Certainly, YouTube can be less combative—and, arguably, more savvy. After Biden’s recent comments, for instance, Facebook came out with a finger-wagging retort. YouTube instead introduced a new set of health features lead by Garth Graham, a physician the company hired earlier this year. One of Graham’s last moves, just a few months ago, was to recruit YouTube stars to make videos promoting vaccinations with Dr. Anthony Fauci and President Biden. —Mark Bergen.