Skip to content

12 Things You Can't Post About the Coronavirus on Facebook

The social media network targets misinformation surrounding COVID-19, vaccines

facebook logo on phone in front of a coronavirus illustration


En español | Facebook is cracking down on misinformation about the coronavirus and COVID-19 vaccines with sweeping new rules about what can and can't be posted on its social media platforms.

The company now has an exhaustive list of more than 50 specific false claims about the coronavirus it does not allow, ranging from saying the virus is manmade to posting that it's safer to get the disease rather than the vaccine. The rules also extend to Instagram, which Facebook owns.

Facebook has come under scrutiny during the pandemic for allowing conspiracy theories and anti-vaccine rhetoric to spread.

"The original idea was that Facebook was a public square where you can come in and say anything you want,” says Bhaskar Chakravorti, an economist who studies digital technology use and dean of global business at the Fletcher School at Tufts University. “Now they're realizing if they're creating a health hazard, they need to put on some constraints."

Research shows that falsehoods spread significantly faster than the truth on social media, and those age 65 and older are particularly vulnerable to misinformation. A 2019 study published in Science Advances found that found older adults are seven times as likely as younger people to share fake or misleading content on Facebook. The researchers hypothesized that some older adults may not have the digital media literacy and experience to recognize untruths.

Rules tighten during pandemic but are tough to enforce

Facebook has gradually stepped up its efforts to combat harmful content related to COVID-19. Early in the pandemic, it announced a policy to promote posts with accurate coronavirus information, to put warning labels on misinformation and to push it lower in people's feeds. At that point, the platform said it would remove false information “that could lead to imminent physical harm.” (Disclosure: News and information related to the pandemic published by AARP appears in Facebook's Coronavirus (COVID-19) Information Center.)

Older adults are seven times as likely as younger people to share fake or misleading content on Facebook.

In October, Facebook banned ads discouraging vaccines. Two months later, it began removing posts with vaccine misinformation that had been debunked by public health experts. Then, in early February, the tech giant took its strongest stance yet, expanding the list of false claims it would not allow, and threatening to ban users, groups or pages that repeatedly spread misinformation.

In an email, Facebook declined to say how many posts, pages and groups it has taken down under its newest rules, but it noted that it removed over a million pieces of content with harmful COVID-19 misinformation from Facebook and Instagram in the fourth quarter of 2020.

Chakravorti says the platform's long list of rules offers clarity both for the public and for the platform's internal army of fact-checkers about what's permissible and what crosses the line.

Still, with billions of new posts every day, it's impossible to keep all misinformation off the platform, says Anna-Sophie Harling, who searches for and reports fake news as managing director for Europe's NewsGuard.

"They obviously have a massive scaling issue,” Harling says. “It is still way too easy to find false vaccine and COVID misinformation online.”

Harling says as long as the platform's algorithm prioritizes user engagement and viral content over credible information, some users will continue to see sensational headlines and fake news.

Coronavirus, COVID-19 vaccine posts that aren't allowed

In its vaccine policy statement, Facebook says it generally allows “claims that are expressing a personal anecdote or experience,” unless they promote or advocate harmful action. For some questionable content, Facebook adds a warning label and downranks it to reduce the number of people who see it. But under the new rules, it will remove misinformation that has been repeatedly debunked by independent fact-checkers.

AARP Membership -Join AARP for just $9 per year when you sign up for a 5-year term

Join today and save 43% off the standard annual rate. Get instant access to discounts, programs, services, and the information you need to benefit every area of your life. 

Here are 12 examples of the type of posts that are not allowed (the full list is at

1. “Sure, you can take vaccines, if you don't mind putting poison in your body.” Facebook does not allow posts that say the COVID-19 vaccines or their ingredients are toxic, poisonous, harmful or dangerous. It also prohibits any content calling to action, advocating or promoting that others not get the shot.

2. “The COVID-19 vaccines were not tested against a placebo during clinical trials.” Facebook will remove inaccurate claims about how the vaccine was developed or its ingredients. That includes claims that the vaccine contains toxic or harmful ingredients, microchips, animal products or anything not on the vaccine ingredient list. Also prohibited: claims that the vaccine was not tested, or that people died as a result of the Pfizer/BioNTech vaccine during clinical trials.

3. “It's safer to just get the disease rather than the vaccine.” Claims that building natural immunity is safer than vaccine-acquired immunity are barred.

4. “The COVID-19 vaccines won't protect you.” Facebook prohibits posts that say the vaccines do not provide any immunity or that they're not effective in preventing COVID-19. The authorized vaccines were found to be about 95 percent effective in clinical trials.

5. “The COVID-19 vaccine turned me into a monkey.” It's OK to post about your vaccine side effects, but Facebook does not allow claims about side effects that are “incredulous or irrational.”

6. “COVID-19 is no more dangerous than the flu.” Posts that deny the existence of COVID-19 or downplay it are banned. Any claim that the number of deaths from COVID-19 is much lower than the official figure must include additional information or context.

7. “The coronavirus is actually a bioweapon.” Facebook does not allow content that says COVID-19 is manmade, manufactured or bioengineered, or that it was created by an individual, government or country.

8. “This herbal remedy will prevent COVID-19.” You can't tout an unapproved product as a way to cure or prevent COVID-19, or say that a specific activity or treatment results in immunity. In addition, Facebook won't let you sell medical or respiratory face masks or COVID-19 test kits on the platform.

9. “Did you know COVID-19 was actually patented (or predicted) many years ago?” Facebook will delete any posts that say COVID-19 has been patented or that it was predicted, including during the “Event 201” pandemic simulation hosted by Johns Hopkins University in 2019.

10. “The COVID-19 vaccine causes infertility.” Facebook forbids anything that says the COVID-19 vaccines kill or seriously harm people. You also cannot make assertions that the vaccines cause autism, that they change people's DNA or that they infect people with the coronavirus.

11. “Face masks don't help prevent the spread of COVID-19.” Content that discourages mask wearing is banned, including posts that say face masks are connected to 5G technology, that masks can make the wearer sick or that health authorities do not recommend that healthy people wear masks.

12. “Hospitals kill patients to increase their COVID numbers and get more money!” Facebook will remove content that inaccurately represents the access, availability or eligibility of health services such as hospitals. That includes claims that a specific hospital is closed or that only certain people are allowed to receive medical care for COVID-19.

Michelle Crouch is a contributing writer who has covered health and personal finance for some of the nation's top consumer publications. Her work has appeared in Reader's Digest, Real Simple, Prevention, The Washington Post and The New York Times.