By Naveen Athrappully
Meta’s Oversight Board, an independent panel that the parent company of Facebook and Instagram selected to deliberate its content decisions, has asked the tech firm to “increase transparency around government requests” regarding suppressing COVID-19 content.
The Oversight Board noted that concerns were raised about Meta reviewing COVID-19-related content at the behest of governments during the height of the pandemic, it said in a statement published on April 20.
“This is particularly problematic where governments make requests to crack down on peaceful protesters or human rights defenders, to control conversations about the origins of the pandemic and to silence those criticizing or questioning government responses to the public health crisis,” it said. “The United Nations has raised concerns that some governments have used the pandemic as a pretext to erode the tenets of democracy.”
The Oversight Board’s observations come as the Biden administration is caught up in a lawsuit alleging collusion between Big Tech and the government to censor disfavored viewpoints and users related to the COVID-19 issue.
In the lawsuit, the states of Missouri and Louisiana claim that at least 67 Biden administration officials and agencies urged companies such as Facebook, Twitter, and Google to suppress alleged misinformation.
“The proof of this sprawling federal ‘Censorship Enterprise’ is voluminous and overwhelming,” the March 6 court filing said. “This enterprise is highly effective—it has stifled debate and criticism of government policy on social media about some of the most pressing issues of our time.”
The Biden administration filed a motion to dismiss the case, claiming that the states lacked standing to bring the claims. However, a federal court in Louisiana denied the motion last month.
In January, Louisiana’s Republican Attorney General Jeff Landry released a document to support his claim that the White House pressured Facebook to take action against Fox News host Tucker Carlson due to the anchor’s position that COVID-19 vaccines “don’t work.”
Transparency and Fairness
In the blog post, the Oversight Board asked Meta to be “transparent” about the matter and regularly report on state requests that ask for reviewing content under the company’s “Misinformation about health during public health emergencies” policy.
Per policy, Meta removes content it identifies as misinformation during public health emergencies provided public health officials affirm that the content is false and may potentially contribute to “the risk of imminent physical harm.”
Between March 2020 and July 2022, Meta removed 27 million pieces of COVID-19 content from Facebook and Instagram, alleging misinformation. The company later restored 1.3 million pieces of content following appeals.
The Oversight Board asked Facebook to “increase fairness, clarity, and consistency around the removal of COVID-19 misinformation.” The company must explain why it deems a claim is false as well as “create a record of any changes made to the list of claims that it removes.”
According to Facebook policies, the platform will remove any content it deems as false information regarding COVID-19, including content that “undermine the severity of COVID-19.”
Content that downplays the severity of the pandemic includes discouraging vaccination or “questioning the efficacy of vaccines,” insisting that COVID-19 is no more dangerous to human beings than cold or common flu, promoting the idea that getting a flu vaccine is more likely to kill a person than COVID-19, and claims that “the number of COVID-19 caused deaths are much lower than the official figure.”
Content stating that wearing face masks or observing social distancing doesn’t help in preventing the spread of COVID-19 may be removed from the platform.
“Claims that COVID-19 vaccines kill or seriously harm people” could be removed as well.