Meta on Tuesday took a step towards abandoning its policy of removing misinformation about COVID from its platform.
The company, which owns Facebook and Instagram, is asking its oversight board for an advisory opinion on whether measures to eliminate dangerous COVID-19 misinformation should be continued or revised.
In an online posting, META’s president for global affairs Nick Clegg explained that the company’s harmful information policies were expanded at the start of the pandemic in 2020 to remove entire categories of false claims worldwide. Prior to that time, content was removed from Meta’s platform only if it contributed to the risk of imminent physical harm.
“As a result,” Clegg wrote, “meta has removed COVID-19 misinformation on an unprecedented scale. Globally, more than 25 million pieces of content have been removed since the start of the pandemic.”
However, Meta suggests that it may be time to change its COVID misinformation policy.
“We are requesting an advisory opinion from the Oversight Board as to whether META’s current measures to address COVID-19 misinformation under our Harmful Health Misinformation Policy are appropriate, or whether we should be able to disseminate this misinformation through other means. should be addressed, such as labeling or demoting it, either directly or through our third-party fact-checking program,” Clegg said.
Meta’s COVID misinformation policies were adopted during a state of emergency that called for drastic measures, said Will Duffield, a policy analyst at the Cato Institute, a Washington, DC think tank whose vice president, John Sample, oversight on the board. Yes, explained. “Now, three years later, the spirit of emergency has faded,” he told TechNewsWorld.
“There’s a lot of health information out there,” he said. “If people believe ridiculous things about the efficacy of vaccines or certain treatments, it’s more on them now and less of a consequence of a mixed information environment where people don’t yet know what’s true.”
“This was an unprecedented step to entrust the policy to global health organizations and local health authorities,” he said. “At some point, some of them had to be clawed back. You can’t have a state of emergency that lasts forever so it’s an effort to start unwinding the process.”
Is the opening process starting too soon?
“In the developed world, vaccinations are almost universal. As a result, while caseloads remain high, the number of serious illness and deaths is quite low,” said Dan Kennedy, a professor of journalism at Northeastern University in Boston.
“But in the rest of the world, where there are countries where Facebook is a bigger deal than the US, the emergency is nowhere close to being over,” he told TechNewsWorld.
“While many countries are taking steps to return to more normal lives, this does not mean the pandemic is over,” said Beth Hoffman, a postdoctoral researcher in the Department of Behavioral and Community Health Sciences at the University of Pittsburgh’s School of Public Health. .
“A major concern is that removing the current policy will particularly harm areas of the world with low vaccination rates and fewer resources to respond to the rise in new forms or new forms,” she told TechNewsWorld.
Clegg acknowledged the global implications of any policy changes META makes. “It is critical that any policy meta-implementation is appropriate to the full range of circumstances that countries find themselves in,” he wrote.
Sand. line in
Meta wants to draw a line in the sand, maintained Karen Kovacs North, director of the Annenberg program on online communities at the University of Southern California. “They say there is no imminent physical harm, the way there was at the beginning of the pandemic,” she told TechNewsWorld.
“If there is no imminent bodily harm, they don’t want to set a precedent for stern action,” he said.
Clegg noted in his posting that Meta is fundamentally committed to free expression and believes its apps are an important way for people to make their voices heard.
“But resolving the inherent tension between free expression and security is not easy, especially when faced with unprecedented and rapidly growing challenges, as we have lived in the pandemic,” he continued.
“Therefore we are taking the advice of the Oversight Board in this matter,” he wrote. “Its guidance will also help us respond to future public health emergencies.”
Meta says it wants to balance free speech with the spread of misinformation, so it makes sense that it would rethink its COVID policy, said Mike Horning, an associate professor of multimedia journalism at Virginia Tech University. Told.
“While they remain concerned about misinformation, it is also good to see that they are concerned about how the policy could affect free speech,” he told TechNewsWorld.
Backlash from content removal
Horning said removing Covid misinformation could improve Meta’s image among some of its users. “Removal policies can be effective in slowing the spread of misinformation, but it can also create new problems,” he said.
“When people have deleted their posts, more conspiracy theorists see that as confirmation meta is trying to suppress some of the information,” he continued. “So removing the content may limit the number of people viewing the misinformation, leading some to view the company as unfair or biased.”
The effectiveness of removing COVID misinformation may even exceed its expiration date. “One study found that when COVID misinformation controls were first implemented, there was a 30% reduction in the distribution of misinformation,” Duffield said.
“Over time, misinformation peddlers shifted to talking about other conspiracy theories or found coded ways to talk about COVID and COVID skeptics,” he continued. “So initially it had an effect, but over time that effect lessened.”
North notes that some methods of controlling misinformation may seem weak, but may be more effective than simply removing the content. “Removing content can be out-of-the-box. Content is removed so people try to post it in a different way to trick the algorithms,” she explained.
“When you de-index it or reduce its exposure,” she continued, “it’s very hard for a poster to know how much exposure it’s getting so it can be very effective.”
profiting from misinformation
While META declares the noblest objectives to be to change its COVID misinformation policy, there may be some bottom-line concerns influencing the move.
“Content moderation is a burden for these companies,” said Vincent Reynolds, an assistant professor in the department of communication studies at Emerson College in Boston.
“Any time you remove content from your platform, there’s a cost associated with that,” he told TechNewsWorld. “When you drop content, you are more likely to get more content creation and engagement with that content.”
“There are a lot of studies showing that misinformation generates a lot of engagement, and for these companies, user engagement is money,” he said.