Facebook Was Slow To Counter Misinformation About COVID-19

WASHINGTON – In March, as reports abounded on social media that COVID-19 vaccines were ineffective and even dangerous, undermining efforts to contain the virus, some Facebook employees believed they had found a way to help. Modifying the parameters for cataloging vaccine publications could limit the amount of misleading information people saw and offer users content from legitimate sources such as the World Health Organization. “In light of the results, I suppose we will try to do it as quickly as possible,” wrote a Facebook employee in March, in response to an internal report on the study they had done. Facebook, however, ignored the study’s recommendations. And only in April did he make some changes. When another Facebook researcher raised in March that comments on vaccine posts be turned off until the platform found a way to better handle that content, the proposal was also ignored. Some say Facebook was slow to react because it feared any move would hurt its profits. “Why are the comments not removed? Because the only thing they care about is having people connected, ”says Imran Ahmed, CEO of the Center for Countering Digital Hate, a group that studies how the internet works. “They attract the user and that generates money.” In an email, Facebook said it made “considerable progress” this year in tackling misinformation circulating on the app. The internal discussions of Facebook were revealed by the former executive of the company Frances Haugen before the Securities and Exchange Commission of the United States. Written versions delivered to Congress were obtained by a group of journalistic organizations, including The Associated Press. The documents reveal that, amid the COVID-19 pandemic, Facebook thoroughly investigated the role of its platforms in spreading disinformation about life-saving vaccines. They also indicate that their employees periodically offered solutions to counteract this misinformation, which were not taken into account. The Wall Street Journal reported last month on some of Facebook’s efforts to deal with anti-vaccine comments. The failure to take action leaves many wondering whether Facebook prioritized controversy and division over the health of its users. “These people sell fear and outrage,” said Roger McNamee, a Silicon Valley investor who was once a Facebook shareholder and who is now critical of the company. “It is not a coincidence. It is a business model ”. Normally, Facebook evaluates the publications according to the interest they generate, the number of “likes”, “dislikes”, comments and the number of times they are shared. That way of evaluating works with innocuous posts, like recipes and photos of dogs. But Facebook’s own documents point out that when it comes to divisive issues, such as vaccines, that way of cataloging posts fuels polarization, disagreements and doubts. To study ways to reduce the amount of misinformation about vaccines, Facebook researchers changed the way they evaluate the posts of more than 6,000 users from the United States, Mexico, Brazil and the Philippines. Instead of cataloging vaccine publications based on their interest, he did so based on their veracity. The results were striking: Interest in posts whose content was unauthorized by verifiers dropped almost 12%, while reading of posts from authoritative sources such as the WHO or the US Centers for Disease Control rose 8%. Many Facebook employees wondered why the company did not accept their recommendations. Facebook, for its part, said it implemented many suggestions from the study. But he missed a month at a key moment in the vaccination campaign. In a statement, Facebook spokesperson Dani Lever said the internal documents “do not represent the considerable progress we have made since then in promoting reliable information about COVID-19 and expanding our policies to eliminate harmful misinformation about COVID and vaccines ”. The company noted that it took time to analyze and implement the changes. The delay in acting came at a time when vaccines were being offered to the most vulnerable: the elderly and the sick. Public health authorities were alarmed as only 10% of the population had received the first dose. And a third of the population thought not to get vaccinated, according to a poll by the Associated Press-NORC Center for Public Affairs Research. Facebook employees admitted that they “had no idea” about how strong the resistance to vaccination was expressed in comments on the portal. A February company investigation, however, revealed that 60% of the comments were against the vaccine or expressed reluctance to get vaccinated. “Our ability to detect (reluctance to get vaccinated) in comments is poor in English and basically non-existent elsewhere,” read another internal report released March 2. Facebook CEO Mark Zuckerberg announced on March 15 that the company would begin tagging posts about vaccines that described them as safe. This move allowed Facebook to continue attracting users – and money – who made negative comments about vaccines, according to Ahmed of the Center for Countering Digital Hate. “Facebook took action that caused people to receive misinformation that caused their death,” Ahmed said. “By now, they should investigate murders.” Connect with the ! Subscribe to our YouTube channel and activate notifications, or follow us on social networks: Facebook, Twitter

and Instagram.