Select date

May 2024
Mon Tue Wed Thu Fri Sat Sun

Who Fact Checks Health and Science on Facebook?

31-5-2021 < SGT Report 18 1076 words
 

by Sayer Ji, Green Med Info:



Overwhelming pressure from governments and the public has compelled social media platforms to take unprecedented action on what users share online in the pandemic. But who fact checks the fact checkers?


In a move likened to the way governments have assumed emergency powers in response to the covid pandemic, Facebook has removed 16 million pieces of its content and added warnings to around 167 million. YouTube has removed more than 850 000 videos related to “dangerous or misleading covid-19 medical information.”


TRUTH LIVES on at https://sgtreport.tv/


While a portion of that content is likely to be wilfully wrongheaded or vindictively misleading, the pandemic is littered with examples of scientific opinion that have been caught in the dragnet–resulting in their removal or de-prioritisation, depending on the platform and context. This underscores the difficulty of defining scientific truth, prompting the bigger question of whether social media platforms such as Facebook, Twitter, Instagram, and YouTube should be tasked with this at all.


“I think it’s quite dangerous for scientific content to be labelled as misinformation, just because of the way people might perceive that,” says Sander van der Linden, professor of social psychology in society at Cambridge University, UK. “Even though it might fit under a definition [of misinformation] in a very technical sense, I’m not sure if that’s the right way to describe it more generally because it could lead to greater politicisation of science, which is undesirable.”


How fact checking works


The past decade has seen an arms race between users who peddle disinformation (intentionally designed to mislead) or unwittingly share misinformation (which users don’t realise is false) and the social media platforms that find themselves charged with policing it, whether they want to or not.1


When The BMJ questioned Facebook, Twitter, and YouTube (which is owned by Google) they all highlighted their efforts to remove potentially harmful content and to direct users towards authoritative sources of information on covid-19 and vaccines, including the World Health Organization and the US Centers for Disease Control and Prevention. Although their moderation policies differ slightly, the platforms generally remove or reduce the circulation of content that disputes information given by health authorities such as WHO and the CDC or spreads false health claims that are considered harmful, including incorrect information about the dangers of vaccines.


But the pandemic has seen a shifting patchwork of criteria employed by these companies to define the boundaries of misinformation. This has led to some striking U turns: at the beginning of the pandemic, posts saying that masks helped to prevent the spread of covid-19 were labelled “false”; now it’s the opposite, reflecting the changing nature of the academic debate and official recommendations.


Twitter manages its fact checking internally. But Facebook and YouTube rely on partnerships with third party fact checkers, convened under the umbrella of the International Fact-Checking Network–a non-partisan body that certifies other fact checkers, run by the Poynter Institute for Media Studies, a non-profit journalism school in St Petersburg, Florida. Poynter’s top donors include the Charles Koch Institute (a public policy research organisation), the National Endowment for Democracy (a US government agency), and the Omidyar Network (a “philanthropic investment firm”), as well as Google and Facebook. Poynter also owns the Tampa Bay Times newspaper and the high profile fact checker PolitiFact. The Poynter Institute declined The BMJ‘s invitation to comment for this article.


For scientific and medical content the International Fact-Checking Network involves little known outfits such as SciCheck, Metafact, and Science Feedback. Health Feedback, a subsidiary of Science Feedback, handpicks scientists to deliver its verdict. Using this method, it labelled as “misleading” a Wall Street Journal opinion article2 predicting that the US would have herd immunity by April 2021, written by Marty Makary, professor of health policy and management at John Hopkins University in Baltimore, Maryland. This prompted the newspaper to issue a rebuttal headlined “Fact checking Facebook’s fact checkers,” arguing that the rating was “counter-opinion masquerading as fact checking.”3 Makary hadn’t presented his argument as a factual claim, the article said, but had made a projection based on his analysis of the evidence.


A spokesperson for Science Feedback tells The BMJ that, to verify claims, it selects scientists on the basis of “their expertise in the field of the claim/article.” They explain, “Science Feedback editors usually start by searching the relevant academic literature and identifying scientists who have authored articles on related topics or have the necessary expertise to assess the content.”


The organisation then either asks the selected scientists to weigh in directly or collects claims that they’ve made in the media or on social media to reach a verdict. In the case of Makary’s article it identified 20 relevant scientists and received feedback from three.


“Follow the science”


The contentious nature of these decisions is partly down to how social media platforms define the slippery concepts of misinformation versus disinformation. This decision relies on the idea of a scientific consensus. But some scientists say that this smothers heterogeneous opinions, problematically reinforcing a misconception that science is a monolith.


This is encapsulated by what’s become a pandemic slogan: “Follow the science.” David Spiegelhalter, chair of the Winton Centre for Risk and Evidence Communication at Cambridge University, calls this “absolutely awful,” saying that behind closed doors scientists spend the whole time arguing and deeply disagreeing on some fairly fundamental things.


He says: “Science is not out in front telling you what to do; it shouldn’t be. I view it much more as walking along beside you muttering to itself, making comments about what it’s seeing and making some tentative suggestions about what might happen if you take a particular path, but it’s not in charge.”


The term “misinformation” could itself contribute to a flattening of the scientific debate. Martin Kulldorff, professor of medicine at Harvard Medical School in Boston, Massachusetts, has been criticised for his views on lockdown, which tack closely to his native Sweden’s more relaxed strategy.4 He says that scientists who voice unorthodox opinions during the pandemic are worried about facing “various forms of slander or censoring . . . they say certain things but not other things, because they feel that will be censored by Twitter or YouTube or Facebook.” This worry is compounded by the fear that it may affect grant funding and the ability to publish scientific papers, he tells The BMJ.


Read More @ GreenMedInfo.com




Print