Facebook gives money-making tools to Covid-19 cranks

0

Jasper Jackson and Alexandra Heal 

A video opens with a question: “Could this be bio-terrorism?”

Talking directly into the camera, the unshaven young man continues in a sarcastic American drawl: “I’m sure that there’s no possible way that somebody could play Frankenstein and create the monster that they didn’t mean to, or in many cases, did mean to.”

He moves on to reports in late 2019 that “thousands of CEOs have been stepping down over the last year… Did these CEOs have insight that the global economy was going to tank and something was going on? … It seems like they had a heads up … I think they had a heads up on the economic collapse and they got out of the way.”

The man is An0maly, a “news analyst & hip-hop artist” with more than 1.5m Facebook followers. The video from March 2020, the early days of the coronavirus pandemic, is one of at least three videos and posts on An0maly’s page that Facebook’s fact-checkers have flagged for containing false or partly false information about the pandemic. Yet even today a strap appears under the videos inviting viewers to pay to “Become a supporter” and “Support An0maly and enjoy special benefits”.

An0maly, real name AJ Feleski, runs one of 430 Facebook pages – followed by 45 million people – identified by the Bureau of Investigative Journalism as directly using Facebook’s tools to raise money while spreading conspiracy theories or outright misinformation about the pandemic and vaccines.

The Bureau’s findings, which likely represent a small snapshot of the vast amount of monetised misinformation on Facebook, show how the site enables creators to profit from spreading potentially dangerous false theories to millions. Some of the posts seen by the Bureau could harm take up of coronavirus vaccines or lead to users believing the pandemic is a hoax. While Facebook generally does not take a cut of this income – although there are times when it does – it benefits from users engaging with this content and staying on its services, exposing them to more adverts.

Facebook’s policies for creators using monetisation tools include rules against misinformation, especially medical misinformation. In November, Facebook, along with Google and Twitter, agreed a joint statement with the UK government committing to “the principle that no user or company should directly profit from Covid-19 vaccine mis/disinformation. This removes an incentive for this type of content to be promoted, produced and be circulated.”

The Bureau’s findings suggest Facebook has failed to adequately implement this agreement and appropriately enforce its own policies. Separately, some parts of the business have seemingly been left without proper oversight of misinformation.

The methods of monetisation vary from page to page – more than two dozen, including An0maly’s, use Facebook-designed “creator” or “fundraiser” tools to receive income directly from their Facebook audience. Hundreds more use the social network’s shopping facilities to sell everything from t-shirts to tarot readings.

More than 260 of the pages identified by the Bureau have posted misinformation about the coronavirus vaccine. The remainder include false information about the pandemic, vaccines more broadly, or some combination of the two. More than twenty of the pages identified by the Bureau have even been “verified” by Facebook, gaining a blue tick signaling authenticity.

Organisations including the UN, the WHO and Unesco said in September that online misinformation “continues to undermine the global response and jeopardizes measures to control the pandemic”. Vaccine hesitancy is seen as a significant threat to efforts to return to some semblance of normality, with misinformation on social media cited as a key driver of anti-vaccine sentiment.

Facebook told the Bureau: “Pages which repeatedly violate our community standards – including those which spread misinformation about Covid-19 and vaccines – are prohibited from monetising on our platform. We are reviewing the pages shared with us and will take action against any that break our rules.”

The company has already closed down a small number of pages as a result of that review and said it had removed 12m pieces of Covid-19 misinformation between March and October, and placed fact-check warning labels on 167m other pieces of content.

LEAVE A REPLY

Please enter your comment!
Please enter your name here