Fb will ship notifications on to customers who like, share, or touch upon COVID-19 posts that violate the corporate’s phrases of service, in keeping with a report from Fast Company.
This new function works like this: if a consumer interacts with a put up that’s later eliminated, Fb sends a notification to the consumer telling them that the put up was taken down. If the consumer clicks the notification, they’ll be taken to a touchdown web page with a screenshot of the put up and a brief clarification for why it was eliminated. The touchdown web page may also function hyperlinks to COVID-19 academic sources and actions, like unfollowing the group that posted it.
That is an growth of Fb’s earlier makes an attempt to combat misinformation. Earlier than this, the corporate displayed a banner on the information feed, urging customers who had engaged with content material that had been eliminated, to “Assist Associates and Household Keep away from False Info About Covid-19.” However customers had been typically confused at what the banner was referring to, a Fb product supervisor informed Quick Firm. The hope is that new method is extra direct than the banner, whereas nonetheless avoiding scolding customers or re-exposing them to misinformation.
Fb’s modified method is arriving nearly a yr into the pandemic — a little bit late. The notifications don’t debunk claims in eliminated posts. In addition they don’t apply to posts that later have fact-checking labels placed on them, Quick Firm writes. Meaning less-dangerous misinformation nonetheless has the chance to unfold.
Fb has been gradual to behave on misinformation that the corporate doesn’t think about harmful. Although conspiracy theories about COVID-19 vaccines have unfold for months, Fb solely started removing COVID-19 vaccine misinformation in December. The query now’s: is that this too little, and too late?