Image: Facebook |
If you’ve interacted with confirmed coronavirus misinformation, Facebook might share some facts with you
Facebook is improving its response to the spread of coronavirus misinformation on its platform. The company has announced that it will begin displaying anti-misinformation messages in individuals’ News Feed if those people have liked, reacted or commented on content Facebook has since removed due to factual inaccuracies.
“Stopping the spread of misinformation and harmful content about COVID-19 on our apps is also critically important. That’s why we work with over 60 fact-checking organizations that review and rate content in more than 50 languages around the world. In the past month, we’ve continued to grow our program to add more partners and languages.”
He goes on to add that “Once a piece of content is rated false by fact-checkers, we reduce its distribution and show warning labels with more context. Based on one fact-check, we’re able to kick off similarity detection methods that identify duplicates of debunked stories.”
But clearly Facebook doesn’t think that’s enough to stop people believing things they read on Facebook, so Rosen also explained that messages (such as the ones above and below) will begin appearing in the News Feed of people who’ve interacted with misinformation.
The tone does not appear to be judgemental and is a gentle and timely way to raise awareness. It might direct the user to the World Health Organisation website or other fact check bodies.
Image: Facebook |
Facebook said the change would start to appear in “the coming weeks”.
👉👇You May Also Like👇👌
0 comments:
Post a Comment