Coronavirus: How Facebook, TikTok and other apps tackle fake claims

BBC Technology 01 Feb 2020 02:41
By Zoe Thomas Technology reporter
tiktok postsImage copyright Tiktok

Social media networks are taking steps to address false information about the coronavirus on their sites.

Facebook, Twitter and TikTok are among the firms working to provide links to accurate information.

The number of posts containing misinformation about the spread and alleged cures for the coronavirus has soared.

So far more than 250 people have died as a result of the outbreak and cases have been reported in 22 countries.

The World Health Organization (WHO) has declared the coronavirus a public health emergency.

Most social media networks have rules banning the posting of hateful or defamatory information. But following a backlash against firms such as Facebook and Twitter for allowing fake news to spread during the 2016 US presidential election, networks began taking action.

False information on social media has led to mob violence in several counties and has also helped spread unfounded fears about the safety of vaccines.

So what are they doing?

Facebook says it will limit the spread of false information about the coronavirus by removing "false claims or conspiracy theories".

Facebook said it was focusing on "claims that are designed to discourage treatment" including posts about false cures.

Last year it announced measures to prevent users from forwarding messages to more than five people or groups. It also adds a tag to heavily forwarded messages. In some places such messages have been linked to sparking mob-violence.


It launched a prompt that appears when users search for coronavirus encouraging them to use official channels - the World Health Organization or Centres for Disease Control - for information.

Image copyright Twitter

Video sharing network TikTok has added a link to the WHO's website and a reminder to users to report information they think might be harmful.

The platform is owned by Chinese firm ByteDance. It has recently been criticised for allowing doctors and nurses to post videos giving medical advice on other health issues.


The streaming video site - which is owned by Google - has been investing to make sure accurate and authoritative information appears most often in searches.

YouTube takes down videos when they contain hate speech, harassment, messages that incite violence or scams - all of which violate its community guidelines.

Reddit is a platform made up of community-based discussion groups. Users can vote comments and links posted by other users up or down. Reddit says this design gives the platform protection from false information.

Reddit also "quarantined" one of its user communities because of the large amount of false and misleading information being posted on it. This means users are given a warning about the type of content on the site when they enter.

Snapchat also said the structure of its platforms protects it from the spread of false information.

The site doesn't have a public news feed that anyone can post on.

Continue reading original article...