Facebook earlier today has revealed that the company took down 1.5 million videos that depict the attack on mosques at Christchurch in New Zealand earlier this week. Most of these videos were said to be blocked right at the upload process, according to a series of tweets by the company.
Not only that, the company has also removed other versions of the video even though they might not contain graphic content. Mia Garlick, spokesperson for Facebook New Zealand stated that this was done out of respect for those who are affected by the mass shooting as well as adhering to the concerns of authorities.
In the first 24 hours we removed 1.5 million videos of the attack globally, of which over 1.2 million were blocked at upload…
— Meta Newsroom (@MetaNewsroom) March 17, 2019
The spokesperson also pointed out that Facebook is still actively removing such content from its service. Previously, Facebook has also stated that it has removed the attack video together with the shooter’s Facebook and Instagram account as soon as it was alerted by the police. Any praise or support for the attack and the shooters have been removed as well.
Alongside the move by Facebook and YouTube, the Malaysian Communications and Multimedia Commission has also urged members of the public not to share such video and reminded them that it is a violation of the Communications and Multimedia Act 1998.
Follow us on Instagram, Facebook, Twitter or Telegram for more updates and breaking news.