Tech

Facebook, YouTube and Twitter go to extraordinary lengths to take down mosque massacre videos

Key Points
  • Tech companies scramble to remove copies of a video of an attack on a mosque in New Zealand on Friday. Officials said 50 people died as a result of the attack on two mosques that day.
  • Facebook says it removed 1.5 million videos of the attack in the first 24 hours.
  • Copies of the video were altered in ways that YouTube's automated systems couldn't detect, according to The Washington Post.
Chris Ratcliffe | Bloomberg | Getty Images

In the hours after a shooting suspect in New Zealand broadcast his mosque rampage across social media, internet companies worked quickly to remove versions of the video that continued to pop up on their platforms.

Facebook said Saturday it removed 1.5 million videos of the attack in the first 24 hours after it was originally livestreamed. Facebook said 1.2 million of those videos "were blocked at upload." Facebook did not immediately respond to CNBC's inquiry about the number of people who viewed the videos of the attack prior to their removal.

Tweet

Google-owned YouTube, Twitter and Reddit also took steps in the hours after the attack to remove copies of the video. Reddit banned a forum where a video of the attack had been posted, saying it violated its policies by "glorifying or encouraging violence." But hours after the massacre, which took 50 lives and was declared an act of terrorism by New Zealand's prime minister, the videos were still available online as tech companies delete duplicate versions.

YouTube deleted tens of thousands of videos from its platform following the attacks and removed human review from its usual content moderation process in order to more quickly take down violent content related to the massacre, according to a spokesperson. The company also "terminated hundreds of accounts created to promote or glorify the shooter," the spokesperson said in a statement.

"The volume of related videos uploaded to YouTube in the 24 hours after the attack was unprecedented both in scale and speed, at times as fast as a new upload every second," the YouTube spokesperson said. "In response, we took a number of steps, including automatically rejecting any footage of the violence, temporarily suspending the ability to sort or filter searches by upload date, and making sure searches on this event pulled up results from authoritative news sources like The New Zealand Herald or USA Today. Our teams are continuing to work around the clock to prevent violent and graphic content from spreading, we know there is much more work to do."

YouTube has previously taken steps to prioritize news reports during a trending event, rather than videos that could potentially spread misinformation. But some of the copied videos of the New Zealand shooting were altered in ways that YouTube's automated systems couldn't detect, The Washington Post reported. The YouTube spokesperson said the company suspended the ability to sort searches by upload date as it tried to remove videos of the attack to make it more difficult to find the violent videos, though it's unclear how quickly this step was taken.

Twitter and Reddit declined to share the number of videos removed from their platforms following the attack.

Subscribe to CNBC on YouTube.

Watch: 'See something, say something' to tackle online extremism: Stratfor

'See something, say something' to tackle online extremism: Stratfor
VIDEO3:1503:15
'See something, say something' to tackle online extremism: Stratfor