In an apparent effort to ensure their heinous actions would “go viral,” a shooter who murdered at least 49 people in attacks on two mosques in Christchurch, New Zealand, on Friday live-streamed footage of the assault online, leaving Facebook, YouTube and other social media companies scrambling to block and delete the footage even as other copies continued to spread like a virus.

The original Facebook Live broadcast was eventually taken down, but not before its 17-minute runtime had been viewed, replayed and downloaded by users. Copies of that footage quickly proliferated to other platforms, like YouTube, Twitter, Instagram and Reddit, and back to Facebook itself. Even as the platforms worked to take some copies down, other versions were re-uploaded elsewhere. The episode underscored social media companies’ Sisyphean struggle to police violent content posted on their platforms.

“It becomes essentially like a game of whack-a-mole,” says Tony Lemieux, professor of global studies and communication at Georgia State University.

Facebook, YouTube and other social media companies have two main ways of checking content uploaded to their platforms. First, there’s content recognition technology, which uses artificial intelligence to compare newly-uploaded footage to known illicit material. “Once you know something is prohibited content, that’s where the technology kicks in,” says Lemieux. Social media companies augment their AI technology with thousands of human moderators who manually check videos and other content. Still, social media companies often fail to recognize violent content before it spreads virally, letting users take advantage of the unprecedented and instantaneous reach offered by the very same platforms trying to police them.

Neither YouTube, Facebook nor Twitter answered questions from TIME about how many copies of the Christchurch video they had taken down. New Zealand police said they were aware the video was circulating on social media, and urged people not to share it. “There is extremely distressing footage relating to the incident in Christchurch circulating online,” police said on Twitter. “We would strongly urge that the link not be shared.” Mass shooters often crave notoriety, and each horrific event brings calls to deny assailants the infamy they so desire. (Four arrests were made after the Christchurch shooting, and it remains unclear whether the shooter who live-streamed the attack acted alone.)

Facebook said that the original video of the attack was only taken down after they were alerted to its existence by New Zealand police, indicating that an algorithm had not noticed the video.

“We quickly removed …read more

Source:: Time – Technology


(Visited 2 times, 1 visits today)
‘A Game of Whack-a-Mole.’ Why Facebook and Others Are Struggling to Delete Footage of the New Zealand Shooting

Leave a Reply

Your email address will not be published. Required fields are marked *