New Zealand Prime Minister Jacinda Ardern says she was one of many who saw horrifying footage of the March 15 terrorist attacks in Christchurch when the video of it started autoplaying in her social media feed. In the wake of the violence in which 51 people were killed, New Zealand immediately imposed new gun control measures and introduced legislation that would ban most semi-automatic firearms.
Now Ardern is turning her efforts toward another factor in the violence that day: the social media platforms on which the gunman livestreamed his attack.
She's in Paris on Wednesday at a meeting of digital leaders of the Group of Seven nations, working with French President Emmanuel Macron to push an initiative deemed the "Christchurch Call." The agreement asks governments and Internet companies to do more to prevent the live broadcast of terrorist attacks and make sure such content is removed quickly when it does appear.
On Tuesday night, Facebook announced that it would be taking steps to try to prevent such videos from reaching its platform, and working to find effective ways to take them down if they are posted. The company said that users who break certain rules – for instance, "someone who shares a link to a statement from a terrorist group with no context" — will be blocked for a set period of time from broadcasting to Facebook Live.
It plans to extend other restrictions in the coming weeks, including preventing the same people from creating ads on Facebook.
The company says it will spend $7.5 million to partner with three universities to develop tools preventing modified versions of terrorist videos from being reposted. In the first 24 hours after the Christchurch attack, Facebook removed the shooter's video 1.5 million times as people continuously uploaded it.
In a New York Times opinion column Saturday, Ardern wrote of the balance that must be struck: "Social media connects people. And so we must ensure that in our attempts to prevent harm that we do not compromise the integral pillar of society that is freedom of expression. But that right does not include the freedom to broadcast mass murder."
Officials from the U.S., Canada and Britain are expected to be at the summit, as well as Twitter CEO Jack Dorsey and staff from Facebook, Amazon and Google, The Washington Post reports.
A number of nations are expected to sign the Christchurch Call, the Times reports, but the U.S. is not among them, with concerns about free speech.
"The pledge does not contain enforcement or regulatory measures. It will be up to each country and company to decide how to carry out the commitments, according to two senior New Zealand officials involved in the drafting, who spoke on the condition of anonymity because the exact wording of the pledge was still being finalized," according to the Times. "Social media companies will be left with the thorny task of deciding what constitutes violent extremist content, since it is not defined in the accord."
In an email to Reuters, Ardern called Facebook's new limits on livestreaming "a good first step to restrict the application being used as a tool for terrorists."
"This call to action is not just about regulation, but instead about bringing companies to the table and saying, 'You have a role, too, and we have expectations of you,'" Ardern told CNN.
Copyright 2021 NPR. To see more, visit https://www.npr.org.