AILSA CHANG, HOST:
As it's been reported, the alleged shooter in the New Zealand attack recorded the massacre on video and streamed it live on Facebook. Even though Facebook took down that video, people who had seen it made copies - millions of copies - and posted them all over the Web, which means social media platforms have been scrambling since Friday to get rid of all of them.
To give us an inside look at this challenge, we are now joined by Neal Mohan. He's chief product officer at YouTube. Welcome.
NEAL MOHAN: Thank you for having me.
CHANG: So give us a sense of the scale of this challenge. Facebook said it removed 1 1/2 million videos of this attack in the first 24 hours. How many did you guys take down at YouTube?
MOHAN: Yeah. So as I think it's been widely reported, the volumes at which that content was being copied and then re-uploaded to our platform was unprecedented in nature. To give you a little bit of idea of that, at some points during, you know, the first few hours, we saw on the order of one upload a second...
CHANG: Wow.
MOHAN: ...To our platform.
CHANG: And how many copies of this video or versions of this video did YouTube eventually take down in the first 24 hours?
MOHAN: It was one single video as the starting place, and - but that video had multiple permutations and combinations where it was sliced and diced - not just the video stream itself, but, you know, the Facebook page in which that video was being streamed.
And so in terms of bringing that content down from our platform, we had to deal with not just the original video and its copies, but also all the permutations and combinations. And we brought down on the orders of tens of thousands of those.
CHANG: And can you explain how that intensely complicates the task of taking these videos down - when a video gets sliced and diced or when a user adds a watermark to it or resizes it or animates it?
MOHAN: Yeah. So as with many of these challenges, we use a combination of technology, machine learning algorithms, but also human beings to be able to make decisions that tend to be a bit more nuanced.
CHANG: But I understand in the middle of all this, YouTube opted to rely solely on artificial intelligence and bypass the human moderators. Why was that? Why did you go completely with AI?
MOHAN: Really, to put it simply, it was because of the unprecedented volume at which these uploads were coming to our platform and all the permutations and combinations. And so a few hours into the incident, we made the decision to remove all those videos that were being flagged by our algorithms.
And we have a process by which an uploader can appeal that decision. And we understood that, obviously, if we were taking a step like this, legitimate news organizations' content would also get caught up in this.
CHANG: Right.
MOHAN: And so the mechanics for them are to appeal that decision, and then their video goes up.
CHANG: A lot of experts who track the Islamic State have talked about how much progress YouTube has made over the past couple years in removing ISIS-related content. Why do you think YouTube seems to have more trouble when it comes to a different type of terrorism - when it's related to, say, white supremacy?
MOHAN: Oftentimes, what happens in the case of those ISIS videos is they're being used for propaganda purposes, and you see things like branding or logos or other clues that might be in the video that allow us to find copies and permutations of them. In the case of an incident like this, literally a split second before the incident happens, there is no reference file for it. Every single one of these is different.
In this case, this was particularly different, given the nature of how it was produced from a first-person standpoint. And so our algorithms are having to learn literally on the fly the second the incident happens without having the benefit of, you know, lots and lots of training data on which to have learned.
CHANG: Right. So how confident are you at this point that YouTube has removed every single video of the massacre in New Zealand?
MOHAN: Well, remember; there's still news content up on our platform.
CHANG: Right. Apart from that?
MOHAN: Right. Our technology gets better. We learn from every single one of these incidents. In this case, we learned a lot that, hopefully, we will incorporate into our technology and systems. I believe that we have a handle on the re-uploads at this point. But when you ask a question about a hundred percent or not, I think it's hard for me to give you a number like that
CHANG: Neal Mohan is chief product officer for YouTube. And we should note YouTube is among NPR's financial sponsors. Thank you very much for joining us.
MOHAN: Thank you, Ailsa. Transcript provided by NPR, Copyright NPR.