Digital Media Center
Bryant-Denny Stadium, Gate 61
920 Paul Bryant Drive
Tuscaloosa, AL 35487-0370
(800) 654-4262

© 2024 Alabama Public Radio
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

Why Apple's Anti-Child Sex Abuse Features Could Be Dangerous

(SOUNDBITE OF AD)

UNIDENTIFIED PERSON: When you're using apps on your iPhone, you may start to see this.

SCOTT SIMON, HOST:

That's from Apple's ad campaign touting its commitment to privacy. In this case, it's an option to ask apps not to track your activity online. Companies do track you to create an online profile and sell it to others to target ads.

(SOUNDBITE OF AD)

UNIDENTIFIED PERSON: This has been happening without your knowledge or permission.

SIMON: Apple also announced it would scan photos uploaded to its iCloud for child sex abuse material. Just this week, the company confirmed it's been scanning iCloud mail for child exploitation since 2019. While the goal of protecting children isn't controversial, there are concerns.

Jonathan Mayer teaches computer science at Princeton. He and a colleague created a system similar to Apple's. We should say here that Apple is a financial supporter of NPR. Professor Mayer joins us. Thanks so much for being with us.

JONATHAN MAYER: Thanks for having me.

SIMON: What are your concerns about this system?

MAYER: My introduction to this space was, as a computer scientist, trying to build a system very similar to Apple's. And what we found was we could solve the hard technical problem of matching known child sexual abuse materials that a user sent or stored in a way where the user wouldn't know if there was a match and the user wouldn't know the database's note of materials, which is an important property for protecting law enforcement methods and making it difficult to evade the system.

What we didn't see how to solve was the follow-on problems. I think there are very serious concerns about the system that Apple needs to address. And I think Apple has been very irresponsible in not clearly addressing them from the get-go and not engaging with stakeholders in the relevant communities in privacy, in civil liberties.

SIMON: Mr. Mayer, I think a lot of people just have this simple worry. If a loving couple takes a picture of their 18-month-old child in the bathtub on their iPhone, are they going to hear a heavy knock on their door from the police investigating child abuse?

MAYER: They're not. These systems do not depend on detecting nudity, detecting children.

SIMON: So you're concerned but not calling for it to be overhauled.

MAYER: I think if Apple moves forward, there are absolutely some technical changes they need to make and some process changes they need to make, right? So, you know, this is a path forward for society - balancing security as against other security concerns, privacy concerns and free speech concerns.

SIMON: But, I mean, that, again, raises the question, their priority is rooting out child abuse, and that certainly is laudable. But their priority is not civil liberties.

MAYER: That's correct.

SIMON: When they say they'll protect children, you can win any argument that way. But what about the possibility that this technology can be misused to not protect children at all but to identify political dissidents, to identify ethnic minorities and put them in camps? I could go on with other ugly examples.

MAYER: This is why it's so puzzling that Apple didn't have good answers to those questions. That was the takeaway from our academic research, that we could answer an initial hard technical question, but we couldn't answer hard follow-on questions like what happens when the Chinese government comes knocking. And Apple just doesn't seem to have had its act together in developing firm answers to those questions beyond, you should trust Apple.

SIMON: Well, maybe - forgive me. Maybe they don't want a firm answer. Maybe they just want to go ahead with this system. Maybe they don't want to say no to the Chinese government if it comes to that.

MAYER: It's certainly the case that Apple has had trouble saying no to the Chinese government in past. And it does a lot of business in China. It's their No. 2 market. It's been an engine of growth for the company's stock. It's where they build a large number of their devices. And so, you know, again, Apple is making a big bet that it, as a company, can withstand some possible very serious pressure from that government given all of the leverage that government has.

SIMON: Forgive me. You say they're making a bet. I don't see where they're making a bet at all. What I see is that they're leaving them a big, fat out to say, yeah, we would've preferred the Chinese government not do it, but, you know, in the end, it's not up to us to decide what a government should do. And besides, look at the number of child pornographers we've been able to identify.

MAYER: I think that that's a...

SIMON: I say this with an iPhone in my hand.

MAYER: I at least have been very much willing to take at face value that Apple's incentives start from protecting children, just like ours did in our research project, but that they didn't think through the very serious consequences of what they were building. There's that old adage of, you know, not attributing to malice what you might attribute to incompetence. And there are a bunch of factors specific to Apple that explain this in my mind much more persuasively than, you know, they're building some new capacity for the Chinese government. So I'm not that cynical about it.

SIMON: Jonathan Mayer, a Princeton University professor, thanks so much for being with us.

MAYER: Thank you. Transcript provided by NPR, Copyright NPR.

Tags
News from Alabama Public Radio is a public service in association with the University of Alabama. We depend on your help to keep our programming on the air and online. Please consider supporting the news you rely on with a donation today. Every contribution, no matter the size, propels our vital coverage. Thank you.