When Facebook founder Mark Zuckerberg appears before Congress this week, he's kicking things off with an apology — an expansive one.
Facebook didn't do enough to prevent its platform from being used to do harm, and that goes for "fake news, foreign interference in elections, and hate speech, as well as developers and data privacy," Zuckerberg says. "We didn't take a broad enough view of our responsibility, and that was a big mistake. It was my mistake, and I'm sorry."
Zuckerberg's prepared testimony for his appearance in front of a House of Representatives committee has been released online. He speaks in front of a joint meeting of two Senate committees on Tuesday, and then the House Energy and Commerce Committee on Wednesday.
Lawmakers are expected to grill him with tough questions over Facebook's handling of user data, and the use of the platform to manipulate elections.
This is the Facebook CEO's first time in front of Congress. But, as Zeynep Tufekci noted in Wired last week, it's hardly his first apology — he's expressed regrets and remorse many times since Facebook's founding in 2004.
Zuckerberg is treading familiar territory when he apologizes for the website's privacy practices, in particular.
But in his prepared testimony and recent public statements, he's also trying something new — a shift in how Facebook describes its relationship with content on the site.
For years, he was adamant that Facebook was a tech company, not a media company — a platform, not a publisher. It was a way of arguing that Facebook was not responsible for the user-created content on the site. (In 2016, NPR's media correspondent, David Folkenflik, called it a "an extremely disingenuous stance.")
Now Zuckerberg is willing to be held accountable for what users say and do on Facebook. In fact, he believes he has a duty to do far more than just clear falsehoods from the site.
"It's not enough to just connect people, we have to make sure those connections are positive," Zuckerberg says in his prepared testimony. "It's not enough to just give people a voice, we have to make sure people aren't using it to hurt people or spread misinformation. It's not enough to give people control of their information, we have to make sure developers they've given it to are protecting it too. Across the board, we have a responsibility to not just build tools, but to make sure those tools are used for good."
In addition to the broad claim of responsibility, Zuckerberg directly addresses the Cambridge Analytica scandal, which finally brought him before Congress.
"Over the past few weeks, we've been working to understand exactly what happened with Cambridge Analytica and taking steps to make sure this doesn't happen again," he says. However, as Zuckerberg acknowledges, Facebook has known about the Cambridge Analytica information-sharing for years.
They changed their platform to limit information apps could access, and later banned researcher Aleksander Kogan's app from the Facebook marketplace, without telling users that their information had been shared improperly.
Zuckerberg also addresses the 2016 election, explaining the website's efforts to block "traditional threats" like hacking and malware, and the company's failure to identify or prevent disinformation campaigns and the coordinated use of fake accounts.
"There's no question that we should have spotted Russian interference earlier, and we're working hard to make sure it doesn't happen again," he says, listing efforts to take down fake accounts, invest in security review and add new measures to promote transparency about who is paying for advertising.
You can read his full testimony on the House website.
On Monday, Facebook also announced a new initiative to make data available to "select scholars" who are researching "the role of social media in elections, as well as democracy more generally."
You can read more about the program here.
Copyright 2021 NPR. To see more, visit https://www.npr.org.