MICHEL MARTIN, HOST:
After the attack on the U.S. Capitol earlier this month, Facebook, like several other social media companies, suspended President Trump's account. At the time, the company said the account would be suspended indefinitely, at least until the peaceful transfer of power had taken place.
So now that that's happened, what's next for the former president's account, an account that he used to spread misinformation throughout his four years in office? Facebook has asked its independent oversight board to take this question on and decide whether Trump should have access to his Facebook and Instagram accounts going forward. The independent oversight board was created last year to make these kinds of complicated decisions, and this will be one of its first big cases.
So we're joined now by the co-chair of the oversight board, Jamal Greene. He's a professor at Columbia Law School, focusing on constitutional law regulation and public policy. Mr. Greene, welcome back to the program. Thank you for joining us once again.
JAMAL GREENE: Thank you for having me.
MARTIN: And I do want to mention here that Facebook is one of NPR's financial supporters. So first, let's go back. What went into the decision behind suspending Trump's account on January 7? Was the oversight board involved in making that decision?
GREENE: The oversight board was not involved in the initial decision to suspend. That was a decision made by Facebook and on their side. And Facebook says they did it in response to some posts that faced - that President Trump put up on the site. And the oversight board will have to decide whether that was an appropriate response.
MARTIN: And is that the way it's supposed to work? I thought that the - so the oversight board is a reactive body, not a prescriptive body?
GREENE: That's right. The oversight board does have the power when asked to help Facebook make certain policy decisions. But the kind of meat of our diet is - a content decision is made by the company or by Instagram. And the board decides whether that was appropriate or not and whether the content should go up - back up onto the platform or stay down.
MARTIN: I see. Understood. OK. So a Washington Post article recently reported that research - that the research from Zignal found that online misinformation about the election fell significantly, some 73% after sites like Facebook banned then-President Trump. So I understood from your reporting just now that the oversight board was not involved in the initial decision. But are you willing to say now that it was the right decision, knowing what we know now?
GREENE: So I'm not in a position yet to judge the case. I'm not going to prejudge it until we actually hear the case. I think it's quite important to make sure that we have all the facts and are in a proper deliberative posture before we make those kinds of decisions. So, you know, we'll work on it as quickly as we can and with as much principle as we can and see where it goes.
MARTIN: So talk to me a bit more, if you would, about what those principles are. Can you sort of describe how the board is approaching this question?
GREENE: Sure. The board, which is a diverse body of - right now, we're 20 members. We sit in panels, and the panels deliberate about the question of whether content should be removed or not, and in this case, whether a suspension was appropriate or not. And we draw on both the company's own kind of terms of service, which they refer to as their community standards, and whether they were properly applied. But, also, Facebook has what are called values. Voice is a - is one of those values. Safety is one of those values. Dignity is one of those values. And we are also charged with applying those values.
And then there's also kind of an external source of information we rely on, which is international human rights law. So that has standards for when and how freedom of expression can be regulated. Facebook has committed to acting consistent with those standards, and so the board is set up to try to apply those international human rights norms to the behavior of the company.
MARTIN: So specifically to Donald Trump, does his civilian status - how does his civilian status weigh now? Does the fact that he's no longer president and that he is now a private citizen - does that factor into the decision-making?
GREENE: So potentially - and I don't want to get into exactly what factors the board will rely on in the substantive case before we actually, you know, sit and deliberate about it. But any number of kind of contextual factors can matter in cases when freedom of expression has been limited.
MARTIN: And is it accurate that you've been asked to put together broad policy recommendations for Facebook on suspensions of political leaders going forward? And - because I'm sure you know that there's been a lot of criticism that Facebook hasn't applied its policies evenly and that, you know, other world leaders, like, for example, President Rodrigo Duterte of the Philippines - the criticism is that he has violated certain Facebook rules as well. So is that also part of your portfolio to put together recommendations going forward for political leaders like president - like the former president, heads of state at that level?
GREENE: Yes, that's right. So Facebook, as part of this case, has asked for policy advice that they are obligated to consider as to whether - how their content standards should relate to political leaders. And this is something that has been a challenge for the company and for other platforms in the past, given that political leaders are very differently situated than ordinary citizens. And so that's going to be one of the things that we look into.
MARTIN: And I would say that this is the first big decision that your oversight board has had to make. Do you agree?
GREENE: I think that this decision will certainly get a lot of attention. We've been working on a number of cases that should come out pretty soon in terms of the opinions that we're going to be issuing. And those are cases from around the world, you know? Facebook has almost 3 billion users, so there are lots of things that happen in the United States that get a lot of attention. But there are also things that happen elsewhere that get lots of attention. So I wouldn't - I don't know that I'd say it's the first big case we've had. All of the cases are important and raise important issues. But I think it's fair to say that this will get a lot of attention.
MARTIN: I take your point. And I do recognize your desire not to inflate your own importance in the world. And I also recognize your desire to give international matters the weight that they deserve. But I am just wondering if this feels like a particularly weighty decision and how you are thinking about that.
GREENE: Yeah, so we're aware, certainly, of the attention paid to the case. We're aware of how many eyes will be on it. We're aware of the visibility of the case. So I wouldn't - and it is indeed important. It's not just a matter of optics; it's an important decision. And we'll take it with the weight that it deserves. But I think it is also important to point out that that we take cases from around the world and those are important in the corners of the world that they impact.
MARTIN: Before we let you go, is this decision binding? Does Facebook have to take your decision, whatever it is? Or is this merely advisory?
GREENE: So Facebook has committed to taking the decision of the oversight board as to whether the content should have been removed or not and whether the suspension was appropriate or not. As to whatever policy advice we give to Facebook, it's committed to respond to that in some way, but it's not obligated necessarily to implement exactly what we say as a policy matter.
MARTIN: Well, we await your decision. That was Jamal Greene, co-chair of Facebook's oversight board and professor at Columbia Law School. Professor Greene, we thank you so much for joining us once again. We look forward to talking further.
GREENE: Thank you. Transcript provided by NPR, Copyright NPR.