WASHINGTON, DC –This morning at a hearing to examine “Social Media Platforms and the Amplification of Domestic Extremism and Other Harmful Content,” expert witness, Dr. Nathaniel Persily, Co-Director, Stanford Cyber Policy Center, agreed with U.S. Senator Rob Portman’s (R-OH), Ranking Member of the Senate Homeland Security and Governmental Affairs Committee, that transparency and independent third-party review of social media company data is necessary to inform evidence-based policy solutions. Portman is currently working on this legislation with Senator Chris Coons (D-DE), Chair of the Senate Judiciary Subcommittee on Privacy, which will require the largest online platforms to share data with nonpartisan researchers and scholars so that Congress can use their reviews to create evidence-based policy solutions.
A transcript of the exchange can be found below and a video can be found here.
Portman: “Thank you, Mr. Chairman, and thank you to the witnesses. So this is a really complicated area, and I’m glad that Dr. O’Neil gave us a little one-on-one on algorithms and how they work. And ultimately, she came down to the conclusion that this is about money. This is about what works, kind of along the lines of what Ambassador Kornbluh just talked about, that what works is what sells more advertising and these algorithms are at the core of that. In other words, they are determining what we want to hear as online participants and amplifying that.
“I will say two things that I just want to try to stipulate here at the start, and some may not agree with me, but it seemed to me, Ambassador Kornbluh, in your comments in particular, you were focused on right-wing extremism as being the problem. It’s not. It’s everything. And I hope we can agree to that because, again, the work that we did early on in this Committee with regard to what ISIS was doing online and in terms of recruiting and spreading violence, and in terms of what happens today, even. I mean there is, as you know, a lot of concern about these platforms not allowing what happened at the Wuhan lab, the leak, to come out or concerns about what Hunter Biden is doing or not doing being blocked or other things that lead me to believe that, whether it’s Antifa on one side or whether it’s white supremacy on the other side, we need to look at this as a problem that is impacting ideologies across the spectrum. So I’m just going to stipulate that because I want to get to some questions. Some of you may not agree with me on that, but I think that’s really important for us not to make this a partisan exercise.
“Second, is the First Amendment— and this is impossible—how do you figure out what is speech that is peaceful expressions of points of view that we should be encouraging and what is content that ought to be filtered in some way. And there are lots of examples of this. Recently, parents were at a school board meeting and, I guess, it was the National School Boards Association who said, these are domestic terrorists. They aren’t domestic terrorists. These are parents and I think they [the National School Boards Association] later apologized for saying that, but parents expressing their strongly held views about their kids’ education. So you got to be sure that we’re not taking content, which is people expressing political views that are peaceful and somehow filtering those out. Any thoughts on that as we start for any of the panelists, either with us or virtually?”
The Honorable Karen Kornbluh, Director, Digital Innovation and Democracy Initiative and Senior Fellow at The German Marshall Fund of the United States: “I just want to agree with you. The algorithms aren’t partisan. The algorithms are economically driven, as Dr. O’Neil said. And they’re just trying to keep us all online. I think it’s really important to keep that in mind. And I think for the First Amendment, the freedom of speech, freedom of association, it’s extremely important that the government not be in the business of deciding what’s true and what’s not true. And that instead, that’s why I think some of these revelations from the whistleblower is so important because she focuses as upstream of the content at the mechanics of the platform and how it’s driving this content just to service itself and to service its ad revenues. And if we focus on those design elements, and we focus on transparency, especially, that furthers First Amendment concerns it furthers freedom of speech and freedom of association. Transparency is such an important principle. If we let consumers know who’s behind what they’re seeing, what interest is it serving, that’s more important”
Portman: “Well, and that leads me to my core question today. Really, again, just stipulating this is a hard area, and we’ve talked about a couple of those issues. But it seems to me that, as I said earlier in my opening statement, getting under the hood and looking into what these design elements are that you talk about, the transparency issue you talk about is really important because, again, it’s proprietary information. I understand that, these are private companies, and this is not an easy issue for government to be involved with. But everybody’s talking about regulation right now. I mean, everybody. Facebook is talking about it. Google is talking about it. Twitter is talking about it. We’re talking about it. Everybody has a different view on what that regulation might be, but shouldn’t it be based on better data? Because we really don’t know what we’re trying to regulate, if there is a lack of transparency as to what that design is or how these algorithms are derived. So we talked a little again in your testimony about this. Dr. Persily, in particular, you talked about your thoughts on how to give access to impartial researchers to be able, I assume, to publish about what is actually behind all this. What is the content directing mechanism and how does it work? I’m intrigued by that. I don’t know if that’s the answer. And you mentioned that the FTC could play a role in this—the Federal Trade Commission. But can you amplify that a little bit and talk about what you think could lead us to more transparency and better understanding?”
Dr. Nathaniel Persily, Co-Director, Stanford Cyber Policy Center and James B. McClathy Professor of Law at Stanford Law School: “Thank you for that question. So the model that I’ve put out there is that the FTC, working with the National Science Foundation, would vet researchers who would not have the permission of the company, but the company would have to basically develop research portals for outside independent research to study all of these societal problems that we are saying are caused by social media. The key features of this are simply that the firms have no choice on who has access to the data. And then we have some way of vetting the researchers to prevent, say, another Cambridge Analytica and the like. And so we need to have some process in place so that someone other than those who are tied to the profit-maximizing mission of the firm gets access to this data. As you mentioned, it’s proprietary data but it’s data about us, right? It is essentially data about everything that’s happening in American politics and frankly, around the world. And so we need to figure out some way for the firms to be opened up so that outsiders can see it. My view is that they shouldn’t turn the data over to a government agency, that that would pose real privacy and surveillance problems. We want to make sure that there’s a vetted, independent third party that’s able to do this kind of research on all of these questions that have come up today so that we can get answers to the questions about some of the questions that you asked earlier about the propensity on the left and the right to engage in hate speech or engage in violent content like as well as potential bias in the content moderation platforms, and the like. Only if we have access to the data, can we really answer those questions.”
Portman: “Otherwise very hard to come up with regulation, which is what, again, everybody is talking about. I know there’s different views on what that means, but it seems to me that there should be a consensus that if we’re going to try to regulate this, we need to have better information as to what the actual design and what the intentions are and the impact is so this hearing is helpful, I think in that regard. I’m at the end of my time, I’ll hopefully be back for a second round. I have so many other questions for this team. But again, I thank you for your expertise today.”
###