Portman Highlights Need for Transparency By Social Media Platforms to Ensure User Safety

WASHINGTON, DC – Today, U.S. Senator Rob Portman (R-OH), Ranking Member of the Senate Homeland Security and Governmental Affairs Committee, delivered opening remarks at the first panel of a Senate Homeland Security and Governmental Affairs Committee hearing examining social media’s impact on homeland security. Portman highlighted the need for transparency by social media platforms to ensure user safety and discussed his bipartisan Platform Accountability and Transparency Act (PATA), legislation that would require social media companies to provide vetted, independent researchers and the public with access to certain platform data.

Portman also expressed serious concern about how the Chinese Communist Party has access to TikTok’s data on American users, establishing a gateway for China to extend its espionage campaign.

A transcript of his opening remarks can be found below and a video can be found here.

“Thank you Mr. Chairman, and I thank the experts for being here. We look forward to hearing from you, it’s going to be an interesting hearing. This past Sunday, we observed the 21st anniversary of the tragic 9/11 terrorist attacks. And over these past couple of decades, our country has adapted to combat the most pressing threats to our nation’s security, and that’s good, but the advent of social media has added a new dimension to the ever-evolving threat landscape and created new considerations for combatting terrorism, human trafficking, and many other threats. 

“During last October’s hearing on how algorithms promote harmful content, I focused on how social media business models contribute to the amplification of terrorism and other dangerous activities. Since then, the Committee has identified ways in which social media companies’ product development processes tend to conflict with user safety. Whistleblower testimony has revealed that in numerous occasions, the leaders at social media companies were aware that certain platform features increased threats to user safety and chose not to mitigate such concerns. We’ll hear about that today.

“It’s unfortunate that the American public must wait for whistleblower disclosures to find out about ways in which platforms are knowingly and unknowingly harming their users. The lack of transparency in the product development process, the obscurity of algorithms, and misleading content moderation statistics create an asymmetric information environment, in which the platforms know all, yet the users and policymakers and the public actually know very little.

“One consequence of this lack of transparency is related to China. I have serious concerns about the opportunities that the Chinese Communist Party has to access TikTok’s data on American users. There are now over 100 million Americans, including 40 million under the age of 19, who use TikTok. This TikTok data remains vulnerable to the Communist Party of China both as the CCP tries to exploit its access to U.S. data and exert influence over the content that U.S. users see. For example, despite moving U.S. data servers to the United States, TikTok and ByteDance employees in China retain the ability to access this data. If that’s not true we’d like to hear about that today.

“Also, we learned yesterday from Senator Grassley’s opening statement in a Senate Judiciary Committee hearing with the Twitter whistleblower that Twitter failed to prevent Americans’ data from being accessed by foreign governments. In fact, Senator Grassley spoke about how several Twitter employees were actually foreign agents of India, China, and Saudi Arabia, which is concerning and speaks to why Congress needs more information from platforms on how they secure user data. 

“Another consequence of poor transparency relates to content moderation. While I recognize that content moderation is a key component to creating safe platforms for users, it cannot be the only thing. Transparency reports released by companies often detail the amount of content that has been removed for violating company policy. However, these reports do not account for violating content that is left up to the platform and left up on the platform and yet goes undetected.

“It also doesn’t account for content that is incorrectly censored, as we often see with many conservative voices on social media. I, like many of my colleagues, have been critical of the political biases held by Big Tech platforms, which have resulted in systematic takedowns of accounts that hold ideologies with which the left and liberal media disagree. We’ll hear about that today, but these takedowns are often done under the guise of combatting ‘misinformation’ which, in fact, they’re really just combatting conservative viewpoints that conflict with their own. Any steps taken to address the impact of social media on homeland security must account for First Amendment protections, of course, and safeguard free speech.

“For us to have a responsible conversation about the impact of harmful content on American users and homeland security, we need to talk about how current transparency efforts have worked or not worked. Congress must enact legislation that will require tech companies to share necessary data so that research can be done to evaluate the true extent of how harms from social media impact Americans. As some of you know I have been working on legislation along those lines with Senator Coons to establish bipartisan legislation to do just that. The Platform Accountability and Transparency Act would require the largest tech platforms to share data with vetted, independent researchers and other investigators so that we can all increase our understanding of the inner workings of social media companies and regulate the industry based on good information that we simply don’t have now that we can learn through this process.

“So again, I thank the witnesses for being here, I look forward to having your expertise help to guide us in these complicated issues and thank Mr. Chairman for holding this hearing.”

 

###

Print
Share
Like
Tweet