In a report released today, the Transatlantic High Level Working Group on Content Moderation Online and Freedom of Expression urges adoption of a flexible regulatory framework to curb online hate speech, violent extremism, and viral deception.

The group calls for greater transparency and accountability from digital platforms, as well as a redress system for promptly dealing with user complaints.

The report, “Freedom and Accountability: A Transatlantic Framework for Moderating Speech Online,” is the culmination of a yearlong investigation by members with diverse perspectives, including legislators, government officials, tech executives, civil society leaders, and academics from North America and Europe.

The Transatlantic Working Group (TWG) is co-chaired by Susan Ness, a former U.S. Federal Communications Commission member, and Marietje Schaake, international policy director of the Stanford Cyber Policy Center and a former member of the European Parliament (Netherlands).

“Freedom of expression is the fundamental right that both promotes individual liberty and holds governments accountable,” said Ness, the TWG project convener. “Working together across the Atlantic, we propose a path forward to address online issues while protecting free speech. If democracies cannot define a coherent set of fundamental principles and governance frameworks, the field will be defined by political powers with very different ideals, or by private sector interests without accountability.”

“The status quo in which technology giants govern much of the online information ecosystem without oversight is unacceptable,” Schaake said. “Closing the accountability gap, offering transparency over content moderation as well as empowering internet users is urgent. The U.S. and the EU should aspire to develop a democratic governance model that protects fundamental rights online.”

The Transatlantic Working Group is a project of the Annenberg Public Policy Center (APPC) of the University of Pennsylvania, in partnership with The Annenberg Foundation Trust at Sunnylands and the Institute for Information Law (IViR), which is affiliated with the Faculty of Law of the University of Amsterdam. The TWG is also supported by the Embassy of the Kingdom of the Netherlands in Washington, D.C.

Read the report.

Regulating transparency, not speech

The COVID-19 pandemic, global protests against racial inequality, and early attempts at election interference underscore both the extent to which the public relies on digital companies for information, communication, and connection, and the fertile online ecosystem that enables the viral spread of hatred, violence, and manipulated information.

The TWG report is being released as governments and courts on both sides of the Atlantic diverge on how to address these problems. The European Commission is drafting the Digital Services Act, a comprehensive proposal to regulate online platforms. France has enacted a law to penalize platforms that fail to rapidly take down illegal hate speech. In the United States, bills are pending in Congress to regulate platforms, while President Donald Trump has issued an executive order attempting to curb the power of social media companies.

The report does not provide a one-size-fits-all solution, but offers a set of principles and a regulatory framework that can be adapted by free societies. The TWG did not address competition policy or privacy legislation, which were outside its purview.

The report proposes five pillars:

  • Regulate on the basis of transparency: Transparency isn’t an end in itself, but it enables governments to develop evidence-based policies for oversight of tech companies, pushes firms to examine problems they would not otherwise address, and empowers citizens.
  • Establish an accountability regime to hold platforms to their promises: A transparency framework should be supervised by a regulator who has the power to set baseline standards, require efficient and effective ways for users to seek redress for problems, and sanction repeated failures.
  • Create a three-tier disclosure structure: This will offer broader access to information on different tiers for users, researchers, and regulators, to enable evidence-based policies.
  • Provide efficient and effective redress mechanisms: Social-media councils–independent external oversight bodies – can make consequential policy recommendations, set content moderation standards, or decide appeals from moderation decisions. An e-court system could be staffed by specially trained magistrates to adjudicate online cases involving potential violations of free expression and human rights.
  • Use an ABC framework to combat viral deception, or disinformation: Distinguish between bad Actors, deceptive Behavior, and harmful Content. In broad-based campaigns involving manipulated information, it can be more effective to address the online behavior employed by bad actors before addressing the content itself.”

“The Annenberg Public Policy Center is proud to have played a role in bringing the important work of this transatlantic group to fruition at a time when the need for these practical solutions could not be greater,” said Kathleen Hall Jamieson, director of the Annenberg Public Policy Center. Ness, the project leader, is an APPC distinguished fellow.

The Annenberg Public Policy Center (APPC) was established in 1993 to educate the public and policy makers about communication’s role in advancing public understanding of political, health, and science issues at the local, state, and federal levels. Learn more: www.annenbergpublicpolicycenter.org