Four questions for Marietje Schaake
The international policy director at Stanford’s Cyber Policy Center discusses the EU’s latest rule for the internet, the Digital Service Act.
Just days after former U.S. President Barack Obama delivered a keynote speech at Stanford on disinformation that urged greater regulation of media and tech companies, Twitter announced that it had agreed to a takeover bid by Elon Musk that would make the company private.
The sudden acquisition by Musk – a self-proclaimed “free speech absolutist” – sets him on a potential collision course with the European Union, which just passed the kind of digital regulations Obama argued for. Known as the Digital Services Act (DSA), the law aims to hold platforms like Twitter more responsible for content posted on their services.
Stanford international policy scholar Marietje Schaake recently moderated a panel at the one-day symposium headlined by Obama addressing media and tech companies’ role in ensuring quality, access, and participation.
Here, she discusses what makes the DSA so remarkable and ways the EU law could still affect American consumers.
The Digital Services Act is the EU’s first binding law that will spell out what responsibilities technology firms have in terms of moderating content. Previous voluntary schemes between the European Commission and tech platforms were considered insufficiently effective, so the binding rule had been a long time coming.
When comparing the U.S., we see how the recently announced takeover of Twitter by Elon Musk reminds people of the power of companies to govern the parameters of the public debate. The Digital Services Act seeks to make these decisions more democratically legitimate and accountable.
Between DSA, the Digital Markets Act (DMA), and General Data Protection Regulation (GDPR), the EU is a leader in reining in the influence and power of big tech firms. What is motivating the EU, and not the U.S., to take such a pivotal role in setting standards for tech regulation?
Many of the harms that are surfacing in the U.S., such as illegal discrimination, voter manipulation, hate speech, and conspiracy theories have been reason for concern in the EU for a bit longer. When I listen to the discussions about platform companies in the U.S., I hear echoes of where discussions in the EU were five years ago.
The EU has a more recent historic memory of what the abuse of power by governments means for people’s liberties – state security services under communism used disinformation and surveillance against their own citizens, for example. It has made people more personally and acutely aware of what is at stake when it comes to the value of a resilient democratic society.
What’s more is that the EU considers the protection of privacy a fundamental right, giving states a positive obligation to protect citizens. The same historic experiences or anchors in fundamental rights do not exist the same way in the U.S. Here, the alleged economic benefits of the tech sector have led to democratic and republican politicians taking a hands-off approach, with high trust in market forces. But with the harms to democracy, cybersecurity, and minority populations surfacing in a significant way in the United States, I am convinced there will be significant changes ahead to rebalance the outsized market power of technology companies with more countervailing checks and balances.
What is needed to ensure platforms comply with the DSA?
Good enforcement is always needed for laws to be successful and impactful in practice. They should be more than a paper tiger. I see how many lessons learned from the lack of effective enforcement in the past around the GDPR that are now informing what needs to happen to ensure the DMA, DSA, as well as the upcoming AI Act, are more successful. Stronger sanctions are another way to increase the price of non-compliance for companies.
Will American consumers benefit from regulation abroad?
It may well be that companies will end up making the DSA and the DMA their global standards. We saw this happening after the GDPR as well. Companies may prefer having a single standard, even if that is a strict one, over fragmentation of rules to comply with. Additionally, I hear voices in the U.S. growing worried that rules are being made elsewhere, so the inspiration for regulators and legislators in the U.S. to not be left behind in making laws that apply in the digital world will hopefully lead to more policies and oversight in the U.S. too.
—As told to Melissa De Witte
Schaake is the International Policy Director at Stanford’s Cyber Policy Center and an International Policy Fellow at the Institute for Human-Centered Artificial Intelligence. Between 2009 and 2019, Schaake served as a Member of the European Parliament for the Dutch liberal democratic party, where she focused on trade, foreign affairs, and technology policies.
Four questions for … is a Stanford News series where Stanford experts answer four questions on a topic they are knowledgeable about.