In the coming months, new laws – the first of their kind anywhere in the world – will come into force in Europe to hold Big Tech companies accountable to the societies where they operate and do business. At this point, we have all heard about the dangers that large online platforms pose to our lives, to our democracies, to our children’s mental health, and to economic competition. Now, the European Union is doing something about it.
With each of these threats, the same basic processes are at work. Algorithms narrow down conversations to small groups of data-determined “friends,” while gatekeepers narrow down online markets to benefit themselves. With such narrowing comes the risk that we will lose track of the broader world, and the broader market, around us.
For decades, tech platforms were left mostly free to do as they wished, and there was very little legislation to limit them as they seized ever-greater control of the world’s information channels. But that began to change a few years ago, when the EU spearheaded a global effort to restore some balance to the digital economy, by ensuring fairness and basic protections for people.
Privacy was the first issue of concern. With major platforms boosting their revenues to record highs by amassing user data, it was clear that our conception of privacy needed to be modernised. Privacy thus became a non-negotiable right for everyone in Europe. As citizens, we – and only we – now set the boundaries of what we do or do not share about ourselves.
This understanding of privacy as a fundamental right was enshrined in the EU’s 2016 General Data Protection Regulation. With the GDPR, Europe was setting a course for democracy to catch up with technology. Today, there is no going back to how things were before the law. The EU’s landmark legislation has since inspired similar frameworks in other jurisdictions around the world.
Following closely on the heels of this initial data-privacy initiative came the Cambridge Analytica scandal, when we learned that Facebook had shared 87 million user profiles with a researcher who then provided that data to a political consultancy that was working for Donald Trump’s 2016 presidential campaign. Suddenly, we all began to wonder whether our digital lives were safe, and to what extent we were being surveilled, influenced, and manipulated online.
Brick by brick, the wall of pseudo-neutrality that the platforms had hidden behind – often claiming that they are mere “pipes” for passing along information – was being dismantled. It was becoming increasingly obvious that the Big Tech companies should take responsibility for the content that they and their algorithms disseminate to the body politic. We responded by establishing this responsibility loud and clear in the Digital Services Act, which was first presented in December 2020.
The DSA is the central piece of EU legislation that will soon regulate how content is treated on major digital platforms. It requires platforms to remove all illegal content, while also ensuring that their users’ freedom of expression remains untouched. It also addresses how platforms use algorithms to determine what we do and do not get to see. We are currently in the process of designating which large platforms and search engines will be subject to these DSA provisions before they enter into force this fall.
The last major issue that the EU’s new digital legislation will tackle is the lack of healthy competition in the tech sector. Over the past few years, regulators have pursued important cases against large online platforms, some of which have increased public awareness about the platforms’ undue market power. But as digital markets have become more complex, we have needed new systemic tools to supplement the usual antitrust instruments.
The Digital Markets Act was drafted to address this need. It features a list of “dos and don’ts” aimed at preventing so-called gatekeeper platforms from abusing their position in digital markets, and enabling some space for new entrants to compete with incumbents on their merits. Just as the DSA will officially articulate platforms’ responsibilities toward their users, the DMA will establish their responsibilities toward other – often smaller – market participants. The result will be a more vibrant, more innovative, and fairer tech market.
We passed this legislation in record time. Throughout the process, we made sure that our work was guided by values, rather than by the underlying technology. This matters, because while technologies change all the time, values do not.
We are proud that Europe has become the cradle of tech regulation globally. It has been gratifying to see similar laws being drafted in countries that share our democratic and humanist values, and we remain eager to coordinate our own regulatory and rule-making efforts with others. The EU-US Trade and Technology Council, launched in 2021, was one early example of how we can deepen international cooperation to ensure that technologies work for everyone. We have now established similar partnerships with India, Japan, Singapore, and South Korea.
For democracy to thrive, it needs open spaces where people can talk, disagree, contradict one another, and find common solutions. In the past, we had public squares, elected chambers, universities, and cafes. When the internet first arrived, it held the promise of expanding these forums globally. But the rise of large platforms got in the way, fragmenting our conversations into a constellation of opaque, walled-off spaces, thus posing a threat to our democracy.
It is now the job of citizens everywhere to tear down those walls.
Margrethe Vestager is Executive Vice President of the European Commission. Copyright: Project Syndicate, 2023, published here with permission.
We welcome your comments below. If you are not already registered, please register to comment.
Remember we welcome robust, respectful and insightful debate. We don't welcome abusive or defamatory comments and will de-register those repeatedly making such comments. Our current comment policy is here.