• Tech
  • Innovation

How the E.U’s Sweeping New Regulations Against Big Tech Could Have an Impact Beyond Europe

8 minute read
Updated: | Originally published:

Big tech companies could face multibillion dollar fines in Europe and the threat of being broken up unless they comply with sweeping new regulations announced by the European Commission on Tuesday.

After years of wrangling in the U.S. over whether to hold tech companies accountable for data practices and anticompetitive behavior, the new rules in the E.U.—which has a total population of some 450 million across 27 countries—could force tech companies to change their practices globally.

The E.U. regulations come in the form of two new laws, one called the Digital Services Act and another called the Digital Markets Act. Both still need to undergo a consultation period and then be passed by European lawmakers, a process which could take years.

The Digital Services Act (DSA) would introduce new obligations on platforms to reveal information and data to regulators about how their algorithms work, how decisions are made to remove content, and how adverts are targeted at users. Many of its provisions only apply to platforms with more than 45 million users, a threshold surpassed by several services including Facebook, YouTube, Twitter and TikTok.

“With size comes responsibility,” said Margrethe Vestager, the European Commissioner for competition, on Tuesday.

Fines for failing to comply with the rules can be up to 6% of a company’s annual revenue—a sum which if levied at Facebook, for example, would amount to $4.2 billion.

Meanwhile on Tuesday, the U.K. government is also expected to announce a similar law that if passed would include fines of up to 10% of annual global turnover for platforms that fail to remove illegal content.

The other law announced by the European Commission on Tuesday, the Digital Markets Act (DMA), is closer to antitrust legislation. It aims to give smaller companies greater ability to compete with big tech platforms, which some European lawmakers have long thought of as monopolistic entities. Repeat violations of this law could lead to big tech companies being broken up, Vestager said. Fines for anti-competitive behavior could amount to up to 10% of a company’s annual turnover, and the E.U. will attempt to break up repeat offenders fined three times within five years.

Press Conference By EU Commission Executive Vice-President Vestager
Vestager during a virtual press briefing in the EU Commission headquarters on November 26, 2020, in Brussels, Belgium.Photo by Thierry Monasse/Getty Images

Experts say the regulation could have consequences for tech companies beyond the borders of the E.U. “Europe increasingly sees itself as a trailblazer for what it sees as tech-savvy, human rights-proof tech regulations,” says Mathias Vermeulen, the public policy director at data rights consultancy AWO. “If you’re a global company, and you have to deal with new obligations in one very big and crucial market, then similar features could be taken up elsewhere even though there’s no hard requirement to do so.”

What is the Digital Services Act?

The DSA is aimed at improving what many European lawmakers see as the lack of oversight over large tech companies.

“The problem is that now tech companies can say they’ve been taking measures, but there’s absolutely no independent third party who can verify their figures,” Vermeulen says. “The idea with the Digital Services Act is to exercise more democratic control over how our rights are being affected by the products of these companies.”

The act obliges platforms with more than 45 million users in the E.U. to tell users in plain language the “main parameters” used in algorithms that rank content. In a measure that appears tailored to allow users to opt out of having algorithms serve up content based on their past activity, it says platforms must offer users at least one option which is “not based on profiling,” according to the act.

When it comes to illegal content, the act upholds existing E.U. principles that—similarly to the U.S.— platforms should not be held accountable for illegal content posted by users. But it does state that if platforms do not act quickly to make that content inaccessible once they are made aware of it, they could be held liable.

The act does not expand the definition of illegal content. But it does lay out a list of content that is already illegal that the new regulations apply to, including illegal hate speech, terrorist content, images of child sexual abuse, non-consensual sharing of private images, stalking, counterfeit products, and copyright violations.

The act doesn’t spend much time discussing details of what content should be counted as illegal. Instead, it states that companies must carry out their own “risk assessments” about how their services could be used to spread illegal content or allow manipulation that has “an actual or foreseeable negative effect on the protection of public health, minors, civic discourse … electoral processes and public security.” It obliges platforms to then act to “mitigate” those risks.

“Before, the trend was to hold companies responsible for specific pieces of content that could still be found on their site,” says Vermeulen. “But this is more of a holistic vision, looking at what these companies are doing to address the risks their systems are posing.”

Amid pressure on social media companies for their role in amplifying political extremism across the globe, the act will give European regulators more power to demand information from tech companies about how both their moderation teams and their algorithms work at scale. The threat of large fines, officials hope, will force companies to roll out new institutional practices even before a single fine is imposed.

Alongside fines of 6% of turnover for failure to comply with the regulations, the DSA says that platforms must also allow regulators insight into how their systems work—with fines of up to 1% of annual revenue if platforms “supply incorrect, incomplete or misleading information” to regulators, or “refuse to submit to an on-site inspection,” according to the act.

The DSA says that those obligations will be accompanied by the ability to enforce changes to tech companies’ services, including “discontinuing advertising revenue for specific content, or other actions, such as improving the visibility of authoritative information sources.”

In the Brexit referendum and 2016 U.S. election, so-called “dark ads” were common on social media platforms—adverts without any accompanying information about their funding or why they were targeted at users. The DSA introduces a legal requirement for platforms to maintain libraries of historical ads, and give people who see the ads more detailed information about the reasons they are being targeted. “Recipients of the service should have information on the main parameters used for determining that specific advertising is to be displayed to them, providing meaningful explanations of the logic used to that end, including when this is based on profiling,” the act states. (Platforms including Facebook and Twitter have already introduced measures along these lines.)

What is the Digital Markets Act?

With the development of the technology sector, economic activity increasingly happens online—but in some cases “online” means within the bounds of specific services designed by big tech companies. Think buying an app through the Apple app store, or buying goods on Amazon.

Systems like that open the door to anticompetitive behavior, E.U. officials say. In November, the European Commission said it believed Amazon was acting anti-competitively by collecting data on independent sellers that use the Amazon platform, and using that data to benefit Amazon’s own competitior products.

Big Tech CEO's Testify Virtually Before House Judiciary Committee
Amazon CEO Jeff Bezos testifies via video conference during an antitrust hearing on July 29, 2020 in Washington, DC.Photo by Graeme Jennings-Pool/Getty Images

Just as its sister legislation only targets companies with more than 45 million users, the Digital Markets Act only targets “gatekeeper” companies, or those defined–at some point in the future–as dictating the terms of a marketplace.

The act makes three main provisions: forcing “gatekeeper” companies to act fairly by not using competitors’ data to disadvantage them; by enforcing interoperability, allowing users to take their data elsewhere and still interact with their services; and by not treating their own services more favorably than competitors that use their platform.

“The business and political interests of a handful of companies should not dictate our future,” wrote Vestager and Thierry Breton on Sunday, the two European commissioners leading on the regulations. “Europe has to set its own terms and conditions.”

Along with fines of up to 10% of global turnover for violations of antitrust law, the European Commission threatens that it could break up the businesses of repeat offenders—a step further than anything the incoming Biden Administration has pledged to do. “If a gatekeeper breaks the rules … several times repeatedly, we can also impose structural remedies, divestiture, that sort of thing,” Vestager said on Tuesday. (The E.U. has fined big tech companies for antitrust violations before — the most expensive instance being a $4.3 billion fine against Google in 2018.)

“The Digital Markets Act imposes new obligations on these so-called gatekeeper companies, which have enormous power to control the markets they are operating in,” says Vermeulen. “This is mainly geared at marketplaces, search engines, operating systems and cloud services. It would prohibit companies from giving preferential treatment to their own products and services in, for instance, search rankings.”

More Must-Reads From TIME

Write to Billy Perrigo at billy.perrigo@time.com