Skip to main contentSkip to navigationSkip to navigation
Stock picture of a boy looking at a Facebook page on a computer
The Digital Services Act will make particular demnands about the protection of children. Photograph: David Bagnall/Alamy
The Digital Services Act will make particular demnands about the protection of children. Photograph: David Bagnall/Alamy

Digital Services Act: inside the EU’s ambitious bid to clean up social media

This article is more than 1 year old
Global technology editor

The legislation aims to tackle problems as wide-ranging as misogyny, disinformation and consumer fraud

Nearly two decades after the birth of Facebook ushered in the social media era, the EU is introducing ambitious legislation designed to clean up the world’s biggest online forums.

Intended to tackle misogyny, protect children, stop consumer fraud, curb disinformation and protect democratic elections, the Digital Services Act (DSA) is wide-ranging. The UK is introducing its own statute, the online safety bill, but the EU’s rules are likely to have a bigger impact because they cover a bigger market, and the EU is more influential as a regulatory power.

“The Digital Services Act is groundbreaking legislation that will set a worldwide standard for content regulation and the protection of users from online harms,” said Peter Church, a technology lawyer at Linklaters.

And the DSA already has tech firms in its sights. Twitter’s new owner, Elon Musk, has been warned his platform is not ready for the new rules, which could come into force for major platforms next summer. Thierry Breton, the EU commissioner overseeing the legislation, has told Musk he has “huge work ahead” to ensure that Twitter is complying with the act.

On Friday, in the wake of the suspension from Twitter of a group of US tech journalists, the commission’s vice-president for values and transparency, Věra Jourová, chimed in. She reminded Musk the DSA “requires respect of media freedom”.

News about arbitrary suspension of journalists on Twitter is worrying. EU’s Digital Services Act requires respect of media freedom and fundamental rights. This is reinforced under our #MediaFreedomAct. @elonmusk should be aware of that. There are red lines. And sanctions, soon.

— Věra Jourová (@VeraJourova) December 16, 2022

Here is a guide to the DSA and what it means for tech platforms and their users.

What is the DSA?

The DSA applies within the EU and regulates, in typical legislative jargon, digital services that act as “intermediaries” in their role of connecting consumers with content, goods and services. This means not only are the likes of Facebook and Google within the scope of the bill, but also Amazon and app stores.

Its provisions include: protecting children from being profiled for advertising purposes by social media sites; giving users a means of appealing against content takedowns; ensuring products sold on online marketplaces such as Amazon are not fake; and taking action against risks such as disinformation and “cyber-violence” towards women.

Violations carry the threat of a fine of 6% of global turnover and, in the most serious cases, a temporary suspension of the service. The EU can also demand that sites take immediate action to deal with problems. Users will be able to seek compensation for any damage caused to them by a breach of the act.

When does it come into force?

The act splits tech firms into tiers. The most heavily regulated tier covers Very Large Online Platforms (VLOPs) and Very Large online Search Engines (VLSEs) that have more than 45 million monthly active users. The EU will determine which platforms qualify for this category early next year. For this tier – which could include Facebook, Instagram, Google, TikTok and Amazon – there are tougher requirements. For VLOPs and VLSEs the act could come into force in the summer and at the beginning of next year for the rest.

The larger operators must carry out an annual risk assessment outlining the risks of harmful content such as disinformation, misogyny, harms to children and election manipulation. They must also put in place measures to mitigate those risks, although all the major social media platforms and search engines already have content moderation teams. Nonetheless, these systems will now be vetted by the EU.

The big platforms will also have to publish an independent audit of their compliance with the act, as well as how many people they use in content moderation. They must also provide to regulators details of the workings of their algorithms, which curate what you view online by recommending content. Independent researchers will also be allowed to monitor compliance with the act.

The large marketplace platforms such as Amazon, which are not producers of social media content, will still have to run risk assessments and publish independent audits.

Sharing the workings of algorithms is a big step for tech firms. Recommendation algorithms are a “black box” where the criteria for prioritising content can be opaque. One reason companies guard the details is to protect themselves from hackers, spammers and hostile actors.

How will it protect children?

Social media platforms will be banned from building profiles of child users for companies to target them with ads. Those platforms that can be reached by minors – in effect, most social media platforms – must put in place measures to protect their privacy and keep them safe. The major platforms must also carry out risk assessments of content that harms children and take measures to prevent that content reaching under-18s. Proposed EU legislation intends to cover the removal of online child sexual abuse material.

What about small platforms?

Smaller platforms – as well as large ones, of course – must give users a right to complain about removal of their content, plus the further option of an out-of-court appeals process if they do not like how the complaint was handled. However, repeat offenders who constantly post illegal content must be suspended.

They must also be transparent about ads and algorithms. Under the DSA, platforms must give users information about why an advert was shown to them. It must also give details of the algorithms that are used to guide a user’s online experience.

skip past newsletter promotion

All online marketplaces will have to keep an eye on the traders using their platforms, making sure that they are traceable in order to ward off rogue sellers. They will need to have procedures in place for removing unsafe or counterfeit goods. This includes randomly checking whether goods have been deemed illegal. Public authorities will also be able to order the removal of unsafe products.

Cloud hosting businesses, which store other people’s content, will need to have a mechanism in place for users and third parties flagging potentially illegal content and then for the complainant to be notified of the action taken. They must also inform the relevant authorities if they notice content that may be criminal or indicate a criminal offence has occurred, although this refers to a threat to life or safety only.

Church says: “The main content regulation obligations fall on the very largest online platforms. Smaller operators will be subject to much lighter regulation, though they will have to issue and enforce terms of use, and will have to operate a notice and takedown procedures for illegal content.”

How is it different from the UK’s online safety bill?

The DSA does not define illegal content – that is either defined at EU member-state level or via other legislation covering terrorist content.

The UK bill, which is still going through parliament, creates new offences such as cyberflashing and encouraging self-harm, and carries a list of illegal content that users must not encounter. It also requires pornography websites to carry out age checks, a layer of detail that is not reflected in the DSA.

However, both require major platforms to carry out risk assessments of harmful content appearing on their platform and explaining how they will mitigate those risks. Both pieces of legislation want to ensure that online platforms have the right structures in place to head off, and detect, harm. The online safety bill also carries similar punishments although the proportion of global turnover that can be fined is higher under the UK legislation, at 10%.

A related piece of legislation, the Digital Markets Act, is also being introduced by the EU with the aim of preventing anti-competitive behaviour by powerful tech firms.

Elon Musk has been told by the EU Commission that Twitter must try significantly harder to ‘pass the grade’ under the DSA. Photograph: Jim Watson/AFP/Getty Images

Could Twitter be banned under the DSA?

Last month, Breton fired a shot across the bows of Twitter and its preparations for the DSA. He said Twitter would have to significantly increase efforts to “pass the grade”, implying the platform was in danger of non-compliance with the act. That could mean fines and ultimately being banned from the EU.

The threat has been made against the backdrop of upheaval at Twitter since it was taken over by Elon Musk, including the firing of thousands of staff and the reinstatement of previously banned accounts.

According to a spokesperson for the European Commission, the act also contains provisions that deal with Friday’s journalist account suspensions, including a requirement that when users and content are penalised it must be in a “diligent and proportionate manner, with due regard to fundamental rights”.

But the question also depends on whether Twitter is classified as a VLOP. Being a VLOP carries the biggest risk of being deemed in breach of the act because of the regulatory demands it places on qualifying platforms. Either way, it is possible that the EU will hit Twitter, and other big tech platforms, with preliminary investigations or requests for information once the act goes live.

Most viewed

Most viewed