The EU’s latest crackdown on big tech begins before the end of the week. Starting on Friday, a total of 19 major companies must adhere to the sweeping rules of the Digital Services Act (DSA).
Essentially, the DSA is a landmark content moderation rulebook, designed to empower and protect users online against harmful or illegal content, disinformation, and the violation of privacy and free speech.
The tech firms listed are not only the first required to comply, but also the ones facing the act’s strictest and most far-reaching measures. That’s because they reach at least 45 million European active users per month, which according to the EU, translates to their “significant societal and economic impact.”
The legislation will eventually apply to all businesses providing digital services within the bloc, expected to come fully into force in February 2024. Violations could result in fines of up to 6% of their global revenue, or even a temporary ban from the union.
“The whole logic of our rules is to ensure that technology serves people and the societies that we live in — not the other way around,” said Margrethe Vestager, Executive VP of the Commission.
“The Digital Services Act will bring about meaningful transparency and accountability of platforms and search engines and give consumers more control over their online life.”
Who’s on the naughty list?
Ranging from social media platforms to online marketplaces and search engines, the list so far includes: Facebook, TikTok, X (formerly Twitter), YouTube, Instagram, LinkedIn, Pinterest, Snapchat, Amazon, Booking, AliExpress, Zalando, Google Shopping, Wikipedia, Google Maps, Google and Apple’s mobile app stores, Google’s Search, and Microsoft’s Bing.
5 key DSA obligations big tech have to follow
1. Remove illegal content
The designated companies are required to identify and remove any illegal content as defined by laws either at EU or national level from their platforms.
In the case of online marketplaces, this also means tracing sellers and conducting random checks on existing product databases to ensure protection against counterfeit and dangerous goods or services.
2. Ban some types of targeted ads
The big tech giants can no longer use targeted advertising that’s based on profiling of minors or sensitive personal data, such as ethnicity, sexual orientation, or political views.
3. Increase user empowerment
Users will have a set of new rights, such as flagging illegal content, contesting the decisions made by online platforms if their own content is removed, and even seek compensation for any rule breaches. They’ll also be able to receive information about the advertising practices, including if and why an ad targets them specifically with the option to opt out.
4. Constrain harmful content and disinformation
The selected companies will further have to perform an annual risk assessment and take corresponding measures to mitigate disinformation, election manipulation, hoaxes, cyber violence, and harm to vulnerable groups — while balancing freedom of expression. These measures are also subject to independent audits.
5. Be transparent
In an unprecedented move, the platforms will need to disclose long-guarded information on their data, systems, and algorithms to authorities and vetted researchers. They’ll also have to provide public access to their risk assessment and auditing reports alongside a repository with information about the ads they run.
“Complying with the DSA is not a punishment – it is an opportunity for these online platforms to reinforce their brand value and reputation as a trustworthy site,” Commissioner Thierry Breton said in a statement.
Who has complied so far?
In the group of social media, TikTok is introducing an “additional reporting option” for European consumers that allows them to flag illegal content, including advertising. It will further provide them information about its content moderation decisions and allow them to turn off personalisation. Targeted advertising for minors aged 13-17 will stop.
Snapchat has made similar changes. For instance, personal advertising for minors is no longer allowed and adult users have a higher level of transparency and control on the ads they see. Meanwhile, Meta has launched non-personalised content feeds on Facebook and Instagram.
Among the online marketplaces, Zalando has introduced content flagging systems on its website, while Amazon has opened a channel for flagging illegal products and is now providing more information about third-party merchants.
Nevertheless, both companies have taken legal action against the EU, claiming they have been “unfairly” added to the list.
The DSA’s potential impact
Historically, the rules for data sharing and online content moderation have been determined by big tech.The DSA aims to change that by setting an unprecedented touchstone, much like the EU’s regulatory efforts with the GDPR and the upcoming AI Act.
“The European Digital Services Act is trying to respond to online corporate practices that are considered inappropriate by the European Union,” David Frautschy Heredia, Senior Director of European Government and Regulatory Affairs at Internet Society (ISOC) told TNW.
“The impact of the act is being closely watched. By nature, corporate organisations operate across jurisdictions, and so their potentially damaging behaviour is not limited to a single region. Moreover, the EU has come to be widely regarded as the benchmark authority for digital regulation and as the example to follow.”
But as parts of the act and its implementation are still to be defined, experts are also pointing to potential risks.
“It is of crucial importance to ensure that these new obligations do not have unintended consequences, or they may be inadvertently mirrored across the globe, ” Frautschy Heredia noted, adding that misaligned policy could lead to the “fragmentation” of the internet.
Meanwhile, Mozilla alongside 66 civil organisations across the globe are urging the Commission to ensure that the DSA will not lead to censorship and the violation of fundamental rights.