The prospect of substantial fines and a European Union-wide prohibition aimed at mitigating manipulative practices and harmful content
Groundbreaking regulations compelling over 40 online giants, including Facebook, X, Google, and TikTok, to enhance their content moderation within the EU, are set to be enforced on August 25. What does the legislation entail, and how will regulators ensure compliance?
Digital Services Act (DSA)
The DSA is a revolutionary legislation that extends to any digital entity catering to the EU, compelling them to assume legal responsibility for a spectrum of issues, ranging from misinformation and manipulation of consumers to Russian propaganda and criminal activities, including child abuse.
Applicable to both large and small operators, the regulations are tiered, with the most stringent obligations imposed on 17 companies designated as “very large online platforms,” including Facebook and Amazon, and two “very large online search engines,” Google and Bing.
Those failing to adhere to the regulations may incur penalties such as substantial fines, potentially amounting to hundreds of millions of euros, and a ban throughout the European Union.
So what are the new rules?
They revolve around five central themes, addressing issues such as online disinformation and other societal risks.
1 Illegal products
Platforms will be required to combat the sale of illegal products and services, impacting entities like Amazon and Facebook Marketplace, among others.
2 Illegal content
The new measures aim to address illegal content, encompassing Russian propaganda, election interference, hate crimes, and online harms such as harassment and child abuse. They are intended to ensure the protection of fundamental rights recognized by law across Europe, including freedom of expression and data protection.
3 Protection of children
For parents, who cannot monitor every aspect of their child’s online experiences, this set of rules is likely the most crucial.
Platforms will be barred from targeting children with personalized advertising based on their personal data or cookies. Major social media companies must overhaul their systems to ensure a “high level of privacy, security, and safety for minors,” and they must demonstrate this to the European Commission.
Additionally, platforms will need to redesign their content recommendation systems to minimize risks to children. They are also required to conduct a risk assessment regarding the potential negative impacts on children’s mental health and submit it to the commission by August.
In the previous year, the world’s largest social media companies faced accusations of “monetizing misery” after an inquiry determined that harmful online content played a role in the death of 14-year-old Molly Russell in the UK.
4 Racial and gender diversity
Social media firms are prohibited from utilizing sensitive personal information, such as race, gender, and religion, for targeted advertising to users.
5 Ban on “dark patterns”
For consumers, this serves as protection against common interfaces employed to coerce users into purchasing unnecessary or unwanted items.
An examination of 399 online stores conducted by the commission and national consumer authorities this year discovered that 40% relied on “manipulative practices to exploit consumers’ vulnerabilities or trick them.”
The EU justice commissioner disclosed that 42 sites employed fake countdown timers featuring fabricated deadlines for purchases, while 70 sites “concealed” crucial information such as delivery costs or the availability of a more affordable option.
According to the audit, 23 sites obscured information with the intention of “manipulating consumers into entering a subscription.”
Under the new regulations, these practices should be eliminated.
Anything else?
Consumers can report violations and anticipate subsequent changes. Additionally, tech companies are prohibited from giving preferential treatment to their own services over others, and they are required to facilitate the uninstallation of pre-loaded software and apps, eliminating any hindrances.
What are the big tech companies doing about the new law?
For extremely large online platforms, the societal implications are more significant, and the regulations encompass additional measures to evaluate and alleviate societal risks associated with their advertising systems, particularly those related to disinformation,” stated a spokesperson for the commission.
The commission’s voluntary code of practice, considered a preparatory step for internal systems ahead of the new regulatory framework, has already garnered support from 44 major tech companies.