Decoding the DSA: a Closer Look at New Regulations for Tech Giants and How They Translate for Agencies
Decoding the DSA: a Closer Look at New Regulations for Tech Giants and How They Translate for Agencies
Since August 25th, 19 of the world’s largest social media companies, e-commerce platforms and search engines are required to comply with the Digital Services Act (DSA) — or else face sweeping fines of up to 6 per cent of their global annual revenue.
What do the DSA provisions entail for very large online platforms and search engines?
1. Remove illegal content
Companies like Facebook and TikTok will have to “expeditiously” remove illegal content — as defined by European and national EU laws — when notified by national authorities or individuals. Online platforms will need to have clear and easy mechanisms for users to flag content they believe is illegal.
Platforms will have to suspend users who often post illegal content, but not before giving them a warning.
Online marketplaces like Amazon and AliExpress will have to make their “best efforts” to check up on their online traders in a bid to stamp out illegal products — everything from fake luxury shoes to dangerous toys. If they realise consumers have bought an illegal product, they will have to warn them or else make the information public on their website.
2. Keep a lid on harmful content like disinformation and bullying
In an unprecedented measure, online platforms and search engines will have to hand over to the Commission a detailed annual report of the so-called systemic risks they pose for Europeans.
Companies from Snapchat to X will have to determine how the cogs in their systems — like their algorithms recommending content and ads to people — potentially contribute to the spread of illegal content and disinformation campaigns. They will also have to see if their platforms open the door for cyber violence, undermine fundamental rights like freedom of expression, and adversely affect people’s mental health.
They will then have to implement measures to limit the risks they’ve identified. These could include adjusting their algorithms, creating tools for parents to control what their children are seeing and verify the age of users, or labelling content like photos or videos generated by artificial intelligence tools.
Companies will be scrutinised by the Commission, vetted researchers, and auditing firms. The latter will specifically go through the assessment and the measures to either approve the companies’ work or make further recommendations.
3. Give power to their users
Very large online platforms and search engines will need to have easily understandable terms and conditions — and apply them in a “diligent, objective and proportionate manner.”
Companies must inform users if they remove their content, limit its visibility, or stop its monetisation, and tell them why. Platforms, including Elon Musk’s X, will also need to warn users and explain any suspension (like in the case of journalists temporarily banned from Twitter). Users will be empowered to challenge the platforms’ decisions with the company, in out-of-court bodies and, finally, in court.
Tech companies must explain the parameters behind their algorithms’ content recommendations and offer at least one algorithmic option that doesn’t recommend content based on people’s personal data.
4. End of targeted ads to minors and ads based on sensitive personal data
Platforms will be banned from targeting people with online ads based on sensitive personal data, including their religion, sexual preference, health information and political beliefs. They also won’t be allowed to collect children’s and teenagers’ personal data to show them targeted ads.
So-called dark patterns — manipulative designs nudging people into agreeing to something they don’t want, like consenting to be tracked online — will also be outlawed.
5. Reveal closely guarded information about how they operate
Every six months, platforms will have to open up and provide long-guarded information, including details about the staff moderating their content — such as size, expertise and European languages spoken.
They must disclose the use of artificial intelligence to remove illegal content and its error rate. They will also have to make public their assessment reports and auditing reports on how they have limited serious risks to society, including threats to freedom of speech, public health and elections. They will need to have a repository with information about the ads that have run on their platforms.
Regulators can access the companies’ data and algorithms, inspect their offices, and request sensitive business documents. Vetted researchers will also be empowered to access platforms’ internal data for specific projects.
What will these changes entail for agencies?
”The implementation of the DSA provisions does not directly impact agencies.
However, agencies will be mostly indirectly impacted by the consequences of restricting the possibilities of targeting users based on specific personal data. In other words, it will be very interesting for agencies to observe how the DSA provisions will impact the effectiveness of digital advertising campaigns and develop strategies to optimise ad campaigns based on the early learnings of these new applying provisions in cooperation with their clients.
On this point, it will be paramount to understand how alternative targeting technologies will compensate for the processing ban on specific types of personal data and to what extent the granularity of customer and prospect data will be affected. Regarding the high amount of innovation in developing alternative and more privacy-friendly targeting technologies by platforms and ad tech companies, there is reason to be optimistic for the future and to eventually better conciliate privacy concerns and targeted advertising without compromising the effectiveness of ad delivery.”
Alexis Bley, Public Affairs Manager @ EACA
Find out more about the updated DSA policies of the major platforms hereunder:
Legal notice: The information presented herein is for informative purposes only and does not constitute legal advice. EACA can in no way be held liable for any type of re-use of this publication
©EACA 2023