top of page

A Digital Service? The EU's response to Big Tech - Darren Trisno

Trolls. Misinformation campaigns. Exponential growth of child sexual abuse material. Manipulation by bot farms. The sale of illegal goods. Hate speech. Curtailed freedom of expression. The dangers of digital spaces are well-known. What was once a revolutionary invention with limitless potential to connect the world and make accurate information available to all is now seen as a threat to fundamental rights and even to democracy itself.


The Digital Service Act (DSA), which becomes fully applicable next year, is the EU's response. The aim? To create a safer digital space where users' rights are protected, to establish an accountability framework for online platforms, and to promote innovation and growth within the single market.


In this article, I analyse the application of the DSA and the key obligations it imposes on providers, as well as highlighting the role of law firms. I conclude by considering whether the DSA will achieve its objectives.


The DSA:


The obligations that the DSA imposes on what the Act calls online intermediary services differ according to their role, size and impact. There are three sub-types: hosting services, online platforms, and very large online platforms or search engines. For example, whereas all three have obligations to report criminal offences, only very large online platforms and search engines have obligations to share data with authorities and researchers, follow codes of conduct, and carry out an annual assessment to identify systemic risks. The European Commission will also gain new supervisory powers over their activities and be empowered by a new “crisis mechanism” to intervene at times of threats to national security or public health.


Besides these, the DSA contains a wide range of measures. For example, a mechanism for users to easily flag illegal content and for platforms to cooperate with ‘trusted flaggers'. To help counter the sale of illegal goods, there are also new obligations on traceability of business users. Transparency requirements are a key theme, particularly concerning the algorithms used for recommendations. Finally, all online platforms are banned from using targeted adverts to children and those based on the special characteristics of users, like race, gender, sexual preferences or political views.


As for content moderation, the DSA stipulates a process that must be followed for each item removed. Platforms must notify the affected user, specifically stating the reason for the removal (Art. 15). The users who receive these notices can contest the decisions, seeking another review by the platform (Article 17) or bringing the dispute to new out-of-court adjudication bodies at the platform's expense, whatever the outcome (Article 18). Platforms must report these decisions to the Commission for inclusion in a new database (Article15), and publicise the data in transparency reports (Article13). Finally, very large platforms must also maintain records for inspection by auditors, regulators, and researchers (Articles 28 and 31).


Needless to say, law firms will be vital in helping their clients to comply with these varied and onerous obligations. Indeed, the costs of non-compliance are high as the Act empowers authorities to impose fines of up to 6% of annual turnover, or even bans from operating within the EU, for the most serious breaches. Companies, then, will lean on firms for clear guidance on which category they belong in and which processes they must put in place.


Will it work?


The DSA has been extensively praised. The Washington Post expects the Act to influence policy in the US, whilst Mathias Vermeulen, a co-founder and policy director at the data rights agency AWO, sees it as setting the 'gold standard for regulating online platforms for any regulator in the world'. Further, Shoshana Zuboff, professor emerita at Harvard Business School, praised the DSA's emphasis on 'information integrity', labelling it 'the beginning of a multi-stage democratic resurgence' which 'dismantles the narrative of tech inevitability and invincibility'.


Others are less convinced. Methodologically, the focus on individual content decisions has detractors. Due to the scale and speed of online speech, there is a debate as to whether content moderation should be seen as as simply the aggregation of many individual adjudications. A more dynamic and continuous system is advocated.


The Act has also been criticised as excessively rigid and expensive, as over-burdening smaller platforms. SMEs are not excluded from all of the obligations by default but need to apply for a waiver and prove they do not represent a 'significant systemic risk'. Further, as one Pinsent Masons lawyer puts it, 'even medium-size platforms and interfaces will face unprecedented requirements on the processes they must put in place and the information they will need to report'. For many, this is a disproportionate burden.


Perhaps the most convincing criticism that can be levelled at the DSA is that it may detract from competition and growth. The regulatory divergence between online platforms and very large ones ties much tougher regulations and higher costs to expansion, which is hardly an incentive for large platforms to challenge very large ones. In this way, the first two of the EU's aims for the Act may clash with the third: the new obligations and accountability framework may become obstacles to competition and growth. Of course, the Digital Markets Act, which is part of the EU's broader package to curb the power of Big Tech, contains a number of pro-competition measures, such as preventing so-called gatekeepers from favouring their own services. Still, instead of reducing the power of very large platforms, it is just possible that the DSA serves to entrench their hegemony.

170 views0 comments
bottom of page