The DSA Newsletter #2

Here is a roundup of useful resources on the DSA from November and December 2022.

Verfassunsblog symposium

The symposium that I mentioned last time led to a wonderful collection of essays by scholars working on the DSA. I definitely recommend all the pieces, especially Folkert Wilman’s piece on where the DSA liability chapter fits into existing case-law, Daphne Keller’s piece on likely global impact, Nicolo Zingales’ piece on the shift in the regulatory approach, Alessandro Mantelero’s piece on the human rights assessments, and Julian Jaursch’s piece on how to build a good national Digital Services Coordinator.

DSA meets DMA: Apple to weaken its grip on app moderation

Apple is rumored to be preparing to allow third-party app stores on the iPhone in the anticipation of the DMA. According to this article on the Verge, “Apple may still keep some hands on the reins. The company is apparently considering “mandating certain security requirements,” verifying outside apps in some way (..)”. This is very a interesting example of how the DSA and DMA interact. Here is what I write about it in a draft chapter of the upcoming book, Principles of the DSA:

The example of app stores is quite illustrative of the ordoliberal thinking behind the DSA and DMA. App stores are an important lever for content moderation. By imposing basic rules on their apps, they de facto meta-regulate content moderation practices on other services. Some apps that tried to offer unmoderated experiences, such as Parler, a social media service, were temporarily removed from the app store due to failure to take “adequate measures to address the proliferation” threats of violence and illegal activity. Because Apple is an important gatekeeper, Parler could not afford to ignore its requests. Softening the economic grip of big app stores on the distribution of apps, one of the goals of the DMA, weakens this power to meta-regulate. This means that individual app providers regain some of the control back from Apple. And this is where the DSA standards become key. The DSA introduces the minimum content moderation rules for apps like Parler. If they fail to live up to them, action can be taken by public and private enforcers. Instead of praising Apple and relying on its economic power over the ecosystem to enforce the rules of diligence, the DMA tries to weaken it, and the DSA to shift it back to the state (and people). The basic insight is that while economic power can be convenient, as it supplements the state, too much of it is corrosive in the long run. Thus, the state sometimes re-asserts itself and takes back some of the responsibilities for the private ecosystem to protect its own people.


eBay says it is not a VLOP!

eBay published an update on its website where it claims that it will not be a VLOP – at least for the time being. The company did not publish the numbers, but says that it reached this conclusion after examining the monthly active users, excluding bots, scrapers, and repeated visits by the same persons. The company’s verdict is that: “during [May to November 22] the average number of monthly active recipients of our marketplace in the EU was well below the 45 million user threshold for being designated as a VLOP.” Yikes!

Just to remind, online platforms and search engines are asked to report their average monthly users on the basis of Article 24(2) by 17 February 2023. After that, the European Commission will designate the VLOPs and VLOSEs, and the show can begin. While VLOSEs are fairly clear (Google and [maybe] Bing), the beauty contest for VLOPs is still fairly open due to unsettled methodology. Some European champions (read: porn sites) might be even reluctant to come forward with their numbers.

Some new scholarly work

Next my and Irene’s DSA Primer, there is now another one, by a leading expert, Folkert Wilman, from the Legal Services of the European Commission. Readers might know Folkert’s book, and his numerous articles.

João Pedro Quintais, Naomi Appelman, Ronan Ó Fahy (all from the University of Amsterdam) published an article on terms and conditions and fundamental rights (aka Article 14 DSA). Their main conclusion is that Article 14 will enable the indirect horizontal effect of fundamental rights in the relationship between online platforms and their users but the provision must be operationalized within the framework of the international and European fundamental rights standards to be useful.

Heleen Janssen, Michelle Seng Ah Lee and Jatinder Singh (all from the University of Cambridge) published an interesting piece on the fundamental rights assessments. They propose a practical, four-phased framework, to assist organizations with performing fundamental rights impact assessments. Their discussion is highly relevant for the DSA risk assessments (Articles 34-35 DSA).

Alessandro Mantelero (The Polytechnic University of Turin) recently published his book, Beyond Data. It is now freely available to download on Springer’s website. Alessandro’s book highlights the limitations of data protection in dealing with the challenges of AI and proposes an integrated approach to risk assessment that focuses on human rights and encompasses contextual social and ethical values.

Three economists, Yi Liu, T. Pinar Yildirim, Z. John Zhang (all from the University of Pennsylvania), published an interesting econ theory paper about content moderation. They model different types of revenue models for platforms to find out what are the optimal strategies for firms. They show that a platform under advertising revenue model is more likely to moderate its content than one under subscription, but does it less aggressively. This is because a platform under advertising is more concerned about expanding its user base, while a platform under subscription is also concerned with users’ willingness-to-pay.

The Mastodon question

Obviously, everyone on Twitter, not just me was recently interested in how the DSA applies to Mastodon. For those who want to learn about content moderation in fediverse, I highly recommend this article by Alan Z. Rozenshtein (University of Minnesota). The most obvious DSA questions are: a) is Mastodon considered to be a single service, or only a protocol connecting multiple instances as separate services, and b) are instances regulated as online platforms.

For a), given that the protocol is open and Mastodon does not control who can create an instance, I think the answer is that only instances (servers) are regulated as services. For b), they can be online platforms, but keep in mind that the DSA has also a size threshold that excludes micro and small enterprises (Article 19 DSA). Thus, for many instances, the likely obligations only come from being a hosting service, which are mostly manageable common sense obligations (points of contact, etc.) and those that any instance would do to avoid its own liability (Article 16). The biggest pain can be an obligation to issue a statement of reasons (Article 17) and notify some relevant suspected crimes (Article 18). Mastodon instances don’t have to worry about annual reports as long as they are micro and small (Article 15(2) excludes them). In practice this means that unless the instances do not make serious money (10 mill EUR) or hire sizeable staff (50 employees), they do not have to do much unusual, as long as they can implement the two above obligations. Given the popularity of Mastodon overall, the worry that any single instance might become a VLOP with 45 million monthly active users seems too far fetched. Oh, and don’t even ask me how to count the users for a particular instance – I don’t think anyone knows.

One interesting question concerns what to do about instance-level moderation. If I understand the system correctly, since Mastodon works pretty much like an advanced public emailing system, one instance can block or moderate other instances, and thereby all users on all other instances (basically, as if a filter on Gmail would flag or block all emails [here short messages] from a particular domain name). I need more time and reading to think this through but welcome any thoughts from you.

In the meantime, I highly recommend this episode by Kate Klonick (St. John’s University) and Alan Z. Rozenshtein (University of Minnesota) on Arbiters of Truth. Other than explaining the problems well, it also features interesting analogies between content moderation and Elinor Ostrom’s work on commons.

DSA’s structures as an incentive to join forces

In my Verfassungsblog’s piece, Will the DSA Work?, I make the point that the DSA can only work if we invest money and effort. I argue that the crucial piece of the DSA are societal structures that the law can only foresee and incentivize but cannot build; only people can.

These structures, such as local organisations analysing threats, consumer groups helping content creators, and communities of researchers, are the only ones to give life to the DSA’s tools. They need to be built bottom-up and sometimes locally in each Member State. In course of my writing, I put together a table of different requirements for the DSA players: representatives to help with the DSA processes, such as content moderation (Article 68), trusted flaggers (Article 22), collective redress bodies for consumers (Article 90) and businesses (P2B Regulation) and out-of-court settlement bodies (Article 21). The important takeaway of the table is that sometimes these structures can be embedded into the same organisations. They are not mutually exclusive. In fact, the only incompatible matches are: (a) between the out-of-court bodies and other bodies and (b) between business and consumer collective redress bodies. All other bodies are mutually compatible. Trusted flaggers can also represent users in the DSA processes and can seek collective redress in courts. Here is what I write in the draft of the book, Principles of the Digital Services Act (thinking about the concept of Trusted Content Creators and current Twitter controversis showing the need for them):

One of us has argued that the DSA should act as an invitation to online communities to start self-organising into Trusted Content Creators. The above structures offer tools that these communities can seize and use when representing their interests. For instance, journalists can create communities that help them to push against account restrictions and content removals when it concerns their profession. The same organisations can help journalists to represent them professionally in content disputes with providers, before out-of-court settlement bodies, and if necessary, seek redress before courts. Such an organisation could also seize the opportunity and qualify as trusted flaggers and protect their members from hate speech and other illegal behaviour against journalists. Finally, once the organisation exists, it could start negotiating with the platforms about how to further protect journalists on their platforms. They could join the conversations about risk assessments and risk mitigation and offer to self-police the quality of the content of its members in exchange for stronger privileges, such as more protection against account suspensions. These can be formalised through the DSA’s official Codes of Conduct which would make their content subject Commission’s own monitoring.

LSE Course on the DSA still has some seats

My course on the Digital Services Act starts already in February. Do not miss the chance to join if you can. See more details here.

That’s it. Have a lovely Christmas break.