The DSA Newsletter #4
A lot has happened since July 2023. Here is your (ir)regular summary of the world of the DSA news.
Biggest News
The European Commission just designated three porn sites as additional VLOPs: Stripchat, Xvideos and Pornhub. This move is hardly surprising given that the porn industry’s numbers did not add up from the very start. The civil society advocated for this move several weeks before the disclosures. The Commission also finally published its first batch of designation decisions (previously, we only had names, not the text of the actual decisions).
The biggest news is undoubtedly the opening of the investigation into X’s DSA compliance. The reasoning behind the suspicions is not public (and unlikely to be disclosed pending the investigation) but the press release shows that the Commission suspects that X might be violating the following provisions: Article 16(5/6) [submission interfaces & handling of illegal content notices], Article 25(1) [potential misleading character of blue checkmarks], Article 39 [the failure to implement proper ad archives], Article 40(12) [failing to allow accessibility of public data for research through APIs or scraping], Articles 34(1/2) and 35 [risk management]. While Breton’s rhetoric in his infamous Palestine/Isreal letters to TikTok, X, Facebook, and Youtube, focused a lot on “disinformation”, this scope of investigations is much more sensible (here is the European civil society’s justified response to those letters). The investigation goes after the obvious failures of X to live up to the DSA standards.
The Commission will now collect evidence to build a potential infringement case. Its case for most of the claims is rather strong. My only doubt is about claims about the violation of Article 35 DSA – insufficient risk mitigation measures. While Article 34 is a rule of a more procedural nature which can be easily violated if other parts of the DSA are infringed, this is different for substantive obligations under Article 35. If the claim here is self-standing (something extra), then the Commission will need substantial evidence to build a case. The Commission’s press release describes the potential problem as follows:
The effectiveness of measures taken to combat information manipulation on the platform, notably the effectiveness of X’s so-called ‘Community Notes’ system in the EU and the effectiveness of related policies mitigating risks to civic discourse and electoral processes.
The key question will be the evidence about what constitutes said (in)effectiveness. But then, this is only the opening of an investigation. I think it is very important the Commission acts swiftly. This is how it builds its reputation and will undoubtedly incentivise companies to better comply with the DSA. In my forthcoming book, I draw on the work of Axelrod, Scholz, Ayres and Braithwaite to argue that the Commission needs to act quickly and decisively whenever it sees clear violations. By practising a tit-for-tat strategy, it can achieve long-term reciprocity, and thus cooperation.
The Commission also adopted its delegated regulation on fees (Commission Delegated Regulation (EU) 2023/1127 of 2 March 2023), and already notified companies of their bills. The companies can still seek judicial review. I haven’t found any public information about which company pays how much. However, loss-making companies (you can guess who they are) and non-profits like Wikipedia will pay zero in fees.
Finally, the Commission has just published a useful summary of its data access consultation that was expertly prepared by Paddy Leerssen (see here). It has tons of useful recommendations for the upcoming delegated act on data access.
Big Tech Transparency
Online platforms issued another round of user-number disclosures in August 2023. I decided to archive some of them because some companies seem to be changing these public disclosures ex-post. You can find February 23 discloures here, and August 23 disclosures here.
The Commission also launched its database for the statements of reasons (SoR). The SoR detabase is a work-in-progress in many respects, but I think it suprised many that in a few days it will exceed 1 billion notifications. A group of Italian researchers (Amaury Trujillo, Tiziano Fagni, Stefano Cresci) just published the first analysis of the disclosures finding the following:
Based on our results, we conclude that large social media adhered only in part to the philosophy and structure of the database. Specifically, while all platforms met the requirements of the DSA, most omitted important (yet optional) details from their SoRs. This limits the usefulness of the database and hinders further analysis of this data by scholars and policymakers alike. At the same time, the structure of the database turned out to be partially inadequate of the platforms’ reporting needs. For example, the current structure lacks the possibility to explicitly report moderation decisions targeted at accounts rather than content. To this regard, our results can inform future developments of the database.
Social media platforms exhibited marked differences in their moderation actions, in terms of restriction and content types, timeliness, and automation. Part of the differences are due to the platforms’ varying degrees of adherence to the database. However, part are also due to the unique characteristics of each platform, which pose challenges when it comes to harmonizing their moderation practices, as envisioned by the DSA. Furthermore, we also found that a significant fraction of the initial database data is inconsistent. This resulted from both internal validity checks on the data itself, as well as from external checks against statements and data extracted from the platforms’ transparency reports. Of all the considered social media, X presents the most inconsistencies, as reflected by striking discrepancies found in multiple aspects of its moderation actions. Finally, we conclude that the self-reported nature of the database, and the widespread inconsistencies that we found, raise concerns on the reliability of the data contained therein. This begs caution when analyzing such data to make inferences and draw conclusions on the moderation practices of large social media platforms.
VLOPs/VLOSEs also published their first round of transparency reports according to Article 42. Tremau has a nice tracker here. Stay tuned for the analysis of these disclosures which include a lot of new information, such as country-by-country breakdown of the user numbers and order numbers, indicators of accuracy, and qualifications/linguistic competence of staff.
Finally, the Commission launched a repository of terms and conditions which it produces by scraping the websites of companies. Going forward, this can become a useful resource to get a quick overview what are the DSA-regulated businesses, as the obligation under Article 14 falls on all DSA-regulated services.
Big Tech RecSys Opt-outs
VLOPS started complying with opt-out obligation for their recommender systems (Article 38: “at least one option for each of their recommender systems which is not based on profiling”). Since companies can decide what type of option they offer, it will be interesting to watch how they comply. I did not have time so far to do a proper survey of adopted solutions. I have seen TikTok announcing some form of regional newsfeed as an opt-out, and Meta announcing chronological newsfeed for Facebook/Instagram. If you have more information about other VLOP services, just drop me an email.
DSA before Courts
After being designated, Amazon and Zalando filed their lawsuits against the designation decisions (Amazon v EC (T-367/23), and Zalando v EC (T-348/23)). Amazon also sought a preliminary injunction, which has been granted in part (regarding Article 39 [ad archives]). Zalando only sought invalidation of the designation. The cases are still pending. Amazon was successful before the General Court by suspending the application of Article 39 which imposes an obligation to create ad archives (T‑367/23 R). The President of the General Court rejected the suspension of Article 38 (opt-outs), but granted that of Article 39 arguing that:
62 Consequently, for the purposes of the present examination of the condition relating to urgency, the information at issue must be regarded as being confidential. On the other hand, the question of whether the advertisement repository provided for in Article 39 of Regulation No 2022/2065 will require the disclosure of confidential information of both the applicant and its advertisers falls within the scope of the assessment of a prima facie case in the third plea relied on by the applicant in support of its action against the contested decision.
63 In that context, the question of the extent to which the disclosure of allegedly confidential information will cause serious and irreparable harm depends on a combination of circumstances, such as, inter alia, the professional and commercial importance of the information for the undertaking seeking its protection and the utility of that information for other market participants which are liable to examine and use it subsequently (see, to that effect, order of 10 September 2013, Commission v Pilkington Group, C‑278/13 P(R), EU:C:2013:558, paragraph 42).
64 In the present case, it must be held that the obligations relating to the advertisement repository, which provides information about the advertisements on the applicant’s platform, enable third parties to access significant trade secrets concerning the advertising strategies of the applicant’s advertising customers. It reveals strategic information such as campaign duration, campaign reach and targeting parameters. By doing so, it will allow competitors and the applicant’s advertising partners to draw market insights on an ongoing basis, to the detriment of the applicant and its advertising partners.
65 The evidence put forward by the applicant therefore makes it possible to establish that it cannot await the outcome of the main proceedings without suffering serious harm.
I think the court has applied a very weak standard here. Basically, it invites all future VLOPs to seek suspension of Article 39. I would be surprised if the porn sites that have just been designated do not use the opportunity to suspend the application of Article 39 by seeking judicial review of the designation. It is not clear to me how Amazon is more impacted than any of the other VLOPs, and why the same advertisers who now have to be disclosed there will suffer so much when they disclose on Amazon. I can see some specifics that exist here for the marketplaces but the court does not mention any.
Disclosure: I represent a consumer organisation in Zalando v Commission seeking permission to intervene.
Something else
I will publish an update with new academic work in early January. Feel free to send me tips. The good news is that my book is already with the publisher, and so is the upcoming OUP commentary on the DSA by Saulius Kaleda, Paul-John Loewenthal and Folkert Wilman.
All the DSA enthusiast should also mark 15-16 of February 2024 in their calendars. Just a day before the DSA enters into force, the Institute of Information Law organises a big conference on the DSA. As a keynote speaker, I am obviously biased, but the conference programme looks very interesting. I encourage you to join us.
Finally, I am again running my LSE Course on the Digital Services Act. We still have some spots. This is the third edition. Former participants were professionals from regulators, providers, civil society, auditing firms, etc. Just get in touch if you are interested.