Trusted Content Creators
The Digital Services Act reignited the debates about how to treat professional journalists in the see of user-generated content which is often of doubtful quality. Some even go as far as to suggest that journalistic content should be entirely carved out of platform responsibilities or that the content shall enjoy near must-carry status.
I am not persuaded by these proposals.
At the same time, I do think that certain qualities of content should be recognized and privileged. But there is no reason to limit this to professional journalists. Any privilege shall be continuously earned through a track record.
Here is how it can be done.
The extended version of this idea is described here.
DSA could encourage creators to enter associations of trusted content creators (TCCs). TCCs would be continuously evaluated for how good of a job they do when policing the content of their members. If they do a good job, they become trusted for as long as they keep such a track record. TCCs could be associations of journalists, but also of YouTubers, activists or academics.
TCCs should be then privileged in the notice and takedown process. For instance, they should not be subject to takedowns without prior discussion, or the accounts of their members should not be terminated as easily. Such benefits should lead to the following:
- encourage creation of associations of trusted content creators who band together to represent some quality standards, even in context of professions that normally do not have such structures (e.g. YouTubers);
- drive interest in membership, as members of TCCs have a special status of quality/trust and thus stronger position vis-a-vis platforms;
- incentivize internal quality control within TCCs, as if one member accumulates mistakes, other members will act to remove such member, as its consistent mistakes can cost everyone the privileged status (it keeps bad actors out);
- bad actors can set up their own associations, but they will be judged on their track record and not the formal status (e.g. being an accredited journalist); as a consequence, if they fail to keep the aggregate quality among the members, they all lose the status immediately;
Such policies strengthen the position of trusted content creators ex-ante and reduce over-blocking prior to take-down/de-platforming/de-amplification. The benefit is that since TCCs are collective entities, their members have to keep each other in check, but also they bargain and deal with the platforms collectively. This strengthens their position but also shifts the focus from individual mistakes to aggregate characteristics of quality that such TCCs represent. Moreover, it gives such TCCs a mandate to focus on much more than just the truthfulness of the content produced by their members, as trust is about many other things (e.g. think of advertising issues among influencers). As with Trusted Flaggers, it leaves it to the society at large to develop its own institutions that can be relied on to articulate trust, without concentrating such power with platforms.
To keep some public oversight, DSA could extend the accreditation mechanism used for Trusted Flaggers and adjust it for these purposes. What needs to be worked out is the following:
- What is the threshold for being trusted?
- How broad of a mandate should TCCs get?
- How significant benefits should TCC get?