“The DSA is nothing more than digital censorship”

This week, the European Parliament voted on amendments related to the so-called “Digital Services Act (DSA), proposed EU legislation to regulate online content. By doing this, MEPs added a whole range of extra restrictions on top of the ones already included by the European Commission in its proposal.

The new rules, which still need to be negotiated with EU member states in the next few months, constitute a big change, largely covering how digital platforms like Facebook and Twitter deal with disinformation and illegal content, while also dealing with other aspects of digital platforms.

Targeted ads

Amendments approved by the EP ban online platforms from using sensitive data (including political views, biometric and genetic information and information on sexual preferences) when deciding which ads to show someone. They also involve a ban on targeting advertising at minors, when this is based on any of their personal data and majndates providers to scrap “tracking walls”, meaning that people should be granted “other fair and reasonable options to access the online platform” if they do not want their personal data to be used for advertising purposes. Also graying out or hiding “I do not consent” buttons will be illegal, while porn sites will be required to register the identities of users uploading material.

Reacting to the amendments approved by the European Parliament, IAB Europe is the European-level association for the digital marketing and advertising ecosystem, stated:

“MEPs voted last night to accept amendments to the DSA which give cause for concern for all those who support an efficient, secure and prosperous European digital market. (…)The use of personal data in advertising is already tightly regulated through the GDPR. What’s needed now is proper enforcement. (…) MEPs have decided to pass amendments that not only overlap with the GDPR and existing consumer law but risk undermining these rules, as well as the entire ad-supported digital economy.”

Encroaching on freedom of speech

Romanian EPP MEP Eugen Tomac points out that the DSA also aims to address so-called “disinformation”, stating: “platforms must become more accountable and better equipped in supporting democracy, addressing illegal content concerns, countering disinformation”.

German newspaper Handelsblatt goes more into detail on how the DSA intends to go about this:

“The Meta corporation knows that its Facebook and Instagram products make people mentally ill and turn entire peoples against each other. The responsibility has become too great to be left to Meta alone. Not to mention the conspiracy accelerator Telegram.

With the DSA, EU states will interfere more when it comes to which content needs to be deleted. Above all, users will be able to find out which parameters the platform uses to sort content. And they will have the option of having content sorted purely chronologically.

In the future, the platforms will need to make public which advertisements have been shown to which target groups. This is a lesson from election campaigns, where different user groups were provided with different messages. This enables scientists to research what else the algorithm does to us and gain deeper insights into this.”

In its Q&A on its original proposal, the European Commission recalls that the DSA includes:

  • “measures to counter illegal content online, including goods and services , such as a mechanism for users to flag such content, and for platforms to cooperate with “trusted flaggers”.
  • “obligations for very large online platforms to prevent abuse of their systems by taking risk-based action, including oversight through independent audits of their risk management measures”
  • “oversight structure to address the complexity of the online space: Member States will have the primary role, supported by a new European Board for Digital Services; for very large online platforms, enhanced supervision and enforcement by the Commission.”

The Commission also claims that “the new rules will harmonise due diligence obligations for platforms and hosting services, and the conditions for liability exemptions for online intermediaries. It will not touch upon national or EU laws that specify what is illegal.”

However, in a reaction to Brussels Report, Dutch MEP Rob Roos (JA21 – ECR) responds:

“The DSA proposal further increases the pressure to moderate and remove ‘harmful content’ from online platforms. It provides little to no support to guarantee the right to free speech.  Big Tech is fine with it, as long as they see their business models protected. (…)

Problematic is that the DSA requires procedures to be put in place for “trusted flaggers”. This comes down to institutionalising the practice of ‘fact checking’, whereby left-wing journalists or NGOs impose what would be “true” and “false”.

A proposed amendment that foresaw an exemption for media —or information outlets that call themselves media—to these “anti-disinformation rules” wasrejected.

According to Swedish MEP Jessica Stegrud (SD – ECR), the new legislation “leaves power in the hands of arbitrary tech giants”, causing “ordinary users to see their comments deleted or their accounts blocked, without any explanation, for views that may challenge those in power”. According to her, “what is legal to express publicly in the streets and on squares should also be legal online.”

When it comes to the issue of arbitrary enforcement of the policies of social media platforms, she notes that “in the past, Facebook has been both permissive and forgiving towards users with obvious links to Islamic State, amongst others, who openly spread their messages via social media. Another problematic example is Iranian leader Ali Khamenei in Iran, who was allowed to spread blatant anti-Semitism via social media”.

Also NGO European Digital Rights (EDRi) came out in a critical manner against this aspect of the Commission’s initial proposal, stating that the legislation would entail the following:

“After the legislative disasters of the Copyright Directive and the Terrorist Content Online Regulation, the Commission now seems to be more aware of the risks that regulation of the freedom of expression entails. The DSA proposal thus maintains the current rule according to which companies that host other people’s digital content are not liable for that content unless they actually know it is illegal.

Unfortunately, however, the Commission appears to have created a far-reaching exception to that rule: As soon as anybody on the internet flags any content as potentially illegal, liability kicks and would require the hosting company to “expeditiously” remove or disable access to the content in question. Removing or disabling content that has been flagged therefore becomes the most commercially reasonable action for companies in order to avoid the legal liability risk that comes with an actual legality assessment.

This heavy-handed approach would create a system of privatised content control with arbitrary rules beyond judicial and democratic scrutiny. We will work closely with the European Parliament and the Council to insert the safeguards needed to better protect human rights. The DSA proposal follows the principle “delete first, think later”.”

Crisis protocols

Dutch MEP Rob Roos also notes:

“Another calamity is the introduction of so-called crisis protocols. When there are ‘extraordinary circumstances affecting public security or public health’, the Commission would be allowed to impose that certain information is prominently disseminated. During the Covid crisis, we have seen how Big Brother techniques ensure that controversial but true information is removed from the internet.”

A research paper by Ruairí Harrison, a legal scholar at Utrecht University School of Law, seems to confirm that these “crisis protocols”, enshrined in article 37 of the Commission proposal, are meant to enable the European Commission to affect the content of what appears on online platforms. He writes:

“It is hoped that this protocol will also allow the Commission to […] tackle the potential associated infodemic.”

MEP Rob Roos further comments:

“Among others, the ECR, ID Group and LIBE Committee, led by Patrick Breyer, member of the Pirate Party, tabled amendments to substantially improve the text. The main amendment protecting free speech (537), which required Big Tech to amend the terms and conditions of social media platforms so that legal content – that did not go against the purpose of the service –  could no longer be removed, was voted down by a large majority.

Other amendments that would increase the role of the courts in content removal or that would minimise the rules for small online businesses also failed.”

He concludes:

“As a result, the DSA is nothing more than a digital censorship law. After Big Tech, Brussels is now ruling the conversation.”