Note: this article was translated from French by an automated tool The political consensusregarding the  Digital Services Act(DSA) was announced on April 23, but the terms of the agreement are still under discussion. The purpose of the DSA is to regulate the activity of companies which, regardless of their place of establishment, offer digital services to persons established in the European Union. Digital services include services offering a network infrastructure, such as the provision of Internet access, storage services in the cloud or on the web (Amazon Web Services or OVH Cloud) and online platforms connecting users , such as marketplaces (Amazon, Vinted), application stores (Apple Store), collaborative economy platforms (Uber, AirBnB) and social networks (Facebook, YouTube, Twitter). 

Will the DSA fundamentally change the rules for tech companies? 

It is true that the DSA was first designed to revise and update the provisions of the E-Commerce Directive 2000/31/EC of June 8, 2000. It therefore incorporates the principle of immunity of digital service providers for the illegal content they host when they were unaware of their illegal nature or acted promptly to remove or make their access impossible when they were aware of it. But the DSA qualifies and supplements this principle with new prohibitions and very restrictive obligations.

first draft of the text had been proposed by the Commission on 15 December 2020. Overall, the current version seems relatively close to the text initially proposed by the Commission, even if some substantial modifications have been added. We do not yet have the final text, which must be officially approved by Parliament and the Member States, and is still hotly debated.

What obligations will henceforth weigh on digital service providers? 

 The DSA submits digital service providers to obligations that can be summarized in three points: transparency, fight against illegal content and practices, user protection.

With regard to transparency, the obligations are heavy: clarity of the general conditions, information on the terms of advertising practices and recommendation parameters, details of the automated processing used to manage content, regular reports on moderation practices.

The DSA also organizes the procedures for an effective fight against illegal content online. Users must be able to easily report illegal content and platforms must give priority to reports from trusted third parties (“ trusted flaggers "). The fight against illegal content must, however, be done with respect for the rights of users, and in particular their right to freedom of expression: moderation decisions must be reasoned and users must be able to lodge internal appeals or lodge their disputes. before an alternative dispute resolution body.

Finally, the text includes provisions protecting users. “User traps” are thus prohibited (“ dark patterns ”) intended to manipulate Internet users by encouraging them to perform an action in a quasi-unconscious way (for example subscribing to a service). Profiling based on sensitive data (political opinions, religious beliefs, sexual orientation) is prohibited. Minors will not be able to see targeted advertisements based on their personal data. Users of very large search engines or platforms should be able to choose recommendation systems that are not based on profiling.

The text aims, in the same spirit, to protect consumers against the practices of professional users by making the platforms responsible. Platforms serving as intermediaries (marketplaces, collaborative economy platforms) will have to ensure the reliability of the professional users of their service and design their interface in such a way as to guarantee compliance with consumer law. They will also have to make reasonable efforts to prevent the sale of illegal products or services and, a great novelty, will themselves be liable, in the event of illegal practices, when an average and reasonably well-informed consumer could believe that the the information, good or service was offered by the platform or by a user acting under its authority or control.

 On all these points, the obligations of very large technology companies are particularly reinforced. Indeed, the DSA constitutes “asymmetrical” regulation in the sense that the obligations imposed vary according to the characteristics of the companies considered. In particular, very large platforms with at least 45 million active users per month in the European Union (" very large online platforms ”) – to which the very large search engines were added during the negotiations – are subject to specific obligations because they present, due to their size, particular risks. These very specific actors will therefore have to assess the risks of their activity and take measures aimed at mitigating these risks. They will also have to appoint a Compliance Officer and be subject, once a year, to independent audits on compliance with the obligations provided for by the text. They will have to give access to their data to approved researchers working on online risks as well as to the authorities when they need it to ensure compliance with the Regulation.

The very large platforms and search engines will finally be able to be called upon in the event of a crisis (war, terrorist attack, natural disaster, pandemic). The war in Ukraine has, in fact, led negotiators to introduce into the DSA a crisis reaction mechanism which allows the Commission, on the recommendation of the European Committee for Digital Services, to ask very large platforms to take specific measures, such as the suppression of war propaganda.

 Do the control and sanction procedures provided for by the DSA guarantee effective compliance with the Regulation by technology companies? 

The Regulation generally leaves it to state authorities to monitor compliance with its provisions, in accordance with the country of origin principle. This country of origin principle means that the law applicable to a provision of services is that of the Member State in which the company providing the service has its registered office, regardless of the State in which the service takes place. operation.

Each Member State will have to appoint a "National Coordinator of Digital Services", who will be specifically responsible for the proper implementation of the Regulation at state level. In France, it could be the ARCOM (Audiovisual and digital communication regulatory authority) but several aspects of this regulation fall within the competence of the DGCCRF (General Directorate for Competition, Consumption and the Repression of fraud). The national coordinators will meet within the “ European Committee for Digital Services”, which will constitute an independent advisory body responsible for supporting the national coordinators and the Commission.

However, the text gives the Commission responsibility for controlling the activity of very large search engines and very large platforms with more than 45 million users. The latter may carry out investigations and checks directly. To finance this control, the undertakings concerned will have to pay the Commission a fee (supervisory fee) which may be up to 0,05% of their annual worldwide turnover. The Commission thus intends to hire approximately 150 experts responsible for monitoring the application of the Regulation, a number which may however appear low, in particular with regard to the means available to the large technological companies.

Regulators will be able to impose fines of up to 6% of the annual turnover of the sanctioned company, a ceiling raised to 1% for minor violations. In the presence of companies that have repeatedly failed to comply with the DSA, a ban on practicing could be considered.

It is estimated that the DSA should effectively enter into force by the first quarter of 2023. The companies concerned will then have 15 months to comply with their new obligations, but the very large search engines and platforms could only have 4 months. In France, however, many of the measures contained in the DSA are already in force: the law of August 24, 2021 introduced in advance a number of provisions in the law of June 21, 2004 for confidence in the digital economy. (new article 6-4 LCEN).

Leave comments

Your email address will not be published. Required fields are marked with *