Note: this article was translated from French by an automated tool On December 15, the European Commission made public two regulatory proposals intended to significantly transform the legislative framework applicable to platforms. This new legislative package was long overdue: the President of the Commission had announced it as soon as she took office (Ursula von der LeydenPolitical Guidelines for 2019-2024, 2019) and the Commission made the official announcement during the presentation of its digital strategy last February (European Commission, « Shaping Europe's Digital Future »February 19, 2020). In June, the Commission officially announced the upcoming adoption of a Digital Services Act. A public consultation, which ended on September 8, focused on several avenues envisaged: the revision of the provisions of the E Commerce Directive 2000/31 / ECof 8 June 2000, the creation of a new regulation on companies controlling access to the market ("  gatekeepers "), And the possible adoption of a new competition law instrument

The very principle of the adoption of new digital legislation is a broad consensus in Europe as the current legislation seems inadequate and insufficient in the face of the unprecedented difficulties caused by the emergence of platforms so powerful that they seem to compete with States (v. . F. G'sell, What is digital sovereignty?). However, it is not only their size or their transnational character that makes the issue of platform regulation particularly acute, but also the particularism of the functions they perform and their business model. At a time when citizens are informing themselves and discussing online, fake news and hate speech, the effects of which are multiplied on platforms, pose unprecedented difficulties. At the same time, a small number of well-identified technology companies benefit from a situation of domination in structurally concentrated markets, which explains why the Commission particularly wishes to supervise the activity of the most powerful platforms.

The announced legislative package now comes in two very distinct texts: the Digital Services Act, which updates and supplements the provisions of the E-Commerce Directive on the regulation of illegal content and the Digital Markets Act which imposes strict obligations on platforms controlling access to digital markets (" gatekeepers "). We will discuss here the Digital Services Act and study the Digital Markets Act in another blog post (v. F. G'sell, A new regulation imposed on gatekeepers: the Digital Markets Act). 

The Digital Services Regulation (Digital Services Act) targets all companies, regardless of their place of establishment, which offer digital services to people established in the European Union. These companies will have to designate a single point of contact (art. 10) or a legal representative when they are not established in the Union (art. 11), so that the national and European authorities responsible for ensuring compliance with provisions of the Rules have designated contacts. Basically, the proposed Regulation takes up and clarifies the principle according to which digital service providers cannot be held liable for illegal content published or transmitted by third parties (1), while attaching this immunity to new obligations relating to the fight against illegal content and practices as well as moderation strategies (2). Compliance with these obligations will be mainly monitored by the authorities of the Member States, even though the Commission will have jurisdiction over very large platforms (3).

1. The reaffirmation of the lack of responsibility of the platforms because of the information transmitted, stored or hosted

The proposed Regulation provides for the repeal of Articles 12 to 15 of the E Commerce Directive 2000/31 / EC of 8 June 2000, of which it incorporates the principles concerning the immunity of providers of simple transport services (Article 3), of Caching (article 4) and hosting (article 5) because of the illegal content that they transmit, store or host. In particular, the text recalls that the hosts are not responsible for the information they host when they are not aware of its illegal nature or have acted promptly to remove it or make their access impossible when they had it. knowledge (Article 5). The principle that these service providers are not required to carry out general surveillance of the content posted online or to carry out checks is also reiterated (Article 7). The proposed Regulation here takes the trouble to specify that the hosts taking the initiative to moderate the content or to carry out checks relating to the information published therein do not, however, become responsible for it (art. 6). . 

2. The introduction of new obligations for the purpose of combating illegal content

The immunity granted to technology companies is now accompanied by new obligations intended to strengthen the fight against illegal behavior (i), to provide guarantees to users of digital services (ii) and to control the risks presented by very large platforms ( iii). These new obligations are imposed asymmetrically, which means that they vary according to the characteristics of the digital service providers considered. To this end, the text targets four main categories of players: 1 / services offering a network infrastructure (Internet service providers, domain name registrars), 2 / storage services (in the Cloud, on the web), 3 / online platforms connecting users (marketplaces, application stores, collaborative economy platforms, social media) and 4 / very large platforms of at least 45 million monthly active users who, because of their size, are considered to present systemic risks (Article 25).

i) The fight against illegal content and practices

The new obligations imposed by the proposal for a Regulation reflect a clear change in the Commission's strategy in the fight against illegal content compared to the simple "Code of good conduct against disinformation" to which platforms can join on a voluntary basis. The text requires all companies covered by the Regulation to cooperate with the competent authorities. They must follow up, by justifying the measures taken, any instruction from the national judicial or administrative authorities to act against illegal content (Article 8). They will also have to communicate, to these same authorities, the information relating to a user which may be requested from them in accordance with national or European legislation (art. 9). In addition, the text requires providers of the last three categories (storage services, linking platforms, very large platforms) to set up accessible mechanisms allowing their users to report the presence of content that they consider illegal ( article 14). In this context, the platforms of the last two categories will have to deal as a priority with reports from " trusted flaggers "," Trusted signalers "who will have been designated as such by the National Coordinator of digital services (art. 19). Once a content has been reported, the providers concerned will be liable if they do not act promptly to remove what is illegal. The platforms must, moreover, quickly inform the competent authorities of the presence of any content giving rise to suspicion of the past, present or future commission of a serious criminal offense involving the life or safety of persons (article 21). LUsers who frequently publish illegal content may be subject to a warning and then have their account suspended for a reasonable period (art. 20 (1)).

In addition to organizing the fight against illegal information, the proposed Regulation includes several provisions intended to ensure compliance with Union law, and in particular consumer law, by professional users of the platforms. Thus, Article 5 (3) introduces a very important limit to the immunity of digital service providers in the event that they allow the conclusion of online contracts with consumers. Until now, the service provider only engaged its own responsibility when it was proven that the professional user at fault was acting under his authority or under his control (art. 14 (2) Directive 2000/31 / EC). The text now provides that the provider himself is liable when an average and reasonably well-informed consumer could believe that the information, good or service was offered by the provider himself or by an acting user. under its authority or under its control. These provisions come here to respond to the discussions aroused by the increasingly frequent questioning of e-commerce platforms, in a context where certain American jurisdictions have qualified Amazon as a "seller" of products sold online by professional users ( v. F. G'sell, Is Amazon a distributor of products sold by third parties on its marketplace?). However, it should be noted that these provisions are primarily aimed at consumer protection: it is not, for example, a question of allowing a company victim of counterfeiting to act more easily against a platform, as in the famous Ebay affair.  (CJEU n ° 0312 July 2011, aff. C-324/09, L'Oréal c / Ebay) -where the responsibility of the platform was retained - or in a more recent case where the responsibility of Amazon was excluded (CJEU n ° 39/20 2 April 2020, aff. C-567/18, Coty v / Amazon) In all cases, the platforms allowing the conclusion of online contracts must obtain a certain amount of information relating to their professional users (art. 22), ensure their reliability and design their interface in such a way as to ensure the compliance with consumer law with regard to the pre-contractual information and information obligation relating to product safety (art. 22 bis).  

ii) Guarantees offered to users of digital services

Several provisions of the proposed Digital Services Act aim to provide guarantees to users.

On the one hand, users must be fully informed of the practices of their service providers in terms of moderation. All providers covered by the Regulation must be transparent and clearly specify, in their general conditions, any restrictions that may affect the use of their services. The text specifies that they will have to act in a diligent, objective and proportionate manner in the implementation of the restrictions provided for (article 12). These service providers - with the exception of small businesses - will also have to publish reports each year (Article 13) presenting: 1 / the injunctions received from national or European authorities, in particular regarding the removal of content or communication information relating to a user, 2 / the reports received and the follow-up given to them, 3 / the moderation actions carried out at the initiative of the service provider and 4 / the complaints received from users concerning moderation practices. The platforms must provide additional information relating to disputes with their users, suspensions decided during the year and the use of automated processing to practice content moderation (art. 23).  

On the other hand, moderation decisions must be motivated and be able to be challenged. Hosting service providers and online platforms must communicate the reasons that led them to remove or deactivate access to certain content (Article 15). Platforms, with the exception of small businesses (Article 16), will also have to set up a system for challenging decisions to remove content or to suspend or terminate the user's account or their user. access to the service (Article 17). Finally, the platforms must allow their users to bring their disputes before an alternative dispute resolution body certified by the National Coordinator of Digital Services (Article 18). However, dishonest users will not be able to abuse the reporting and objection options that will be offered to them. The platforms may decide, after a warning, to no longer follow up on reports from a user regularly making unfounded reports (art. 20 (2)). Likewise, users who frequently raise unfounded disputes may be deprived of this right of dispute (art. 20 (1)).

Finally, the platforms' business practices must be transparent. LPlatforms that deliver online advertising will need to ensure that recipients of advertising messages are able to understand that the message being broadcast is an advertisement, know the identity of the person for whom the advertisement is being shown, and '' obtain enlightening information on the main parameters used to determine the recipients of the advertising in question (art. 24). Very large platforms will need publish precise information about the advertisements they disseminate online (art. 30). They will also have to specify, in their general conditions, the characteristics and parameters of their recommendation systems and the conditions under which users can possibly modify them (art. 29).

(iii) Controlling the risks created by very large platforms

Very large platforms with at least 45 million users are subject by the text to relatively heavy additional obligations given the systemic risks they generate (art. 25). They will therefore be required to identify, analyze and assess these risks, in particular the risks caused by the dissemination of illegal content. They will also have to consider the negative effects of the functioning of their platform on fundamental rights, and the real or possible effects of an intentional manipulation of their services on the protection of public health, minors, public debate, the electoral process and public security (Article 26 (1)). In particular, they will need to consider the extent to which their moderation, recommendation, and ad targeting systems may impact the systemic risks under consideration (art. 26 (2)). They must, in consideration of these assessments, take reasonable and effective measures to mitigate the risks identified (Article 27). 

In addition, very large platforms will have to designate a " Compliance Officer ”(Art. 32). Once a year, they will be subject to independent audits relating to compliance with the obligations set out in the Regulation (article 28). In this regard, they will have to draw up specific reports which will be communicated to the competent authorities (art. 33). 

3. Monitoring of compliance with their obligations by the platforms 

The proposed Regulation leaves it to the state authorities to monitor compliance with the obligations laid down by the text. The Member States will therefore have to designate the competent authorities in the matter: among these authorities, one will act as the National Coordinator of digital services, who will be specifically responsible for the proper execution of the Regulation at State level (art. 38). The National Coordinators will have powers of investigation, decision on a certain number of measures and sanctions (art. 41). They may also receive complaints from users of the services concerned (art. 43). The National Coordinators will meet within the European Board for Digital Services which will constitute an independent consultative body in charge of supporting the National Coordinators and the Commission. 

The penalties applicable in the event of non-compliance with the Regulations will be provided for by national law (article 42). These sanctions must be effective, proportionate and dissuasive. They must not exceed 6% of the annual turnover of the sanctioned company. For certain offenses (supply of incorrect or incomplete information, non-rectification of incorrect or incomplete information, refusal to submit to an inspection), this ceiling will be raised to 1% of annual turnover. It will be possible to impose fines to be paid on a periodic basis; in this case, each monthly payment may not exceed 5% of the company's daily turnover. 

If the proposed Regulation leaves it to the Member States to ensure that the obligations it enacts are respected, the European authorities will be able to intervene against very large platforms with more than 45 million users. In this case, the National Coordinators of the place of establishment of the platforms (or of the place of establishment of their legal representative in the European Union) will remain competent, but the European Board for Digital Services and the Commission will have the possibility to intervene, if necessary on their own initiative. The Commission may intervene directly, in certain cases, in order to carry out investigations and controls (Articles 51 to 57). The investigations may then be concluded with a decision of non-compliance (art. 58) in which the Commission may impose a fine not exceeding 6% of the turnover for the previous financial year, or even 1% for certain infringements.  

In the end, in so far as it concerns all digital service providers, the proposed Regulation is not, or not only, the instrument for combating GAFAM that is regularly mentioned. However, it imposes additional and important obligations on large platforms with more than 45 million users, which will be added to those weighing on gatekeepers under the Digital Markets Act. In this, these two texts, which complete the proposal of Data Governance Act published on November 25 (v. F. G'sell, Towards the European data market: the Data Governance Act), do indeed strictly regulate the activities of large technology companies. 

Leave comments

Your email address will not be published. Required fields are marked with *