Note: this article was translated from French by an automated tool On May 5, 2021, Facebook's Oversight Board (FOB) issued its decision on the merits of the suspension of Donald Trump's account, which was decided by the platform on January 7, in the wake of the events on Capitol Hill. This long-awaited decision provides valuable information on how this new body intends to support Facebook in its moderation practices and monitor the decisions made. However, it is limited in that it does not settle the issue on the merits, as it merely gives Facebook six months to take a position on the final sanction of Donald Trump's actions. In fact, the Council has left it up to Facebook to choose between the permanent deactivation of his account or a temporary suspension that would allow Trump to eventually return to the platform. The decision also illustrates the difficulties raised by the status of this body and by the standards it applies, which still need to be clarified. Finally, it shows that the Oversight Board (or FOB) intends to act less as a counter-power than as a body that guarantees the proper application of the general conditions decided by Facebook, all in compliance with very general international standards relating to freedom of expression.

First, let's review the facts. On January 6, 2021, activists wishing to prevent the certification of the 2020 presidential election by Congress entered the U.S. Capitol by force, causing five deaths and many injuries. On the same day, Facebook decided, after having deleted several publications of Donald Trump, to prevent him for 24 hours to publish on Facebook and Instagram. The next day, January 7, 2021, Mark Zuckerberg announced the decision to suspend, for an indefinite period, Donald Trump's account on Facebook and Instagram, on the grounds that the President was trying to prevent the peaceful and legal transition of power to his elected successor, Joe Biden: in this context, the risks generated by his presence on the network were too great. Facebook's decision was followed, on January 8, by Twitter's decision to permanently delete Donald Trump's account, considering that his tweets were likely to incite others to reproduce the violent acts committed on Capitol Hill.

At the time, Trump's “de-platformization” had aroused many reactions, between relief and reluctance. "This event will be remembered as a turning point in the battle for digital speech control", Edward Snowden tweeted, while Alexey Navalny denounced  "An unacceptable act of censorship" and that German Chancellor Angela Merkel saw "A problematic act". A few days later, Jack Dorsey, while regretting the inability of the network to produce a healthy conversation, had simply concluded: "If people don't agree with our rules and their application, they can simply go to another internet service". Two months later, Dorsey was the only one to recognize clearly, before Congress, the role played by social networks in the events of the Capitol. For its part, Facebook had taken, in January, the decision to submit its suspension decision to its Supervisory Board, theOversight Board(FOB). 

What is Facebook Oversight Board or FOB? This is an instance imagined in 2018 by Noah Feldman, a law professor at Harvard, who suggested the idea of a "supreme court". Designed to settle disputes, theOversight Board de Facebook is reminiscent of an arbitration body, except that only one of the parties, Facebook, has fixed its composition, the conditions of referral and the procedure followed. The Council has a quasi-judicial role insofar as it examines appeals and renders binding decisions for the platform, but it also has a (small) political role, since it can formulate general recommendations in an advisory capacity. 

Established in May 2020, the Supervisory Board is made up of twenty members recruited from among particularly recognized personalities such as Helle Thorning-Schmidt former Prime Minister of Denmark, Catalina Botero Marino, former special rapporteur for freedom of expression to the Inter-American Commission of human rights, Alan Rusbridger, former editor of the Guardian, or Tawakkol Karman, activist who received the Nobel Peace Prize in 2011 for her role in the Arab Spring protests in Yemen. Five out of twenty members are Americans, three are Europeans, and the rest come from all over the world. Since last fall, this body has been examining appeals against Facebook's moderation decisions and can take decisions that Facebook has committed, in advance, to apply. 

It is in this context that the decision taken by Facebook on January 7 with regard to Donald Trump was submitted to it. Specifically, Facebook posed two questions to the Council. The first related to the suspension decision adopted on January 7. The second amounted to questioning the Council on the strategy to be adopted with the accounts of political leaders in general. If the Council gave a clear, although not definitive, answer to the first question, it only gave an imperfect answer to the second. We will limit ourselves, in this blog note, to a few general remarks on this very instructive decision. We can also refer to Evelyn Douek's very informed comment, in English.

1. First remark: the Oversight Board does not resolve the issue of "Trump's de-platformization" in substance and refers the problem to Facebook

The Supervisory Board has decided without ambiguity to confirm the decision taken by Facebook on January 7, 2021 to suspend Donald Trump's account, recalling the risks of violence raised by the publications of the former President. On the other hand, the Council sharply criticizes Facebook for having pronounced an indefinite sanction, which is not, he emphasizes, not provided for by the general conditions of the platform which only provide for three sanctions: the removal of publications violating the Facebook conditions, a limited account suspension or permanent deactivation of the page and the account. Consequently, the Council gives Facebook a period of 6 months to take a final decision on Donald Trump's account, which must, in any case, be justified.

The Supervisory Board therefore returns to Facebook the responsibility of deciding on the thorniest question: what to do with Donald Trump? And this even though the object of the referral to the Council by Facebook was precisely to get it to rule on this point! One gets the impression that the Council did not wish to risk taking a position on such a controversial subject. Paradoxically or not, he affirms, in his decision, that by deciding to seize the Council after having taken a decision not listed in its general conditions, Facebook has sought to escape its responsibilities ... We can undoubtedly welcome the wish of the Council to reason rigorously by strictly examining the application of Facebook's terms and conditions. But we have, for the rest, little indication as to its position on the possible return of Donald Trump on the platform. Admittedly, the decision gives some elements on the circumstances to be taken into consideration to justify a possible permanent deactivation of Trump's account. It is, for example, specified that Facebook will have to analyze the context inside Facebook and Instagram, but also outside, which Facebook has done in the past by exploiting the information disseminated by the Department of Homeland Security (DHS) regarding terrorist threats. Several criteria have been mentioned by a minority of Council members, such as the fact that Trump stops making unfounded claims about alleged electoral fraud (which is currently far from the case!), That he recognizes the violations committed and commits in the future to abide by the rules of the platform in the future, or that it stop supporting those involved in the riots on the Capitol. However, these scattered considerations seem largely indicative. The responsibility for deciding the final fate of Donald Trump's account is here clearly referred to Facebook, even if it is asked to justify its position.  

2. Second remark: the Council sticks to the rules set by Facebook 

One thing is certain: the Supervisory Board intends to exercise strict control over the application, by Facebook, of its own general conditions, within the framework set by the platform. Not provided for by the general conditions, the “indefinite” suspension is therefore qualified as a “vague and non-standardized sanction”, imposed by Facebook in an “arbitrary” manner. By criticizing Facebook on this point, the Council functions well as a court which engages in a form of "control of legality" and imposes, in doing so, a "principle of legality" according to which the sanctions pronounced must be foreseen beforehand by the rules. of the platform. The check carried out is also a proportionality check: the Council checks that Facebook's decision constitutes a proportionate reaction, taking into account the circumstances and the risks incurred. The parallel with the control carried out by a court is all the more striking given that the Council here makes a "referral" to Facebook of the decision on the merits, like our Court of Cassation referring a dispute to the courts after having operated his control. La presentation on the websiteOversight Board moreover evokes an "appeal procedure" and praises an "independent judgment"

The parallel with a traditional jurisdiction must however be very strongly qualified. Because the "legality", the "standard" that the Council is here charged to make prevail consists, above all, of the general conditions of this private company which is Facebook. The Council expressly indicates it: the control of "legality" is done with regard to "Facebook community standards", which has something to shiver any lawyer accustomed to seeing in the law voted in Parliament the standard par excellence. No state norm is intended to apply here, even if we find, in the decision, mention of the First Amendment to the American Constitution, the content of which is considered equivalent to that of article 19 of the International Covenant on Civil and Political Rights. But we are hardly leaving here the scheduling created and planned by the platform. Facebook has, in fact, decided from the outset (and quite logically) to exclude national state laws from the standards applied by the Council. However, the platform has also made the choice to comply with international standards protecting human rights and freedom of expression, such as guiding principles on business and human rights. This is the only circumstance that allows the Council to step outside the ordering of the platform's rules to verify that the infringement of freedom of expression resulting from the suspension of Trump's account complies with the international standards applicable in the matter. 

There would be a lot to say about Facebook's choice to submit to standards of international law which, in principle, are addressed to States. These international principles certainly constitute a common base of standards shared by a large number of countries, but are they really appropriate here? Are they not too vague, too general? Do they correspond to the reality of the contemporary world, to that of these immense transnational digital platforms within which hate speech can spread virally? We do not know, moreover, the place and the scope really recognized here in these principles of international law and in particular if these international principles will necessarily prevail over the general conditions of Facebook. Admittedly, the Council criticizes, in the decision, the rules of the platform on the grounds that they lack the clarity required by international standards. In another recent decision, he judges that a Facebook standard leads to restricting freedom of expression in an unnecessary and disproportionate way from the point of view of international law (while also specifying - and this is convenient - that this standard is not also more in line with Facebook's "values"). However, whatever its criticisms, the Council has no other power than that of settling the dispute submitted to it and of ruling on the possible restoration of the contested publication. For the rest, it can only recommend that the platform modify the standards that it considers incompatible with international law, a recommendation that Facebook is not required to follow.  

In the present case, the Council recalls that if “ political speech enjoys high protection under human rights law because of its importance for democratic debate », International human rights standards allow restrictions on freedom of expression where there is a serious risk of incitement to discrimination, violence or any other unlawful act. From an international law perspective, rules restricting freedom of expression, says the Council, must meet three requirements: they must be clear and accessible, pursue a legitimate objective, and be necessary and proportionate to the risk of harm. The Council therefore controls the general conditions of Facebook and their implementation in these three respects. On the first, he judges that the rules of Facebook are sufficiently clear but criticizes the vague and uncertain nature of the “indefinite” sanction pronounced against Trump. On the second, he believes that Facebook's rules pursue a legitimate aim. On the third (necessity and proportionality), it is based on the criteria adopted by the Rabat Action Plan, and in particular on a six-point evaluation grid developed within the framework of this action plan, to assess the capacity of a discourse to create a serious risk of incitement to discrimination, violence or any other unlawful action. He concludes that Facebook's decision was necessary and proportionate given the events of January 6, while specifying that a minority of Council members considers it appropriate to also draw inspiration from Donald Trump's behavior in the months preceding. its suspension (like this sentence pronounced during the events following the death of George Floyd which had shocked so much and which Facebook had refused to delete: " When the looting starts, the shooting starts« ).

We can certainly rejoice that the Council monitors, on the basis of international standards, the implementation by Facebook of its general conditions. This control creates, however, a strange, unprecedented, somewhat baroque legal situation in which Facebook's terms and conditions are appreciated as if they defined a state legal order. The legality on the platform, determined by “Facebook's community standards and Instagram's community guidelines,” constitutes a sort of unbridled horizon. In this, the decision does not seem only to apply the guarantees provided for by international law, it also enshrines this somewhat strange ordering, constituted by the rules of the platform. It is also only because Facebook has decided that international standards are summoned here, as if, for the rest, the platform was not subject to anything.

3. Third remark: the Council does not consider it useful to grant a special status to political leaders

The Council very quickly sweeps away, in its decision, the particular quality of the applicant, who was still, at the time of his suspension, a leading political leader. Little is being made here of the overwhelming power of the big tech companies, who have the ability to silence the outgoing President of the United States, who just won 74 million votes, on the sole basis of their terms of use. The thing is understandable: created and designated by Facebook, theOversight Board immediately accepts the idea that the platform becomes, through its moderation policy, an arbiter of democratic debate by controlling what can be said, publicly, by elected politicians. 

Facebook had specifically asked the Council to make recommendations on the policy to be adopted with regard to political leaders, whether they be senior officials, elected officials or candidates for an election. The Council is content here with very general recommendations. Above all, he does not believe it is useful to mark a real difference between, on the one hand, political leaders, and, on the other, people who are particularly influential on social networks but lacking political mandates. The reasoning used by the Council to reach this conclusion is based exclusively on risks: people with a large audience can indeed, says the Council, generate serious risks, regardless of whether or not they have official functions. . "What is important is the degree of influence." The Council therefore insists on the risks and nuisances generated by the speeches of political leaders by brushing aside their legitimacy or the need for citizens to be aware of their words. If there is a risk of damage under international human rights standards, the Council insists, Facebook must suspend the account. But it is up to the platform to assess, alone, the merits of the suspension! The political leader concerned will only have, in a second step, to address theOversight Board...

It is regrettable that the Council refuses to draw a difference between the users officially authorized to represent the State and the others. His position is all the more astonishing given that Facebook, like other platforms, has adopted special conditions for influential people and politicians, towards whom there is increased tolerance on the grounds that their word is of "media interest" (" newsworthiness"). It is moreover said in the decision that the accounts with strong notoriety are the object of "cross-checks" in order to minimize the errors of moderation. This does not prevent Facebook from acting in the presence of repeated violations of its general conditions: at the end of March 2021, Facebook decided to prevent, for 30 days, the Venezuelan President, Nicolas Maduro, from posting or commenting on anything. on the network on the grounds that the person concerned had repeatedly violated the general conditions by praising a drug against covid-19 whose effectiveness has not been demonstrated. In such circumstances, one can only deplore the refusal of the Council to pronounce in a more specific way on the behavior to be taken with regard to the political leaders.

There are still a few recommendations that can be approved. The Council suggests, in fact, entrusting the process of moderation of political content to specialized staff able to assess the political context. It also encourages more transparency regarding the rules applicable to influential users.

4. Fourth remark: Facebook still has some progress to make from a transparency point of view

The Council repeatedly emphasizes that Facebook's rules lack clarity and that their implementation is not sufficiently transparent, in particular with regard to sanctions. It should also be noted that Facebook refused to answer certain questions from the Council, such as those relating to the configuration of the platform and its impact on the visibility of Donald Trump's publications or on the way in which political leaders are generally treated. . Once again, we welcome the Council's recommendations in favor of greater transparency, even if these recommendations remain very general. It should also be noted that the Council is calling for a real investigation into the potential contribution of Facebook to the accusations of electoral fraud and the tensions they have generated. 

5. Fifth remark: the Council appears less as a counter-power than as a body responsible for ensuring the satisfactory implementation of the platform's policy

In its decision, the Council sticks to its mandate, as defined by Facebook. It controls the platform's decisions with regard to its own terms and conditions and the international standards to which Facebook has chosen to comply. But this control remains very limited, restricted to a "legal order" whose contours are defined by Facebook and in which no state law can penetrate. No constitutional or legislative standard is intended to be taken into account here, in addition to the rules of Facebook and very general international principles which we do not yet know if they will really prevail over the rules of the platform. Is all this sufficient to guarantee the rights of people operating on the network? If we can welcome the initiative of Facebook and the way in which the Council intends, somehow, to fulfill its mission, the fact remains that the platform is not a State. The quasi-judicial control mechanism set up here is exercised with regard to and within the framework of a private enterprise which does not obey any democratic control and does not answer for its acts before the people. From this point of view, it is difficult for the French observer to see in the Council's decision, an equivalent to the Marbury v Madison decision, as some reviewers have written. It remains to be seen whether, over time and within its limited scope, the Council will succeed in ensuring the prevalence of these fundamental guarantees of transparency, integrity and freedom. This is the condition to be fulfilled for the Council to be able to assert its authority and one day deal, as it wishes, with the challenges brought before it. other social networks.

Finally, we must not forget that the legitimacy of the Council raises questions. It is made up of members appointed exclusively by Facebook, according to criteria that belong to it, and paid, albeit indirectly, by the platform (through a trust in which Facebook has placed $ 130 million). In addition to this problem of independence, there is the question of the representativeness of the members of the Board, which includes a large number (excessive?) Americans (five). Can we, under these conditions, hope that global principles relating to online speech can be identified or will it simply be a question here of applying Western conceptions, and in particular American, relating to freedom of expression? We could just as easily see in this Supervisory Board a body responsible for consolidating Facebook in its original strategy, particularly favorable to freedom of expression. It is moreover this fear which led to the creation, by a group of experts, of an informal and competing body, the « Real Facebook Oversight Board«  Which includes, for example, former Estonian President Toomas Henrik Ilves, Professor Shoshana Zuboff, and Derrick Johnson, NAACP President. 

In reality, the importance given across the Atlantic to the Supervisory Board and its quasi-judicial function largely derives from the law void created by the inability for US users to invoke the First Amendment against moderation decisions or to engage the platforms' liability due to the immunity guaranteed by section 230 of the Communications Act of 1934. This void, which explains the stated objective of Facebook to build a body of precedents (Charter of the FOB, article 2 section 2), is not felt in the same way in Europe and in France, where legislative texts in the course of preparation are preparing precisely to grant new guarantees to the users. The proposal for a European Regulation Digital Services Act (Articles 17 and 18) and the French bill confirming respect for the principles of the Republic and the fight against separatism  (Article 19 bis) strengthen the possibilities of internal and external appeals against moderation decisions, which can be exercised before alternative dispute resolution bodies or before the courts. There is no doubt, moreover, that the decisions of the Supervisory Board may, one day or another, be challenged before the competent national courts.

Anyway, the ball is now in Facebook's court which will have to, from here to June 4, disclose its position on the Council's non-binding policy recommendations, and then, within 6 months, finally make its final decision on Trump's account. 

One thought on “Facebook and Trump's suspension: is the Facebook Oversight Board there?”

  1. Thank you for this very interesting contribution Florence. Related to this article, but beyond Trump, captivated by private regulation and new forms of non-state justice. A double system is emerging, worrying for the sovereignty of States, but also the protection of individuals.

Leave comments

Your email address will not be published. Required fields are marked with *