The Gambia v Facebook: Obtaining Evidence for Use at the International Court of Justice (Part I)

Written by

On September 22, 2021, a US magistrate judge ordered Facebook to disclose materials relating to the perpetration of ethnic hatred against the Rohingya Muslim-minority in Myanmar. The application was made by The Gambia, which seeks further evidence to support its claims in the pending action against Myanmar at the International Court of Justice (ICJ). The Facebook decision is relevant to The Gambia’s efforts to establish genocidal intent at the ICJ as well as to broader debates about free expression, privacy, access to information, and the rights and obligations of social media companies. Whether the decision will survive further legal challenge from Facebook, in whole or in part, is a significant question.

Why did The Gambia sue Facebook?

The Gambia initiated the ICJ proceedings against Myanmar in November 2019. The Gambia claims that Myanmar is in breach of its obligations under the 1948 Convention on the Prevention and Punishment of the Crime of Genocide because of its treatment of the Rohingya minority, including violent military campaigns in 2016 and 2017 that caused as many as 10,000 deaths and forced more than 700,000 people to flee to Bangladesh. Prior to the filing of the ICJ case, the International Independent Fact-Finding Mission on Myanmar reported that the Rohingya in Myanmar had suffered gross human rights violations—including torture, sexual assault, and killings—and that reasonable grounds supported an inference of genocidal intent (2018 Detailed Findings, paras 1440-41). In January 2020, the ICJ indicated provisional measures that require Myanmar to prevent the commission of genocidal acts against the Rohingya and the destruction of evidence (for analysis see here).

The connections between Facebook and anti-Rohingya hate speech in Myanmar are well documented (see here and here). The UN Fact-Finding Mission paid considerable attention to how Facebook was used by state and non-state actors to promote hateful and violent rhetoric (paras 1342-1354). Following Facebook’s own investigation into the use of the platform in Myanmar, the company announced in August 2018 the removal of 425 Facebook pages, 17 Facebook groups, 135 Facebook accounts, and 15 Instagram accounts for engaging in ‘coordinated inauthentic behavior’. It also banned 20 individuals and entities from Facebook, including Senior General Min Aung Hlaing (who later led the February 2021 military coup) and the military-controlled Myawady television network. Facebook conceded that it had been ‘too slow to act’ but asserted that it was making progress with ‘better technology to identify hate speech, improved reporting tools, and more people to review content’ (on Facebook’s belated turn to content moderation in Myanmar, see here).

Facebook’s removal (or ‘deletion’) of these accounts and pages made them inaccessible to Facebook users and the general public, but Facebook retained the material internally. This set the stage for The Gambia’s litigation against Facebook in US federal court.

The Dispute Surrounding The Gambia’s Request

On June 5, 2020, The Gambia filed a request pursuant to 28 USC §1782 in the District Court for the District of Columbia. That statute authorizes a US federal court to order testimony or the production of documents ‘for use in a proceeding in a foreign or international tribunal’ pursuant to a request by such a tribunal or ‘any interested person’. As a party to the proceedings at the ICJ, The Gambia was an ‘interested person’. Its request covered (1) public and private communications associated with the deleted content (which it limited to material containing hate speech or violent content); (2) documents associated with Facebook’s internal investigation that explain how Facebook identified the deleted content; and (3) a Rule 30(b)(6) deposition with a Facebook representative.

Facebook opposed the request. First, Facebook asserted that the Stored Communications Act (SCA) (28 USC §2702) barred it from disclosing the material. Secondly, Facebook argued that because the request was overly burdensome and the information could be sought through other channels, the court should exercise its discretion to decline the request.

The Stored Communications Act and the Non-Disclosure Obligation

The SCA stipulates that an entity providing an electronic communication service to the public (a description that covers Facebook) ‘shall not knowingly divulge to any person or entity the contents of a communication while in electronic storage by that service’. As Orin Kerr has explained, the SCA, a statute enacted in 1986, ‘creates a set of Fourth-Amendment like privacy protections by statute, regulating the relationship between government investigators and service providers in possession of users’ private information’ (p 1212). However, the SCA was enacted well before web browsers and social media became ubiquitous. This can make its application to content moderation decisions by social media companies an awkward fit.

A key question was whether the documents and communications linked to specific Facebook accounts that Facebook has removed from the platform were ‘in electronic storage’ and therefore subject to the SCA’s non-disclosure rule. The term ‘electronic storage’ means ‘temporary storage’ or ‘backup storage’ (28 USC §2510(17)). Specifically, the court focused on whether content that a service provider deletes (that is, makes ‘permanently unavailable to the user’)— but nonetheless retains—is ‘backup storage’ (Order, p 13). The court found that it is not. In particular, the court reasoned that ‘backup storage’ requires that a corresponding ‘original’ version exists; a file copy maintained for some other purpose, in the absence of the original, serves no back-up function. Because Facebook had deleted the content in question from the site, ‘no back-up copy can exist’ and the non-disclosure rule no longer applied (p. 14).

The potential implications of this interpretation of the statute seem astonishingly broad. As Orin Kerr put it on Twitter, the idea that SCA protections hinge on a provider’s ‘reason for storage of a file copy’ appears ‘wrong’ as a matter of law. It would suggest, for example, that a company providing e-mail services could unilaterally terminate a user’s account—for whatever reason—and then voluntarily disclose the contents of the account to law enforcement, a foreign government, or anyone else. Such a scenario runs counter to the idea that the SCA was enacted to promote the use of new technologies and electronic communications by ensuring that basic privacy protections would not be eviscerated by sending information via a third-party provider. From another angle, the court’s interpretation might disincentivize content moderation if the removal of odious content will open up providers to a higher volume of disclosure requests and court orders. Social media companies already face pressure to adopt a human-rights based approach to content moderation in policing their platforms; the court’s decision in The Gambia v Facebook might inadvertently complicate that objective.

However, the court did seek to limit the potential scope of its ruling. First, the court emphasized that only content that has been permanently removed from an on-line platform may be divulged, not material that remains ‘in purgatory . . . de-platformed but not yet subject to a decision about permanent deletion’ (p. 14). This invites a ‘fact-intensive task of determining whether a provider has reached a final decision on de-platforming’ (p 14). Exactly how such a standard could be applied in practice remains unclear. Secondly, the court pointed out that content deleted by the user of an electronic communications service—as opposed to provider-deleted content—would not be disclosable because ‘permanently deleted content by the user appears not to be retained by providers’. Accordingly, ‘whether user-deleted content is protected is likely a non-issue’ (p. 13). This is hardly an iron-clad assurance, and it raises an obvious question: What happens if user-deleted content is, contrary to the court’s speculation, retained by a provider?

Facebook had argued that a narrow interpretation of ‘backup storage’ would have ‘sweeping privacy implications’ (p. 18). Rather than addressing those broader concerns, the court described the privacy implications in the instant case as ‘minimal given the narrow category of requested content’ (p 20). It emphasized that Facebook’s terms of service, which a user agrees to when setting up an account, provide that an account may be deleted if it violates those terms, which then puts that account outside SCA protection. As the Court put it, ‘Congress empowered [electronic communication services] to denature parts of the SCA’ (p 19). Again, this raises questions. It seems to ignore the possibility that a provider might wrongfully delete a user’s account for a purported terms-of-service violation, whether inadvertently or by misjudgment—or perhaps for more nefarious reasons. The prospect that SCA protections simply cease to exist in those scenarios again runs counter to the idea that the non-disclosure rule was meant to protect the privacy interests of users. Notwithstanding the court’s efforts to limit the scope of its interpretation, the statutory basis for those suggested limits seems missing-in-action.

The ‘Lawful Consent’ Exception to the Non-Disclosure Rule

Moreover, the court did not necessarily need to decide that ‘deleted-but-retained’ content is not covered by the SCA’s non-disclosure rule. Instead, the court could have relied solely on one of the exceptions in the statute that permits disclosure—at least with respect to much of The Gambia’s request. The Gambia invoked two such exceptions, but the court focused only on the ‘lawful consent’ exception: A provider ‘may divulge the contents of a communication . . . with the lawful consent of the originator’ (28 USC §2702(b)(3)).

First, the court addressed a threshold issue. Some US federal courts have held that the word ‘may’ in this provision means that disclosure of a communication that falls within the exception remains ‘purely voluntary’ on the part of the provider (p. 20). The court rejected that view and found that disclosure of a communication that falls within the exception may also be compelled by court order. Given the conflicting case law, the court’s determination on this point—which, on its face, seems reasonable—may likely face challenge on appeal.

Secondly, the court found that a user that posts a communication ‘with a reasonable basis for knowing that it will be available to the public’ provides implicit consent to its disclosure, obviating the need to obtain express consent (p 21). In the case at hand, ‘much of the content’ sought by The Gambia had been publicly accessible before its removal by Facebook, and Facebook agreed that it had discretion to divulge such ‘public content under the consent exception’ (p 21). Indeed, Facebook has already been providing material to the Independent Investigative Mechanism for Myanmar (IIMM), a body established by the UN Human Rights Council, on this basis (Transcript, Nov. 18, 2020, p 66). One might wonder why Facebook and The Gambia were unable to reach agreement on a voluntary disclosure that covered at least those posts and pages that were indisputably public before their deletion by Facebook.

However, the classification of material posted on social media as ‘public’ or ‘private’ is not always straightforward. The court pointed out that some Facebook posts are ‘set to be public to anyone at any time, while others are shared only to members of a group or followers of a page’, and a private group may require that an administrator grant access (p 21). The court reasoned, however, that if access to a private group on Facebook is automatically granted or if ‘private’ posts are ‘available to the general public’, such posts are ‘effectively public’ (p 22). Courts must therefore (again!) conduct ‘a fact-intensive inquiry’ aimed at establishing whether a user intended a communication to be ‘public’ or ‘private’ in nature. This suggests another ambiguous and difficult-to-apply standard, and the decision did not discuss the extent to which a user’s subjective intent might be inferred from objective factors. As Alex Koenig has explained, the court’s approach creates ‘a potentially slippery slope that could become dangerous if not carefully guarded’.

Applying this standard, the court found that while some of the material sought by The Gambia concerned pages that were ‘nominally private’, Myanmar officials had ‘intended their reach to be public, and in fact they reached an audience of nearly 12 million followers’ (p 22). The court described this as ‘the rare case’ in which ‘the authors nakedly displayed their intent to reach the public’ (p 22). As a result, Facebook could lawfully disclose the material sought by The Gambia on the basis of the lawful consent exception, except for user-to-user private messages (which The Gambia and Facebook agreed were beyond that exception).

Compared to the court’s narrow reading of ‘backup storage’ described above—with its broader potential privacy implication—the ‘lawful consent’ exception appeared to provide a firmer legal basis on which to order Facebook to disclose public (or quasi-public) communications that were deleted but retained. However, as noted above, this approach introduced its own set of problems and did not necessarily cover the entirety of The Gambia’s request (since it excludes user-to-user private messages).

The Court’s Discretion to Grant The Gambia’s Request

Having established that the SCA posed no bar to Facebook’s disclosure of deleted-but-retained material, the court considered whether to exercise its discretion to grant The Gambia’s request.

First, the Court rejected Facebook’s argument that the request was overly burdensome. It noted that the request specifically identified seventeen individuals, four entities, and nine Facebook pages and was otherwise ‘limited to the information Facebook de-platformed’ following Facebook’s own investigation (p. 24). The court also rejected Facebook’s argument that disclosure should extend back only to 2016 and agreed instead with The Gambia’s request for material dating back to 2012. It also pointed out that Facebook has the necessary resources and expertise to identify materials within the deleted content that relate to hate speech or incitement to violence, even if this may pose challenges.

Secondly, the court considered whether the requested material was likely to be ‘useful’ in the ICJ proceedings. The court did not get into the details of how a party might establish genocidal intent but concluded that evidence relating to what Facebook has described as a ‘coordinated campaign of hate against the Rohingya’ by Myanmar authorities went to questions that were ‘the gravamen of the ICJ inquiry’. Accordingly, the court deemed the records sought by The Gambia as ‘highly relevant’ to the case (p 27).

Thirdly, the court considered Facebook’s arguments that The Gambia should have pursued other channels, including by making a request pursuant to a mutual legal assistance treaty, an executive agreement under the 2018 CLOUD Act, or by working with the IIMM. These arguments were a ‘non-starter’ for the court because §1782 does not include a ‘quasi-exhaustion requirement’ (p. 27). As for the IIMM, the court noted that Facebook had made only limited disclosures to that body, and The Gambia could not reasonably expect to obtain the same materials from the IIMM that it sought directly from Facebook.

As for The Gambia’s request for records relating to Facebook’s internal investigation, the court agreed that such records ‘will illuminate how Facebook connected the seemingly unrelated inauthentic accounts to Myanmar government officials’ and whether certain pages or accounts were operated by the same officials (p. 29). The theory here appeared to be that evidence (including content and non-content metadata from the deleted Facebook pages and posts) that demonstrates the concealed and coordinated propagation of anti-Rohingya hate speech or incitement to violence by state actors will provide support for the argument that genocidal intent is ‘the only inference that could reasonably be drawn’ (see Croatia v Serbia (2015), para 148) from the other evidence of the acts falling under Article II of the Genocide Convention. If seemingly innocuous Facebook pages with names such as ‘Beauty and Classic’, ‘Young Female Teachers’, ‘Let’s Laugh Casually’, and ‘We Love Myanmar’ (to name a few) were in fact surreptitiously controlled by the Myanmar military—and if Facebook established that such pages were engaged in ‘inauthentic coordinated behavior’ based on the non-content metadata associated with those pages—the relevance of that information to The Gambia’s claims at the ICJ seems clear. The court therefore granted the request to compel Facebook to produce the documentation relating to Facebook’s own investigation, subject to the usual rules on privilege.

However, the court’s decision to grant The Gambia’s request for the investigation materials was undercut by its decision to deny the related request to depose a Facebook representative. The court viewed this as ‘too much to demand’ because The Gambia could make sense of the documents that Facebook must now disclose by examining them itself (p. 29). This was surprising in light of the rest of the decision. While it might be the case that a Facebook witness would have little to add, it is not difficult to imagine that the investigation documents might contain gaps or references intelligible only to Facebook insiders, which a witness could be asked to address. The court may have feared that the opportunity to depose a witness might have devolved into a more far-reaching examination of Facebook’s own role in the conflict, notwithstanding the fact that Facebook’s conduct is not at issue before the ICJ.

In sum, the court granted The Gambia’s request for ‘de-platformed content and related internal investigation documents’ and denied the request for a deposition with Facebook. Facebook’s concern had been that ‘the SCA prevented the requested disclosure’, but the court determined that this was not the case (p. 31). According to the court, Facebook has placed great importance on ‘remediation efforts for its role in what happened in Myanmar’. The decision now compels Facebook to make good on that commitment (p. 32).

[The second installment of this post will consider some lingering questions and further observations surrounding the court’s decision in The Gambia v Facebook.]

Print Friendly, PDF & Email

Leave a Comment

Comments for this post are closed

Comments