Facebook's secret blacklist of Dangerous Individuals and Organizations (DIO) was leaked on October 12, 2021. Among the list of 4,000 banned entities which includes hate groups, criminals, and terrorist organizations, is the Oromo Liberation Army (OLA) which Facebook categorized as a “violent Non-state actor.”
Many Oromo Facebook users claim that Facebook is also censoring and silencing Oromo activists who are not members of the OLA. Some say that they cannot even mention the group's name on the platform. Facebook's categorization of the OLA as a dangerous group has made many Ethiopian Oromo Facebook users feel marginalized.
Ethiopia is composed of several ethnolinguistic communities and the Oromo are the largest. According to research, the Oromos have long considered themselves a marginalized group, sidelined from politics and the economy despite their numerical majority.
The OLA is a militant splinter group of the Oromo Liberation Front (OLF) fighting the Ethiopian government. The group has been accused of human rights violations and massacres targeting Amhara in Oromia. The group denied the accusations and welcomed an independent investigation into the atrocities they were blamed for.
Facebook's Dangerous Individuals and Organizations Policy
Facebook is the world's biggest social network platform, with roughly 2.89 billion monthly active users as of 2021. There were 7.4 million Facebook users in Ethiopia in September 2021, which is around 6.3 percent of its entire population.
The DIO policy is a recent addition to Facebook's Community Standards which officially outlines the types of posts allowed and not allowed on Facebook. The DIO policy prohibits praising, supporting, or representing individuals and organizations that Facebook considers dangerous. In its explanation about the DIO, Facebook said the policy is “an effort to prevent and disrupt real-world harm” and stated that it will not “allow organizations or individuals that proclaim a violent mission or are engaged in violence to have a presence on Facebook.”
Facebook also indicates that the entities will be assessed based on their online and offline behavior, including their ties to violence. According to the policy, the organization classification will be divided into three tiers. Facebook stated that Tier 1 will have the most extensive enforcement since Facebook believes the entities have the most direct ties to offline harm.
There is a common undertaking among civic organizations to moderate and monitor harmful Facebook activity. One example is the Oversight Board, a quasi-judicial body that oversees Facebook's content moderation, and calls for just and transparent moderation policies from Facebook.
Critics suggest that Facebook's policy is overly broad and opaque and, due to a lack of accountability and clear guidelines, discriminates against marginalized groups.
According to experts, with the advent of traditional media and then social media platforms like Facebook, TikTok, Twitter, and YouTube, there is a worldwide correlation between violent and hateful content online and offline violence.
A short glimpse of the publicly available reports by social networking platforms could give us a global overview of this phenomenon, and show that it is becoming increasingly common. Hence, social networking platforms are devising policies and developing tools to reduce hateful and violent content from their platform.
One of the recent developments in this regard is Facebook's Dangerous Individuals and Organizations (DIO) policy. One of the most famous applications of this policy was on former US President, Donald Trump, in January 2021 after he was accused of inciting violence. The controversial choice to ban the former president has been the subject of much discussion among interested parties.
In August 2021, the media was flooded with news that social media giant Facebook was removing accounts maintained by or on behalf of the Taliban. It will also prohibit praise, support, or representation of the Taliban. The policy has been criticized for being opaque, overbroad, and targeting political speech by Muslim users. The oversight board also expressed concerns about the clarity of policy.
The DIO in Ethiopia
Regarding Facebook's decision to designate the OLA as a dangerous organization, OLA International Spokesperson, on August 23, 2021, Odaa Tarbii, tweeted the following in the Afaan Oromo (Oromo language) with a caption that can be translated into English as:
A lot of people asked me why OLA does not have a proper Facebook page. The reason is that Facebook has designated OLA as dangerous individuals and organizations’.
In the same tweet, he states that Facebook will prohibit praise, support, and representation of the OLA.
“Maaliif WBOn fuula Feesbuikii sirnaawaa hin qabu?” jedhanii namoonni gaafatan danuudha. Sababaan isaa Feesbuukiin WBO toora “Namootaa fi Jaarmayoota bala’amoo,” jedhuutti sirnaan farrajee waan jiruufi.
Issa armaan gadii dubbisaa. pic.twitter.com/RVBe6v5Mjo
— Odaa Tarbii (@OdaaTarbiiWBO) August 23, 2021
Though there have been no independent investigations undertaken regarding the violent incidents that the OLA has been blamed for, Facebook believes the OLA poses a more threat to real-world harm than any other armed political actors in Ethiopia.
Many Oromo citizens objected to the censorship of Oromo Facebook users. An example of such complaints is a Facebook post by Henok G.Gabisa, an Oromo activist with over 203,000 followers on Facebook, in which he expressed his concern about the knowledge and cultural gap between the Oromo users and Facebook.
An Oromo advocacy group, the Oromo Legacy Leadership and Advocacy Association, recently claimed that Oromo activists are being targeted for making comments against Abiy Ahmed, the Ethiopian Prime Minister. The group also claims that Facebook is silencing Oromo activists while allowing government-sponsored misinformation to spread.
As the critics say, the DIO policy brings more questions than answers and has led to uneven application of the policies against various ethnicities. Critics call for increased explanation and transparency about the application of the policy. This includes explaining how the targeted group poses a greater threat than other armed political actors in the country.
As an illustration, the allegation and evidence of atrocities against the warring parties in Ethiopia's Tigray conflict are more serious and substantiated than allegations and evidence against OLA. The Tigray conflict has been declared a situation that “threatens the peace, security, and stability of Ethiopia and the greater Horn of Africa” by the United States, Facebook's official home state. The parties to the conflict have been also accused of widespread violence, atrocities, and of using rape as a weapon of war, and other serious human rights violations by international organizations and the media.
The question as to why Facebook chose to apply its DIO policy solely to the OLA and not other armed political actors in Ethiopia remains unanswered. Many Ethiopians online are now questioning whether the policy is really aimed at preventing real-world harm. Also, does the OLA pose more threat for real-world harm in Ethiopia than entities who are already involved in the disastrous Tigray conflict?