Close

Support Global Voices

To stay independent, free, and sustainable, our community needs the help of friends and readers like you.

Donate now »

Still flattening the curve?: Increased risk of digital authoritarianism after COVID-19

Image by Etienne Girardet used with an Unsplash licence.

The coronavirus pandemic has encouraged states to employ digital tools to monitor the spread of the virus and surveil public health data. Expanding state access and control over digital platforms quickly became the new normal, facing little to no objection because of pandemic anxiety. Although the expansion was a strategy to “flatten the curve,” it now carries the risk of digital authoritarianism since governments persist in deepening their digital control capabilities. 

One of the emerging risks we face now is state-led technological surveillance. Regardless of the regime type, states employed a variety of surveillance tools during the pandemic. For example, police departments in many places, such as California, Florida, and New Jersey, used drones to watch people and warn them to follow physical distancing rules when necessary. The Israeli government likewise launched a contact tracing app HaMagen (The Shield), to monitor whether or not users crossed paths with anyone who tested positive for COVID-19. While citizens voluntarily became users of these technologies for their health, they unwittingly opened the door to future human rights violations.

The main rationale for increasing state surveillance was to tackle the pandemic effectively to save people’s lives. Yet, states are not enthusiastic about abandoning these digital tools, even though the pandemic is winding down. Instead, they are determined to preserve their surveillance capacities under the pretext of national security or preparation for future pandemics. In the face of increasing state surveillance, however, we should thoroughly discuss the risk of digital authoritarianism and the possible use of surveillance technologies to violate privacy, silence political opposition, and oppress minorities. For example, South Korea’s sophisticated contact tracing technology that involves surveillance camera footage, cell-phone location data, and credit card purchases has disclosed patients’ personal information, such as nationality. It raised privacy concerns, particularly for ethnic minorities, and underlined the risk of technology-enabled ethnic mapping and discrimination.

In 2020, UN experts released a statement on the privacy impacts of the pandemic, warning that “all-pervasive surveillance is no panacea for COVID-19.” According to the statement, the pervasive use of contact tracing technology during the pandemic paved the way for uncontrolled surveillance in many countries, which the document calls a disturbing trend. The UN statement also emphasizes the need for legal regulations to ensure that states employ surveillance technologies to an extent proportional to the pandemic situation in these words: “If a state decides that technological surveillance is necessary as a response to the global COVID-19 pandemic, it must make sure that, after proving both the necessity and proportionality of the specific measure, it has a law that explicitly provides for such surveillance measures. The law must include safeguards, which, if not spelled out in sufficient detail, cannot be considered adequate under international law.”

Access restriction and censorship are two other emerging risks exacerbated throughout the pandemic. Governments often limited or blocked access to independent online news sources under the guise of preventing the spread of false news about the coronavirus. In many countries, while pro-government news outlets continued to spread false information about the pandemic, the governments detained people on spurious charges of misinformation. For example, Vietnam initiated 654 legal cases and sanctioned 146 people, and Cambodia detained over 30 activists and opposition figures for spreading false news in the early months of 2020. Citizen Lab’s research also points out state cooperation with technology firms for authoritarian purposes. The Lab’s research on how the Chinese government managed pandemic-related information on social media shows that China-based WeChat censored content including criticism of the Chinese government’s policies on the pandemic. 

In the light of these incidents, we can conclude that digital repression peaked in both democracies and nondemocracies with poor human rights records. In the face of state surveillance, control, and censorship, we need a solid plan to take our digital rights back and develop proper political and legal regulations to protect them during future crises and emergencies. A key strategy to achieve this goal can be political transparency. 

Openness and transparency are two main pillars of democratic policymaking processes since they guarantee accountability and build trust between policymakers and the public. If we ensure transparent policymaking, we can prevent future violations before they happen. Another strategy is improving technological literacy to help citizens have knowledge of their digital rights. Despite their quick adoption of technological developments, most people do not have a clear understanding of their digital rights, the extent of state access to their data, or the ways to claim their digital rights. This lack of knowledge highlights the need for technological literacy for people of all ages and backgrounds. To that end, schools, research centers, civil society organizations, and volunteer groups can play a critical role by organizing training activities and public events and producing educational materials. Increased levels of technological literacy can also create public awareness to resist or lessen the degree of digital authoritarianism through civic action and initiatives.


Please visit the project page for more pieces from the Unfreedom Monitor.

Start the conversation

Authors, please log in »

Guidelines

  • All comments are reviewed by a moderator. Do not submit your comment more than once or it may be identified as spam.
  • Please treat others with respect. Comments containing hate speech, obscenity, and personal attacks will not be approved.

Receive great stories from around the world directly in your inbox.

Sign up to receive the best of Global Voices!

Submitted addresses will be confirmed by email, and used only to keep you up to date about Global Voices and our mission. See our Privacy Policy for details.

Newsletter powered by Mailchimp (Privacy Policy and Terms).

* = required field
Email Frequency



No thanks, show me the site