How to develop accountability and transparency recommendations for under-scrutinized digital platforms
Editor's Note: This blog post was authored by Farzaneh Badiei, founder of Digital Medusa.
The policies and products of major tech platforms such as Facebook, Instagram, Twitter, and YouTube receive a significant amount of attention and engagement from researchers, journalists, and civil society organizations alike. The National Democratic Institute (NDI) has previously engaged these platforms to recommend interventions for ending online violence against women in politics and advocate for robust data access for non-academic researchers, among other topics. However, there are other digital platforms—including those that might be smaller in scale, are commonly used in only a few countries or by specific communities, or are relatively new to the market—that are also important to political processes around the world.
NDI is exploring how lessons learned from engagement with the aforementioned “legacy” platforms can inform recommendations to help other platforms ensure their policies and products make a positive impact for democracy. As the larger, better-resourced platforms walk back their commitments to protecting users from disinformation and online harassment, advocacy to encourage “alternative” or “emerging” platforms to uphold (or even factor into their design) the Democratic Principles for the Information Space is more important than ever.
NDI recently organized a roundtable discussion with civil society representatives and researchers to gather feedback about the risks and opportunities these platforms present in diverse contexts, including during pivotal democratic moments such as elections.
Which platforms should we focus on?
During the discussion, participants generally agreed there is significant value in dedicating time and resources toward researching and engaging with under-scrutinized platforms. However, the group grappled with which platforms to prioritize and how to develop terminology for talking about these platforms that is inclusive but not overly broad. NDI distributed a survey prior to the roundtable that asked respondents about their use of a range of platforms: audio-based apps such as ClubHouse and Discord, apps with primary user bases in one country or region such as Line and KakaoTalk, recently developed apps such as BeReal and Lemon8, encrypted messaging apps like Telegram, and widely popular but relatively new (compared to legacy platforms) apps like TikTok. There is significant diversity among these platforms in terms of their user base, longevity, and primary functions that make assessing them as a whole all the more challenging.
The terms “alternative” and “emerging” were considered as potential classifiers, but not all of these platforms are “emerging” in the sense that they are new to the market or even rising in popularity, and a platform that is “alternative” in one context may be mainstream in another. The majority of these apps are social media or communication platforms, but participants also considered how other digital products like cloud services could be used in contexts where access to these platforms is restricted. Though no consensus was reached on the scope of platforms under consideration or the best terminology to use, it was evident throughout the discussion that any recommendations attempting to target a variety of platforms should be appropriately nuanced to facilitate adoption across a range of contexts.
How should we engage with these platforms?
One characteristic that unifies these platforms is their relative inexperience in building up systems and policies compared to the legacy platforms, and a lack of diverse regional expertise (though the regional expertise of legacy platforms arguably leaves much to be desired). Channeling engagement through coalitions may be a useful strategy, as these platforms’ capacity to engage with civil society organizations and researchers around the world may be limited. Established trust and safety associations, such as the Digital Trust & Safety Partnership, the Trust & Safety Professional Association, and the Integrity Institute, offer different models for information sharing and collective action. Some coalitions may facilitate direct participation from platforms themselves, though the willingness of platforms to voluntarily commit to engagement may vary depending on the platforms’ resources and the political context in the country where the platform is based. Connections between a platform and government authorities may also shape how the platform approaches engagement with civil society and researchers on topics like content moderation, data privacy, and election integrity policies.
Different modalities of engagement will likely be required depending on a platform’s user base (whether national, regional, or global), whether a platform’s moderation teams are open to having discussions about identified threats, and the existing rules a platform has in place. A decision tree may be a useful tool to help civil society organizations determine which method of engagement is most effective and which recommendations to prioritize in advocacy to a given platform.
How can we incentivize platforms to take up recommendations proposed by civil society?
In addition to direct engagement with platforms, the roundtable participants also considered other mechanisms to incentivize platforms to incorporate recommendations from civil society into their policies and products. For example, pressuring investors to comply with international human rights standards could be an effective strategy for incentivizing smaller platforms funded by Silicon Valley venture capital groups. App stores and payment processors could also be a potential tool for incentivizing platforms to take certain actions, but there is a risk of app stores arbitrarily blocking apps (including in compliance with government requests) without transparency around the decision. Litigation against platforms is becoming increasingly common, but may be abused in illiberal contexts to entrench state power by imposing restrictions on free expression.
Platforms are not immune to misuse and abuse just because they have a smaller user base or have not received as much attention from the international research community. After the Oversight Board of Meta recommended the Cambodian Prime Minister’s Facebook account be suspended for posting a video threatening his political opponents with violence, the Prime Minister announced he would be leaving the platform, instead relying on Telegram to share his message (the Prime Minister also has a TikTok account). Tech companies of all shapes and sizes need to be prepared for mitigating the risk of bad actors and harmful content that may migrate to their platforms. NDI will leverage insights from this roundtable discussion, one-on-one conversations with relevant stakeholders, and desk research as it continues to refine its approach to these important questions.