What we learned at RightsCon 2023: Centering Best Practices, Defending the Information Space, and Tackling AI

By Alexandra Smith | September 18, 2023

Large Photo
Small Photo
Photo
DemTech Director Moira Whelan joins a panel discussion at RightsCon

The spotlight at RightsCon 2023 was on the potential risks that emerging technologies pose to democracy, but we also want to reflect on the other takeaways and lessons learned. Discussions were particularly illuminating when it came to integrating novel technologies, systems, and optimal methodologies into our efforts. The emphasis was on advancing democratic principles and strengthening our ability to endure challenges. Key topics of discussion included the human rights community's responsibility in supporting appropriate content moderation, the formulation of effective norms and tools to counter online violence against women in politics, and the strategic regulation of artificial intelligence to harness its full potential.

Here are some of our key takeaways:

The Human Rights Community’s Role in Content Moderation
In response to mounting worries about information integrity on social media platforms, there is a growing focus for tech companies to establish user appeal mechanisms and fortify their defenses against harmful misinformation campaigns.

One particular focus at RightsCon was Meta’s Oversight Board, 22 experts who review specific appeals about Facebook’s content decisions, and make recommendations based on user input. With ongoing threats to information integrity, and platforms abandoning best practices, RightsCon attendees, including, the Center for Digital Resilience, Pen America, and the Digital Rights Foundation, along with Access Now, who hosted the event, presented emerging alternatives for users to report and identify disinformation when necessary, like Meedan’s Check that helps platforms collect flagged information and perform accuracy checks. During a session led by Tech for Democracy, a Danish-led coalition that brings together governments, multilateral organizations, civil society actors, and tech industry leaders to develop strategies for content moderation and promote multi-stakeholder collaboration, specific examples of implementing this approach in Ukraine were discussed like the work of the Atlantic Council’s Digital Forensic Research lab which has tried to document online Russian campaigns to undermine information integrity. Other sessions also highlighted the importance of having more human content moderators as misinformation, hate speech, and other harmful content can often bypass automated systems and algorithmic checks. 

Defending the Information Space for Women in Politics 

One issue that has received increasing attention over the last several years is technology-facilitated gender-based violence and the threats that women face online. The work of Luchadoras, a digital advocacy group and feminist collaborative, was highlighted as they work to provide resources for politically active women, who face higher levels of gender-based violence than their male counterparts. The UN Population Fund also hosted a session focusing on ways to mitigate tech-facilitated violence including the creation of data collection systems to track and analyze reports of tech-facilitated gender-based violence (TFGBV). Sara is another tool using artificial intelligence to provide tools and resources to women facing TFGBV in Central America. Other sessions focused on the need to implement institutional structures to tackle harassment on Twitter, Facebook, Reddit, and other platforms. 

Additionally, in light of difficulties in regulating online platforms, Chatham House and Global Partners Digital convened a panel on how best to tackle human rights issues in an internationally fragmented regulatory space. The panel reinforced the need to maximize existing frameworks like the EU’s Digital Services Act and localize legislation to different contexts. 

Activists and speakers at RightsCon elevated many of the key issues that complicate this mission and the intersections between gender, digital accessibility, and regulation that further complicate how to formulate approaches to these issues. With women all over the globe reporting harassment on social media platforms, speakers at RightsCon highlighted how activists, policymakers, and tech companies can tackle online violence and ensure a safe digital space for all. 

Capitalize on AI Opportunities and Regulate AI Threats

With new and emerging technologies increasingly taking center stage and demanding regulation, several panels focused on the need for international governance and collaboration on regulating artificial intelligence (AI). The panels highlighted the challenges AI poses for democracy advocates and citizens, ranging from data protection to disinformation, to deep fakes, and the threat that increasing utilization of AI will pose to privacy. This conversation was part of a more significant theme of reigning in tech companies and ensuring platform accountability. 

Over the course of the week, other sessions expanded on this point and focused more specifically on the ways that geopolitics will inform and influence the digital space and AI regulation in the years to come. With narratives that a digital cold war is emerging and that the internet is the new frontier for political confrontation, panelists in a variety of sessions argued that increased attention to digital security and multi-stakeholder approaches that center around establishing international norms and practices would be necessary. As governments attempt to address the threats posed by AI, it is vital to increase international cooperation, establish coordinated best practices, and establish dialogue between stakeholders on how to protect citizens and capitalize on opportunities. 

With the tech and democracy space at a critical juncture, the conference provided insights into how activists and practitioners can utilize digital tools for good and reform existing frameworks to improve the accessibility, safety, and usability of online platforms. NDI is committed to operationalizing these insights and collaborating with other stakeholders to create a safer, more inclusive digital space.

 

Topics

Share