Social Media Monitoring for Democracy (Part I): Getting Comfortable with the Concepts
Social Media Monitoring for Democracy (Part I): Getting Comfortable with the Concepts
Social media monitoring is an essential tactic for many pro-democracy initiatives. In order to find out what is being said on social media and by whom, find coordinated information campaigns, and identify and respond to online threats in a timely manner, democratic actors need tools and methods to get data from social media platforms and analyze it.
DemTech Wrapped 2022
DemTech Wrapped 2022
DemTech Wrapped 2022 is here. All year we’ve worked to make tech work for democracy As we look forward to 2023, we wanted to share with you what we’ve been up to.
Weekly Roundup 6 July, 2022
Weekly Roundup 6 July, 2022
The Taiwan National Communications Commission has approved the draft Digital Intermediary Service Act, legislation which would require platforms to improve content moderation, transparency, and localization.
Weekly Roundup 13 June, 2022
Weekly Roundup 13 June, 2022
Hackers linked to the Chinese Communist Party are exploiting a recently discovered Microsoft vulnerability to target external dissident groups, most notably the Tibetan government in exile.
Weekly Roundup 31 May through 8 June
Weekly Roundup 31 May through 8 June
Persian language content moderators for Instagram have come forward alleging that Persian language content moderation is biased due to pressure from the Iranian government.
Weekly Roundup 24 May, 2022
Weekly Roundup 24 May, 2022
The UN Security Council (UNSC) convened a technology and security briefing on Monday this week. The briefing particularly focused on technology’s application to maintaining international peace and security.
Weekly Roundup 16 May, 2022
Weekly Roundup 16 May, 2022
To address a spike in online child sexual abuse material(CSAM) during the pandemic, the European Commission has published a proposal that would allow courts to order internet service providers to block sites containing CSAM. The law would also require platforms to scan content to detect and remove CSAM and report annually on how much content they remove, while also increasing their obligations to prevent online grooming.