Scale efforts to detect and curtail abuse at its source

Though increased capacity for content level moderation is important, it is not enough to deter and root out coordinated abuse. Bad actors spend months— if not years—building networks of online assets, including accounts, pages and groups, that allow them to manipulate the conversation. These inauthentic presences continue to present a major risk in places like Myanmar and are responsible for the overwhelming majority of problematic content.

  • • Social media companies should invest significantly more resources towards the detection and investigation of coordinated inauthentic behaviour. They should drastically expand their investigation teams and prioritise the development of detection systems, which can help trigger such investigations.
  • • Social media companies should consider expanding their security bounty model to encompass the detection of information operations and new means of weaponising their platform.

Preserve evidence of abuse

Although the existence of coordinated efforts to weaponise the Facebook platform for political ends in Myanmar is now widely accepted, the ability of independent researchers and accountability bodies to understand the full scope and scale of the operations is heavily constrained by the continuous and ongoing removal of assets and content by Facebook and the actors themselves.

  • • Social media companies should consider retaining all content removed for policy violations for a period of at least two years.
  • • Social media companies should further consider making available to researchers and accountability bodies all public assets such as accounts, pages, and groups identified as being part of a coordinated inauthentic behaviour operation—including the content which had been previously removed for other policy violations.

To civil society

Standardise, systematise and consolidate evidence collection

When presented with evidence of abuse in Myanmar from 2013 through early 2018, Facebook often dismissed concerns as anecdotal and defaulted to one-off fixes. The more civil society was able to consolidate evidence and show patterns, however, the more they became able to push for systemic solutions.

  • • Wherever possible, civil society should standardise, systematise and consolidate their evidence collection. As a general rule, if a problem seems to be widespread or recurring, civil society should consider complementing qualitative approaches with quantitative ones. There is a growing body of work on methodologies and ways to structure data collection, which can be built on to help shape such research.
  • • Civil society should also work together across countries to identify shared patterns as well as differences in treatment across markets and engage in joint advocacy.

Focus on enforcement and not just policy

A lot of civil society criticism of social media to date has focused on questions of policy and bias, i.e. Where should the line between hate speech and freedom of expression fall? To what extent are Facebook’s decisions driven by its moderators’ bias? Though these are definitely areas that require attention, it is important for civil society to also pay attention to the enforcement capacity of platforms like Facebook. Civil society’s experience in Myanmar has shown that enforcement capacity varies significantly from country to country and language to language. For languages where Facebook has no text-based AI, for example, enforcement relies almost exclusively on user and civil society reports to the company. There can also be flaws in the triaging of these reports and not always sufficient language-specific human capacity to review them, which can result in widespread enforcement errors.

  • • Civil society should advocate for improvements to enforcement, both in terms of resourcing and accuracy.
  • • Civil society should develop technical expertise and/or partnerships with technical experts to be able to independently probe social media platforms as technical products, made up of databases, workflows and algorithms. These technical aspects can differ greatly between context and do not always work adequately, or as intended.
 
Source
< Prev   CONTENTS   Source   Next >