The board, dubbed the ‘supreme court’ will make decisions over how the network is moderated.
Background: In November 2018, Facebook’s CEO Mark Zuckerberg set out his initial vision of how content should be governed on the platform. One of the pillars of that vision was to establish independent governance and oversight, so that the company would no longer make decisions about safety and freedom of expression on its own.
What does it look like? Facebook has now disclosed more detail on the structure of the Oversight Board it wants to create, and has published a charter to establish membership, governance, and decision-making authority. As a first step, the charter identifies guidelines on how to prioritise the most difficult cases for referral to the board, along criteria of ‘significance’ (severity, scale, public discourse) and ‘difficulty’ (disputed, uncertain, competing). The charter also outlines a process for content referral, involving users, Facebook staff, and the board.
What next? This announcement is only the beginning, as Facebook still needs to select board members, finalise the bylaws to complement the charter, and set up a website for the board in multiple languages. The company plans to have the board ready to deliberate on its first cases by early 2020.
More resources against violence and extremism: Alongside the announcement of the new board, Facebook also shared updates on how it is tackling violent and terrorist content. The company has enhanced its AI-based systems, used in combination with human expertise, and is expanding the teams which deal with these issues in the company.