The Democracy and Digital Technologies Committee published the findings of its inquiry on Digital Technology and Trust, along with a set of proposed remedies.
A year-long inquiry comes to an end: In June 2019, the Committee on Democracy and Digital Technologies within the UK House of Lords launched an inquiry into democracy and digital technologies. The inquiry gathered evidence on the ways in which algorithms of social media platforms have shaped democratic debate; on the role education should play in a healthy and digitally literate democracy; and on the need for more transparency on online political campaigning. Other issues included whether organisations are deliberately undermining trust in democracy through social media, and which steps can be taken to reduce the impact of misinformation online.
The inquiry found a worrying lack of trust: On 29 June 2020, the Committee published its report on the inquiry, which found democracy to be under threat from a ‘pandemic of misinformation’ online. People no longer trust the information they receive, partly because of the unchecked power of digital platforms. To address this, the report calls on the Government to ensure tech companies are held accountable for online harms to individuals and the wider society.
The proposals: The report includes a set of no fewer than 45 recommendations. In particular the report calls for the Government to introduce its draft Online Harms Bill, and clarify that disinformation and misinformation are included in the scope. Ofcom should have the power to hold digital platforms legally responsible for content they show to large audiences. Political advertising should be brought into line with other advertising in the requirement for truth and accuracy. The Advertising Standards Authority should be involved in overseeing a code of practice and remove any advertising that breached the code. There should be real-time databases of political advertising on online platforms, and an increase in the fines the Electoral Commission can impose on campaigners. Finally, the report proposes the introduction of a digital ombudsman for content moderation, as a point of appeal for people who want to challenge companies’ decision to take down content, or failure to do so.