Please enable javascript in your browser to view this site

UK Government aims to lead efforts in tackling online harm

The UK Government recently published a white paper with a broad set of proposals to tackle online harm. These include a new regulatory framework, to establish a duty of care for online companies for the first time, and the set-up of a specific regulator to enforce the new rules. While the regulation of internet companies has been coming for some time, the UK is the first country to propose such a comprehensive and detailed framework; to this end, the government appears to have listened to the concerns raised by inquiries of parliamentary committees (the DCMS Committee’s inquiry on fake news, and the Lords’ Technology Committee on internet regulation, in particular). A consultation is running until 1 July 2019, before the Government goes on to develop a final legislative proposal.

Platform liability is increasingly likely

One of the key starting points of the Government’s white paper is that, while currently there are voluntary initiatives to address the problems of online harm, they do not go far or fast enough. Other countries/institutions are developing regulatory approaches to online harms, as Assembly’s Platforms and Big Tech Tracker shows; but so far none of them has yet established a regulatory framework as broad as the proposals of the UK Government, which now aims to get the UK to lead internationally on this front. A vision is set out where a “free, open and secure internet” exists alongside rules to discourage harmful behaviour and citizens who understand the risks of online activity and challenge unacceptable behaviour. The paper also calls for a “global coalition of countries” to take coordinated steps for online safety.

As a result, the white paper proposes a new statutory duty of care to make internet companies take more responsibility for online harms related to their services. The scope of the new framework would include any companies that allow users to share or discover user-generated content or enable users to interact with each other online, regardless of their size; it will be up to a new regulator to take a ‘risk-based and proportionate’ approach. All companies will be required to fulfil their duty of care and comply with information requests; the Government is consulting on ways to differentiate regulation for ‘private communications’, which would be exempted from requirements to scan or monitor content. Companies will have to demonstrate they are fulfilling their duty of care; the new regulator will set out how to do this, through codes of practice. Businesses will still be able to comply in a way other than set out in the codes, but will have to explain and justify how their alternative approach will deliver similar or greater benefit. The regulator will also ensure companies grant independent researchers access to their data, subject to appropriate safeguards; and that effective and easily accessible ways for users to complain are implemented, where companies will respond to complaints within an appropriate timeframe. On top of that, the Government’s consultation seeks input on an independent review mechanism, to ensure that users’ concerns are treated fairly. One option includes the designation of bodies empowered to make ‘super complaints’ to the new regulator in order to defend users effectively when necessary.

A new regulator would oversee internet companies, with strong enforcement powers

The task to implement and oversee the regulatory framework would be assigned to an independent regulator. It is currently unclear whether this would be an existing body, or a newly found agency; the Government’s consultation is seeking input on this point. It has been reported that Ofcom could be initially in charge for the implementation of the framework, with a view to set up a new body at a later stage. The regulator will have to be equipped with the necessary resources and expertise; it will work closely with UK Research and Innovation (UKRI) and other partners to improve its evidence base, and be legally required to pay due regard to innovation, users’ rights, privacy, and freedom of expression. The Government is explicit that the regulator will not be responsible for ‘policing truth and accuracy’; in other words, it will not engage in combating fake news. The body will be funded by industry; the Government is exploring options such as fees, charges or a levy to put it on a sustainable footing. This could fund the full range of the regulator’s activity, including producing codes of practice, enforcing the duty of care, preparing transparency reports, and any education and awareness activities.

The Government is determined to ensure the regulator’s powers are effectively exercised by giving it strong enforcement capabilities. These will include substantial fines, and powers that would ensure the regulator can ‘disrupt the business activities’ of non-compliant companies, such as the blocking of non-compliant services and establishing personal liability for individual members in the senior management of a firm. The regulator will also be required to ensure the presence of a level playing field between UK-based companies and foreign firms. Finally, the framework will aim to increase firms’ responsibility while remaining compatible with the EU’s e-Commerce Directive, which limits their liability for illegal content until they have knowledge of its existence, and have failed to remove it from their services in good time. At this stage, the Government does not specify whether this would change once the UK is out of the European Union; it can be assumed the Government aims to retain regulatory consistency with the EU on this front, to avoid fragmentation and additional cost to business.

Technology and media literacy will be part of the mix as non-regulatory solutions

The Government’s white paper highlights the role technology can play in ensuring online safety; it wants the UK to be a world leader on this front, and to ensure companies of all sizes have access to innovative solutions. Investment is already being made toward several projects, such as the GovTech Catalyst scheme and its ‘challenge fund’ which has helped develop technology to detect online terrorist propaganda. This year, the leading proposals will receive up to £500k to develop and test a prototype. The government is also investing £300k to fund up to five innovative projects to disrupt child abuse online. Once the new regulator is set up, it will use its position to encourage the development of new technologies and the sharing of best practice among companies in the market. In the meantime, the Government plans to work with industry bodies (Tech Nation, Tech UK among others) to promote innovation and scale-up of safety products; it will also help create a ‘Safety by Design’ framework to help companies incorporate safety in the development of their services. The Government envisages a role for AI in analysing and countering hate speech, alongside trained moderators.

Alongside technology, the white paper sees a key role of improved digital literacy as a way to empower users against online harms. The Government recognises there is currently insufficient support, and pledges to develop a new online media literacy strategy. Recent independent reports have also highlighted the specific need for improved digital literacy, including the DCMS Select Committee’s report into disinformation and the Cairncross review on a sustainable future for journalism. The Government foresees setting out a strategy against four key objectives, including: improved user resilience against disinformation; equipping people to recognise and tackle deceptive behaviour online; include people with disabilities in digital literacy education; and develop approaches to tackle online violence against women. The Government notes it has already spent £1m in 2018–19 toward two initiatives (the ‘RESIST’ toolkit and a public campaign against disinformation). The UK Council for Child Internet Safety also found that children want more education about online safety, and more support from tech companies. As part of the new transparency obligations they could face, companies will also have to report to the regulator on their activities to raise awareness and improve education.