The regulator will monitor the way in which online platforms fulfill the statutory duty of care that new legislation will impose on them.
Background: The UK Government has planned legislation in this area for some time now. Following the recent general election, the Queen’s Speech setting out the executive’s work programme included legislation to “improve internet safety for all”. This followed on from the Online Harms White Paper published by the Government in April 2019, which proposed a new duty of care for online companies towards end users, and the setting up of an independent regulator to oversee the framework.
Ofcom expected to gain new powers: On 12 February 2020, the Government published its initial response to the consultation on the White Paper. The response lacks detail on many important areas of the legislation the Government intends to pass. However, the Government says it is minded to empower Ofcom to oversee the framework rather than setting up a new agency. The regulator will ensure companies have the systems and processes in place to remove harmful content and minimise the risk of it appearing online.
A light-touch approach could be on the cards: In its initial response, the Government has provided little indication of what the duty of care will look like, and of the penalties to which companies will be subject if non-compliant. However, the Government says it will require companies to state what content and behaviour is acceptable and to enforce this consistently on their platforms – something that social media services have already largely had in place for some time. It is therefore unsurprising that the initial response from Facebook and Google was a reminder of all the steps companies have already taken to remove and avoid harmful content. The Government aims to publish a full response, with more detail on future legislation, in the next few months.