Please enable javascript in your browser to view this site

Facebook’s privacy practices will now face thorough scrutiny

What started as criticism for not doing enough to spread disinformation online is now quickly escalating into inquiries about the way in which Facebook allows third parties to access its users’ data. Allegations that Facebook’s data have been misleadingly obtained, and used to profile approximately 50m users to run targeted political campaigns, have now prompted the Federal Trade Commission (FTC) to investigate the company’s privacy practices. At the moment, the investigation is 'non-public', though it could lead to further action. The Cambridge Analytica scandal also prompted the UK Parliament’s DCMS Committee to hear the whistleblower Christopher Wylie, whose remarks exposed the need for more powerful instruments in the hands of regulators.

For Facebook, the fake news issue is turning into a data protection issue

At the start of the debate around the spread of disinformation online, social media were criticised for not doing enough to tackle the problem. During 2017, legislators across the world set up committees and enquiries to gather information and identify the next steps. The events of March 2018, particularly those involving Facebook and the use of its data for profiling purposes, are now likely to put the issue into a different, more comprehensive perspective. What was initially related to controlling disinformation and hate speech through instruments such as fact-checking and removal of fake accounts, is now also a problem of data protection.

Profiling for advertising purposes (including political advertising) is not a problem per se; however, transparency is required to make sure end users know which data they are disclosing, to which organisations, and for which purposes. Also, where relevant, all this will require individuals to give consent in a meaningful way. To this end, the upcoming GDPR in the EU is likely to be sufficient to address the problem; however, legislation elsewhere is likely to need strengthening - particularly in the US, where an overarching data protection framework is still lacking.

The scale of the potential violation means a fine of several trillion dollars is possible

The FTC will now investigate whether Facebook has violated a settlement it reached with the regulator in 2011 (known as the 'consent decree'). Back then, the company agreed to settle FTC’s charges that it deceives consumers by telling them that they could keep their information on Facebook private, and then repeatedly allowing it to be shared and made public. The settlement required Facebook to take steps to live up to its promises thereafter, including giving consumers clear and prominent notice and obtaining consumers' express consent before their information is shared beyond the privacy settings they have established. If found in violation of the settlement, Facebook could face a fine of up to USD40,000 per violation. Potentially, if the FTC finds violations for most of the 50m users involved, this could amount to several trillion dollars.

It will be therefore crucial for the company to avoid the fine, while at the same time take steps to restore confidence of both users and regulators. Facebook’s CEO Mark Zuckerberg has pledged to build a better Facebook, and has opened the door to some form of regulation. Legislators have been considering imposing rules for some time, and will no doubt jump at the opportunity now that the green light comes from the company itself; however, such regulation does not necessarily need to be in the form of prescriptive rules. Given the fast pace at which technology evolves, well monitored guidelines could be more effective and flexible, as Zuckerberg noted with regard to Facebook’s processes to tackle hate speech.

Regulators will need more powers and more expertise to tackle these problems effectively

Regardless of how heavy fines can be, regulators still need more support if they are to enforce data protection rules effectively. This was one of the key remarks made by Cambridge Analytica’s whistleblower Christopher Wylie, during his hearing for the enquiry on fake news carried out by the UK Parliament’s DCMS committee. He noted that the UK data protection authority ICO needs more staff - and more technical staff in particular. This will be a crucial step to make sure authorities have the instruments to investigate properly.

These regulators will also need more powers. Some of them will come as soon as GDPR comes into force, as note by ICO’s chief Elizabeth Denham in a previous hearing of the same enquiry; others might require a reform of the processes to obtain search warrants. Strikingly, it took several days for the ICO to obtain and execute a warrant, after an initial request for access directly addressed to the company was not satisfied.