Please enable javascript in your browser to view this site

DCMS committee unhappy with social media’s approach to fake news

The hearing carried out by the UK’s Digital, Culture, Media, and Sport (DCMS) parliamentary committee on 8 February 2018 in Washington DC with executives of three of the main global social media platforms (Google, Twitter, Facebook) showed how much distance currently separates tech companies from regulators and policymakers with regard to tackling “fake news”, hate speech, and other related illegal activities.

While all witnesses seemed to agree with the long-term objectives, it is clear that, on the one hand, companies are not yet doing all they can to discourage these practices, as their takedown processes are often slow and not sufficiently transparent; on the other hand, policymakers have very high expectations of these platforms to be able to tackle the spread of false information, which can be inherently challenging to identify and block without hindering the right to free speech. It is likely that future activities will focus on identifying fake accounts associated to dissemination of false stories, and on the tracking of money coming from political advertising - a front on which the platforms audited have shown concerning weakness.

Google was criticised for the algorithm of its search engine

The hearing started with representatives from Google and YouTube, and soon showed the committee’s willingness to probe witnesses fully with thorough questions. Richard Gingras of Google argued that the issue of misinformation is certainly one close to Google’s heart - in fact, a crucial one to what it does, due to the need to retain users’ trust as part of Google’s business model. Nonetheless, such reassurances were not quite sufficient to satisfy the committee, which argued not enough is being done to tackle misinformation and hate speech. One example brought up by a member of the committee referred to Google’s search engine, which brings up hate content when certain key words are typed (e.g. “Jew”). Gingras reassured the committee that Google is looking into this, through an extensive team which continues to train the system. However, he added, “this is a continuous effort” and no one can expect the algorithm to ever become perfect.

YouTube also faced strong criticism. While Juniper Downs, YouTube’s Global Head of Public Policy, maintained there was no evidence of Russian advertising on the platform aiming to influence the UK elections, she was not able to remove doubts about the presence of content with the same aim - though she committed to cooperating on this front.

The committee also pointed out that takedown processes seem to work very efficiently against copyright breaches, whereas they are far slower with regard to spam and abuse. Prompted on this point, Downs said YouTube will make this “a top priority” during this year by investing tens of millions of dollars, but failed to say which percentage of YouTube’s advertising revenues is invested toward that.

Facebook’s monitoring of sources of political advertising should improve

Facebook’s appearance was possibly the most heated up of the meeting. The committee questioned the social network’s takedown process, particularly with regard to the US election, and, more in detail, the way Facebook has dealt with accounts purchasing political advertising. The committee’s view is that the process is such that checks are only carried out after the ads have been purchased, and are not robust enough to prevent illegal activities, including for example money laundering.

The committee was particularly dissatisfied with the answers provided by Simon Milner, UK and MEA Policy Director, when they asked him whether Facebook could identify whether political ads were paid by foreign persons (which would be illegal in the UK). Milner replied Facebook could identify who bought a specific ad space, though it cannot track where the money comes from. In a heated exchange, the committee concluded Facebook should do much more on this front, at least by sharing information about purchasers of political advertising with the relevant authorities.

Twitter’s executives were accused of not doing enough against hate speech

Despite being the smallest of the three social media platforms audited, Twitter could not avoid criticism similar to that faced by Google and Facebook during the meeting. Nick Pickles, UK Head of Public Policy and Philanthropy for Twitter, argued it is not accurate to portray the company as “unregulated” (given it still responds to existing regulations on hate speech and defamation) and that he does not see tech companies as responsible to take down content and information based on what is true. Pickles also noted that the aim should be to elevate “credible voices” rather than shutting down accounts, and that Twitter cannot take down accounts just because they are a bot. Many bots operate in a perfectly legitimate way, and provide useful information to their followers.

However, the committee maintained Twitter’s standards are still “inadequate”, pointing to the recent example of the US President Donald Trump retweeting content published by a racist political group of the UK.