Cambridge Analytica, GDPR - 1 year on - a lot of words and some action

Cambridge Analytica, GDPR - 1 year on - a lot of words and some action

The first half of 2018 saw two major privacy moments:  in March, the Facebook/ Cambridge Analytica scandal broke, followed in May by the EU General Data Protection Regulation ("GDPR") taking effect. The Cambridge Analytica scandal, as it has become known, grabbed the attention and outrage of the media, the public, parliamentarians and regulators around the world - demonstrating that yes, people do care about violations of their privacy and abuse of power. This scandal has been one of many that illustrate that privacy is also about the autonomy, dignity, and self-determination of people — and a necessary precondition for democracy.  At the same time GDPR, which was years in the making, finally took effect across the EU on 25 May 2018, bringing with it more stringent obligations for those using personal data and stronger rights for individuals, both within and outside the EU.

These two events have been a catalyst for debate regarding the lack of sufficient safeguards, oversight measures and enforcement to adequately protect our data from exploitation. 

A year on, the world - including regulators and legislatures - has begun to wake up to the nature and the scale of the problem and how to grapple with it.

To recap - The Facebook Cambridge Analytica scandal in March 2018

On March 17, 2018, the Guardian and New York Times simultaneously published stories exposing how the personal data of over 50 million Facebook users ended up in the hands of Cambridge Analytica, a company which then sought to increase support for the 2016 Trump presidential campaign.  The company's work had been reported before, for example, for US Senator Ted Cruz using Facebook data had previously been reported in 2015, but the March revelations propelled the company to worldwide attention, perhaps due to the scale and potential links with the 2016 Brexit referendum and the 2016 US presidential election.  

Cambridge Analytica was a consulting and data analytics company that was funded by right-wing American billionaire Robert Mercer and headed by Breitbart-founder Steve Bannon before he left to serve as chief executive for the 2016 Trump campaign. Reporting covered how Cambridge Analytica used data to profile and target individual voters with the aim of predicting and influencing their voting decisions. Reporting further revealed that Cambridge Analytica also supported the Brexit campaign in the UK. According to the Guardian and the New York Times, by late 2015, Facebook was aware that Cambridge Analytica had exploited its users’ data, but Facebook failed to inform people who were affected and engaged in limited and ineffective efforts to recover their data. Facebook later admitted that the number of people affected was much higher than what the Guardian and New York Times had initially reported: it had actually shared the data of 87 million users.  The scandal and its impact are thanks to the persistence and dedication of a number of individuals, including investigative journalists such as Carol Cadwalldar, researchers, the whistleblower Christopher Wylie (a former employee of Cambridge Analytica), Shahmir Sanni (a volunteer with the Vote Leave Campaign in the UK Brexit Referendum), and Professor David Carrol, a New York-based professor, who has engaged in a lengthy battle to obtain his data from Cambridge Analytica.

The story that broke in March 2018 was not the beginning or the end.  Over a year since, more information and questions have emerged.

Furthermore, Cambridge Analytica's role was by no means limited to the UK and US. It was involved in elections around the world. Privacy International had previously looked at the role of Cambridge Analytica in the Kenyan elections and as the revelations unfolded we provided an update in response to the focus, and our Kenyan partner CIPIT further examined the role of the company. 

How did PI respond?

The Facebook Cambridge Analytica scandal highlighted issues with data exploitation that Privacy International fights against. In the days following the Cambridge Analytica revelations, Privacy International highlighted that the companies involved are part of an industrial sector that exploits personal data and called on policy makers to move swiftly.

At the core of the scandal was disregard for the protection of data and the concerns surrounding the profiling of individuals. Cambridge Analytica is by no means the only company that operates in the shadows. Tactical Tech has documented an entire industry of "Whose working for your vote?".

Over the past year we have used data protection law to investigate and seek to hold to account lynch pins of the advertising industry - data brokers and AdTech companies - submitting complaints to data protection authorities in the UK, France and Ireland. We are continuing to follow up with regulators and are pleased to see that our efforts together with others have helped put AdTech squarely on the agenda for data protection authorities.

We have continued to look at Facebook and the data it gets, including revealing the large scale transfer of data from apps to Facebook, whether or not an individual has a Facebook account and the use of Facebook and other data in the Fintech sector, for example by Lenddo.

What has happened since?

In the year since the Guardian and New York Times broke this story, much has been said and some has been done. Cambridge Analytica went into administration, but its parent company SCL is still around and has been succeeded by Emerdata.  Here is a by no means an exhaustive snapshot of some of the responses from regulators, parliamentarians and Facebook. These developments are taking place in the context of a wider debate regarding misinformation, disinformation, electoral interference and more, which is promoting regulation and actions around the world, not covered here.

Data Protection authorities are taking action

In the UK, the Information Commissioner’s Office (ICO) was already conducting an investigation into data analytics for political purposes and in response to the Cambridge Analytica scandal eventually obtained a warrant to inspect the Cambridge Analytica premises. The challenges in responding immediately to the Cambridge Analytica reports led, in a large part, to the ICO being granted new and stronger powers as the UK Data Protection Bill (now the Data Protection Act 2018) as it made its way through Parliament. In July 2018, the ICO published its report Democracy Disrupted. In the ICO's update on its investigation into data analytics in political campaigns, the ICO announced their intention to fine Facebook for lack of transparency and security issues relating to the harvesting of data breaching the Data Protection Act 1998 (the UK data protection law in place at the time).  The ICO also issued an enforcement notice against Aggregate IQ to require them to cease processing the personal data of UK or EU citizens obtained from UK political organisations or otherwise for the purposes of data analytics, political campaigning or any other advertising purpose. In the report, the ICO also announced their intention to bring criminal proceedings against SCL Elections, Cambridge Analytica's parent company, for failing to comply with an enforcement notice issued by the ICO to require them to do deal properly with Professor David Carroll's request to access his data. Then in October 2018, the ICO fined Facebook £500,000 for breaching the UK’s prior data protection law. In discussing the numerous reasons for imposing the maximum fine, the ICO noted “the personal information of at least one million UK users was among the harvested data and consequently put at risk of further misuse.” This fine was the maximum allowable under previous law; if its replacement, the GDPR, had been in place at the time of the Cambridge Analytica data breach, the ICO could have fined Facebook 4% of the company’s total worldwide annual turnover, which would have been over £1 billion. Facebook is currently appealing this fine.  In November 2018, the ICO published its report to Parliament on the use of data analytics in political campaigns, among its findings, the ICO highlighted a disturbing disregard for voters’ personal privacy by players across the political campaigning eco-system — from data companies and data brokers to social media platforms, campaign groups and political parties.  The report sets out that the ICO is continuing to investigate Cambridge Analytica and is analysing materials it has seized in the course of this investigation. In January 2019, the ICO fined Cambridge Analytica's parent company, SCL Elections, for failing to comply with an ICO enforcement notice and in March 2019, issued fines to Vote Leave.

The ICO is not the only DPA looking at this issue and others around the world have also investigated, for example:

Italy

Cambridge Analytica reportedly worked with “a resurgent Italian political party last successful in the 1980s”. It was reported that out of the 87 million Facebook users who had their data illicitly transferred and analyzed by Cambridge Analytica, some 214,000 were Italians. In April 2018, both the Italian Data Protection Authority and Antitrust Authority started an investigation into what exactly happened with the data, both in terms of individual privacy and “alleged improper commercial practices”. In February 2019, the Italian DPA announced that it was ready to impose sanctions.

Germany

The Facebook-Cambridge Analytica scandal also reportedly affected approximately 300,000 German Facebook users. The Hamburg Commissioner for the Protection of Data and Freedom of Information, Johannes Caspar was reported as having initiated legal proceedings against Facebook in April 2018 on the basis of "collection of data without a legal basis". However, the proceedings were then shut down due to issues related to time bar.

Canada

In April 2019, the Office of the Privacy Commissioner of Canada and the Information and Privacy Commissioner for British Colombia found Facebook violated Canada's privacy laws as part of their report on their investigation of the Cambridge Analytica revelations. The Commissioners' also highlighted the pressing need for legislative change to protect the rights of Canadians as well as Facebook's refusal to address deficiencies identified. They announced that the Office of the Privacy Commissioner of Canada plans to take the matter to Federal Court to seek an order to force the company to correct its privacy practices.

Electoral laws need reform and authorities need more powers

As well as data protection law, electoral law is relevant to data exploitation in the electoral context. Like with data protection law, there have been breaches and questions as to whether the current legal frameworks are sufficient. 

In the UK, the Electoral Commission investigated Vote Leave as well as campaign spending relating to Facebook and Cambridge Analytica services. Then in July 2018, the Electoral Commission determined that five payments various Leave campaign groups made to a Canadian data analytics firm, AggregateIQ, violated campaign funding and spending laws. The Electoral Commission fined Vote Leave and referred them to the police for breaking electoral law.  Vote leave has since dropped their appeal against the fine. AggregateIQ has been linked to Cambridge Analytica. It had a contractual relationship with Cambridge Analytica’s parent company in its work for the Leave campaign groups. The Electoral Commission found AggregateIQ and Facebook had “used identical target lists for Vote Leave and BeLeave ads.” 

The Electoral Commission has called for more powers to increase transparency and available sanctions in relation to digital campaigning. 

Politicians are angry

Politicians around the world have sought answers from Facebook, Cambridge Analytica and big tech more widely -  being met largely with frustrating responses. The EU, in May 2018, finally pinned down Mark Zuckerburg. In November 2018, parliamentarians from across the world united to grill Facebook on its practices as part of an ‘international grand committee’ on Disinformation and ‘fake news’.  In Canada, the Standing Committee on Access to Information, Privacy and Ethics has looked into and reported on the Cambridge Analytica scandal.  The UK Parliament has recognised that Facebook has resisted efforts to expose and regulate it. The UK Parliament, through the Digital, Culture, Media, and Sport Committee, scrutinised Facebook and other platforms to examine how peoples’ privacy and political choices could be compromised by online disinformation and online interference in the democratic election cycle. In February 2019, the Committee concluded that while “Facebook seems willing neither to be regulated nor scrutinised,” “[c]ompanies like Facebook should not be allowed to behave like ‘digital gangsters’ in the online world, considering themselves to be ahead of and above the law.”  

There may be consequences in the US, too

Among other developments in the United States, the Department of Justice, Federal Bureau of Investigation, the Securities and Exchange Commission, and the Federal Trade Commission have been examining Facebook’s data sharing and protection practices. This includes a new investigation into the Cambridge Analytica revelations. On April 24, 2019 it was reported that Facebook expected to be fined up to $5 billion by the Federal Trade Commission for privacy violations, which would be a record fine imposed by the FTC against a technology company. Facebook disclosed this amount in its quarterly financial results, saying that it expected a one-time charge of $3 billion to $5 billion. Furthermore, federal prosecutors in California are still actively investigating the Facebook-Cambridge Analytica scandal. 

Facebook fails again and again

Throughout the aftermath of this scandal, Facebook has been criticised for denying wrongdoing and seeking to deflect blame. The day before the initial Guardian article was published, Facebook threatened to sue The Guardian, a move that Facebook has since expressed regret over. In November 2018, the New York Times revealed that Facebook hired a public relations firm to attack and discredit its critics by linking them to George Soros, cast criticism of the company as anti-Semitic, and shift attention to Facebook rivals such as Google. After such revelations were met with public outcry, Facebook terminated its relationship with the PR firm. The UK Parliament Digital, Culture, Media, and Sport Committee accused Facebook co-founder and CEO Mark Zuckerberg of showing contempt towards the UK Parliament by failing to appear before the Committee, and noted “Facebook used the strategy of sending witnesses who they said were the most appropriate representatives, yet had not been properly briefed on crucial issues, and could not or chose not to answer many of our questions. They then promised to follow up with letters, which—unsurprisingly—failed to address all of our questions.” 

Mark Zuckerberg announced in early March 2019 that Facebook would be building a “privacy-focussed messaging and social networking platform,” but he was criticised for failing to address whether the company would stop purchasing information from data brokers, collecting browsing data, collecting data from people who are not even on Facebook, and micro-targeting users. Facebook has yet to answer these questions. Calls by Zuckerberg for regulation to protect privacy and election integrity gave rise to some scepticism.

Next Steps

Privacy International is continuing to work to expose and challenge data exploitation, in political campaigning and advertising more generally. We will follow developments closely, push for the enforcement of existing legal protections and advocate for stronger protections where these are absent. We will demand that those involved are more transparent, implement their obligations and commitments, and are held accountable. 

In May 2019, a year on from the Cambridge Analytica Scandal and GDPR we have reinforced our commitment to this issue with our focus on Defending Democracy and Dissent with work dedicated to challenging data exploitation in democratic societies.