A look into Facebook and Cambridge Analytica: where it had gone wrong and could have been worse
How did more than 50 million people’s personal data that was meant to be collected for the sole purpose of the platform’s usability end up as a political tool? In this article, we identify the areas where Facebook and Cambridge Analytica had failed in their roles as a data controller and data processer respectively. Consequently, this had led to a shocking scandal which is now very closely watched by both the authorities and the world.
The major crimes that Facebook and Cambridge Analytica are held for are the breaches of the Article 5 GDPR (1), Article 6 GDPR (1) and Article 17 GDPR (1). In particular:
Article 5 GDPR (1)
Personal data shall be:
- processed lawfully, fairly and in a transparent manner in relation to the data subject (‘lawfulness, fairness and transparency’);
- collected for specified, explicit and legitimate purposes and not further processed in a manner that is incompatible with those purposes; further processing for archiving purposes in the public interest, scientific or historical research purposes or statistical purposes shall, in accordance with Article 89(1), not be considered to be incompatible with the initial purposes (‘purpose limitation’);
Article 6 GDPR (1)
Processing shall be lawful only if and to the extent that at least one of the following applies:
- the data subject has given consent to the processing of his or her personal data for one or more specific purposes;
Article 17 GDPR (1)
The data subject shall have the right to obtain from the controller the erasure of personal data concerning him or her without undue delay and the controller shall have the obligation to erase personal data without undue delay where one of the following grounds applies:
- the personal data are no longer necessary in relation to the purposes for which they were collected or otherwise processed;
Facebook is known for its platform for a wide range of third-party apps to function. Upon installation or usage of these apps, the users grant these third-party developers like Kogan (the professor whom Cambridge Analytica had hired to build a database of the users’ personal data) access to their own data. In this case, there wasn’t anything wrong that Kogan did, because he was collecting data for the purpose of his app: which was meant to be used for academic research to Facebook and its users.
Until the moment Kogan had passed on the data to Cambridge Analytica which had intended to use the data to influence the voter’s behaviour. By doing so, he was changing the purpose that the data was collected or otherwise processed: from academic purposes to political purposes. It wasn’t what the users had agreed to, unless Kogan can argue that he had included that purpose in the terms & conditions and the privacy policy as well. For that reason, Kogan had infringed all of the 3 articles mentioned above by:
- Further processing the data for an illegitimate purpose.
Kogan had obviously sent the data to Cambridge Analytica so that it could use the data to manipulate the outcome of the US elections. In this case, the purpose is illegitimate because such data should not have been used so extensively for events that have significant importance on politics. Secondly, it was agreed between Facebook, Kogan and the users that the data was to be used, in general, for business purposes and not other purposes. However, processing for political purposes is considered to be a totally different one from business purposes and thus is not anywhere similar to the notion that the data was processed for business purposes. After all, like what people say, politics and business don’t mix. Likewise, in this situation, because the purposes differ, the data that Kogan had passed onto Cambridge Analytica is classified as an illegitimate purpose. - Further processing the data for another purpose other than the purpose specified in the Terms & Conditions and/or Privacy Policy.
Obviously, the purpose had changed the moment Kogan had passed on the data to a company which intended to process it for political purposes instead of business purposes. That purpose is inconsistent with that of the users and Kogan had agreed on. - Not deleting the personal data after the personal data was no longer necessary for academic research purpose.
According to Article 17, the data processor is obliged to delete all of the personal data once its purpose was fulfilled. In Kogan’s case, he did not: instead, he had retained the contents of the personal data when he had passed on the data to Cambridge Analytica for purpose of building a political tool.
Meanwhile, as much as Kogan was responsible for the whole breach, Facebook also had a fair share to ensure that the breach was contained. Even though Facebook had discharged its duties in a legal way, it could have done better to protect their users’ personal data from being misused.
Firstly, it should not have made its users’ data so readily accessible till the third-party developers could easily obtain them with just a snap of their fingers. Secondly, Facebook should have made it clear that its users’ data should not be misused — in this case, Facebook had not aggressively followed up on Cambridge Analytica even after suspecting foul play from Kogan and Cambridge Analytica. It is worth noting that the whole scandal could have been avoided had Facebook been more proactive in protecting their users’ data.
Yet, Facebook could have been in a worse situation had the GDPR had commenced about two years ago. Fines of up to €20 million or 4% of annual global turnover (whichever is greater) is not the only thing — the obligation by Facebook to inform authorities and the affected users is another. According to Article 33 and Article 34 GDPR, which I quote some parts of them,
Article 33
- In the case of a personal data breach, the controller shall without undue delay and, where feasible, not later than 72 hours after having become aware of it, notify the personal data breach to the supervisory authority competent in accordance with Article 55, unless the personal data breach is unlikely to result in a risk to the rights and freedoms of natural persons. Where the notification to the supervisory authority is not made within 72 hours, it shall be accompanied by reasons for the delay.
Article 34
- When the personal data breach is likely to result in a high risk to the rights and freedoms of natural persons, the controller shall communicate the personal data breach to the data subject without undue delay.
Legally, Facebook wasn’t obliged to reveal the breach but from an ethical perspective, it should have. However, from 25th May 2018 onwards, any companies in a similar position will have to report the breach to the authorities and affected users– failure to do so will only put these companies in a worse position to defend themselves in the investigation process.
Having explained the circumstances that had led to the scandal, what is next for Facebook and all the companies? What could help these companies avoid such circumstances?
In the short term, Facebook and the companies should have data mapping documents — and it should not be restricted to their own companies, but to their partners which have access to the users’ data. An even better scenario will be when companies and partners establish a mutual data mapping repositories of who their customers’ data will be accessed by before agreeing to work together. Any data that is shared beyond the boundaries stated in these data mapping documents will be seen as a contractual breach. Meanwhile, this short-term method will allow companies to have a better sense and ultimately, control of where their customers’ data will be shared.
What happens when the data is currently being used for purposes other than the purpose specified in the agreement? How should the companies follow up? This is another short-term strategy that companies should form. Response procedures will give the employees a better idea of how to contain such breaches and take action to improve the situation. This is important, because Facebook’s lack of aggressiveness in following up on Cambridge Analytica was probably because of the lack of response plans that Facebook’s employees could follow in the case of the breach.
In the long run, companies like Facebook ought to design or re-design their systems such that the features of the system are consistent with the GDPR compliance framework. One option in the system is including a mechanism that tracks all kinds of activity of who has accessed the users’ data. Another option to consider is how the system will be designed in a way that data cannot be so easily exported onto another platform, and subsequently, risk being used for other purposes.
What can we learn from Facebook and Cambridge Analytica?
A single wrong move can cause a person or company to run afoul of the GDPR. To avoid such circumstances, one ought to implement GDPR-compliant policies and reinforce the culture of data protection in the organisation. This means companies have to read up on the updated GDPR, re-design their systems to be compliant and finally, re-train their employees to be operationally compliant with the GDPR.