Who: ICO
Where: United Kingdom
When: 10 April 2015
Law stated as at: 3 June 2015
What happened:
On 10 April 2015, ICO published a summary of the feedback on its Big Data and data protection report published in July 2014 (the “Report”), and its comments in response to that feedback (the “Response”).
ICO notes in the Response that there appears to be a consensus that the general approach put forward by it in the Report is on the right lines. Nonetheless, the Response did elicit interesting comments, particularly in relation to:
(i) the legitimate interest condition for processing;
(ii) the difficulties of providing appropriate privacy notices in the big data context; and
(iii) the importance of having an adequate framework in place when starting a data project for predicting and mitigating any privacy risks that might arise.
Legitimate interest condition
Feedback on the Report suggested that ICO had placed too much emphasis on big data analysts having to obtain consent to process personal data, which is not always practical in the big data context, as opposed to relying on other routes to fair and lawful processing, such as the “legitimate interests” basis as per s.4(3) of the Data Protection Act 1998 and Schedule 2 paragraph 6.
In the Response, ICO recognised that it had dealt at greater length with consent. It explained that this was only because it is a subject of current debate in the context of big data and says that by doing so, it did mean to imply that consent was the only or the most important condition for fair and lawful processing.
ICO confirms that organisations can of course process personal data without consent where it is in their “legitimate interests” to do so, provided, as stated in Schedule 2.6, that the processing in question does not unduly prejudice the rights and freedoms of individuals.
Privacy notices
In its Report, ICO considered the argument that big data requires a regulatory focus on how data is used, rather than how it is collected. ICO confirmed, however, that it is still necessary in the big data context to tell people about processing through privacy notices and where necessary, to obtain consent.
Feedback on the Report raised concerns about the practical difficulties of doing this. In its Response, ICO helpfully recognises this and commits itself to continuing to look for innovative examples of how to provide privacy notices in a big data context. ICO has also started a review of its Privacy notices code of practice, which will consider how the code can further reflect the issues around transparency in the context of big data.
Privacy engineering and privacy impact assessments
In the Report, ICO asked respondents to provide feedback on what practical measures and tools can help to protect data privacy in the context of big data analytics. Feedback in this area revolved around privacy engineering (so called “Privacy by Design”) and privacy impact assessments (“PIAs”).
In the context of PIAs, the Response reports that one theme of the feedback was that looking for example at a case where big data was used to offer a product to a consumer in a more targeted way, people would be likely to see this as less significant or sensitive than using it to make a decision about an application for life assurance.
Encouragingly, ICO says in the Response that it “broadly agrees with this point.”
Further with regard to Privacy by Design and PIAs, both respondents and ICO acknowledged their role in ensuring that organisations have considered and mitigated privacy risks at the outset of a data project.
On the topic of privacy engineering, in the Response ICO reports feedback to the effect that Privacy by Design is not just a legal question but an engineering one. There is therefore a role for privacy engineering, which would involve bringing together legal and policy people in an organisation, together with technical experts “to developing ethical approaches to designing systems.”
ICO feels it can and should play a role in this and says it has committed to work with external technical experts to develop privacy enhancing technologies, to encourage the recognition of privacy and data protection issues in university IT and information management courses, and to liaise with key stakeholders to discuss the development of more specific PIA guidance on big data that uses the ICO PIA code as a framework.
What next?
ICO will re-issue the Report in summer 2015. It expects to have concluded its review of the Privacy notices codes of practice by the end of June 2015, which will be followed by a seminar on privacy and big data later in 2015.
Why this matters: While ICO confirms that the same rules apply to those engaging in big data analytics as to those dealing with more modest volumes of personal data, but it recognises the practical difficulties of complying with them in the big data context. Better still, ICO makes a commitment to assist key stakeholders with the development of innovative means of complying with data protection legislation, and specific guidance on big data. ICO’s comments and commitments show that there is a lot to look out for in the coming months!