Since 2006, 28 January has marked the anniversary of the first international law in the field of data protection – who knew?
A lot has happened since then. Data protection and privacy is now a rapidly expanding area of law of ever-increasing importance. As we head towards the second anniversary since the GDPR came into force, we review current developments and look ahead at what to expect in 2020.
Our special Data Privacy Day newsletter covers the following topics:
- Accountability – sounds good, but what does it actually mean?
- International transfers and Brexit
- What’s cooking with cookies?
- Whatever happened to the ePrivacy Regulation?
- The growing culture of Data Subject Access Requests (DSARs)
- Adtech – under regulator scrutiny
- Artificial Intelligence (“AI”) and data protection
- Data security
- Class action compensation claims
Meanwhile, please make a diary note of our annual Data Protection Update seminar, which will be held on 14 May 2020.
Please do contact us if you have any questions or if our data protection team can assist you in any way.
1. Accountability – sounds good, but what does it actually mean?
The GDPR sets out six principles relating to processing of personal data. These include ‘lawfulness, fairness and transparency’, ‘purpose limitation’ and ‘data minimisation’. But then the GDPR adds another principle – that the controller “shall be responsible for, and be able to demonstrate compliance with” these six principles. This is referred to as the “accountability” principle. The ICO has said that “Accountability encapsulates everything the GDPR is about”. But what does it actually mean in practice?
Accountability is about putting data protection at the heart of your organisation. It means that you must consider data protection and privacy issues upfront when you are planning any new initiative. It includes things like:
- implementing data protection policies;
- recording your processing;
- taking a data protection by design and by default approach;
- having written contracts in place with processors;
- implementing appropriate data security measures;
- recording and, where necessary, reporting data breaches;
- appointing a data protection officer;
- establishing processes for handling data subject rights’ requests; and
- carrying out data protection impact assessments where needed.
Towards the end of 2019 the ICO consulted on the idea of developing a toolkit to help organisations comply with their accountability obligations. The objective is to provide down to earth practical guidance on implementing privacy management programmes based on an understanding of technical challenges and other barriers (such as commitment to data protection from top management).
The ICO is planning to conduct a workshop on the toolkit in early February 2020. Following this, they expect to pilot the toolkit later in the year. It is hoped that this may help organisations, whose resources are already over-stretched, with achieving a good and practical level of compliance.
2. International transfers and Brexit
International organisations with a UK presence are likely to face further dilemmas in relation to their compliance with the rules concerning international data transfers in 2020, especially now we know that Brexit is set to occur on 31st January.
Whilst the data transfer rules will remain unchanged during the transitional period, which runs until 11pm on 31st December 2020, what happens after then is yet to be seen. What we do know is that Britain will become a “third country” for the purposes of EU GDPR from this date. This has the potential to cause a significant amount of disruption.
The most positive outcome would be for the EU Commission to issue an “adequacy” decision before the end of the transitional period. This would allow data to continue to flow freely between the UK and the European Economic Area (“EEA”). However, reaching an “adequacy” decision is often a lengthy procedure and it is perhaps wishful thinking to believe that the EU Commission will take a short-cut and make such a decision in time.
If an adequacy decision has not been made by the end of the transitional period, then organisations in the EEA which are transferring personal data to the UK will need to ensure that they have in place an “appropriate safeguard” for the data. In the majority of cases, the most appropriate lawful mechanism for transfers will be for the parties to enter into the appropriate EU approved “standard contractual clauses” (“SCCs”).
There are currently two sets of SCCs which have been approved by the EU Commission – these regulate transfers from:
a) an EEA controller to a non-EEA controller; and
b) an EEA controller to a non-EEA processor (“C2P SCCs”) (see more on the validity of these below).
One legal grey area is in relation to transfers from an EEA processor to a UK controller. There are no SCCs which would regulate such transfers and there will often be no other suitable lawful mechanism which could be used for these types of transfer, meaning EEA organisations are likely to be faced with either violating the GDPR or stopping transfers to the UK if such circumstances arise. It is expected (or perhaps hoped) by the UK government that the European Data Protection Board would issue guidance on this in the event of a no deal Brexit.
On a more positive note, it appears the C2P SCCs will survive the legal challenge currently being brought against them in the European Court of Justice (ECJ) in the case of Data Protection Commissioner v. Facebook Ireland Limited (often referred to as “Schrems II”). The Advocate General Henrik Saugmandsgaard Øe issued his opinion in Schrems II at the beginning of December 2019, recommending that the court uphold the validity of the C2P SCCs.
Although this is not binding and the ECJ will have the final say in the matter, the opinion of the Advocate General is followed in around 80% of ECJ cases. It is, therefore, widely expected that the C2P SCCs will remain intact following the court’s judgment. Although imperfect, and in need of updating, the SCC’s will, for many businesses, continue for the time being to be the glue that holds international data transfers together.
3. What’s cooking with cookies?
It is likely that the ICO will start taking enforcement action against organisations which do not follow the rules, and this could lead to fines. As such, businesses which are not yet compliant should take steps to ensure compliance now.
At a high level, the following are the main rules when using cookies on websites:
a) User consent must be obtained (except in relation to “strictly necessary cookies”)
The ICO confirmed that the standard of consent for using cookies is the same high standard as under the GDPR, even for cookies which do not involve the processing of personal data. This means that implied or inferred consent can no longer be relied on for cookies. For consent, a clear affirmative act is needed; pre-ticked boxes or inactivity does not constitute consent.
Websites which use non-essential cookies without specifically requiring users to consent to these when accessing a site (e.g. by specifying that continued use entails consent) are, therefore, not compliant. This also means that all non-essential cookies should be switched off by default. It also means that such cookies should only be served on the user if and when the user consents.
“Strictly necessary cookies”, which do not require consent, are those which are essential to provide a user with the service they have requested or to comply with applicable law. Analytics cookies and advertising cookies do not fall within this exemption.
b) Provide clear and transparent information to users concerning the cookies you use
The ICO Guidance emphasises the need to provide users with transparent information about cookies. The information must be in accordance with the higher standards of transparency as required by the GDPR; it must be presented in a “concise, transparent, intelligible and easily accessible form, using clear and plain language”.
In relation to cookies, this means that online retailers need to review and update their cookies policies to ensure that these are drafted in a sufficiently clear and easily accessible manner for a normal user to be able to understand how the different types of cookies are being used on the website. Failure to provide clear information will breach the transparency requirement, and will also undermine any “consent” if the consent cannot be said to be sufficiently informed.
Highlighting the importance of transparency and consent, in January 2019, the French data protection regulator imposed a fine of €50 million on Google for lack of transparency, inadequate information and lack of valid consent regarding ads personalization on mobile devices. For more information on this, see here.
4. Whatever happened to the ePrivacy Regulation?
Originally intended to coincide with the GDPR, the introduction of the ePrivacy Regulation has been highly contentious and has met with considerable delay. Towards the end of 2019, the latest draft was rejected by the Council of Europe leading to further delays in its adoption.
The new rules would also ban cookie walls (where a website requires users to accept cookies as a condition of being able to access the website’s content).
The proposal will also continue the ban on unsolicited electronic communications by emails, SMS and automated calling machines. However, it is not yet known if this will extend to B2B communications, or simply apply to B2C marketing as at present.
The draft Regulation also introduces more stringent penalties for non-compliance, and bring the sanctions regime and remedies available broadly into line with the GDPR.
It is uncertain what the final form of the Regulation will be. However, given the latest delay, Brexit has now intervened and so the Regulation will not be directly applicable in the UK. Despite that, it is likely that the UK will adopt the new rules as and when introduced. While the UK may be able to make its own decision on this following Brexit, if the UK does not implement the new Regulation that may stand in the way of the adequacy decision the UK needs in order to allow the free flow of data to and from the EEA. Also, the proposed extra-territorial scope of the new Regulation (like the GDPR) means that it will remain directly applicable to UK businesses targeting the EEA. Who said that after Brexit the UK will take back control of its laws?!
Meanwhile, the ICO has also published a draft direct marketing code of practice for consultation. The consultation closes on 4 March 2020 and the ICO expects to finalise it in 2020. The ICO plans to produce additional practical tools such as checklists to go alongside the code.
Some key points include:
a) The two lawful bases most likely to be applicable to direct marketing are consent and legitimate interests. However, where PECR applies and requires consent, then in practice consent should also be your lawful basis under the GDPR.
b) It is important to keep personal data accurate and up to date. It should not be kept for longer than is necessary. It is harder to rely on consent as a genuine indication of wishes as time passes.
c) If you are considering buying or renting direct marketing lists, you must ensure you have completed appropriate due diligence
d) Profiling and enrichment activities must be done in a way that is fair, lawful and transparent.
e) If you are using new technologies for marketing and online advertising, it is highly likely that you will be required to conduct a data protection impact assessment (DPIA).
f) If someone objects you must stop processing for direct marketing purposes. You should add their details to your suppression list so that you can screen any new marketing lists against it.
Once the draft ePrivacy Regulation is finalised and the UK’s position on Brexit is clear, the ICO has indicated that it will update the direct marketing code to take into account the ePrivacy Regulation.
5. The growing culture of Data Subject Access Requests (DSARs)
The GDPR gives data subjects the right to access the personal data which a controller holds in relation to them. Although this may sound fairly innocuous, dealing with DSARs in practice continues to be a source of much frustration for controllers, particularly in the field of employment where DSARs are often used by disgruntled employees as part of a wider litigation strategy.
Meanwhile, the ICO’s Annual Report 2018-19 (published in July 2019) shows that subject access requests generate by far the most complaints to the regulator (at 38%). We expect the use of DSARs will continue to be prevalent in 2020. Businesses who do not yet have processes in place for dealing with such requests should develop procedures and protocols to be followed when requests are received. To this end, the ICO published updated draft guidance in relation to the right of access towards the end of 2019. Some key points for controllers to note are as follows:
a) Procedure for submitting requests – there is no particular procedure data subjects must follow when submitting a DSAR. Individuals do not need to designate their request as being a DSAR for it to be treated as such. Furthermore, individuals can submit DSARs through whatever channel they prefer (including verbally), meaning that it’s important that relevant staff are trained in recognising such requests.
b) Receiving DSARs from 3rd parties – it is common for 3rd parties, such as law firms, to submit DSARs on behalf of others. In such circumstances, controllers are entitled to (and should) ask the relevant 3rd party for proof of the authorisation permitting them to act on behalf of the data subject. The onus is on the 3rd party to provide proof of authorisation, and this can be achieved through a letter of authorisation or a general power of attorney.
c) Time for responding to DSARs – normally you must comply with a DSAR without undue delay and at the latest within one month of receipt of the request. You can extend the time to respond by a further two months if the request is “complex” or you have received a number of requests from the same individual. Some organisations claim the extra time on the basis that the request is complex because it involves a large volume of information. The ICO guidance indicates that, while this may add to the complexity of a request, a request is not complex solely because the individual has requested a large amount of information.
The ICO guidance provides helpful advice in relation to the timeframe controllers are required to respond to DSARs, including the circumstances in which a controller may be able to extend the time for responding to a request on the basis of it being “complex” or where it has received multiple requests from the same individual.
The following are given as examples of factors that may in some circumstances add to the complexity of a request. However, you need to be able to demonstrate why the request is complex in the particular circumstances:
- Technical difficulties in retrieving the information – for example if data is electronically archived.
- Applying an exemption that involves large volumes of particularly sensitive information.
- Any specialist work involved in redacting information or communicating it in an intelligible form.
One key area where the ICO has changed its position is in relation to circumstances where a controller needs to raise clarifications in relation to the DSAR. Whilst previously the ICO had taken the view that the statutory timeframe for responding to a DSAR would not commence until the controller received a response to any clarifications raised by it, this is no longer the case in the updated guidance. The ICO now takes the position that the timeframe for responding commences from the date the DSAR is received, irrespective of whether any clarifications are raised by the controller or whether the data subject has replied.
d) Being ready for DSARs – the ICO guidance expresses little sympathy for controllers who aren’t able to process DSARs efficiently, stating that DSARs have been a feature of the law since the 1980s and that therefore organisations should have systems in place to deal with them. From our experience, many organisations do not currently have systems in place to deal with DSARs, and particular difficulties are faced with unstructured data such as emails. While there are a growing number of third-party solutions which claim to assist, organisations are often forced to expend significant time and expense in dealing with DSARs.
e) Charging for DSARs – the guidance provides further guidance as to what is meant by the “administrative” costs which can be charged by controllers where an individual submits excessive or manifestly unfounded DSARs. Printing, photocopying and postage would fall within the meaning of an administrative costs. Charging for employee time taken to deal with such requests – which can be significant – would not be.
6. Adtech – under regulator scrutiny
The ICO has been investigating the adtech and real time bidding (RTB) industry over the past year. This is a huge industry and, from a compliance viewpoint, it is particularly complex due to the challenges of providing meaningful information and obtaining valid consent from users.
The ICO is concerned that that the creation and sharing of personal data profiles about people, on such a large scale, is disproportionate, intrusive and unfair, particularly when people are often unaware it is happening. The key issues are:
- identifying a lawful basis for the processing of personal data in RTB, as the scenarios where legitimate interests could apply are limited, and methods of obtaining consent are often insufficient;
- the privacy notices provided to individuals lack clarity and do not give them full visibility of what happens to their data;
- in many cases there is a reliance on contractual agreements to protect how bid request data is shared, secured and deleted. This does not seem appropriate given the type of personal data sharing and the number of intermediaries involved.
Industry bodies such as the IAB have been engaged with these issues looking for practicable solutions for some time. As a recent sign of the seriousness this is being taken in some quarters, Google recently proposed changes to its Chrome browser, including phasing out support for third party cookies within the next two years.
However, in a recent blog, the ICO has expressed frustration that many organisations involved in RTB appear to have their heads firmly in the sand.
The ICO has made it clear that those in the adtech chain cannot rely on “legitimate interests” as the lawful basis for the processing of personal data in RTB. Furthermore, they have said that the Data Protection Impact Assessments they have seen have been “generally immature, lack appropriate detail, and do not follow the ICO’s recommended steps to assess the risk to the rights and freedoms of the individual”. The ICO has indicated that they anticipate it may be necessary to take formal regulatory action in such cases. We could, therefore, see such actions in 2020.
The most effective way for organisations to avoid the need for regulatory action is to engage with the process for industry reform, and to encourage their supply chain to do the same. The ICO warns that those who have ignored the window of opportunity to engage and transform must prepare for the ICO to utilise its wider powers.
7. Artificial Intelligence (“AI”) and data protection
In the past few years, we have seen an increasing number of organisations developing or using AI solutions. Although the business case for the use AI is compelling, tensions can arise where its use is at odds with data protection laws.
These tensions between AI and data protection include the following:
- Transparency – the GDPR requires you to provide individuals with notice setting out how you are using their personal data. Where there is an element of automated decision-making which results in legal effects or otherwise has a significant effect on an individual (as there often is with AI), the controller is required to provide affected individuals with “meaningful information about the logic involved, as well as the significance and the envisaged consequences of such processing for the data subject”. Given the complexities with AI and the fact that some types of AI can develop in an unsupervised environment, without human intervention, it can sometimes be difficult to meet these requirements.
- Purpose limitation, data minimisation and storage limitation – the GDPR requires that processing of personal data is carried out for specific purposes, no more personal data than is adequate to achieve those purposes is processed and that personal data is only processed for as long as necessary to achieve those purposes. There is often tension between these principles and AI, since the development of an AI system can often result in data being used for unexpected purposes, and often requires vast amounts of data to be inputted into the system in order for it to meaningfully detect patterns and trends.
In respect of the transparency issue, the ICO has developed draft guidance along with the Alan Turing Institute (the UK’s national institute for data science and artificial intelligence) dealing with explaining AI. The guidance provides detailed information on the different ways in which businesses can seek to explain the processing they undertake using AI to the individuals concerned and seeks to address some of the concerns businesses may have in providing such explanations.
In addition to the above, the ICO is also working on finalising its AI auditing framework which will address the following specific issues:
- Accountability – which will discuss the measures that an organisation must have in place to be compliant with data protection law.
- AI-specific risk areas – which will discuss the key risk areas the ICO has identified in relation to the use of AI in the field of data protection.
As the use of AI becomes more widespread, it is hoped that the guidance issued by the ICO will help businesses better understand and comply with their data protection obligations whilst still allowing them to develop AI systems which can benefit organisations and individuals alike as our knowledge in this area continues to grow.
8. Data security
The ‘integrity and confidentiality’ principle of the GDPR – also known as the security principle – requires that you have appropriate security measures in place to protect the personal data you hold. In terms of data security, the central obligation under the GDPR is “taking into account the state of the art, the costs of implementation and the nature, scope, context and purposes of processing as well as the risk of varying likelihood and severity for the rights and freedoms of natural persons, … [to] implement appropriate technical and organisational measures to ensure a level of security appropriate to the risk”.
The GDPR is not prescriptive as to what this means and there is no “one size fits all” solution – the GDPR takes a risk-based approach. It says that these measures may include pseudonymisation and encryption of personal data, and implementing a process for regularly testing, assessing and evaluating the effectiveness of technical and organisational measures for ensuring the security of the processing.
Pseudonymised data (for example, replacing names with a number) remains subject to the GDPR, but is a good technique for securing the data, for example, when sharing it with others. On the other hand, the GDPR makes clear that data protection laws do not apply to anonymised information (information which does not relate to an identifiable person). The GDPR does not go into any detail on how to anonymise data and the organisations often refer to personal data as having been ‘anonymised’ when, in fact, this is not the case. This presents a risk that you disregard the terms of the GDPR in the mistaken belief that you are not processing personal data. The ICO issued a code on anonymisation under the old Data Protection Act. In 2020, we can expect an update to this code.
Encryption is a key tool for data security. As this is an established, widely-deployed technology, failing to encrypt data in transit or at rest risks being in breach of the security principle and could lead to fines if the data is compromised.
On the other hand, in the event of a data breach where the data had been effectively encrypted, there would be no requirement to notify data subjects of the breach as there would be no risk to data subjects as the data was “unintelligible”.
However, the biggest causes of data breaches are relatively unsophisticated issues such as data being sent to the wrong recipient and email users falling for a phishing attack. While there are effective technologies that can help prevent these sort of errors, employee awareness and training programmes will go a long way to protect against them, and are an important part of the “accountability” principle (see above Accountability – sounds good, but what does it actually mean?).
Due to the timing of data incidents and the related ICO investigation, many monetary penalties in 2019 were issued under the previous legislation, the Data Protection Act 1998, and not under the GDPR. The maximum financial penalty under the former law is £500,000. And the ICO has shown itself willing to issue the maximum fines; for example, in January 2020, fining DSG Retail Limited (the brands Currys, PC World, Dixons Travel) £500,000 after a ‘point of sale’ computer system was compromised as a result of a cyber-attack, affecting at least 14 million people. Earlier, in December 2019, the ICO fined a London-based pharmacy £275,000 for failing to ensure the security of special category data. Doorstep Dispensaree Ltd, which supplies medicines to customers and care homes, had left approximately 500,000 documents in unlocked containers at the back of its premises in Edgware.
However, mega fines under the GDPR are beginning to come through. The outcome of the ICO’s statement of intention to fine Marriott International Inc £99,200,396 for a cyber incident affecting approximately 339 million guest records globally, is still awaited. As is the outcome of its statement of intention to fine British Airways (BA) £183.39 million for a cyber incident which affected approximately 500,000 BA customers. According to reports, the deadline by which to reply to the notices of intention has been extended to 31 March 2020 for both companies.
We expect to see more eye watering regulatory action of this kind in 2020.
Meanwhile, an important point of housekeeping; companies should ensure that they register with the ICO and pay their data protection fee (unless exempt) as the ICO has launched a campaign to contact organisations to remind them about payment of the fee. The ICO issued 340 monetary penalty notices for non-payment of the data protection fee between 1 July and 30 September 2019.
10. Class action compensation claims
The GDPR provides supervisory authorities the power to issue huge administrative fines (and we have seen the ICO demonstrate its intent to levy such fines). It also provides individuals with the right to seek compensation against controllers and processors which fail to comply with its provisions. This is set to provide fertile ground for claimants bringing actions in this area, and we expect the number of claims for data protection violations to increase significantly over the course of 2020.
Of particular interest is the rising number of class actions being brought for data protection related offences.
The decision of the Court of Appeal was significant since it allowed the case to be brought on behalf of all iPhone users affected by Google’s conduct over the relevant period on an opt-out basis. The Court of Appeal found this to be acceptable since all members of the class had the same “interests” (i.e. they had all suffered the same alleged wrong). This could potentially have broad ramifications in the area of data protection since violations will often impact upon a large number of individuals, rather than being one-off events affecting specific individuals (e.g. where an organisation is sending marketing communications to its entire mailing list unlawfully).
Many commentators have therefore suggested that the decision by the Court of Appeal in Lloyd v Google LLC could result in the floodgates opening for class action claims in relation to data protection violations. To a certain extent, this has already materialised, with a number of data protection class actions currently being fought out in the UK courts. Organisations which have suffered security incidents would appear to be at particular risk, with each of Morrisons, Equifax and British Airways currently litigating class actions in the aftermath of high-profile data breaches.
While the amounts awarded to individuals may be modest, in the event of a class action involving a large number of claimants, the potential total damages could dwarf the fines that could be imposed by the regulator.