The contact tracing App being developed by NHSX is being promoted as a key tool which will enable the lockdown to be eased by automating the process of identifying people who have been in recent close proximity with someone with symptoms of Covid-19.
The success of the App is dependant to a large extent on a significant proportion of the population downloading and using it. While the App has some utility if only 20% of the population download it, contact tracing will only be effective if a significant percentage (estimated to be around 60%) of the population participate.
Whether or not people will take up the App is, in turn, critically dependant on the level of trust which people have that the system will operate as advertised and if and how legitimate concerns as to the privacy and security of the data will be addressed.
This article addresses the following points:
The App uses low power Bluetooth on smartphone devices to communicate with other devices in near proximity that also have the App installed. The App tracks the estimated distance and duration of each device from each other device. Each device that is in contact with another will issue to the other randomised numbers. This proximity log is then stored on the device.
If, soon after, a user develops symptoms of the virus, the user can then update their status on the App. The proximity log will then be uploaded to the central system that will work out the specific other devices that need to be alerted to the fact that they have been in proximity with someone who now has symptoms, so that the users of the other devices can then self-isolate.
Any government sponsored technology that can track and trace the population instinctively raises privacy concerns.
First, although the data is anonymised and does not contain any personal identifiers, it will track everyone a user comes into contact with. Data concerning one’s daily personal inter-actions and the people one associates with can be highly sensitive and not something one would wish to share with the state.
Then there is “feature creep”. While the technology is being introduced with the best intentions and in the interests of public health, once it has been widely implemented and as time goes on there will be a temptation to “enhance” it and use it for broader purposes. For example, if the App starts to record specific location data (and not only proximity data), this will be a serious privacy concern as location data can itself reveal highly sensitive personal data (e.g. meetings at other people’s homes, attendance at particular [e.g. political] events, health clinics or places of worship etc). There may be a temptation to share the data with other government departments or the police for other purposes, such as detecting crime, or for tax or immigration purposes.
And the government and NHS do not have a great track record in respect of data security – so how secure will the data collected by the App be? There must be a risk that it could it be hacked by criminals or a rogue state sponsored hacker?
The fact that NHSX has – in contrast with many other governments (such as Ireland, Germany and Switzerland) and unlike the Google / Apple initiative – apparently opted to implement a centralised system, where data is held by the government rather than only locally on the device, heightens these concerns.
Application of Data Protection laws
Data protection laws apply to “personal data” relating to an identified or identifiable person. In the case of the App, personal data is used on a no-names basis, with the user being given a random rotating ID. The specific device ID is not used, although the make and model of the device is captured. The GDPR specifically refers to an “online identifier” as being personal data. However, while pseudonymised data is regulated as personal data, truly anonymised data is not.
Although the precise way the App works is yet to be finalised and published, we must assume that the use of the App for track and trace will involve personal data. As such, it will be regulated by the GDPR, as it will be possible to identify and distinguish some individuals (or devices) from others and to apply different treatment accordingly. Data protection laws do not stand in the way of such technologies, but such technologies must be built and implemented in compliance with data protection laws.
How to address the privacy concerns
While most people will in the present circumstances accept some degree of compromise on their privacy in the interests of their, and the nation’s, health, this has to be proportionate with the App being as minimally privacy invasive as is possible. To ensure widespread adoption on the App, it will be essential to ensure that privacy concerns are comprehensively addressed. There are a number of steps that must be taken.
First, NHSX should reconsider the centralised data approach and consider switching to a localised data solution. As the ICO commented, a purely localised system without a centralised dataset must inherently be more secure. It would also have the benefit of achieving greater interoperability with localised solutions being implemented by other countries; in particular, it is important to have interoperability on the island of Ireland.
NHSX counter this, however, by saying that there are public health benefits in their having access to the big data for analytics and research so as to learn more about the virus. It may also help limit malicious self-reporting (which could be done to try to put someone into self-isolation).
While a centralised system can be made to work, it is the case that much greater efforts in terms of data security will be required if public confidence is to be won over. There is a trade-off between functionality and public confidence; the more you try to get of the one, the less you get of the other. And public confidence is critical for widespread adoption, and ultimately for success, of the App.
There have been reports in the past few days of NHSX investigating the feasibility of transitioning the App to Apple and Google’s technology, and this could indicate a change of heart and a shift towards a localised data approach.
Second, transparency. Provision of transparent information regarding how a person’s data is to be used is a central requirement under the GDPR. This requires that information be provided in a concise, transparent, intelligible and easily accessible form, using clear and plain language.
Given that the App is to be used by the general population, the privacy notice will need to be carefully and skilfully drafted so that it is accessible to all whether young, old, or with reading difficulties. It is yet unknown what the age requirement will be for the App; but particular care will be needed for information addressed to children.
We also need to know who will be the “controller” of this data and with whom it may be shared and for what purpose. Will the controller be the NHS, or will it be the Government?
Transparency will also be well served by making public the NHSX Data Protection Impact Assessment. Under GDPR, a DPIA – a form of risk assessment – is required whenever using a new technology that is likely to result in a high risk to the rights and freedoms of individuals. The GDPR says a DPIA is specifically required where the technology involves a systematic and extensive evaluation of personal aspects relating to individuals which is based on automated processing, and on which decisions are based that significantly affect the person; or where there is processing on a large scale of special categories of data such as health data; or where there is systematic monitoring of a publicly accessible area on a large scale. In some ways, the App ticks all of these boxes and the DPIA will be a critical document.
The DPIA must contain a systematic description of the processing operations and the purposes for which the data will be used, an assessment of the necessity and proportionality of the processing in relation to these purposes, an assessment of the risks to the rights and freedoms of individuals and the measures to be taken to address these risks, including safeguards and security measures to ensure the security of the data.
NHSX must share this DPIA as soon as possible with the ICO (as contemplated by Art 36 GDPR) for consultation. While not a legal requirement, it should also be made public for wider consultation. Unless the government so requires, the DPIA does not need to be approved by the ICO as such; however, NHSX should consider and implement as appropriate any advice and recommendations that the ICO, as the independent privacy watchdog, may put forward.
Finally, the working of the App should be open to audit and review by independent experts, not as a one-off, but on an ongoing basis.
Under data protection laws, processing of personal data is only lawful if there is a “lawful basis” for the processing. The GDPR sets out six possibilities; the main options for the App will be user “consent” or “performance of a task in the public interest”. Health data requires an additional lawful basis which could be satisfied by “explicit consent” or for public health reasons.
It is not yet known which of these lawful bases will be applied. While the App is entirely voluntary to use, it may be that consent is not the best option as it can be difficult to establish that a valid consent has been obtained. However, consent may be required under the GDPR on the basis that the App involves “automated decision making”.
As the App accesses data on the device, it could be that consent is required under the Privacy and Communications Regulations (PECRs). If consent were required under PECRs, then it would also be necessary to use consent as the lawful basis under the GDPR. Consent will not be required under PECRs if the exemption applies where the access to the data is “strictly necessary for the provision of” the service requested by the user. If, however, the App is to access any data that is not “strictly necessary”, then consent would be required by law.
While the App may or may not rely on “consent” as the lawful basis, it is important for public trust that its use is truly voluntary. A person is free to download it, and delete it, as they wish. They are free to choose whether to update their health status or not. And – if warned that they have been in proximity with an infected person – they are free to self-isolate or not as they choose.
One of the central principles of the GDPR is ‘data minimisation’ – that data being collected must be limited to what is necessary in relation to the purposes for which they are collected. It is essential for this, therefore, to identify and articulate the purpose and then test whether the data being collected is necessary for this.
For example, the App requires proximity data, but it does not require location data (although, confusingly, the App requires Android users to turn “location services” on in their settings, this is apparently because Bluetooth only works on Android phones when this is switched on but this will not result in location data being collected). If there is the potential with a centralised system to add additional data elements, such as location data, then that could breach this central principle of the GDPR.
It has been suggested that users of the App will not need to add their name or other identifiers, but will be required to enter the first half of their post code. This alone will not ordinarily be sufficient to identify a person, but may serve a purpose in enabling NHSX to spot clusters of infection.
Under GDPR data can only be collected for specified, explicit and legitimate purposes and must not be further processed in a manner that is incompatible with those purposes. The GDPR allows for further processing for scientific research or statistical purposes in addition to the initial purposes. This is an important legal constraint on feature creep, but is it enough to give people confidence that their data will not be used for other purposes?
A further principle is that data must not be kept for longer than is necessary for the purposes for which the personal data are processed. A key issue is what happens to all the data after the Covid-19 crisis has subsided and it will no longer be necessary to track and trace. The data should then be securely destroyed or completely anonymised, but what guarantee is there that this will happen? The data retention period in relation to the data must be set out in the privacy notice to be issued with the App. This will need to reflect this principle and we have to have confidence that NHSX will honour it.
It is a fundamental requirement of data protection that appropriate technical and organisational measures are taken to ensure a level of data security appropriate to the risks. This will require implementation of state-of-the-art encryption of the data at rest and in transit. Following the GDPR principle of data protection “by design and by default”, data security and compliance with the other principles must be designed in to the way the App is built and used.
While data security is never 100% guaranteed, the public will need to be satisfied through the provision of transparent information that rigorous safeguards are in place.
Do we need a specific NHSX App watchdog?
While we have the ICO who is the regulator for compliance with data protection laws, we do have separate watchdogs for specific areas, for example, biometrics and communications monitoring. Given the speed at which the App needs to be rolled out if it is to be effective, and given that the ICO is well established and respected as the regulator for data matters under GDPR and the Data Protection Act 2018, with powers to audit, investigate complaints and issue substantial fines, the ICO is the appropriate regulator and an additional regulatory regime should not be needed.
Is specific legislation needed?
Some have suggested that specific regulation is needed to enshrine some necessary safeguards in law. Again, given timing imperatives, and given the flexible and well developed structure we already have with the GDPR and the Data Protection Act 2018, this may be a “nice to have” but should not be necessary.
Clearly, contact tracing could be highly beneficial to employers, since it could reduce the need to carry out manual contact tracing in the event an employee falls ill with coronavirus. So, can an employer make downloading the App compulsory?
The answer will depend to some extent on the lawful basis that is relied on for the processing of personal data through the App. If the lawful basis is “consent”, then compelling employees to download and use the App will invalidate any apparent consent since it will not have been freely given. If the lawful basis is “public interest”, then employers will need to decide if they should seek to compel, or alternatively strongly recommend, their employees to download and use the App. If they seek to compel, and an employee refuses, it is hard to see that the employee can with fairness be subjected to any detriment other than as required for health and safety.
We all have a strong interest in the App being rolled out, gaining maximum levels of public adoption and making a valuable contribution to fighting the virus. For this it will be necessary for the public to have a high level of trust in the App and its privacy safeguards. Good data protection will be an essential ingredient to achieving this trust.
Contact us
If you have any questions about these issues in relation to your own organisation, please contact a member of the team or speak with your usual Fox Williams contact.