What follows below is an edited version of the
debate in the House of Lords of the Second Reading of the Data Protection
Bill, held on 10 October. Colleagues that prefer not to read the entire
(46,709 word) transcript of the 5 hour debate will get an impression of the key
interventions in this (16,000 word) summary:
My Lords, I am delighted to be moving the Second
Reading today.
New technologies have started innumerable
economic revolutions, and the pace of change continues to accelerate. Data is
not just a resource for better marketing, better service and delivery. Data is
used to build products themselves. It has become a cliché that data is the new
oil.
In our manifesto at the general election we
committed to provide people with the ability to require major social media
platforms to delete information held about them, especially when that
information related to their childhood. The new right to be forgotten will
allow children to enjoy their childhood without having every personal event,
achievement, failure, antic or prank that they posted online to be digitally
recorded for ever more. Of course, as new rights like this are created, the
Bill will ensure that they cannot be taken too far. It will ensure that
libraries can continue to archive material, that journalists can continue to
enjoy the freedoms that we cherish in this country, and that the criminal
justice system can continue to keep us safe.
The new right to data portability—also a
manifesto commitment—should bring significant economic benefits. This will
allow individuals to transfer data from one place to another. When a consumer
wants to move to a new energy supplier, they should be able to take their usage
history with them rather than guess and pay over the odds. When we do the
weekly supermarket shop online, we should be able to move our shopping list
electronically. In the digital world that we are building, these are not just
nice-to-haves; they are the changes that will drive innovation and quality, and
keep our economy competitive.
The Bill will amend our law to bring us these
new rights and will support businesses and others through the changes. We want
businesses to ensure that their customers and future customers have consented
to having their personal data processed, but we also need to ensure that the
enormous potential for new data rights and freedoms does not open us up to new
threats. Banks must still be allowed to process data to prevent fraud;
regulators must still be allowed to process data to investigate malpractice and
corruption; sports governing bodies must be allowed to process data to keep the
cheats out; and journalists must still be able to investigate scandal and
malpractice. The Bill, borrowing heavily from the Data Protection Act that has
served us so well, will ensure that essential data processing can continue.
Noble Lords will be familiar with the role of
the Information Commissioner, whose role is to uphold information rights in the
public interest, promoting openness by public bodies and data privacy for
individuals. The Bill provides for her to continue to provide independent
oversight, supervising our systems of data protection, but we are also
significantly enhancing her powers. Where the Information Commissioner gives
notices to data controllers, she can now secure compliance, with the power to
issue substantial administrative penalties of up to 4% of global turnover. Where
she finds criminality, she can prosecute.
I congratulate the Bill team on the excellence
of the paperwork that we have received—I am sure everybody has read it, word
for word, all the way through; it is worth it. They are obviously ahead early
in the “Bill team of the year” stakes, a prize which they won easily last time
on the Digital Economy Bill, and they are building on that.
This is a tricky Bill to get hold of, first
because of its size and volume. It is a bulky package and it is not even
complete because we are told to expect a large number of amendments still being
processed and not yet available which may—who knows?—change it substantially.
Even without that, it has 300 paragraphs and 18 schedules, one of which
helpfully signposts the way that the Government intend to make changes to the
Bill so that the GDPR becomes domestic law when we leave the EU, even though
the amendments to make that happen will actually be made by secondary
legislation. This is “Hamlet” without the prince.
The GDPR itself, which runs to 98 paragraphs—or
articles, as it calls them—and which will be the new data-processing law that
comes into force in May 2018 whether or not we in Parliament have agreed it, is
not actually printed in the Bill. That therefore raises the concern that—post
Brexit, courtesy of another, separate Bill, probably by secondary legislation—the
regulations will become UK law without ever having been scrutinised by either
House of Parliament. I wonder if other noble Lords share my feeling that this
is a bad precedent and, if so, what we might do about it. I suspect that this
decision might have made sense were we to stay in the EU but we are going to
leave, so there is a gap in our procedures here. That is compounded by the fact
that this is a Lords starter Bill that comes to us without the benefit of
consideration in the other place, and particularly without the usual
evidence-taking sessions that ensure that a Bill meets the needs of those
affected by it.
I have a suggestion: could the authorities look
carefully at the Bill and at the GDPR in its printed form and arrange for that
committee to bring forward either a report or simply a testimony about what the
GDPR contains, how it is reflected in the Bill and how it works? It would help
the House to do the job that we ought to be doing of scrutinising this
legislation.
In his opening remarks, the Minister said all
the right things about the Government’s commitment to unhindered and
uninterrupted flows of data post Brexit, but the Bill comprehensively fails to
set out how they plan to deliver that outcome. Worse, it may contain measures
in Parts 3 and 4 that make it impossible to achieve the “adequacy” agreement,
which is the only card that they have left to play post Brexit. You could not
make it up.
Some 43% of EU tech companies are based in the
UK and 75% of the UK’s data transfers are with EU member states. Even if the
Bill successfully aligns UK law with the EU data protection framework as at 25
May 2018, that does not mean that the Bill makes proper provision for the
future. On the UK’s exit from the EU, the UK will need to satisfy the European
Commission that our legislative framework ensures an “adequate level of
protection”, but achieving a positive adequacy decision for the UK is not as
uncontentious as the Government think.
On more concrete issues about the rights of data
subjects, we have a number of issues to pursue, although today I shall
concentrate on only three: children and the “age of consent”, the rights of
data subjects in relation to third-party use of their data, and the proper
representation of data subjects. I shall end with some thoughts on the Leveson
report and its implications for this Bill.
The Bill proposes to set the age at which
children can consent to the processing of their data through “information
society services” which include websites and social media platforms at 13
years. That is a surprising decision and no credible evidence has been adduced
to support it. Understandably, there is much concern about this low age limit,
particularly as the general data protection regulation gives discretion in a
range up to 16 years of age. Last month, the Children’s Commissioner for
England said:
“The social media giants have … not done enough
to make children aware of what they are signing up to when they install an app
or open an account”.
These are often the first contracts a child
signs in their life, yet,
“terms and conditions are impenetrable, even to
most adults”.
I think we can all say “Hear, hear” to that. The
commissioner also said:
“Children have absolutely no idea that they are
giving away the right to privacy or the ownership of their data or the material
they post online”.
Setting an age limit of 13, or even 16, would
almost certainly be illegal under the UN Convention on the Rights of the Child,
to which the UK is a signatory. Perhaps the Government could respond on that
point.
The Children’s Society argues that if companies
continue to rely on their current practices—whereby they allow only over-13s to
have an account but have no age verification process to check that children who
are consenting are the age they state themselves to be—then there will continue
to be widespread breaches of both the companies’ own rules and this new Data
Protection Act. In the Bill, it is unclear how breaches will be handled by the
Information Commissioner and what penalties will be put in place for those
companies failing to verify age properly.
There is also no consideration in the Bill about
capacity, rather than simply age, or protection for vulnerable children.
Although there are arguments for setting the age limit higher—or indeed
lower—there is surely a need both for proper evidence to be gathered and for a
minimum requirement for companies to have robust age verification systems and
other safeguards in place before any such legislation is passed. We will pursue
that. There is also the question of the overlap this derogation has with the
right to be forgotten, which the Minister mentioned. That right kicks in only
at age 18; we need to probe why that is the case and how that will work in
practice.
Concern about the increasing use of algorithms
and automatic data processing needs to be addressed, perhaps requiring
recording, testing and some level of disclosure about the use of algorithms and
data analysis, particularly when algorithms might affect employment or are
used in a public policy context. Related to that is the question of the
restriction on data subjects’ rights in relation to processing data contained
in documents relating to criminal investigations. Here, we agree with the Information
Commissioner that the provision, as drafted, restricts not just access rights
but the right to rectification, the right to erasure and the restriction of
processing. We welcome greater clarification on the policy intent behind this
as we go into Committee.
We welcome the Government’s proposal for an
offence of knowingly or recklessly re-identifying de-identified personal data
without the data controller’s consent. The rapid evolution of technology and
growth in the digital economy has led to a vast increase in the availability
and value of data. There is a clear need for robust safeguards against misuse
in this area.
On representation, we welcome the provision in
article 80(1) of the GDPR which gives greater ability for civil society and
other representative bodies to act on behalf of citizens and mirrors consumer
rights in goods and services. However, article 80(2) contains a provision that
the Government have chosen not to implement, under which consumer groups that
operate in the privacy field can act on behalf of data subjects without a
particular complainant. We think that this super-complainant system would help
to protect anonymity and create a stronger enforcement framework. We know we
are supported in that belief by the Information Commissioner.
The wider question here is perhaps whether data
subjects in general, particularly vulnerable ones, have sufficient support in
relation to the power of media companies that want access and use their data.
Does any of us know what really happens to our data? The Information
Commissioner’s Office already has a huge area of work to cover and may struggle
to cover all its new responsibilities. Having a better system for dealing with
complaints submitted by civil society bodies may be a good first step, but I
wonder whether we might think harder about how this will be organised—perhaps
modelled on the Caldicott data guidelines.
I give notice that we will probe whether the
Government intend to implement amendments previously made to Section 55 of the
Data Protection Act by virtue of Section 77 of the Criminal Justice and
Immigration Act 2008, which would allow terms of imprisonment of up to two
years to be imposed for offences of unlawfully obtaining disclosure of personal
data. As the Information Commissioner has previously noted, this has much wider
application than just to the press, because there is an increasing number of
cases of blagging and unauthorised use of personal data which must be stopped.
The Government have set themselves a very tight
timetable to pass this Bill into law before the end of April 2018. We will
support the main principles of the Bill, but, as indicated above, many areas
need to be scrutinised in depth before we can agree to them.
It is clear that the Brexit decision and
timetable will cast a long shadow as we debate the Bill. The Information
Commissioner, Elizabeth Denham, has already warned that data adequacy status
with the EU will be difficult to achieve within the Government’s Brexit
timetable and a major obstacle has been erected by the Government themselves.
The European withdrawal Bill makes it clear that the EU Charter of Fundamental
Rights will not become part of UK law as part of the replication process, yet
Article 8 of the charter relating to personal data underpins the GDPR. How then
will we secure adequacy without adhering to the charter?
While referring to the Information Commissioner,
I put on record our view that the Information Commissioner’s Office must
continue to be adequately funded and staffed during this period of great
uncertainty. The biggest changes since our debates on the Data Protection Act
1998, or even the early stages of the GDPR, which I was involved in as a
Minister at the MoJ from 2010 to 2013, is that the threat to civil liberties
and personal freedoms now comes not only from agencies of the state but from
corporate power as well.
We have become accustomed to the idea that some
financial institutions are too big to fail. Are we approaching a situation
where these global tech giants are too big to regulate? We have to devise
legislation and have the political courage to bring the global tech giants within
the compass of the rule of law.
These modern tech giants operate in a world
where the sense of privacy which was almost part of the DNA of my own and my
parents’ generation is ignored with gay abandon by a generation quite willing
to trade their privacy for the benefits, material and social, that the new
technology provides.
The elephant in the room always in discussing a
Bill such as this is how we get the balance right between protecting the
freedoms and civil liberties that underpin our functioning liberal democracy
while protecting that democracy from the various threats to our safety and
well-being. The sophisticated use of new technologies by terrorist groups and
organised crime means that we have to make a sober assessment of exactly what
powers our police and security services need to combat the terrorist attack and
disrupt the drug or people trafficker or the money launderer. The fact that
those threats are often overlapping and interconnected makes granting powers
and achieving appropriate checks and balances ever more difficult.
On the issue of crime fighting, one point was
made with particular vigour by Thomson Reuters. With offerings such as
World-Check, it plays a key role in Europe and globally in helping many private
sector firms and public authorities identify potential risks in their supply
chains, customers and business relationships. It made it clear that it will be
needing a number of clarifications in the Bill so that it will be able to
continue to provide its important services, and we will probe those concerns
and the concerns of others in the private sector in Committee.
There is no doubt that the greater transparency
and availability of data provided by government has contributed to citizens’
better understanding of and access to government information and services, but
public concerns remain about the use of data in certain sectors. For example,
although there are clear benefits to medical research from giving researchers
access to anonymised medical data, it remains a matter of concern to the
public, the media and the profession itself.
I do not believe that sprinkling Bills with
Henry VIII clauses is an answer to the challenge of future-proofing. Perhaps
there is a case for expanding the remit of the National Data Guardian to act as
an early warning system on wider data abuse—or that of the Information Commissioner
or our own Select Committee—but there is a need. I fear that without some
permanent mechanism in place, we will be for ever running up the down escalator
trying to match legal protections to technical capacity. But that is no excuse
for not trying to improve the Bill before us.
My Lords, as chairman of the EU Home Affairs
Sub-Committee, I will speak mainly about the EU Committee’s report on the EU
data protection package, which we are debating alongside the Second Reading of
the Data Protection Bill.
In their recent Brexit position paper, The
Exchange and Protection of Personal Data—A Future Partnership Paper, the
Government said that they wanted to maintain free and uninterrupted data flows
with the EU after we leave; and in proposing a new security and criminal
justice treaty between the UK and the EU in her recent Florence speech, the Prime
Minister laid out her ambition for a model underpinned by, among other things,
high standards of data protection. Our report supports this objective: free and
uninterrupted data flows matter to us all. But the committee was struck by the
absence of clear and concrete proposals for how the Government plan to deliver
that objective. The stakes are high, not least because the introduction of
greater friction in data transfers could present a real barrier to future
trade. It is hard to overstate the importance of cross-border data flows to the
UK economy. Getting on for half of all large EU digital companies are based in
the UK, and three-quarters of the UK’s cross-border data flows are with EU
countries. What is more, any impediments to data flows following our withdrawal
from the EU could seriously hinder police and security co-operation, and that
means that lives, not just money, are at stake.
In our report, we considered four elements of
the EU’s data protection package: the general data protection regulation—the
GDPR—which the Data Protection Bill seeks to transpose into UK law; the police
and criminal justice directive; the EU-US privacy shield, and the EU-US
umbrella agreement. Both the regulation and the directive will enter into force
in May 2018, while we are still a member of the EU. The agreements with the US
are already in force, but will cease to apply to the UK after our withdrawal.
Our report considers the Government’s policy options both short and long term.
The committee wanted first to look at possible
data protection arrangements once the UK becomes a third country outside the
EU, and we heard evidence on two broad options. The first option is for the UK
Government to secure a so-called adequacy decision from the European Commission
which would certify that the UK offered a standard of protection that was
“essentially equivalent” to EU data protection standards. To date, the
Commission has adopted 12 such decisions. The second option would be for
individual data controllers and processors to adopt their own safeguards using
tools such as standard contractual clauses and binding corporate rules. Our
report comes to a clear conclusion that this second option would be less
effective. The tools available to individual data controllers, including small
businesses, are bureaucratic and would be vulnerable to legal challenges. We
therefore agree with the Information Commissioner that the Government should
seek an adequacy decision for the UK as a whole. This should offer certainty
for businesses, particularly SMEs. It would also follow the approach taken by
Switzerland, which has secured an adequacy decision from the EU. I am therefore
pleased that the Government’s position paper also calls for a future
relationship that builds on the adequacy model.
But there is a fly in this particular ointment.
The general data protection regulation only provides for adequacy decisions for
third countries, not countries leaving the EU. Decisions also follow a lengthy
procedure, so the chances of having an adequacy decision in place by March 2019
are small. So to avoid a cliff edge, we will need transitional arrangements.
The Government’s position paper acknowledges this but lacks detail. I hope that
in responding to this debate the Minister will update us on the Government’s
thinking on transition and perhaps provide some more of that detail. In
particular, I hope that as a Home Office Minister she can comment on the risks
facing law enforcement. One of the most striking findings in our inquiry was
that as a third country the UK could find itself held to higher standards of
data protection than as a member state. This will be the case both when the
European Commission considers an adequacy decision and when the UK’s data
retention and surveillance regime is tested before the Court of Justice, at
which point we will no longer be able to rely on the national security
exemption enjoyed by member states under the EU treaties. The United States has
fallen foul of EU data protection law in the past, and it is not impossible that
the United Kingdom will do the same when it is no longer a member state.
On a related theme, the committee also
considered whether the UK’s data protection regime would continue to be
influenced by EU legislation after withdrawal. What we found was that the
general data protection regulation will continue to apply to transfers of
personal data from the EU to the UK, significantly affecting UK businesses that
handle EU data. If we obtain an adequacy decision, the rulings of the new
European Data Protection Board and the Court of Justice will have an effect,
albeit indirectly, by altering the standards that the UK will need to maintain
an adequate level of protection. This means that there will be no clean break.
We will also continue to be affected by EU rules on the onward transfer of
personal data to third countries. This could be a particular problem in the
field of security, whereby our approach to sharing personal data with, say, the
United States could put any adequacy decision at risk. In summary, it seems
likely that EU and UK data protection practices will need to remain alive long
after we leave the EU.
The Bill that we are debating today reflects a
comprehensive EU data protection regime which has been heavily influenced over
the years by the United Kingdom. Withdrawal from the EU means that we stand to
lose the institutional platform from which we have exercised that influence.
The committee’s report therefore concludes that the Government must aim to
retain the UK’s influence wherever possible, starting by securing a continuing
role for the Information Commissioner’s Office on the European Data Protection
Board. I am glad that the Government’s data protection position paper spells
out our aim to do just that, but in the longer term, the Government will also
need to find a way to work in partnership with the EU to influence the
development of data protection standards at both the EU and the global level.
The continued success of our commercial and security relations with the EU will
depend on that.
Although I also welcome the rights and
protections for children that the Bill offers, not least the right to be
forgotten, there is one very important point of detail where reconsideration is
urgently needed, namely the age of consent for children to give their personal
information away online in exchange for products and services without a parent
or guardian needing to give their permission. The proposals in Clause 8, as we
have already heard, set this age of consent at 13. However, a recent YouGov
survey of the public commissioned by the BCS, the Chartered Institute for IT,
shows very little support for this. Indeed, a whopping majority of 81% thought
the age should be set at either 16 or 18. The Bill’s Explanatory Notes state
that the Government have chosen this age—the youngest possible allowed under
the incoming GDPR rules—because it is,
“in line with the minimum age set as a matter of
contract by some of the most popular information society services which
currently offer services to children (e.g. Facebook, Whatsapp, Instagram)”.
In other words, a de facto standard age of
consent for children providing their personal information online has emerged,
and that age has been set by the very companies that profit from providing
these services to children. It might be that 13 is an appropriate age for
consent by children to give their information away online, but surely that
should be decided in other ways and with much greater reference to the public,
and I do not think this has happened. It is certainly at odds with the results
of this recent survey.
Moreover, Growing Up with the Internet,
the recently published report of the Select Committee on Communications, on
which I am privileged to serve, examined the different ways in which children
use the internet through the different stages of childhood. We received lots of
evidence that lumping together all young people between the ages of 13 and 18
was really not helpful, and that much more research was needed. To bow to the
commercial interests of Facebook and others therefore feels at the very least
premature, and the example of its usefulness given in the Explanatory
Notes—that this would somehow ease access to,
“educational websites and research resources”,
so that children could “complete their
homework”—somewhat naïve, particularly in the light of otherconclusions and recommendations
from the Growing Up with the Internet report, not least that digital
literacy, alongside reading, writing and arithmetic, should be considered a
“fourth R”; that the Government should establish the post of a children’s
digital champion at the centre of government; that children must be treated
online with the same rights, respect and care that has been established through
regulation offline; and that all too often commercial considerations seem to be
put first. So 13 might be the right age but it might not, and at the very
least, further consultation with the public and with parents is needed.
As the UK leaves the EU, it will be essential—I
use the word “essential”—for the UK to be able to demonstrate adequacy. I hope
the Government will assure us on that point and produce the necessary
regulatory framework to enable it to happen. Adequacy does not mean that the UK
should simply cut and paste all EU legal provisions where reliance on national
law and derogations are real options in front of us. There are some where we
should be availing themselves of them. Nor do we need to make privacy
safeguards—which are very important—so demanding that they become
self-defeating, standing in the way of benefiting patients, in the case of
medicine, and the community more generally.
The Government have made it clear that they want
the Bill to support research, which is extraordinarily welcome. I hope that
when she replies, the Minister will be able to say something about how the
Government will approach the changes that will be needed to deal with research
issues in the UK. The Bill classes universities as public bodies, and
universities lie at the core of the research community. It is fair enough for
universities to be classed as public bodies—that is what they are—but the
legislation then denies them the right to invoke public interest, or even legitimate
interest, as a basis for their research, and thus obliges them to seek explicit
consent when using data at every stage of processing. This becomes very onerous
if you are doing a long study. That may on the face of it seem reasonable but,
in practice, it can do real harm. The whole point of research is that often at
the outset it cannot be 100% certain where it may lead or whether further
processing or trials may be necessary. You can get a situation in which
unexpected and unplanned-for research is available and could yield real
dividends. That is especially true of interventional research. If, as a result
of wanting to take it to a further stage, the data processing demands that
there should be another round of explicit consent, you get into a situation
whereby universities—unlike some of the public bodies in government, which do
not have to follow this procedure—have to go round again to all those who
offered their personal data in the first place. Seeking the consent of holders
of the data anew may simply not be possible, especially in long-term research
projects. People move house or become incapable; they also die.
Even if those problems can be overcome—and I
think they are real—there is a question of proportionality. Why make consent so
onerous that it makes research too difficult in practice and too costly to
engage in? There needs to be greater proportionality on this issue and greater
alignment between the various bodies that use data in this way, and there needs
to be some alternative to consent as the basis for engaging in some kinds of
research. Numerous government mechanisms are available, not least ethics
committees, which are a key component of modern research and could provide the
necessary safeguards against abuse. I recognise that there need to be
safeguards, but I suggest that we should use some imagination in how they could
be brought about.
I am involved with an organisation called
Unique, which deals with rare genetic disorders, whereby datasets to be useful
have to be gathered globally. The number of people with those afflictions is so
tiny in any given population that you have to go across the globe to connect
useful datasets, which means in turn that you come up against some of the
provisions that govern transnational transmission of data. However, the rarity
of such individual disorders also makes every patient’s data precious to other
affected individuals, because it is potentially a very tight community. No
other organisation is dealing with that affliction in that way, and Unique can
give support and advice to otherwise lonely parents and their equally isolated
medics, who turn to Unique for information about alike cases. There is a
network there.
By insisting on onerous consent regimes, we are
in danger of disabling such organisations from continuing their pioneering
work. In Unique, it is not uncommon for parents who have not been in touch for
a long time suddenly to turn to it with a request for help. Try telling
families, many of whom are not in the UK but are in third countries, who are
coping with the daily stress of caring for a disabled child or adult, that they
must be sure to keep up online with the stringent requirements of UK data
legislation and that failing to do so will mean that they run the severe risk
of no longer being able to get the kind of individualised attention and support
that they seek from the very organisations set up to help them. The problem is
that the law will lay down the need for the regular reconsultation and
re-consent of individuals in very precise ways, and that such individuals might
not reply, not understanding the potential hazards involved in failing to do
so. One might say that data anonymisation might solve the problem. It solves
some problems, but it creates new ones in an organisation set up for certain
purposes where the idea is that one fellow sufferer can help another. So piling
difficulties on small organisations—there are other difficulties that I have
not even mentioned—might lead ultimately to an unwanted outcome, which will be
a reduction in effectiveness.
I would like the Government to think about the
possibility that they should allow for the creation of governance and
accountability regimes that will fit special circumstances—and I am sure that
we will come across others as we go through this legislation. The existence of
the Information Commissioner should not result just in enforcing the law
effectively and well; it should provide an opportunity for creativity under her
auspices and the ability to create variations on governance regimes where they
are needed.
I am rather concerned about the clarity of this
very substantial Bill. It is explained that the format is chosen to provide
continuity with the Data Protection Act 1998, but whether or not as a result of
this innocent, no doubt valuable, choice, it seems to me that some confusion is
thereby created.
First, there is the fact that the GDPR is the
elephant in the room—unseen and yet the main show in town. You could call it
Macavity the cat. The noble Lord, Lord Stevenson, dubbed the Bill Hamlet
without the Prince. Traces exist without the GDPR being visible. Is the
consequent cross-referencing to an absent document the best that can be done? I
realise that there are constraints while we are in the EU, but it detracts from
the aims of simplicity and coherence. Apparently, things are predicted to be
simpler post Brexit, at least in this regard, when the GDPR will be
incorporated into domestic law under the withdrawal Bill in a “single domestic
legal basis”, according to the Explanatory Memorandum. Does that mean that this
Bill—by then it will be an Act—will be amended to incorporate the regulation?
It seems odd to have more clarity post Brexit than pre-Brexit. It would no
doubt be totally unfair to suggest any smoke-and-mirrors exercise to confuse
the fact of the centrality of EU law now and in the future.
Secondly, we seem to have some verbal gymnastics
regarding what “apply” means. The departmental briefing says that the Bill will
apply GDPR standards, but then we have the so-called “applied GDPR” scheme,
which is an extension of the regulation in part 2, chapter III. Can the
Minister elaborate on precisely what activities part 2, chapter III covers?
The Bill says that manual unstructured files come within that category. I do
not know how “structured” and “unstructured” are defined, but what other data
processing activities or sectors are outside the scope of EU law and the
regulation, and are they significant enough to justify putting them in a
different part?
I will highlight, rather at random, some other
examples which need reflection. We may need seriously to look at the lack of
definition of “substantial public interest” as a basis for processing sensitive
data, or even of public interest. I think the noble Lord, Lord Stevenson,
mentioned the failure or the non-taking-up of the option under Article 80(2) of
the regulation to confer on non-profit organisations the right to take action
pursuing infringements with the regulator or court. This omission is rather
surprising given that a similar right exists for NGOs, for instance, for breach
of other consumer rights, including financial rights. Perhaps the Minister
could explain that omission.
There is also concern that the safeguards for
profiling and other forms of automated decision-making in the Bill are not
strong enough to reflect the provisions of Article 22 of the GDPR. There is no
mention of “similar effects” to a legal decision, which is the wording in the
regulation, or of remedies such as the right of complaint or judicial redress.
Very significant is the power for the Government
under Clause 15 to confer exemptions from the GDPR by regulation rather than
put them in primary legislation. That will need to be examined very carefully,
not only for domestic reasons but also because it could undermine significantly
an adequacy assessment in the future.
Clause 7 refers to alternatives to consent. The
noble Baroness, Lady Neville-Jones, referred briefly to the problems that
arise. For many uses of personal data, explicit consent is absolutely the right
legal basis for processing that data, and it is positive that, with the GDPR,
data subjects’ rights have been strengthened. Medical research will usually
rely on a person providing informed consent for ethical reasons, but it is
essential that there are alternatives to consent as a legal basis. That is
because GDPR-compliant explicit consent sets a high bar for information
provision that it may not always be feasible to meet. In many research
resources, such as biobanks—I hope that my noble friend Lady Manningham-Buller
will refer to that as the chairman of the Wellcome Trust, which is responsible
for initiating the UK Biobank—the participants give consent for their
pseudonymised data to be used.
In some studies it is not possible to seek
consent, either because a very large sample size is needed to generate a robust
result, and that would be practically difficult to obtain, or because seeking
consent would introduce bias. The use of personal health data without specific
explicit consent is sometimes essential for research for the health of the
population. If researchers could not process medical records for research
without specific explicit patient consent, they could not run cancer
registries, which are extremely important in recording all cases of cancer; they
could not monitor the hazards of medical procedures, such as the recently
discovered implications of CT scans for long-term disease development; they
could not assess the unexpected side-effects of routinely prescribed medicines;
and they could not identify sufficiently large numbers of people with a
particular disease to invite them to take part in trials for the treatment of
that disease. The example I would give is the recruitment of 20,000 suitable
people for the Heart Protection Study on statins, which has helped transform
medical practice throughout the world. I am sure that many noble Lords use
statins. This began with the identification of 400,000 patients with a hospital
record of arterial disease and that information could not have been accessed
without their permission. There are good examples of how this provision would
cause a problem as it is enunciated in Clause 7.
We have a well-established, robust system of
governance and oversight for non-consensual medical research in the UK; for
example, through the Health Research Authority, a confidentiality advisory
group, advising on Section 251 approvals to override the common law duty of
confidentiality. Patient groups actively advocated for research exemptions
during the passage of the GDPR—for example, through the Data Saves Lives
campaign. I hope that, in Committee, we might get an opportunity to explore
this further to see whether we can somehow modify the Bill to make this
possible.
I come now to the public interest issues in the
same clause. I understand that the Government intend the functions listed in
Clause 7 not to be exhaustive, and to allow, for example, research conducted by
universities or NHS trusts to use the public interest legal basis. Again, the
noble Baroness, Lady Neville-Jones, briefly touched on that. It would provide
much-needed clarity and assurance to the research community, particularly to
those in the universities, if this could be made explicit in the Bill. A huge
amount of research will rely on public interest as a legal basis. The
Government have recognised the value of making better use of data for research,
and the recent life sciences industrial strategy confirms the tremendous
potential benefits for patients and the public if we can unlock the value of
data held by public authorities and promote its use in the public interest.
There is currently a highly risk-averse culture
in data protection, driven in part because people are unclear about the rules
and what they can or cannot do with data for their purposes—hence I referred to
the need for better governance of the data. This is why the public interest
legal basis matters so much for research. The DP Bill is an opportunity to set
out very clearly what the legitimate basis for processing personal data can be.
Setting out a clear public interest function for research will give researchers
confidence to know when they are operating within the law. If necessary, any
specification of research in Clause 7 could be qualified by safeguards to
ensure that the legal basis is used only when appropriate.
This is a welcome and necessary Bill. It is not
perfect, but I leap to its defence in at least one respect—namely; the absence
of the GDPR regulations themselves from the Bill. On the Government’s website,
there is a truly helpful document, the Keeling schedule, which sets out how the
GDPR intersects with the text of this Bill. After noble Lords have read it a
few times, it comes close to being comprehensible.
The Commission has estimated that this [GDPR] would
lead to savings of around €2.3 billion a year for businesses. But while the
rules might make things simpler for businesses in that respect, it is possible
that they will also make it easier for citizens to demand to know what
information is held on them in paper form as well as in digital form. In fact,
that is one of the main purposes of the Bill. So we might find that businesses
have more rather than less to do. I wonder whether that has been costed. It is
a good thing that citizens should find out what information people hold on
them, but we should not pretend that the exercise will be free of cost to
businesses. The Federation of Small Businesses estimates an additional cost of
£75,000 per year for small businesses, and obviously much more for larger ones.
The Bill contains a bespoke regime for the
processing of personal data by the police, prosecutors and other criminal
justice agencies for law enforcement purposes. The aim of this, which is
laudable, is to,
“ensure that there is a single domestic and
trans-national regime for the processing of personal data for law enforcement
purposes across the whole of the law enforcement sector”,
but what is the law enforcement sector? To what
extent do banks, for example, fall into the law enforcement sector? They have
obligations under the anti-money laundering rules to pull suspicions together
and to share those across borders—not just across European borders but
globally. How are those obligations tied in with the GDPR obligations in the
Bill? Businesses, especially banks, will need to understand the interplay
between the GDPR regulations, the anti-money laundering regulations and all of
the others. The Government would not, I know, want to create the smallest risk
that by obeying one set of laws you disobey another.
That sort of legal understanding and pulling
things together will take time. It will take money and training for all
organisations. There is a real concern that too many organisations are simply
hoping for the best and thinking that they will muddle through if they behave
sensibly. But that is not behaving sensibly. They need to start now if they
have not started already. The Federation of Small Businesses says that:
“For almost all smaller firms, the scope of the
changes have not even registered on their radar. They simply aren’t aware of
what they will need to do”.
Yet it goes on to say that,
“full guidance for businesses will not be
available until next year, potentially as late as spring. The regulator cannot
issue their guidance until the European Data Protection Board issue theirs”,
so there is a lot of work to be done.
My final point echoes one raised by the noble
Lord, Lord McNally, relating to the issue of the re-identification of personal
data which has been de-identified, as set out in Clause 162. The clause makes
it a crime to work out to whom the data is referring. The very fact that this
clause exists tells us something: namely, that whatever you do online creates
some sort of risk. If you think that your data has been anonymised, according
to the computational privacy group at Imperial College, you will be wrong. It
says:
“We have currently no reason to believe that an
efficient enough, yet general, anonymization method will ever exist for
high-dimensional data, as all the evidence so far points to the contrary”.
If that is right, and I believe it is, then
de-identification does not really exist. And if that is right, what is it in
terms of re-identification that we are criminalising under this clause? In a
sense, it is an oxymoron which I think needs very careful consideration. The
group at Imperial College goes on to suggest that making re-identification a
criminal offence would make things worse because those working to anonymise
data will feel that they do not have to do a particularly good job. After all,
re-identifying it would be a criminal offence, so no one will do it.
Unfortunately, in my experience that is not entirely the way the world works.
I hope that the Minister will set out some
clarification of the intentions of the Bill in relation to the consent of
children. Clause 8(b) includes an exemption for “preventive or counselling
services”. Does that mean that a child could give their consent to these
websites before the age of 13 or not at all? What is defined as a “preventive
or counselling service”?
Clause 187 gives further criteria for the
consent of children, but only children in Scotland where a child’s capacity to
exercise their consent should be taken into account, with the expectation that
a child aged 12 or over is,
“presumed to be of sufficient age and maturity
to have such an understanding”.
The Explanatory Notes to the Bill state that
this clause must be read with Clause 8, which provides that the age limit is
13. Is Clause 187 intended to say that the age of digital consent cannot go
below 13, which is the position of Article 8(1) of the GDPR, or that there
might be circumstances when a child who is 13 cannot consent for genuine
reasons? Either of these scenarios seem to give rise to confusion for children,
parents and the websites that children access.
After all the detailed discussions about age
verification that we had earlier in the year, there is an argument for age
verification to apply to Clause 8. How will websites that require a child to
verify that they are 13 years old ensure that the child is the age that they
say they are without some requirement for the site to prove the age of the
child? This is surely a meaningless provision. I hope that when the Minister comes
to reply, he will set out the Government’s position on this matter and explain
what penalties a website which breaches this age requirement will face..
There is much that is good in the Bill, but I do
not believe that it is yet the best that it can be.
I must start with a confession. Despite the kind
references today to my career and supposed expertise, I found this Bill
incredibly hard to read and even harder to understand. I fear that we will not
do enough to stop the notion, referred to by the noble Lord, Lord McNally, that
we are sleepwalking into a dystopian future if we do not work hard to simplify
the Bill and make it accessible to more people, the people to whom I feel sure
the Government must want to give power in this updated legislation. Let us
ensure that the Bill is a step forward for individual power in the rapidly
changing landscape in which we sit, a power that people understand and,
importantly, use. Let us make it an indicator to the world that the UK balances
the importance of tech start-ups, innovation, foreign investment and big
businesses with consumer and citizen rights.
The Government should be commended for getting
ahead of movements that are growing all over the world to free our data from
the tech giants of our age. As data becomes one of our most valuable
resources—as we have heard, the new oil—individuals have begun to want a stake
in determining for themselves when, how and to what extent information about
them is held and communicated to others. So I welcome the clear data
frameworks, which are important not only for the best digital economy but for
the best digital society.
I agree with much that has been said today but
want to make three specific points on the Bill. First, from any perspective,
the GDPR is difficult to comprehend, comprising sweeping regulations with 99
articles and 173 recitals. The Bill contains some wonderful provisions, of
which my favourite is:
“Chapter 2 of this Part applies for the purposes
of the applied GDPR as it applies for the purposes of the GDPR … In this
Chapter, “the applied Chapter 2” means Chapter 2 of this Part as applied by
this Chapter”.
Giving people rights is meaningful only if they
know that they have them, what they mean, how to exercise them, what
infringement looks like and how to seek redress for it. There are questions
about the practical workability of a lot of these rights. For example, on the
right to portability, how would the average person know what to do with their
ported data? How would they get it? Where would they keep it? There was a funny
example in a newspaper recently where a journalist asked Facebook to send them
all the data that it had collected over the previous eight years and received a
printed copy of 800 pages of data—extremely useful, as I think you will agree.
What about your right to erase your social media history? I should declare my
interest as a director of Twitter at this point. How can you remove content
featuring you that you did not post and in which people may have mentioned you?
What happens as the complexity of the algorithm becomes so sophisticated that
it is hard to separate out your data? How does the immense amount of machine
learning deployed already affect your rights, let alone in the future?
Awareness among the public about the GDPR is
very low—the Open Data Institute has done a lot of work on this which is soon
to be published. It is very unlikely that ordinary people understand this
legislation. They will have no understanding of how their rights affect them. A
lot of education work needs to be done.
For businesses, too, the learning curve is
steep, especially for foreign investors in European companies. Some are betting
that the sheer scope of the GDPR means that the European regulators will
struggle to enforce it. When the GDPR came up at a recent industry start-up
event, one industry source said that none of the people to whom they had spoken
could confidently say that they had a plan. Every online publisher and
advertiser should ensure that they do, but none of them is taking steps to
prepare.
So much has been done by this Government on
building a strong digital economy that it is important to ensure that small and
start-up businesses do not feel overwhelmed by the changes. What substantial
help could be planned and what education offered? What help is there with
compliance? By way of example, under Clause 13, companies have 21 days to show
bias from algorithms, but what does this mean for a small AI start-up which may
be using anonymised intelligence data to build a new transport or health app?
What do they have to think about to make good legal decisions? As my noble
friend Lord Jay so brilliantly argued, how can we ensure post-Brexit
legislative certainty for them in building global successful businesses?
This brings me to my second question: why has
the right of civil groups to take action on behalf of individuals been removed
from the UK context for the GDPR? Instead, the Bill places a huge onus on
individuals, who may lack the know-how and the ability to fight for their
rights. As has been mentioned, article 80(1) of the GDPR allows for
representative bodies—for example, consumer groups—to bring complaints at the
initiation of data subjects.. This omission is worrying, given how stretched
the ICO’s resources are and the impact this could have on its support for the
public. Granting rights over data to individuals is meaningless if individuals
lack the understanding to exercise those rights and there is no infrastructure
within civic society to help them exercise those rights. There does not seem to
be any good reason why the UK has chosen not to take up the option in EU law to
allow consumer privacy groups to lodge independent data protection complaints
as they can currently do under consumer rights laws.
Resourcing the ICO, Part 5 of the Bill, is
essential and my third main area of interest. The ICO has considerable
responsibilities and duties under the Bill towards both business and
individuals: upholding rights, investigating reactively, informing and
educating to improve standards, educating people and consumer groups, and
maintaining international relationships. I feel exhausted thinking about it.
The ICO’s workload is vast and increasing. It lacks sufficient resources
currently. In March 2017, the Information Commissioner asked Parliament if it
could recruit 200 more staff but the salaries it offers are significantly below
those offered by the private sector for roles requiring extremely high levels
of skills and experience. These staff are going to become ever more important
and more difficult to recruit in the future.
The ICO currently funds its data protection work
by charging fees to data controllers. It receives ring-fenced funding for its
freedom of information request work from the Government. This income can
increase the number of data controllers only as it increases: it is not in line
with the volume or complexity of work, and certainly not with that in the Bill.
Perhaps it is time for another method of funding, such as statutory funding.
Finally, I would like briefly to add my thoughts
on how the Bill affects children. As many noble Lords have said, the YouGov
poll does indeed say that 80% of the public support raising the age to
18—currently it is 13, as detailed by the Government. However, there are many
other surveys, particularly one by the Children’s Society, which show that 80%
of 13 year-olds currently have a social media account and 80% of people under
13 have lied or twisted their age in order to establish one. This is the
realpolitik in the war of understanding the internet with our children. I
respectfully disagree with the noble Baroness, Lady Howe, and others in the
Chamber: I feel strongly that it is wrong to place policing at the heart of how
we deal with relationships between children and the internet. We need to take a
systems-based approach. I have seen my godchildren set up fake accounts and
whizz around the internet at a speed I find alarming. We have to deal on their
terms. We have to help educators, parents and people supporting children, not
use the long arm of the law.
Like other noble Lords, I am concerned about
public trust and confidence in the system. At the moment there is a need for
guidance on preparation for the new regime. I visited a charity last week and
asked about the availability and accessibility of advice. The immediate, almost
knee-jerk response was, “It’s pretty dire”—followed by comments that most of
what is available is about fundraising and that there is a particular lack of
advice on how to deal with data relating to children. The comment was made,
too, that the legislation is tougher on charities than on the private sector. I
have not pinned down whether that is the case, but I do not disbelieve it. The
Federation of Small Businesses has made similar points about support for small
businesses.
Part of our job is to ensure that the Bill is as
clear as possible. I was interested that the report of the committee of the
noble Lord, Lord Jay, referred to “white space” and language. It quoted the
Information Commissioner, who noted trigger terms such as “high-risk”, “large
scale” and “systematic”. Her evidence was that until the new European Data
Protection Board and the courts start interpreting the terms,
“it is not clear what the GDPR will look like in
practice”.
I found that some of the language of the Bill
raised questions in my mind. For instance—I am not asking for a response now;
we can do this by way of an amendment later—the term “legitimate” is used in a
couple of clauses. Is that wider than “legal”? What is the difference between
“necessary” and “strictly necessary”? I do not think that I have ever come
across “strictly necessary” in legislation. There are also judgment calls
implicit in many of the provisions, including the “appropriate” level of
security and processing that is “unwarranted”.
Finally, I return to the committee report, which
has not had as much attention as the Bill. That is a shame, but I am sure we
will come back to it as source material. I noted the observation that, post
Brexit, there is a risk that, in the Information Commissioner’s words, the UK
could find itself,
“outside, pressing our faces on the glass …
without influence”,
and yet having,
“adopted fulsomely the GDPR”.
That image could be applied more widely.
For the public interest, terminology should be
extended so that we can look at issues of safeguards beyond consent and make
sure that it is possible to do clinical trials and interventional work. Why is
that the case? It is because health data offers the most exciting opportunities
to do things which we have only recently been able to do, understand the causes
of disease in detail over populations and have a much better chance of getting
to diagnosis early. We could deal with many things if we could only diagnose
them far earlier and develop treatments for them—indeed, prevent some of them
ever materialising. Health data also helps us to measure the efficacy of
treatment. We all know of plenty of treatments that over years have proved to
be useless, or unexpected ones that have proved to be outstanding. Looking at
big-scale data helps us to do that. That data helps in precision medicine,
which we are all moving towards having, where the drugs we receive are for us,
not our neighbour, although we apparently both have the same illness. Health
data can also help with safety as you can collect the side-effects that people
are suffering from for particular drugs. It helps us evaluate policy and, of
course, should help the NHS in planning.
I know that the Government want to support
scientists to process data with confidence and safety. The industrial strategy
comments that data should be “appropriately accessed by researchers”.
“Appropriate” is a hopeless word; we do not know what it means, but still. The
document also states that access for researchers to,
“currently available national datasets should be
accelerated by streamlining legal and ethical approvals”.
We are not there yet.
I want to say a word about public support. The
Wellcome Trust commissioned an Ipsos MORI poll last year before the Caldicott
review to assess public support for the collection of data. In many cases,
there is significant public support for that provided it is anonymised—although
I know there are questions about that—but what people are fussed about is that
their data is sold on for commercial purposes, that it is used for marketing
or, worst of all, that it is used to affect their insurance policies and life
insurance. Therefore, we need to give reassurance on that. However, it has
certainly been the case in our experience, and that of many universities, that
you can recruit many people for trials and studies if they believe that their
data will help others with similar diseases or indeed themselves.
I agreed with the noble Lord, Lord McNally, and
his worries about standing up to the tech giants. They are not our friends.
They are big, powerful companies that are not citizens of this country. They
pay as little tax here as possible and several of them actively help tax
evaders in order that they can make more profits out of the transactions that
that involves. They control what we see on the internet through algorithms and
extract vast quantities of data and know more about us than we know ourselves.
In the interests of democracy we really must stand up to them and say, “No, we
are the people who matter. It is great you are doing well, but we are the
people who matter”. Bills like this are part of that, and it is important that
we stand up for ourselves and our citizens.
My noble friend Lord Arbuthnot referred to a
Keeling schedule. It would be wonderful to receive it. For some reason I cannot
pick it up on the email. It is not in the documents listed on the Parliament
website, not in any location, and it does not Google or come up on GOV.UK. One
way or another, I think the simplest thing to ask is: please can we put it on
the parliamentary website in the list of documents related to the Bill? I know
that it exists, but I just cannot find it. It would be nice if it appeared on
the departmental website too.
It seems to me that bits are missing in a number
of areas. Where are Articles 3, 27, 22(2)(b) and 35(4) to 35(6)? Where is
Article 80(2), as the noble Baroness, Lady Lane-Fox, mentioned? That is an
absolutely crucial article. Why has it gone missing? How exactly is recital 71
implemented? I cannot see how the protections for children in that recital are
picked up in the Bill. There are a lot of things that Keeling schedules are
important for. In a detailed Bill like this, they help us to understand how the
underlying European legislation will be reflected, which will be crucial for
the acceptance of this Bill by the European Union—I pick up the point made by
the noble Lord, Lord Stevenson—and what bits are missing.
And what has been added? Where does paragraph 8
of Schedule 11 come from? It is a very large, loose power. Where are its edges?
What is an example of that? I would be very grateful if my noble friend could
drop me a note on that before we reach Committee. What is an arguable point
under that provision? Where are the limits of our economic interest so far as
its influence on this Bill is concerned?
Paragraph 4 of Schedule 10 is another place that
worries me. We all make our personal data public, but a lot of the time we do
it in a particular context. If I take a photograph with my
parliamentary-supplied iPhone, on which there is an app that I have granted the
power to look at my photographs for some purpose that I use that app for, I
have made that photograph and all the metadata public. That is not what I
intended; I made it public for a particular purpose in a particular
context—that of social media. A lot of people use things like dating websites.
They do not put information on there which is intended to be totally public.
Therefore, the wording of paragraph 4 of Schedule 10 seems to be far too wide
in the context of the way people use the internet. Principle 2 of the Data
Protection Act covers this. It gives us protection against the use of
information for purposes which it clearly has not been released for. There does
not appear to be any equivalent in the Bill—although I have not picked up the
Keeling schedule, so perhaps it is there. However, I would like to know where
it is.
On other little bits and pieces, I would like to
see the public policy documents under Clause 33(4) and Clause 33(5) made
public; at the moment they are not. How is age verification supposed to work?
Does it involve the release of data by parents to prove that the child is the
necessary age to permit the child access, and if so, what happens to that data?
Paragraph 23 of Schedule 2 addresses exam scripts. Why are these suddenly being
made things that you cannot retrieve? What are the Government up to here?
Paragraph 4 of Schedule 2, on immigration, takes away rights immigrants have at
the moment under the Data Protection Act. Why? What is going on?
There are lots of bits and pieces which I hope
we can pick up in Committee. I look forward to going through the Bill with a
very fine-toothed comb—it is an important piece of legislation.
In order to support archiving activities, it is
essential that this legislation provide a strong and robust legal basis to
support public and private organisations which are undertaking archiving in the
public interest. As I understand it, this new legislation confirms the
exemptions currently available in the UK Data Protection Act 1998: safeguarding
data processing necessary for archiving purposes in the public interest and
archiving for scientific, historical and statistical purposes. This is welcome,
but there may perhaps be issues around definitions of who and what is covered
by the phrase “archiving in the public interest”. I look forward to further
discussion and, hopefully, further reassurances on whether the work of public
archiving institutions such as our libraries and museums is adequately
safeguarded in the Bill.
The new Bill seeks to replicate the approach of
the Data Protection Act 1998, whereby there have been well-established
exemptions to safeguard national security. It is obviously vital that the
intelligence services be able to continue to operate effectively at home and
with our European and other partners, and I look forward to our further
discussion during the passage of the Bill on whether this draft legislation
gives the intelligence services the safeguards they require to operate
effectively.
This Bill attempts to help us tackle some big
moral and ethical dilemmas, and we as parliamentarians have a real struggle to
be sufficiently informed in a rapidly changing and innovative environment. I welcome
the certainty that the Bill gives us in implementing the GDPR in this country
in a form that anticipates Brexit and the need to continue to comply with EU
data law regardless of membership of the EU in the future.
But ultimately I believe that the GDPR is an
answer to the past. It is a long-overdue response to past and current data
practice, but it is a long way from what the Information Commissioner’s
briefing describes as,
“one of the final pieces of much needed data
protection reform”.
I am grateful to Nicholas Oliver, the founder of
people.io, and to Gi Fernando from Freeformers for helping my thinking on these
very difficult issues.
The Bill addresses issues of consent, erasure
and portability to help protect us as citizens. I shall start with consent. A
tougher consent regime is important but how do we make it informed? Even if 13
is the right age for consent, how do we inform that consent with young people,
with parents, with adults generally, with vulnerable people and with small
businesses which have to comply with this law? Which education campaigns will
cut through in a nation where 11 million of us are already digitally excluded
and where digital exclusion does not exclude significant amounts of personal
data being held about you? And what is the extent of that consent?
As an early adopter of Facebook 10 years ago, I
would have blindly agreed to its terms and conditions that required its users
to grant it,
“a non-exclusive, transferable, sub-licensable,
royalty-free, worldwide license to use any IP content”.
I posted on the site. It effectively required me
to give it the right to use my family photos and videos for marketing purposes
and to resell them to anybody. Thanks to this Bill, it will be easier for me to
ask it to delete that personal data and it will make it easier for me to take
it away and put it goodness knows where else with whatever level of security I
deem fit, if I can trust it. That is welcome, although I still quite like
Facebook, so I will not do it just yet.
But what about the artificial intelligence
generated from that data? If, in an outrageous conflagration of issues around
fake news and election-fixing by a foreign power to enable a reality TV star
with a narcissistic personality disorder to occupy the most powerful executive
office in the free world, I take against Facebook, can I withdraw consent for
my data to be used to inform artificial intelligences that Facebook can go on
to use for profit and for whatever ethical use they see fit? No, I cannot.
What if, say, Google DeepMind got hold of NHS
data and its algorithms were used with bias? What if Google gets away with
breaking data protection as part of its innovation and maybe starts its own
ethics group, marking its own ethics homework? Where is my consent and where do
I get a share of the revenue generated by Google selling the intelligence
derived in part from my data? And if it sells that AI to a health company which
sells a resulting product back to the NHS, how do I ensure that the patients
are advantaged because their data was at the source of the product?
No consent regime can anticipate future use or
the generation of intelligent products by aggregating my data with that of
others. The new reality is that consent in its current form is dead. Users can
no longer reasonably comprehend the risk associated with data sharing, and so
cannot reasonably be asked to give consent.
Thanks to AI, in the future we will also have to
resolve the paradox of consent. If AI determines that you have heart disease by
facial recognition or by reading your pulse, it starts to make inference
outside the context of consent. The AI knows something about you, but how can
you give consent for it to tell you when you do not know what it knows? Here,
we will probably need to find an intermediary to represent the interests of the
individual, not the state or wider society. If the AI determines that you are
in love with someone based on text messages, does the AI have the right to tell
you or your partner? What if the AI is linked to your virtual assistant—to Siri
or Google Now—and your partner asks Siri whether you are in love with someone
else? What is the consent regime around that? Clause 13, which deals with a
“significant decision”, may help with that, but machine learning means that
some of these technologies are effectively a black box where the creators
themselves do not even know the potential outcomes.
Could the Minister tell me how the right to be
forgotten works with the blockchain? These decentralised encrypted trust
networks are attractive to those who do not trust big databases for privacy
reasons. By design, data is stored in a billion different tokens and synced
across countless devices. That data is immutable. Blockchain is heavily used in
fintech, and London is a centre for fintech. But the erasure of blockchain data
is impossible. How does that work in this Bill?
There is more to be said about portability, law
enforcement and the intelligence services, but thinking about this Bill makes
my head hurt. Let me close on a final thought. The use of data to fuel our
economy is critical. The technology and artificial intelligence it generates
has a huge power to enhance us as humans and to do good. That is the utopia we
must pursue. Doing nothing heralds a dystopian outcome, but the pace of change
is too fast for us legislators, and too complex for most of us to fathom. We
therefore need to devise a catch-all for automated or intelligent decisioning
by future data systems. Ethical and moral clauses could and should, I argue, be
forced into terms of use and privacy policies. That is the only feasible way
to ensure that the intelligence resulting from the use of one’s data is not
subsequently used against us as individuals or society as a whole. This needs
urgent consideration by the Minister.
If being GDPR compliant requires a hard age
limit, how do we intend to verify the age of the child in any meaningful way
without, perversely, collecting more data from children than we do from adults?
Given that the age of consent is to vary from country to country—16 in the
Netherlands, Germany and Hungary; 14 in Austria—data controllers will also need
to know the location of a child so that the right rules can be applied.
Arguably, that creates more risk for children, but definitely it will create
more data.
In all of this we must acknowledge a child’s
right to access the digital world knowledgeably, creatively and fearlessly.
Excluding children is not the answer, but providing a digital environment fit
for them to flourish in must be. There is not enough in this Bill to
fundamentally realign young people’s relationship with tech companies when it
comes to their data.
I very much agreed with those who said that the
regulation must certainly apply to the big boys in the computer and digital
world. I shuddered when the noble Baroness, Lady Lane-Fox, quoted from that
wholly incomprehensible Brussels jargon from the regulations.
I received last week a letter as chair of
Marlesford Parish Council. We have seven members and only 230 people live in
Marlesford. Our precept is only £1,000 a year. A letter from the National
Association of Local Councils warned me that the GDPR will impose,
“a legal obligation to appoint a Digital
Protection Officer … this appointment may not be as straightforward as you may
be assuming, as while it may be possible to appoint an existing member of
staff”—
we have no staff, just a part-time parish clerk
who is basically a volunteer. It continues:
“They must by requirement of regulations possess
‘expert knowledge of data protection law and practices’”.
I am afraid that will not be found in most small
villages in the country, so I hope that one result of this Bill will be to
introduce an element of proportionality in how it is to apply, otherwise the
noble Baroness, Lady Lane-Fox, who was so right to draw our attention to the
threat of incomprehensibility, will be right and we will all lose the plot.
Despite acknowledging that the Bill fleshes out
the regulation to make it member-state applicable, like the noble Lord, Lord
Stevenson, I worry about a Bill of 218 pages and an explanatory note of 112
pages, plus a departmental pack of 247 pages to deal with it all. That all adds
to the complexity. I admit that the GDPR conceals its highly challenging
requirements in wording of beguiling simplicity under the flag of private rights,
but it is no wonder that the European Parliament did not want its handiwork
contextualised by inclusion in what we have before us. It is not a particularly
encouraging start to bringing 40 years of EU legislation into domestic law.
In what I felt was an inspirational
contribution, the noble Baroness, Lady Lane-Fox—I am sorry she is not in her
place—referred to the tortuous use of language in parts of the Bill. I agree
with her—parts of it are gobbledygook that deny transparency to ordinary
mortals.
I shall touch on three concerns. According to
the Federation of Small Businesses, the measures represent a significant step
up in the scope of data protection obligations. High-risk undertakings could
phase additional costs of £75,000 a year from the GDPR. The MoJ did an impact
assessment in 2012, which is no doubt an underestimate, since it did not take
account of the changes made by the European Parliament, which estimated the
cost at £260 million in 2018-19 and £310 million by 2025-26. I am not even sure
if that covers charities or public organisations or others who have expressed
concerns to me about the costs and the duties imposed. Then there are the costs
of the various provisions in the Bill, many levelling up data protection
measures outside the scope of the GDPR. It is less confusing, I accept, but
also more costly to all concerned.
The truth is that overregulation is a plague
that hits productivity. Small businesses are suffering already from a
combination of measures that are justified individually—pension auto-enrolment,
business rates and the living wage—but together can threaten viability at a
time of Brexit uncertainty. We must do all we can to come to an honest estimate
of the costs and minimise the burden of the new measures in this legislation.
Also, I know that CACI, one of our leading
market analysis companies working for top brands such as John Lewis and
Vodafone, thinks that the provisions in the Bill are needlessly gold-plated.
Imperial College has contacted me about the criminalisation of the
re-identification of anonymised data, which it thinks will needlessly make more
difficult the vital security work that it and others do.
The noble Lord, Lord Patel, and the noble
Baroness, Lady Manningham-Buller, were concerned about being able to contact
people at risk where scientific advance made new treatments available—a
provision that surely should be covered by the research exemption.
The second issue is complication. It is a long
and complicated Bill. We need good guidance for business on its duties—old and
new, GDPR and Data Protection Bill—in a simple new form and made available in
the best modern way: online. I suggest that—unlike the current ICO site—it
should be written by a journalist who is an expert in social media. The
Minister might also consider the merits of online training and testing in the
new rules. I should probably declare an interest: we used it in 2011 at Tesco
for the Bribery Act and at the IPO for a simple explanation of compliance with
intellectual property legislation.
The third issue is scrutiny. I am afraid that,
as is usual with modern legislation, there are wide enabling powers in the Bill
that will allow much burdensome and contentious subordinate detail to be
introduced without much scrutiny. The British Medical Association is very
concerned about this in relation to patient confidentiality. Clause 15,
according to the excellent Library Note, would allow the amendment or repeal of
derogations in the Bill by an affirmative resolution SI, thereby shifting
control over the legal basis for processing personal data from Parliament to
the Executive. Since the overall approach to the Bill is consensual, this is
the moment to take a stand on the issue of powers and take time to provide for
better scrutiny and to limit the delegated powers in the Bill. Such a model
could be useful elsewhere—not least in the Brexit process.
I will make a few rather sceptical remarks about
the long-term viability of data protection approaches to protecting privacy.
They have, of course, worked, or people have made great efforts to make them work,
but I think the context in which they worked, at least up to a point, has
become more difficult and they are less likely to work. The definition of
personal data used in data protection approaches, and retained here, is data
relating to a living individual who is identified, or can be identified, from
the data. It is that modal idea of who can be identified that has caused
persistent problems. Twenty years ago it was pretty reasonable to assume that
identification could be prevented provided one could prevent either inadvertent
or malicious disclosure, so the focus was on wrongful disclosure. However,
today identification is much more often by inference and it is very difficult
to see how inference is to be regulated.
The first time each of us read a detective
story, he or she enjoyed the business of looking at the clues and suddenly
realising, “Ah, I know whodunnit”. That inference is the way in which persons
can be identified from data and, let us admit it, not merely from data that are
within the control of some data controller. Data protection is after all in the
end a system for regulating data controllers, combined with a requirement that
institutions of a certain size have a data controller, so there is a lot that
is outside it. However, if we are to protect privacy, there is, of course,
reason to think about what is not within the control of any data controller.
Today, vast amounts of data are outwith the control of any data controller:
they are open data. Open data, as has been shown—a proof of concept from
several years ago—can be fully anonymised and yet a process of inference can
lead to the identification of persons. This is something we will have to
consider in the future in thinking about privacy.
Moreover, throughout the period of data protection,
one of the central requirements for the acceptable use of otherwise personal
data has been that consent should be sought, yet the concepts of consent used
in this area are deeply divisive and various. In commercial contexts, consent
requirements are usually interpreted in fairly trivial ways. When we all
download new software, we are asked to accept terms and conditions. This is
called an end-user licence agreement. You tick and you click and you have
consented to 45 pages of quite complicated prose that you did not bother to
read and probably would not have understood if you had maintained attention for
45 pages. It does not much matter, because we have rather good consumer
protection legislation, but there is this fiction of consent. However, at the
other end of the spectrum, and in particular in a medical context, we have
quite serious concepts of consent. For example, to name one medical document,
the Helsinki Declaration of the World Medical Association contains the
delicious thought that the researcher must ensure that the research participant
has understood—then there is a whole list of things they have to understand,
which includes the financial arrangements for the research. This is a fiction
of consent of a completely different sort.
We should be aware that, deep down in this
legislation, there is no level playing field at all. There are sectoral regimes
with entirely different understandings of consent. We have, in effect, a
plurality of regimes for privacy protection. Could we do otherwise or do
better? Legislation that built on the principle of confidentiality, which is a
principle that relates to the transfer of data from one party to another, might
be more effective in the long run. It would of course have to be a revised
account of confidentiality that was not tied to particular conceptions of
professional or commercial confidentiality. We have to go ahead with this
legislation now, but it may not be where we can stay for the long run.
This has been an interesting, and for me at
times a rather confusing, debate on the issues associated with the Bill. The
Bill is complex, but I understand that it is necessarily complex. For example,
under European law it is not allowed to reproduce the GDPR in domestic
legislation. The incorporation of the GDPR into British law is happening under
the repeal Bill, not under this legislation. Therefore, the elephant and the
prints are in the other place rather than here.
We on these Benches welcome the Bill. It
provides the technical underpinnings that will allow the GDPR to operate in the
UK both before and after Brexit, together with the permitted derogations from
the GDPR available to all EU member states. For that reason it is an enabling
piece of legislation, together with the GDPR, which is absolutely necessary to
allow the UK to continue to exchange data, whether it is done by businesses for
commercial purposes or by law enforcement or for other reasons, once we are
considered to be a third-party nation rather than a member of the European
Union.
The enforcement regime, the Information
Commissioner, is covered in Part 5, because we will repeal the Data Protection
Act 1998 and so we need to restate the role of the Information Commissioner as
the person who will enforce, and we will need to explore concerns that we have
in each part of the Bill as we go through Committee. However, generally
speaking, we welcome the Bill and its provisions.
Of course, what the Government, very sensibly,
are trying to do but do not want to admit, is to ensure that the UK complies
with EU laws and regulations—in this case in relation to data protection—so
that it can continue to exchange data with the EU both before and after Brexit.
All this government hype about no longer being subject to EU law after Brexit
is merely the difference between having to be subject to EU law because we are
a member of the EU and having to be subject to EU law because, if we do not, we
will not be able to trade freely with the EU or exchange crime prevention and
detection intelligence, and counterterrorism intelligence, with the EU. That is
the only difference.
For most aspects of data exchange, compliance
with the GDPR is required. The GDPR is directly applicable, so it cannot simply
be transposed into this Bill. Coupled with the derogations and applying the
GDPR to other aspects of data processing not covered by the GDPR makes this
part of the Bill complex—and, as I suggest, probably necessarily so.
As my noble friend Lady Ludford also mentioned,
along with the noble Baroness, Lady Jay of Paddington, various provisions to
allow Ministers to alter the application of the GDPR by regulation is something
that we need much further scrutiny of, albeit that Ministers’ hands are likely
to be tied by the requirement to comply with changing EU law after Brexit—de
facto even if not de jure. Could it be—perhaps the Minister can help us
here—that the purpose of these powers, put into secondary legislation, is to
enable the UK to keep pace with changes in EU law after Brexit?
As other noble Lords have said, we have concerns
about the creation of a criminal offence of re-identification of individuals.
As the noble Lord, Lord Arbuthnot of Edrom, said, criminalising
re-identification could allow businesses to relax the methods that they use to
try to anonymise data on the basis that people will not try to re-identify
individuals because it is a criminal offence.
Despite what is contained in this Bill, we have
serious concerns that there are likely to be delays to being granted data
adequacy status by the European Commission when we leave the EU. That means
that there would not be a seamless continuation of data exchange with the EU 27
after Brexit. We also have serious concerns, as does the Information
Commissioner, that there are likely to be objections to being granted data
adequacy status because of the bulk collection of data allowed for under the
Investigatory Powers Act, as the noble Lord, Lord Stevenson of Balmacara, said
in his opening remarks.
As the noble Baroness, Lady Lane-Fox, mentioned,
it is essential that the Information Commissioner is provided with adequate
resources. My understanding is that there has been a considerable loss of
staff in recent times, not least because commercial organisations want to
recruit knowledgeable staff to help them with the implementation of GDPR, plus
the 1% cap on public sector pay has diminished the number of people working for
the Information Commissioner. It is absolutely essential that she has the
resources she needs, bearing in mind the additional responsibilities that will
be placed upon her.
A number of noble Lords, including the noble
Lord, Lord Kennedy, the noble Baroness, Lady Lane-Fox, and my noble friend Lady
Neville-Rolfe, asked whether the Bill was too complex. It was suggested that
data controllers would struggle to understand the obligations placed on them
and data subjects to understand and access their rights. As the noble Lord,
Lord Paddick, said, the Bill is necessarily so, because it provides a complete
data protection framework for all personal data. Most data controllers will
need to understand only the scheme for general data, allowing them to focus
just on Part 2. As now, the Information Commissioner will continue to provide
guidance tailored to data controllers and data subjects to help them understand
the obligations placed on them and exercise their rights respectively. Indeed,
she has already published a number of relevant guidance documents,
including—the noble Lord, Lord Kennedy, will be interested to know this—a guide
called Preparing for the General Data Protection Regulation (GDPR): 12 Steps
to Take Now. It sounds like my type of publication.
Other noble Lords rightly questioned what they
saw as unnecessary costs on businesses. My noble friends Lord Arbuthnot and
Lady Neville-Rolfe and the noble Lord, Lord Kennedy, expressed concern that the
Bill would impose a new layer of unnecessary regulation on businesses—for
example, in requiring them to respond to subject access requests. Businesses
are currently required to adhere to the Data Protection Act, which makes
similar provision. The step up to the new standards should not be a disproportionate
burden. Indeed, embracing good cybersecurity and data protection practices will
help businesses to win new customers both in the UK and abroad.
A number of noble Lords, including the noble
Lord, Lord Jay, asked how the Government would ensure that businesses and
criminal justice agencies could continue, uninterrupted, to share data with
other member states following the UK’s exit from the EU. The Government
published a “future partnership” paper on data protection in August setting out
the UK’s position on how to ensure the continued protection and exchange of
personal data between the UK and the EU. That drew on the recommendations of
the very helpful and timely report of the European Union Committee, to which
the noble Lord referred. For example, as set out in the position paper, the
Government believe that it would be in our shared interest to agree early to
recognise each other’s data protection frameworks as the basis for continued
flow of data between the EU and the UK from the point of exit until such time
as new and more permanent arrangements came into force. While the final
arrangements governing data flows are a matter for the negotiations—I regret
that I cannot give a fuller update at this time—I hope that the paper goes some
way towards assuring noble Lords of the importance that the Government attach
to this issue.
Several noble Lords, including the noble Lord,
Lord Paddick, in welcoming the Bill asked whether the Information Commissioner
would have the resource she needs to help businesses and others prepare for the
GDPR and LED and to ensure that the new legislation is properly enforced,
especially once compulsory notification has ended. The Government are committed
to ensuring that the Information Commissioner is adequately resourced to fulfil
both her current functions under the Data Protection Act 1998 and her new ones.
Noble Lords will note that the Bill replicates relevant provisions of the
Digital Economy Act 2017, which ensures that the Information Commissioner’s
functions in relation to data protection continue to be funded through charges
on data controllers. An initial proposal on what those changes might look like
is currently being consulted upon. The resulting regulations will rightly be
subject to parliamentary scrutiny in due course.
The noble Baroness, Lady Ludford, and the noble
Lord, Lord Paddick, I think it was, asked about the Government choosing not to
exercise the derogation in article 80 of the GDPR to allow not-for-profit
organisations to take action on behalf of data subjects without their consent.
This is a very important point. It is important to note that not-for-profit
organisations will be able to take action on behalf of data subjects where the
individuals concerned have mandated them to do so. This is an important new
right for data subjects and should not be underestimated.
The noble Baroness, Lady Manningham-Buller, the
noble Lords, Lord Kennedy and Lord Patel, and my noble friend Lady
Neville-Jones all expressed concern about the effect that safeguards provided
in the Bill might have on certain types of long-term medical research, such as
clinical trials and interventional research. My noble friend pointed out that
such research can lead to measures or decisions being taken about individuals
but it might not be possible to seek their consent in every case. The noble
Lord, Lord Patel, raised a number of related issues, including the extent of
Clause 7. I assure noble Lords that the Government recognise the importance of
these issues. I would be very happy to meet noble Lords and noble Baronesses to
discuss them further.
My noble friend Lord Arbuthnot and others
questioned the breadth of delegated powers provided for in Clause 15, which
allows the Secretary of State to use regulations to permit organisations to
process personal data in a wider range of circumstances where needed to comply
with a legal obligation, to perform a task in the public interest or in the
exercise of official authority. Given how quickly technology evolves and the
use of data can change, there may be occasions when it is necessary to act
relatively quickly to provide organisations with a legal basis for a particular
processing operation. The Government believe that the use of regulations,
rightly subject to the affirmative procedure, is entirely appropriate to
achieve that. But we will of course consider very carefully any recommendations
made on this or any other regulation-making power in the Bill by the Delegated
Powers and Regulatory Reform Committee, and I look forward to seeing its report
in due course.
I look forward to exploring all the issues that
we have discussed as we move to the next stage. As the Information Commissioner
said in her briefing paper, it is vital that the Bill reaches the statute book,
and I look forward to working with noble Lords to achieve that as
expeditiously as possible. Noble Lords will rightly want to probe the detailed
provisions in the Bill and subject them to proper scrutiny, as noble Lords
always do, but I am pleased that we can approach this task on the basis of a
shared vision; namely, that of a world-leading Data Protection Bill that is
good for business, good for the law enforcement community and good for the
citizen. I commend the Bill to the House.
Bill
read a second time and committed to a Committee of the Whole House.