Wednesday, 30 March 2011
Disguising personal data – the new reality
Today’s “Privacy & Data Anonymisation” seminar, sponsored by the ICO at the offices of the Wellcome Trust in Central London, will cause a few headaches for regulators, data controllers and everyone else alike. Transparency is currently occupying a central part of the Government’s agenda. It’s seen as the way to ensure that public bodies are more transparent, democratic and accountable. That seems to be a good thing. Ahead we see the rise of the transparency activist, whose calls for action will be just as strident as those of the privacy activist. They are not, necessarily, combatants occupying different ends of a continuum, however. The trick, if we can all pull it off, is to ensure that the benefits that will be obtained from transparency will not be significantly offset by detriments to personal privacy.
The seminar, chaired by Christopher Graham and subsequently David Smith, gave a range of distinguished experts an opportunity to generally agree with each other about the challenges ahead. There was tacit acceptance that, in this ever more closely connected world, the concept of true anonymity is redundant. It is a term of fiction. And more people will come to accept this as a statement of fact in the next few years.
The assessment has to move on to working out whether data which can benefit society as a whole should be withheld just because there is a “small” risk that individuals can be identified. From the floor, Caspar Bowden was given the opportunity to ask some of the more difficult questions. Questions such as what happens when it goes wrong and those who are identified are unhappy about this? What collective redress mechanisms should be in place to deter the maliciously minded?
The discussion turned to issues of confidentiality – and an acceptance that life was so complicated that, for most practical purposes, legitimising data processing by obtaining an individual's consent is not really going to be an option. I suspect that the legitimate interests of the data controller condition will become even more widely relied upon. The difficulties of turning a fair processing statement into language that an ordinary person would actually want to read, let alone understand, are overwhelming. But, particularly in the medical research context, while it may be impractical to seek consent for such processing, it is inexcusable for the information not to remain confidential. We heard a few examples of the very significant benefits that could be obtained when information about people with very rare and peculiar medical conditions was shared – to the benefit of the treating doctors and obviously to the patients themselves.
What was clear was that the way forward was not one where it would be appropriate to rely on an answer which depended wholly on either technology or regulation. Technology won’t give us all the answer as it merely fuels an arms race between the statisticians and the data miners. Regulation won’t give us the answer either, as laws, once created, will be years behind the curve – and have political needs to address, rather than practical ones. After all, who really thinks that the revised privacy Directive is going to answer all our privacy problems? (Let alone the problems that will emerge in the near future.)
Actually, the way forward really revolves around behaviours – or more importantly, trust. We will naturally give the Government the initial benefit of the doubt, and I’m sure we will be all be delighted and amazed at the manner in which various public sector data sets can be rehashed to provide us with information which is of real value to us personally.
But – and this is a big but - it will only take a couple for high-profile mishaps to occur before we, the great unwashed, lose confidence in the ability of public bodies to mask the stuff that really causes us personal embarrassment, at which time we will refuse to supply the very information which, when mashed up, caught us on the hop.
The situation is complicated, but it is not hopeless. There is great potential for greater transparency of public information, and we have to be careful that small gains in privacy are not offset by huge losses of utility of this information. There must be combinations of information available which, when mashed up, won’t lead to personal ruin, These we should find. If we believe in transparency, we will have to learn to live with a bit of risk. It’s not data that really needs protecting. Let’s face it. Actually, it’s people, not data, which really need protecting.
Turning the concept on its head, one speaker wondered what we are willing to give up for privacy. Would we prefer to join a social network which we paid to use, or would we be happy with a search engine which was a bit flaky around the edges?
In conclusion, it seems that anonymity belongs to the framework of the old style of data security. Norms are changing, and people are slowly realising this. Trust will replace anonymity in terms of data security. And this trust will be placed in the judgment of people who will have access to vary quantities of information which could well relate to us. If our trust is broken, as a result of the failure of the people (or institutions) to seperate us from information which, if linked, could ruin our established reputation, the result will be a collapse in public confidence which will quickly result in the demise of the Government’s transparency agenda – which does not appear to be a very good outcome for anyone.
There ought to be a very useful ICO report of the proceedings available soon, so I won’t steal any more of its thunder.
.