Friday 30 July 2010
Chris Williams from The Register was making some interesting points in his recent article on the extent to which internet service providers can protect its users from innocently accessing a website that contains code, or malware, that might later cause some damage to that user.
Is it right that internet service providers should engage in this activity in the first place? After all, the purpose appears pretty benign – which is to safeguard users from attacks which may be caused by third parties with wholly malicious intent.
But is it right that internet service providers should engage in this activity in a manner that does not immediately appear transparent? If all users were aware of the work that the internet service provider was doing, then would it still be appropriate for these protective measures to be applied, even in circumstances when the user was not able to object to such (protective) activity?
And what should happen to internet service providers who were not so transparent?
This got me wondering whether there were other circumstances in which an internet service provider might be able to “monitor” a user’s communications in an opaque manner.
For me, the problem lies in coming to terms with what internet service providers are allowed to monitor, in terms of the traffic records that a user creates, and when a line must be drawn which separates the traffic element of the record from the content element – which is obviously much more sensitive, or “personal” to the individual user.
Parliament has historically applied the rough and ready definition of IP “traffic” information as being information which comes before the first "/" of an IP address. Accordingly, by a process of deduction, IP “content” information has to be any - and all - information which comes after that first “/”. So, if I were to access Chris Williams’s article, Parliament would presumably hold that the ISP is allowed to retain logs relating to the traffic information (in this case http://www.theregister.co.uk/ , but the ISP would need to try a lot harder to explain why it should retain logs relating to the content information (in this case http://www.theregister.co.uk/2010/07/26/talktalk_stalkstalk/)
When is it appropriate for an internet service provider to retain the content logs?
And for how long is it appropriate for the provider to retain these logs?
And what use should they be permitted to make of them?
And how might we, as a community, develop a consensus on such a divisive issue?
These questions are not new. Over half a century ago, Dylan Thomas wrote vividly about this issue in his seminal radio play “Under Milk Wood”, which covers the events of a day in Llarregub, a fictional Welsh village. One of the characters Mr. Willy Nilly, the postman, dreams of delivering the post in his sleep, and physically knocks upon his wife as if knocking upon a door. In the morning they open the post together and read the town's news, so he can relay it around the village.
Is it really appropriate for an internet service provider to open the post and read all of the private correspondence, just like Mr Willy Nilly did? Or have our standards moved on from when that play was first broadcast in 1953?
Answers on a post card, please, to the usual address.
Tuesday 27 July 2010
July’s edition of the Baker & McKenzie’s privacy newsletter arrived in my in box a few days ago. It’s one of those publications that are “must reads”, if only to reassure myself that nothing much has happened recently that I wasn’t already aware of.
I read their lead article and became very irritated with myself. Why – not because the subject matter wasn’t important, but because I was having a real struggle in persuading myself that I actually wanted to be interested in it.
And what was this worthy (but oh, so dull) subject? None other than that old favourite, the methods with which data controllers legitimise transborder data flows.
The article referred to the fact that the European Commission had recently updated its controller to processor model clause for the transfer of personal data from controllers to processors based in countries outside the EEA that are not recognised as offering an adequate level of data protection. (The EEA is all EU countries plus Iceland, Liechtenstein and Norway.)
The article explained that “the main change to the model clauses is to allow sub-processing, a practice which is extremely common particularly in IT and outsourcing industries, to take account of the business trend and practice towards more globalised processing activity ... The main change to the controller to processor model clauses is the inclusion of new provisions to allow non-EEA processors to sub-contract their processing activities to sub-processors, provided certain conditions are fulfilled.”
The trouble was that, much as I tried to fake an interest, I really couldn’t do it. This is one aspect of data protection law that is so technical and so boring that I defy anyone to be seriously worried about this stuff. And I’m sure that no-one I try to explain the concept to really gets it either.
And then I remembered that I was in good company. The authors of the Kantor report, about which I blogged on 23rd July, also had a pretty dim view of some of the more arcane rules that the EU had set around these matters. Their views on “applicable law” and how a data controller can determine what the applicable law actually is, make pretty explosive reading. Section V(3) explains that “All data processing, including the processing of personal data, is becoming increasingly internationalised. This is inherent in activities on the Internet, and will be all the more so in an era of “cloud computing”. The actors involved in such processing are also becoming increasingly diversified and split between countries, with often not-easy-to-distinguish tasks and responsibilities. This will cause increasing conflicts of law, also within the EU/EEA, because of the ambiguity and different implementation of the “applicable law” rules in the Directive."
The report went on to argue that “Specifically, under the main Directive, within the EU/EEA, Member States must apply their national data protection law to a processing operation if “the processing is carried out in the context of the activities of an establishment of the controller on the territory of the Member State”; but “when the same controller is established on the territory of several Member States, he must take the necessary measures to ensure that each of these establishments complies with the obligations laid down by the national law applicable.” (Article 4(1)(a) of the main Directive)..... The rules in Article 4(1)(a) are quite simply utterly confused and impossible to apply in the new global-technical environment. Not surprisingly, the rules are applied differently in the Member States, leading to conflicts of law (which are only not too serious in practice because the competing and conflicting laws on paper are often not enforced in practice).”
So what was required? “Better, clearer and unambiguous rules are desperately needed on applicable law."
Hear, Hear. If the rules were less ambiguous then I might be more interested in trying to apply them properly.
But right now, I’m happy to echo the sentiments of Rhett Butler, in his last words to Scarlet O’Hara. Should European Commissioner Viviane Reding ask me for my views on the extent to which the EU Model clauses have been effective in facilitating excellent data protection standards when regulating transborder data flows, I’ll just turn to her and say:
“Frankly, my dear, I don’t give a damn.”
Friday 23 July 2010
I was working in Chester earlier this week (don’t ask why) and found that, much to my delight, my hotel was, very conveniently, sited right next to one of Chester’s more notorious public houses. It was the Chester Hangman (pictured), where the wrong ‘uns obviously went to meet their maker once the courts had dispensed whatever justice was available at that time. This set me thinking about something I had recently read about the wisdom of giving more powers to data protection regulators as they are currently established.
And who had written this stuff? None other than the respected LDRP Kantor Ltd, in association with the Centre for Public Reform. I hadn’t heard of them either – but I had – as we all have – heard of the core experts, who are extremely respected priests of the data protection universe: Douwe Korff, Professor of International Law at the London Metropolitan University, and Ian Brown Senior Research Fellow at the Oxford Internet Institute. And they were assisted by a panel of special experts and advisors, including Peter Blume Professor of Legal Informatics University of Copenhagen, Ross Anderson Professor of Security Engineering at the University of Cambridge, Caspar Bowden Chief Privacy Adviser for Microsoft UK, and Paul Whitehouse former Chief Constable of Sussex Police.
These are not people whose views ought be dismissed with a mere shrug of the shoulders. And, having read their final report to the European Commission (strangely, on the very issues that the Commission is now consulting about significant changes to the Data Protection Directive), I am betting a photograph of a £50 note that, unless significant evidence is adduced to the contrary, the vast majority of the recommendations will end up as EU data protection law in the not too distant future.
I really mean that. Given the speed with which the EU appears to wish to publish it's proposals for a revised Data Protection Directive, I am expecting that these recommendations will form the basis for that new Directive.
One of the many interesting points the report made was to challenge the traditionally held view that the current crop of data protection regulators need more powers. To quote (selectively) from paragraphs 104 – 108 of the report:
DPAs have great insight and knowledge, and provide helpful guidance on the law - but they are not effective in terms of enforcement: “Policing” of data protection compliance by DPAs is generally weak and ineffective. To quote the conclusions from a major report for the EU Fundamental Rights Agency, drawn up in parallel with the present report:
"Shortcomings are identifiable in the lack of independence, adequate resources and sufficient powers of some Data Protection Authorities. Compliance with data protection legislation in the praxis of several Member States also raises concerns. Legislative reforms are needed also in the field of sanctions and compensation to ensure a higher degree of enforcement of the relevant legislation and protection of the victims of personal data violations."
We ... note that weak enforcement in many countries was already noted in a much earlier study, and does not appear to have improved much.
[But] we feel that too often, DPAs are brought in too late: they are asked to give a view on systems that are already largely “cast in stone”, especially in the public sector. This can even apply to soi-disant “prior checks”, if those are only carried out once the system has already been finally designed (with major cost implications). A second problem is that a number of DPAs are still lacking in core technical competence: there are still too many lawyers, and not enough system- and computer specialists in the authorities.
The key point can be found in paragraph 106: There is ... a more fundamental question about the - in our view, to some extent incompatible - functions of the DPAs. They are advisers and guides. They are also interpreters of the law - and sometimes even quasi-legislators. They are supposed to be advocates on behalf of data subjects. And they are supposed to be law-enforcers. We feel that this is too much to ask of any single body. One danger is that as regulators, they become “captives” of those they regulate, industry and government agencies in particular. That phenomenon is far from limited to data protection authorities: it has been observed in many modern regulatory bodies. But it too serves to underline the tensions between the different functions of these authorities.
We feel that ... consideration should be given to separating the “soft” advisory and guidance functions of the authorities from the “hard” role of law enforcement, with the latter placed basically in the hands of the courts (also acting in cases brought by individuals) and (in respect of more serious or general breaches) the prosecuting authorities. Of course, DPAs, as experts on the issues, could still always be asked to advise the court; they could even be given a right to submit their opinions ex officio and to have rights of appearance ex officio in any case raising data protection issues. In any case, to the extent that data protection issues are placed in the hands of the courts (or special tribunals, as in the UK), there should be equal access to them for data subjects and controllers.
So here we have it. Perhaps it’s best not for us to expect a single regulatory creature to perform so many functions. Perhaps it would be better if there were an organisation with a “confessional”, and seperate organisation with a “courthouse”, rather than a single body trying to operate both at the same time.
Has anyone got a cleaver sharp enough to hack the newly built office in Wilmslow in half then?
The final report, submitted by LRDP Kantor, in Acssociation with the Centre for Public Reform was published on 20 January 2010. It’s available somewhere on the internet – try searching for “a comparative study on different approaches to new privacy challenges, in particular in the light of technological developments, under contract number JLS/2008/C4/011 – 30-CE-0219363/00-28.”
Saturday 17 July 2010
Woops – it appears that the Article 29 Working Party have opined on another “unlawful” activity – but I’m not at all convinced that the British cops will be doing much about it.
This time, I’m referring to an opinion, freshly minted from their latest meeting, which was held in Brussels earlier in the week. The data retention practices of communication service providers are at fault, as there appears to widespread non compliance with some of the provisions in an EU Directive which obliges them to retain telecom and internet traffic records for various periods, in order that they could be passed to law enforcement agencies for the purposes set out in the Directive.
The 21 page opinion is helpfully accompanied by a 45 page annex, which sets out the responses to the questions that were posed by the report’s authors.
Actually, I could probably have saved them the effort. Had they asked me for my opinion, I might have been able to set the issue of compliance with the relevant statutes in its rightful context.
Before I spill the beans, let me first just summarise the main findings of the enquiry, by plagiarising (and editing) the high level findings:
Service providers were found to retain and hand over data in ways contrary to the provisions of the directive. The provisions of the data retention directive are not respected and the lack of available sensible statistics hinders the assessment of whether the directive has achieved its objectives. The European Data Protection Authorities therefore call on the European Commission to take into account the findings of the report when taking the decision on whether or not to amend or repeal the Directive.
The joint inquiry focused on security measures and preventions of abuse, compliance with storage limit obligations and the type of retained information. It showed that the directive has not been implemented in a harmonized way. Significant discrepancies were found between the member states, especially regarding the retention periods which vary from six months to up to ten years which largely exceeds the allowed maximum of 24 months.
Another important finding is that more data are being retained than is allowed. The data retention directive provides a limited list of data to be retained, all relating to traffic data.
The retention of data relating to the content of communication is explicitly prohibited. However, it appears from the inquiry that some of these data are nevertheless retained. As to the internet traffic data several service providers were found to retain URL’s of websites, headers of e-mail messages as well as recipients of e-mail messages in “CC”- mode at the destination mail server. Regarding phone traffic data it was established that not only the location of the caller is retained at the start of the call, but that his location is being monitored continuously.
Member states have scarcely provided statistics on the use of data retained under the Directive, which limits the possibilities to verify the usefulness of data retention.
The report [makes] several recommendations for changes to be made to the directive:
• Increased harmonization, more secure data transmission and standardized handover procedures.
• No additional data retention obligations for the providers may be imposed by national laws.
• A reduction of the maximum retention period to a single, shorter term,
• Reconsideration of the overall security of traffic data by the Commission,
• Clarification of the concept of “serious crime” at member state level and
• [Improved] disclosure protocols to all the relevant stakeholders of the list of the entities authorized to access the data.
Ok, let’s cut to the chase. It should not come as a surprise to anyone to realise that the Directive has been implemented in different ways in different Member States. After all, that’s a hallmark of virtually every piece of data protection legislation that has ever come out of the EU.
But it also should not really come as much of a surprise to anyone to realise that, in all honesty, no-one in the room actually understood the meaning of the final text of the Communications Data Retention Directive [2006/24/EC] when the politicians were asked to vote to adopt it. I wasn’t in the room at the time, but I do know a few of the flies who were on the wall. The dirty deeds, so to speak, were done during the dying days of the UK’s Presidency of the Council of Ministers. In those days, each Member State got to chair the relevant committees for six months, and they all tried hard to let history mark their chairmanship with a few pieces of noteworthy legislation.
From what I remember, however, there wasn’t much noteworthy legislation that was adopted during “that” British Presidency. Tony Blair was the Prime Minister, and it fell to Home Secretary Charles Clarke to be responsible for this initiative. I remember the time running out , with the end of the Presidency just a couple of weeks away, and Home Office officials wondering whether they should really recommend to Charles Clarke that the Council of Ministers be asked to approve a "semi finished" measure that made some sort of sense for the retention of telephone and SMS records, but used technical terms that could not easily be applied to internet records. In a masterstroke, some Commission official then took it upon himself to quickly rewrite the text in a way that basically no-one could understand how it would apply to internet records, and it was this text that was presented to European Home Office Ministers with a request that they approve it.
Which they did.
Which is why we've ended up with a piece of legislation which is incredibly hard to comprehend and which, as far as its treatment of IP records is concerned, does not appear to have been warmly welcomed by investigators as being fit for purpose.
So, if the Article 29 Working Party want to get their teeth into anyone, they could start with the people who created the words that no-one could really understand. And they could ask themselves why politicians should be allowed to pass laws that neither they nor any of the so-called experts in the room can properly comprehend.
Hey ho – we could be hearing from the Working Party soon on possible tweaks to the main Data Protection Directive – again an instrument that contains some provisions that very few experts can properly comprehend.
Note for the brave – the Working Party’s report can be found on the internet by Googling its snappy title of “Report 01/2010 on the second joint enforcement action: Compliance at national level of Telecom Providers and ISPs with the obligations required from national traffic data retention legislation on the legal basis of articles 6 and 9 of the e-Privacy Directive 2002/58/EC and the Data Retention Directive 2006/24/EC amending the e-Privacy Directive.”
Failing that, point your browser to http://ec.europa.eu/justice_home/fsj/privacy/news/docs/pr_14_07_10_en.pdf
Friday 16 July 2010
Nick Clegg set out the aims of this current liberal parliament in a speech at the think tank Demos today. No, I wasn’t there to hear it myself. (I was invited, but the invitation arrived too late for me to rearrange my plans and hoof it down to Tooley Street. Entirely my mistake. I really ought to take a closer interest in my emails.)
But Nick was kind enough to send me a copy of his text when he wrote to me this evening. Very nice of him. I was interested in his views on the extent to which regulation plays a role in governing the people. After all, we are all promised a great Reform Bill later this year, where perhaps a bonfire of regulations will go up in smoke. What was he going to say about freedom, or of safeguarding information rights?
Not much today – but then again it would probably have been stupid for him to have wasted the opportunity to assure the public on the biggest issues of the day – such as economics, education and political reform. What did he end up saying about regulation – bearing in mind the interest the data protection community has in understanding what level of regulation is going to be foisted upon the citizens of the EU given the faults inherent in current data protection law?
Here is (some of) what he had to say:
A liberal cannot hold a simple ‘for’ or ‘against’ view of regulation. It is clear that in many areas, we have not had enough regulation in the last decade – the banks and the housing market being the most obvious examples. On the other hand, we have seen far too much regulation for small businesses, and too much micro-management in the day-to-day lives of ordinary people. A liberal cannot say that a state is too big - but we can certainly say the state has become too big for its boots. Labour over-regulated in some areas, but under-regulated in others.
I did notice one carefully crafted comment – which surely was not meant to criticize the politics of a person I (thought I) heard proclaiming himself a convert to Libertarianism a little while ago, and who wants to be selected as the Liberal Democrat candidate for the London Mayoral elections: Mr Lembit Opik!
Nick first referred to remarks made by the Nobel prize winner Amartya Sen: As Sen puts it: “Responsible adults must be in charge of their own well-being; it is for them to decide how to use their capabilities. But the capabilities that a person does actually have depends on the nature of social arrangements, which can be crucial for individual freedoms. And there the state and the society cannot escape responsibility. I agree. This is a vital element of the liberal approach, as opposed to libertarians, or neo-liberals if you prefer. Libertarians believe that simply clearing away obstacles will set people free. Liberals understand that for a person to have power over their life, they need capabilities too.
In terms of data protection regulation, what does this mean?
It probably means that data controllers should not expect the regulatory state to be rolled back to such an extent that they can trample over the legitimate needs of individuals. Instead, individuals need to be provided with a greater degree of digital awareness, in order that they can flex their own muscles and make their own, meaningful, choices about where they should leave their digital footprints, and about who should be able to follow their trails.
Yep, I’m all in favour of that.
Nick ended his speech by hoping that by 2015:
• power will have been radically redistributed towards people
• our civil liberties will have been restored
• our broken political system will be repaired
• our economy will be balanced, green and growing
If the coalition Government succeeds, by 2015 Britain will be a more liberal nation, a nation of stronger citizens living in a fairer society. I am under no illusions about the scale of this ambition. But I am also in no doubt that we can achieve it.
Heavy, aspirational stuff. But all power to him - and to his mate Dave, of course - if the Government manages to pull it off.
Thursday 15 July 2010
Another full day of data protecting today - which began with an interesting “Data Protection Roundtable” event, sponsored by Sophos and hosted at the offices of Field Fisher Waterhouse in London. The idea was that a glittering array of data protection experts would be on hand to discuss the results of a newly commissioned survey on current data protection laws with some of the finest journalists on the land.
It was fun – and quite thought provoking. Look out for the articles mentioning this Sophos survey which ought to be appearing quite soon in Computer Business Review, SC Magazine, Computer Weekly, IT Pro, Silicon, GT News, Computing and Privacy Laws & Business.
The discussion didn’t just end up as a platform for promoting the services that Sophos can supply to its clients. Instead, people were able to focus on other issues that were on the horizon – such as the issue of breach notification rules, and whether “breach notification” was actually the right tool to address many of the concerns which surrounded the extent to which people’s personal information is currently secured. Has the world been right to copy the breach notification concept, the brainchild of Arnold Schwarzenegger’s administration in California? Perhaps there is actually a better way of terminating the problem. Who will be the first to find Stewart Room’s and Eduardo Ustaran’s thoughts on this issue, as expressed so eloquently at today’s session?
In any event, how are breach notification rules likely to affect the current behaviour of the largest and most reputable British data controllers?
And, when will the Information Commissioner start exercising his new powers to fine those who are responsible for the most serious of data blunders?
Why am I asking so many questions?
I’m not really sure - but the answers might be available, if the journalists have faithfully reported the discussions, somewhere on the internet, quite soon!
Wednesday 14 July 2010
I left the Design Council’s offices in Covent Garden today with this tag line ringing in my ears. Had I been to a Sainsbury’s launch? No – it was an event which marked the publication of the (upbeat) Information Commissioner’s Annual Report for 2009/10.
I expect that the press releases will repeat the messages that Christopher Graham wanted to emphasise today, so anyone who’s that interested can get that spin from the usual sources. What I was keener to appreciate was the tone of the Commissioner’s report, and to get a better sense of the role that the Commissioner’s staff see themselves playing in the forthcoming year.
The impression I had when stumbling out into the rain in Covent Garden was that the Commissioner’s staff were determined to proclaim that they were regulators who meant business, and that organisations whom in the past, may have ignored the data protection rules, should not be surprised if they faced painful enforcement action in the future.
On data protection (as well as freedom of information) issues, the ICO was at the heart of the Government’s transparency agenda. Of course there was a need to balance freedom of information with legitimate data protection rights, but it was wholly wrong to assume that “data protection” would provide the great duck out of the 21st Century, as perhaps “health & safety” legislation provided the great duck out of the 20th Century.
What they do is really where it’s at – in terms of the current political debates on accountability, privacy, freedom and transparency.
So, as a data controller, what should I really concern myself with in 2010?
Well, it’s clear that technological advances are going to raise fears among some individuals that data controllers are all set to exploit personal information in ways that are ever more devious, which could lead to a collapse in confidence in some institutions. But a remedy is at hand. The more transparent a data controller is about their intentions, then easier it will be to set these (probably irrational) fears aside.
An attitude of “we know what’s best for you” is unlikely to wash in this current decade, and I suspect that many data controllers who have taken a pretty cursory view of their customer’s expectations will face a real wakeup call when their customers leave them in droves for competitors who are keener to respond to legitimate privacy concerns. The more transparent an organisation is, the less it may feel obliged to rely on “consent” as a condition for using personal information.
I have always disliked relying on “consent” as a condition for processing, as it means that I have to build systems which take account of the fact that individuals may exercise their rights to “object”. For me , the best condition for processing personal information is that it is in the legitimate interests of the individual concerned, and that my actions do not harm or affect the legitimate interests of that individual.
Also, a new data protection principle - that of accountability - ought to emerge. What this means is that data controllers should be expected to be more accountable for the way they use personal information, and that individuals (rather than just the regulator) should be given legal powers to challenge data controllers when they failed to do what they had previously announced that they would do.
I think that this means that the Commissioner’s Office would welcome a pretty fundamental review of current data protection legislation, rather than a quick tweak around its edges. So, if the European Commission really plans to publish proposals for a revised Directive by this winter, then let’s hope that it doesn’t take the easier option of just offering some minor proposals to trim some of the most impractical provisions. What we all really need is a Directive that is fit for the (digital) world as it currently is, and as it is likely to be during the next decade, rather than just how the world was some 20 years ago.
I also think this means that the Commissioner’s Office would welcome fundamental changes to the delivery mechanisms specified in the Directive. While the main Data Protection Principles seem fine (once you also add the accountability principle), surely some EU Data Protection authorities are deluding themselves if they seriously think that, for example, they should exercise powers, and are equipped, to control (or approve) all the transborder information flows that emanate from their jurisdictions. Perhaps some of these regulators really do believe that they have the powers that even King Canute lacked. Can some of these European regulators really turn back (or give prior approval to) all the information tides that ebb and flow from their shores (or borders)?
I think not, but I suspect that some European regulators might think otherwise.
And what about the freedom and security debate? What have I sensed from the conversations I’ve had today?
Interestingly, I did hear one comment from someone in the Commissioner’s Office later today, who was keen that the Commissioner should remain wary of the Home Office’s views on the best ways to combat terrorism. Proportionality and necessity were important principles, but it should not be taken as read that what was proportional in an analogue world would also be proportional in a digital world.
I sensed from that aside that his Office would look very carefully at any proposals that emerge from the Home Office’s Interception Modernization Programme. Indeed, one of those officials mentioned that very programme by name. Evidently, Home Office proposals to combat terrorism should not result in their becoming a recruiting sergeant for terrorism.
Salus populi est suprema lex?
Not necessarily so,Cicero. The health, safety or welfare of the public is not necessarily the supreme law.
Let’s have a bit of proportionality and necessity thrown into the mix too, just for good measure.
Monday 12 July 2010
Yesterday reminded me why it’s dangerous to allow people the opportunity to claim a right to remain completely anonymous. Should people have the unfettered right to control “their” personal information to the extent that they can demand that it be deleted from all systems?
I don’t think so.
I was reminded why it was wrong to give people the right to remain anonymous when I attended a concert given by the Royal Marines at the Deal Memorial Bandstand yesterday afternoon. It was a concert to commemorate the 21st anniversary of the murder of 11 Royal Marines musicians in the town by the IRA. A 15 lb bomb, activated by a timer switch, detonated at 8.27am on 22 September 1989.
No one has ever been arrested or convicted for this offence.
This year’s memorial concert brought together some 10,000 people, including the relatives of some of those who had been murdered, and others whose physical and emotional scars are still evident. To see the strength of character of many of the survivors is a truly humbling experience.
When you are involved in an incident like that, you really appreciate that concepts of data protection have to include an element of protection for people – to prevent their lives from being ruined by criminals who try to operate under a cloak of anonymity. If we are to live in a civil society, we really do need to ensure that, when appropriate, those whose job it is to keep us safe do have the tools to do that job.
Where and just how to draw the line between investigation and mass surveillance is an extremely difficult job. But it has to be done. I’m all for libertarianism, and self expression, but I’m also for an element of protection. The internet ought not be like the wild west. We do need to equip some trusted sheriffs with the tools to go after the bad guys.
Not to investigate dog fowling, but certainly to investigate people whose values are such that they set out to cause serious harm to people whose lives and lifestyles don’t accord with theirs.
As I examine the changes the European Commission proposes to make to the current EU data protection landscape, I will do what I can to ensure that there remains an element of protection for the many, as well as a healthy respect for the legitimate rights of the individual.
Saturday 10 July 2010
I don’t review mobile phones very often. This is because I don’t change my phone very often. In the past I’ve considered a mobile phone to be a device simply for making calls, sending texts, and responding to work emails. I haven’t previously been too interested in picking up anything else that’s available on the internet. Life has previously seemed too short, and I’ve thought that what I really wanted to do was get home and fire up a laptop. At home, the screen was so much larger, and I could surf the net for “free” subject of course to any fair use policy set by my internet service provider.
But the scales were lifted from my eyes when the very latest Samsung Galaxy S handset popped into my lap just a couple of days ago. I’ve seen the light(and the data speeds) - and I’m hooked.
But what about privacy? To what extent have I traded my ability to roam on the net, or to use location based services for social networking or business purposes, with the data behemoth that is Google? Who’s got the better deal?
Knowing what I do about the ability of companies to collect information about its users (regardless of whether it can identify those users and put a name to the fingers being used to key the device at any particular time) I am still convinced that I have the better part of the deal. The world is at my fingertips and if I want to confuse an eagle tracker all I have to do is change the equipment I am currently using – not that prohibitively expensive a deal, any more. If you can afford the iPad, I’m sure you can afford to change your mobile devices (and SIM cards) reasonably frequently, if you are seriously interested in covering your tracks.
But my knowledge of the ways mobile devices currently connect with the internet give me sufficient confidence that hardly anyone really knows what I’ve been up to anyway. They may know how much data the device has consumed over a given period, but I think they would be pushed to provide me with the granularity of information that my (fixed) internet service provider probably has about anyone who uses my account.
What did impress me was the fact that Samsung have realised that privacy is an issue for a significant proportion of their potential customer base, so they have tried hard to make life pretty transparent. They even explain how privacy settings can be managed in their (printed) operational manual. Instructions are given about
• Use my location: set the device to use your current location for Google services.
• Back up my settings: Back up and restore your device’s settings to the Google server.
• Factory data reset: Reset your settings to the factory default values.
What I do like is the icon which reminds me that my location is being tracked for various purposes, as if I want to remain private (or conserve the battery), I can turn that feature off.
Also, the device has a handy feature to lock SIM cards, and a nifty way of securing access to the handset itself.
There’s probably lots more to learn about the way the phone could be used to protect my privacy – and doubtless I’ll stumble across these in the next few days (or weeks).
First impressions are that it’s a great piece of kit – and it will be flying off the shelves when the word gets around as to just how good it is.
I’m now off to listen to the internet radio on the phone – picking the signal up from my WiFi connection indoors. All I’ll need to do is it turn on, tune it in and then hope the signal doesn’t drop out.
If, in the words of Timothy Leary, the PC was the LSD of the 1990s, then surely smart phones are the LSD of the current era.
So I’m quite proud to think of myself an advocate of the (mobile) cyberdelic counterculture.
Friday 9 July 2010
I’ve come up with a stunning way to delay worthy projects which huge corporations want to embark on – and all in the cause of Data Protection. Oh, to be given the opportunity to work for some of the EU regulators.But I’m not an EU regulator and I am very happy where I am just right now, thank you.
What on earth do I mean?
Well, I was lucky enough to attend a presentation earlier this week given by the Chief Privacy Official of the Graduate Management Admission Council. That’s the American educational testing organisation, which delivers the majority of its on-line tests to people who live in one of 111 countries outside the USA. Given the doors that are opened to a student with good GMAC scores, how does the Council ensure the integrity of its applicant process, to deter people from hiring others to take the exams for them?
To cut a (very) long story short, the answer seemed to lie in inviting candidates to undergo a biometric test to indicate whether the candidate was also known to the GMAC under a different name. And, most importantly, the answer lie in developing a test which could not be used by others for other purposes. For example, if a fingerprint was taken, it was possible that the print might later be matched against other fingerprint records for, say, law enforcement purposes. Woops. Big “no no” from the privacy wonks.
But could a no-trace, no-touch test be created which gave both applicants and examiners a sufficient degree of confidence that it would only be used for GMAC authentication? Yes it could. It’s all to do with photographing hands. I won’t give anything away in case the really keen ones start to cheat by turning up with spare hands.
Ok. So if a data controller has the technology, and a valid reason to carry out the biometric testing, how long might it take the regulators to approve the concept?
If I were you, I might think carefully about rolling the concept out in every EU state simultaneously.
The GMAC took about 6 months to carefully prepare their case before formally seeking the blessing of the French Data Protection Authorities. The CNIL had to approve the application as it involved the processing of biometric data and some of the personal information was going to transferred out of the European Union (to America - evidently a pariah state, in the minds of some people). The CNIL received the request in October 2008 and authorised it in June 2009.
That’s right. To take a photograph of a handprint from a (consenting) applicant, simply to see if it’s characteristics were different to those of all the other (consenting) applicants, GMAC undertook a process that lasted 14 months.
I’m not sure if I should laugh aloud or cry with exasperation.
But 14 months does seem an awfully long time to wait for the much longed for “Oui”.
I wonder if the wait for the results of the France regulators might have been siginficantly shorter if the GMAC had first chosen to persuade other EU Member States to introduce the test in their countries, to demonstrate that even though the concept looked a bit challenging in theory, it actually worked really well in practice.
Monday 5 July 2010
Call me old fashioned, but I always thought that one of the main purposes of the European Data Protection Directive was to facilitate the free flow of personal information within the borders of the European Union (and the EEA). Others have obviously thought that a principal purpose was to make it pretty hard to allow personal information out of the European Union (and the EEA) unless it was pretty clear that it would be protected to at least as careful an extent as if it were to remain within the European Union (or the EEA).
What becomes ever clearer to the legal wonks, as we try to understand how to pull the social and legal levers in each EU Member State is that it can often be quite difficult to get the stuff out of one member state and into the servers of another. Or, when an institution has the misfortune to suffer a data breach, how differently it may be treated by the regulators, depending on which country the institution was located.
This issue was briefly discussed this afternoon by a group of experts at the Privacy Laws & Business Data Protection Conference, which is taking place at St John’s College in Cambridge. The audience were treated to a fascinating explanation of the different ways that the same breach would be likely to be considered by regulators in Canada, Denmark, France, Germany, and the UK. And the issues were remarkably similar to those expressed by Peter Fleisher who, earlier in the day, had spoken about the different ways that the Google Street View service had been reviewed by various Data Protection Agencies within the EU.
Google’s Street View was first launched in the USA in 2007. The first imagery in Europe was launched the following year, and the imagery is currently available in 21 countries on 5 continents. It is very popular with users – over 100 million images are viewed each day, and additionally a large number of business applications use the images, too. Can 100 million daily viewers be wrong? Well, I wouldn’t have thought so – but of course it’s not for me to say. Just why the service is not available in all of the EU countries yet is one of life’s many mysteries. I gather it’s not necessarily much to do with the collection of the images. I suspect that many of them were collected some time ago. I sense the problem may be more to do with the fact that not all of the EU’s Data Protection Commissioners have completely satisfied themselves that the Street View service meets the cultural expectations of the citizens of that particular state. Why I can use it in the UK, France, Holland, Belgium, and Austria - but not yet Germany- remains a mystery. Will the German citizens rise up and demand views of their (and their neighbour’s) streets any time soon? We’ll see.
This afternoon’s discussion about the different remedies available to regulators on data breaches brought a wide smile to my face. When life gets as complicated as this, I get a job for life. I don’t think there’s any special reason why the rules need to be so complicated (or so nuanced), but complicated life must be. Until reality sets in anyway – the reality that soon the data controllers will start rising up and demanding simplicity and a greater sense of uniformity. When we’ve blown our legal budgets, seeking advice on just why we have to be everso slightly different in the countries where we use the same processing systems. But until that fateful day I’ll carry on marvelling at the way every regulator tries just a little bit too hard to be just a little bit different from the rest of their regulatory friends.