This article is from the source 'guardian' and was first published or seen on . It last changed over 40 days ago and won't be checked again for changes.

You can find the current article at its original source at https://www.theguardian.com/commentisfree/2017/aug/07/the-guardian-view-on-data-protection-a-vital-check-on-power

The article has changed 6 times. There is an RSS feed of changes available.

Version 4 Version 5
The Guardian view on data protection: a vital check on power The Guardian view on data protection: a vital check on power
(about 2 months later)
Data is knowledge and knowledge is power. That is why data protection matters in a democracy. The most recent government paper, a statement of intent, is not the detailed legislation that will be needed to harmonise British law with the EU’s General Data Protection Regulation [GDPR], which comes into force next spring, but it gives a clear view of what the government is trying to achieve. The overwhelming aim is to remain in step with the EU. So much of a modern economy depends on the frictionless movement of vast quantities of data across national borders that it is vital to harmonise with EU policy even if we can now no longer help to shape it.Data is knowledge and knowledge is power. That is why data protection matters in a democracy. The most recent government paper, a statement of intent, is not the detailed legislation that will be needed to harmonise British law with the EU’s General Data Protection Regulation [GDPR], which comes into force next spring, but it gives a clear view of what the government is trying to achieve. The overwhelming aim is to remain in step with the EU. So much of a modern economy depends on the frictionless movement of vast quantities of data across national borders that it is vital to harmonise with EU policy even if we can now no longer help to shape it.
There are three different interests in data and privacy which have to be brought into balance: the individual, the companies which hold and process our data, and the state, which mediates between the two. There is a fundamental asymmetry between the individual and the other two, in that the personal data of any particular customer is worth in isolation very little to anyone else, but the aggregation and refinement of data gives it a huge new value. It should be the aim of policy to ensure that no one is disadvantaged by having their data processed in this way.There are three different interests in data and privacy which have to be brought into balance: the individual, the companies which hold and process our data, and the state, which mediates between the two. There is a fundamental asymmetry between the individual and the other two, in that the personal data of any particular customer is worth in isolation very little to anyone else, but the aggregation and refinement of data gives it a huge new value. It should be the aim of policy to ensure that no one is disadvantaged by having their data processed in this way.
Anonymity is not the shield it might appear. Someone who knows everything about you but your name is in possession of information far more valuable, and potentially dangerous, than someone who knows your name and nothing else. Names can be trivial to discover, given other facts. One of the central premises of the information economy is that the collection and analysis of gigantic quantities of anonymised data produces general patterns which enable accurate prediction about anonymised individuals. The correlations that emerge from vast quantities of data hold good even when tiny samples are examined. This is the insight at the heart of “machine learning”, one of the most promising fields of artificial intelligence.Anonymity is not the shield it might appear. Someone who knows everything about you but your name is in possession of information far more valuable, and potentially dangerous, than someone who knows your name and nothing else. Names can be trivial to discover, given other facts. One of the central premises of the information economy is that the collection and analysis of gigantic quantities of anonymised data produces general patterns which enable accurate prediction about anonymised individuals. The correlations that emerge from vast quantities of data hold good even when tiny samples are examined. This is the insight at the heart of “machine learning”, one of the most promising fields of artificial intelligence.
The GDPR, and hence the statement of intent, takes aim at this in two ways. The first is the right of any individual to know what data is held on them, and in some circumstances to demand its deletion. This is obviously a help against teenage indiscretions, but it is not, nor should it be, a general panacea. There is a genuine public interest in knowing things about public figures that they would rather conceal. The partial exemption of journalism from data protection rules is a welcome part of this statement of intent. In any case, the right to be forgotten isn’t absolute. It is really a right to remove facts from search engines, not from the web itself. It does nothing to diminish the powers of inference from known facts which are harmless in themselves to unknown dangerous truths that we would rather conceal.The GDPR, and hence the statement of intent, takes aim at this in two ways. The first is the right of any individual to know what data is held on them, and in some circumstances to demand its deletion. This is obviously a help against teenage indiscretions, but it is not, nor should it be, a general panacea. There is a genuine public interest in knowing things about public figures that they would rather conceal. The partial exemption of journalism from data protection rules is a welcome part of this statement of intent. In any case, the right to be forgotten isn’t absolute. It is really a right to remove facts from search engines, not from the web itself. It does nothing to diminish the powers of inference from known facts which are harmless in themselves to unknown dangerous truths that we would rather conceal.
The second is the right of appeal to a human being against decisions which have been taken by an algorithm. This is something very different from requiring that the workings of the algorithm in question be explained. That would almost certainly be impossible: some forms of artificial intelligence now reach conclusions by a process that even their programmers cannot debug. But computers don’t operate themselves. They are programmed and maintained by human actors who must be held responsible for their actions, and that is what the stipulation about algorithmic decisions amounts to.The second is the right of appeal to a human being against decisions which have been taken by an algorithm. This is something very different from requiring that the workings of the algorithm in question be explained. That would almost certainly be impossible: some forms of artificial intelligence now reach conclusions by a process that even their programmers cannot debug. But computers don’t operate themselves. They are programmed and maintained by human actors who must be held responsible for their actions, and that is what the stipulation about algorithmic decisions amounts to.
But there is in the end a limit to what state action can accomplish. We must all learn to negotiate a world where machines watch almost everything we do, or read, or write, and soon what we say in our homes, and harvest data out of all of it. Anonymity is not the answer; in fact it can conflict with rights over personal data, since these rights can only be properly exercised by someone known to be entitled to them. But caution, discretion, and strong encryption are available to everyone. Make use of them.But there is in the end a limit to what state action can accomplish. We must all learn to negotiate a world where machines watch almost everything we do, or read, or write, and soon what we say in our homes, and harvest data out of all of it. Anonymity is not the answer; in fact it can conflict with rights over personal data, since these rights can only be properly exercised by someone known to be entitled to them. But caution, discretion, and strong encryption are available to everyone. Make use of them.
Data protectionData protection
OpinionOpinion
Data and computer securityData and computer security
Artificial intelligence (AI)Artificial intelligence (AI)
ComputingComputing
ConsciousnessConsciousness
GDPR
editorialseditorials
Share on FacebookShare on Facebook
Share on TwitterShare on Twitter
Share via EmailShare via Email
Share on LinkedInShare on LinkedIn
Share on PinterestShare on Pinterest
Share on Google+Share on Google+
Share on WhatsAppShare on WhatsApp
Share on MessengerShare on Messenger
Reuse this contentReuse this content