This article is from the source 'guardian' and was first published or seen on . It last changed over 40 days ago and won't be checked again for changes.
You can find the current article at its original source at https://www.theguardian.com/global/commentisfree/2018/feb/08/the-guardian-view-on-internet-privacy-its-the-psychology-stupid
The article has changed 3 times. There is an RSS feed of changes available.
Version 1 | Version 2 |
---|---|
The Guardian view on internet privacy: it’s the psychology, stupid | The Guardian view on internet privacy: it’s the psychology, stupid |
(11 days later) | |
Privacy is necessary for human society to function. The problem is not that the information exists but that it reaches the wrong people. Information on the internet could bring great benefits to society, and to individuals, when huge datasets can be refined to yield information otherwise unavailable. But once the information is gathered, a precautionary principle has to apply. It is too much of a stretch to agree with John Perry Barlow, the internet rights pioneer who died this week, when he quipped that “relying on the government to protect your privacy is like asking a peeping tom to install your window blinds”; but it does not help when it appears that everything the public sector does with the huge datasets it has will be overseen by the minister for fun. | Privacy is necessary for human society to function. The problem is not that the information exists but that it reaches the wrong people. Information on the internet could bring great benefits to society, and to individuals, when huge datasets can be refined to yield information otherwise unavailable. But once the information is gathered, a precautionary principle has to apply. It is too much of a stretch to agree with John Perry Barlow, the internet rights pioneer who died this week, when he quipped that “relying on the government to protect your privacy is like asking a peeping tom to install your window blinds”; but it does not help when it appears that everything the public sector does with the huge datasets it has will be overseen by the minister for fun. |
Governments need to keep our trust; but technology erodes privacy in two ways. The first is simply smartphones. Most Britons – 70% – now carry around with them devices which record and report their location, their friends and their interests all the time. The second is the ease with which two (or more) datasets can be combined to bring out secrets that are apparent in neither set on its own, and to identify individuals from data that appears to be entirely anonymised. By the beginning of this century researchers had established that nearly 90% of the US population could be uniquely identified simply by combining their gender, their date of birth and their postal code. All kinds of things can be reliably inferred from freely available data: four likes on Facebook are usually enough to reveal a person’s sexual orientation. | Governments need to keep our trust; but technology erodes privacy in two ways. The first is simply smartphones. Most Britons – 70% – now carry around with them devices which record and report their location, their friends and their interests all the time. The second is the ease with which two (or more) datasets can be combined to bring out secrets that are apparent in neither set on its own, and to identify individuals from data that appears to be entirely anonymised. By the beginning of this century researchers had established that nearly 90% of the US population could be uniquely identified simply by combining their gender, their date of birth and their postal code. All kinds of things can be reliably inferred from freely available data: four likes on Facebook are usually enough to reveal a person’s sexual orientation. |
Underlying such problems is human psychology. No one forces anybody to reveal their preferences on Facebook: the like button is genuinely popular. The latest spectacular breach of privacy came when the exercise app Strava published a global map of the 3 trillion data points its users had uploaded, which turned out to reveal the location of hitherto secret US military bases around the world. But the chance to boast about where you have been and how fast you were moving is exactly what makes Strava popular. Psychology, as much as technology, made this a massive security breach. The users gave enthusiastic consent, but it was fantastically ill-informed. Then again, how could anyone give informed consent when not even the firms that collect the data can know how it will be used? | Underlying such problems is human psychology. No one forces anybody to reveal their preferences on Facebook: the like button is genuinely popular. The latest spectacular breach of privacy came when the exercise app Strava published a global map of the 3 trillion data points its users had uploaded, which turned out to reveal the location of hitherto secret US military bases around the world. But the chance to boast about where you have been and how fast you were moving is exactly what makes Strava popular. Psychology, as much as technology, made this a massive security breach. The users gave enthusiastic consent, but it was fantastically ill-informed. Then again, how could anyone give informed consent when not even the firms that collect the data can know how it will be used? |
The protection of private data from malevolent hackers is a technical arms race one cannot leave. But the protection of privacies from inadvertent disclosure is primarily a social or psychological problem. The solution cannot just be one of informed consent from the data providers, because in most situations no one has the information necessary to give their consent. What’s needed instead is a change of attitude among those who harvest and process the data. They need constantly to ask themselves – or to be asked by society – how this information could be used for harm, and how to prevent that from happening. | The protection of private data from malevolent hackers is a technical arms race one cannot leave. But the protection of privacies from inadvertent disclosure is primarily a social or psychological problem. The solution cannot just be one of informed consent from the data providers, because in most situations no one has the information necessary to give their consent. What’s needed instead is a change of attitude among those who harvest and process the data. They need constantly to ask themselves – or to be asked by society – how this information could be used for harm, and how to prevent that from happening. |
Data protection | Data protection |
Opinion | Opinion |
Internet safety | Internet safety |
Internet | Internet |
Smartphones | Smartphones |
Mobile phones | Mobile phones |
Privacy | Privacy |
comment | comment |
Share on Facebook | Share on Facebook |
Share on Twitter | Share on Twitter |
Share via Email | Share via Email |
Share on LinkedIn | Share on LinkedIn |
Share on Pinterest | Share on Pinterest |
Share on Google+ | Share on Google+ |
Share on WhatsApp | Share on WhatsApp |
Share on Messenger | Share on Messenger |
Reuse this content | Reuse this content |