How your phone and fitness band could end up giving evidence against you

http://www.theguardian.com/world/2015/feb/24/privacy-criminal-terrorism-suspects-i-cloud-phone-fitness-band-laptop

Version 0 of 1.

A criminal suspect can’t be forced to divulge their phone passcode, a US circuit court judge ruled in October 2014. Yet law enforcement officials can compel a suspect to provide a fingerprint – which they can then use to unlock the phone and obtain data which may prove the case against them.

In an ongoing Canadian civil case, activity data from a Fitbit fitness band is being used to determine the truthfulness of an accident victim’s claim that she is less active now than before the accident.

And in another civil case, where a plaintiff argued that his injuries meant he was no longer able to operate a computer for lengthy periods of time, a court ruled that the defendants had a right to access metadata from his hard drive that showed how often the claimant had used his PC.

Keeping in mind David Cameron’s suggestion in January that there should be no such thing as private messaging, how much of this is reasonable? How do we strike a balance between the privacy of the individual and the state’s interest in justice being served?

It might be reasonably argued that the degree of intrusion should be proportional to the seriousness of the accusation. But this principle can easily take us into very grey territory.

Suppose the police and intelligence services are investigating a terrorist attack – a tube bombing. Ten people died: it’s clearly a very serious crime. The authorities know that the bomb was placed on the station platform sometime between 7:13am, when CCTV footage shows the bag definitely wasn’t there, and 7.23am, when the explosion occurred. Is it reasonable to pull the Oyster data from 7am to 7.23am, to identify all the people who entered the station between those times and cross-reference with police and security services files to search for anyone known or suspected to have terrorist links?

What if they do that and draw a blank? They will now want to know more about all those people who entered the first tube station between 7am to 7.23am. More than 250 people per minute enter a busy station during rush hour, so that’s 5,750 suspects. They’re pretty sure from the CCTV footage that the suspect is male, so they narrow it down to 2,875 people. And that’s all there is to go on so far. One of those men is our bomber, the other 2,874 of them are innocent.

Is it reasonable to get a blanket court order to examine the ISP and mobile phone records of all 2,875 people? With that many people, all the authorities are going to do is run a simple search of the metadata – the who-contacted-who part – and see if any of them have been in contact with any known or suspected terrorists. They’re not spying on your sexts to your girlfriend or emails from your credit card company querying a missed payment, they’re just looking at who you might have been in touch with.

No matches. But the explosive used in this attack was found to have been stolen from a demolition company in Leeds one week before the attack. A court order to run a search of the 2,875 suspects’ email records for train bookings to or from Leeds during that week is readied, and their car registration numbers are obtained, to see whether any of them were logged on any ANPR systems on the M1 during that time. That’s all. No other email content will be looked at, nor any other details of their driving history; just those two straightforward searches. Fair enough?

The suspects are narrowed down to 47 people whose cars were spotted at least once on the M1 at some point between London and Leeds during that week. There is got nothing else to go on, so the authorities now need to take a deeper dive into the online lives of those 47 people.

What could that involve? Most of us leave a pretty comprehensive digital footprint these days. Your fitness bank or sleep-tracking app logs the time that you woke up. Your ISP logs show which websites you visited, even which stories you read on Guardian.com over breakfast.

Phone GPS and wi-fi logs can enable your movements to be tracked to within tens of metres: your route to the tube station can easily be mapped. Oyster data logs the details of the subsequent tube journey: stations, dates, times.

Your email records are a goldmine. There’s the obvious stuff – who you were in contact with when, and what was said – but there’s so much more than that to be gleaned.

Ever had a password reminder emailed to you for iCloud or Google? Deleted the mail but failed to empty your trash can? Not an issue if you switched on two-factor authentication, but if you didn’t, the authorities now have remote access to the content of your phone. The entire content. Your phone does regular, automatic backups to Apple or Google servers, and with the right software, anyone can download and access them.

Your contacts. Your calendar. Your photos. Your notes. And more.

Collating the addresses of your contacts with your Oyster data tells us who you’ve been visiting, and how often. The authorities would soon know more about those 47 people than almost any of their friends.

What if they had been left not with 47 suspects but 200? 500? Where do we draw the line?

What if, instead of an actual bombing, it was an aborted attempt at the same, but without hard-and-fast proof – how does that change the equation of what is and isn’t acceptable?

These will always be difficult judgement calls, but while the individual decisions may need to be made in secret, it does not mean that the principles governing these decisions should themselves be secret or – worse – left to the whim of individual judges in individual cases.

It may not be possible to formulate hard-and-fast rules covering every eventuality, but there is every reason to set out clear and transparent guidelines within which decisions can be made – and no reason why the debate to determine these guidelines should not take place in public and in parliament.