Mobile phones contain almost everything about their owners there is to know. Banking apps show their spending. Photos reveal their narcissism. Dating apps make their sexuality plain. Health apps record their vital signs. Music apps reveal embarrassing trivialities, like Air Supply being their second most played artist.
Apple likely to invoke free-speech argument
Apple is poised to argue free-speech rights as a key legal strategy to fight a court order to unlock an encrypted iPhone.
Location tracking shows where they went yesterday, allowing people to predict where they are likely to go tomorrow. Facebook reveals their friends and lovers, their likes and distastes, their hopes and nightmares.
Unlike anything else we own, mobiles are master keys to our lives. As such, they are incredibly alluring to both the good - like police investigating crime - and the bad, like identity thieves.
The more we use our mobiles, the greater the risk they pose to our privacy, and the more useful they are to criminals, hackers, and police.
Enter Tim Cook, the immensely powerful chief executive of Apple, now in a stand-off with the US government over unlocking one criminal's iPhone. It is a fight we should all hope he wins, not just those of us who are among his vast number of customers, and not just those with "something to hide".
Cook is a rare warrior, and a self-interested one, in the lopsided fight between people retaining some semblance of control over their private information, and the continual invasion of privacy by governments too quick to erode it by crying terrorism. This privacy protector has the means to take on the US government.
One of the San Bernardino terrorists, Syed Rizwan Farook, had an iPhone owned by his employer, the San Bernardino County Department of Health. The FBI has it, the Department allowed the Bureau to read it.
Except neither the Bureau nor the Department knows the phone passcode, and trying too many times will destroy its contents. The FBI already has a copy of the phone's data from Apple's cloud storage, but there is a missing few weeks before the attacks. Had the password not been reset after the shootings while it was in government custody, it could have backed up again, and prevented the entire basis for the fight.
So now the FBI wants Apple to create software to break the iPhone's security, allowing agents to read the whole thing and according to the US government, only that one terrorist's phone. As Apple knows, that is baloney. What the FBI wants now does not equal what may be done later.
"The government suggests this tool could only be used once, on one phone. But that's simply not true," Apple said in a statement last week. "Once created, the technique could be used over and over again, on any number of devices. In the physical world, it would be the equivalent of a master key, capable of opening hundreds of millions of locks — from restaurants and banks to stores and homes."
As we have seen many times before, authorities facing a problem in one investigation want a fix that may undermine the security and privacy of millions of innocent others. In working out why two people murdered 14 others, the US government is willing to risk the digital safety and privacy of a decent proportion of the world's mobile phone users.
In Apple's words: "In the wrong hands, this software — which does not exist today — would have the potential to unlock any iPhone in someone's physical possession."
US government lawyers responded on Friday that Apple's refusal to comply "appears to be based on its concern for its business model and public brand marketing strategy".
True, customer trust is essential to Apple's business model and Cook's stance likely helps Apple's business, despite Donald Trump's call for a boycott. But citizens' privacy should also be respected by their governments, only sacrificed when demonstrated as required and only to the least extent possible.
Too often privacy is derided as a concern of the criminal class, that people who have nothing to hide should not worry, rather than the digital equivalent of pulling your curtains. This attitude is how Australia has ended up with legislation that forces mobile companies to keep our phone and internet logs for two years for police to read at will without a warrant.
But at least for now the government can't easily peer into our phones themselves (unless you're arrested, in which case police can seize, read and use everything in your phone, no matter how irrelevant to their case - unlike in the US, where they need a warrant).
The US case against Apple risks changing that. And if a democratic government can force Apple to unlock its phones for an investigation it thinks is important, what is there to stop authoritarian regimes doing the same thing?