There has been lots of conversation about the following statement from Prime Minster Cameron:
“do we want to allow a means of communication between two people which even in extemis with a signed warrant from the home secretary personally that we cannot read? …My answer to that question is no, we must not. The first duty of any government is to keep our country and our people safe.”
I think it’s an error to make this a discussion about technology. What’s open for debate here is the definition of “safe”. A lifetime driving ban for people caught speeding would likely make the country more “safe” than, for example, the recording of all digital metadata for communications in the UK.
When people get bogged down in specific discussions, for example about technical nitty-gritty of digital encryption, what we’re not doing is discussing the trade off between safety (public order and public safety) / freedom (right to the rule of law, liberty and privacy) and the overall economic and social well being of the state [c.f. Demos article :].
It’s acceptable for ~1.7K people to die on British roads without calls to change the law. That’s the accepted level of risk. The question is what level of level of risk are we, as citizens, willing to tolerate for the freedoms we want (e.g. for private communication).
The three major problems I see in addressing this are (1) getting useful estimates to the actual of risk of terrorism in terms of both likelihood and effect, (2) those who do have this data being unwilling to share it and (3) lack of an open and honest national debate about this.
If there’s a 1 in 10 million chance of a dirty bomb in central London is that enough of a risk to record everyone’s digital metadata or leave backdoors in encrypted systems? What about 1 in 1,000?
That’s the kind of conversation, about levels of acceptable risk, that no-one seems to be having.