There is a big problem. It's called encryption. And the people in San Bernardino were communicating with people who the FBI had been watching. But because their phone was encrypted, because the intelligence officials could not see who they were talking to, it was lost.
If we try to prohibit encryption or discourage it or make it more difficult to use, we're going to suffer the consequences that will be far reaching and very difficult to reverse, and we seem to have realized that in the wake of the September 11th attacks. To the extent there is any reason to be hopeful, perhaps that's where we'll end up here.
We basically have only two real tried and true techniques that can help counter this. One of them is to make systems as simple as we can, and there are limits to that because we can only simplify things so much. The other is the use of encryption.
Those who are experts in the fields of surveillance, privacy, and technology say that there need to be two tracks: a policy track and a technology track. The technology track is encryption. It works and if you want privacy, then you should use it.
I don't own encryption, Apple doesn't own encryption. Encryption, as you know, is everywhere. In fact some of encryption is funded by our government.
So end-to-end encryption, keeps things encrypted and that means that law enforcement, without a warrant, cannot read that information.
As far as Paris goes, we don't know for sure yet how these guys communicate among themselves and how they communicated back to the ISIS leadership in Iraq and Syria, but I'm fairly confident we're going to learn they used these encrypted communication applications that have commercial encryption and are extremely difficult for companies to break - and which the companies have made the decision not to produce a key for.
It seems that 'national security' is the root password to the Constitution. As with any dishonest superuser, the best countermeasure is strong encryption.
The reality is that if you - let's say you just pulled encryption. Let's ban it. Let's you and I ban it tomorrow. And so we sit in Congress and we say, thou shalt not have encryption. What happens then? Well, I would argue that the bad guys will use encryption from non-American companies, because they're pretty smart.
Let's put it this way. The United States government has assembled a massive investigation team into me personally, into my work with the journalists, and they still have no idea what documents were provided to the journalist, what they have, what they don't have, because encryption works.
If, technologically, it is possible to make an impenetrable device or system where the encryption is so strong that there's no key - there's no door at all - then how do we apprehend the child pornographer? How do we solve or disrupt a terrorist plot?
So, in 1993, in what was probably the first salvo of the first Crypto War, there was concern coming from the National Security Agency and the FBI that encryption would soon be incorporated into lots of communications devices, and that that would cause wiretaps to go dark. There was not that much commercial use of encryption at that point. Encryption, particularly for communications traffic, was mostly something done by the government.
Now, with a warrant, they can always go to the information service provider and attempt to get that information. But even then, they may not be able to because the party selling the encryption services may be a third party and may not even know who the parties are that are communicating.
The new iPhone encryption does not stop them from accessing copies of your pictures or whatever that are uploaded to, for example, Apple's cloud service, which are still legally accessible because those are not encrypted. It only protects what's physically on the phone.
If you go to a coffee shop or at the airport, and you're using open wireless, I would use a VPN service that you could subscribe for 10 bucks a month. Everything is encrypted in an encryption tunnel, so a hacker cannot tamper with your connection.
The digital age is for me in many ways about temporal wounding. It's really messed up our ontological clocks. In the digital economy, everything is archived, catalogued, readily available, and yet nothing really endures. The links are digital encryptions that can and won't be located. That will have to be reassembled over time. It won't be exactly what it was. There will be some slightly altered version. So the book is both an immaterial and material artifact.
The people working in my field also are quite skeptical of our ability to do this. It ultimately boils down to the problem of building complex systems that are reliable and that work, and that problem has long predated the problem of access to encryption keys.
There are programs such as the NSA paying RSA $10 million to use an insecure encryption standard by default in their products. That's making us more vulnerable not just to the snooping of our domestic agencies, but also foreign agencies.
We've already seen shifts happening in some of the big companies - Google, Apple - that now understand how vulnerable their customer data is, and that if it's vulnerable, then their business is, too, and so you see a beefing up of encryption technologies. At the same time, no programs have been dismantled at the governmental level, despite international pressure.
When the September 11th attacks happened, only about a year later, the crypto community was holding its breath because here was a time when we just had an absolutely horrific terrorist attack on U.S. soil, and if the NSA and the FBI were unhappy with anything, Congress was ready to pass any law they wanted. The PATRIOT Act got pushed through very, very quickly with bipartisan support and very, very little debate, yet it didn't include anything about encryption.
[Eric]Goldman [a professor at Santa Clara University School of Law] says back in the 1990s, courts began to confront the question of whether software code is a form of speech. Goldman says the answer to that question came in a case called Bernstein v. U.S. Department of Justice. Student Daniel Bernstein who created an encryption software called Snuffle. He wanted to put it on the Internet. The government tried to prevent him, using a law meant to stop the export of firearms and munitions. Goldman says the student argued his code was a form of speech.
My favorite method of encryption is chunking revolutionary documents inside a mess of JPEG or MP3 code and emailing it off as an "image" or a "song." But besides functionality, code also possesses literary value. If we frame that code and read it through the lens of literary criticism, we will find that the past hundred years of modernist and postmodernist writing have demonstrated the artistic value of similar seemingly arbitrary arrangements of letters.
There's been a certain amount of opportunism in the wake of the Paris attacks in 2015, when there was almost a reflexive assumption that, "Oh, if only we didn't have strong encryption out there, these attacks could have been prevented." But, as more evidence has come out - and we don't know all the facts yet - we're seeing very little to support the idea that the Paris attackers were making any kind of use of encryption.
Coalition [against ISIS] need the tools. And the tools involve encryption where we cannot hear what they're even planning. And when we see red flags, a father, a mother, a neighbor who says we have got a problem here, then we have to give law enforcement the ability to listen so they can disrupt these terrorist attacks before they occur.
[Bill] Binney designed ThinThread, an NSA program that used encryption to try to make mass surveillance less objectionable. It would still have been unlawful and unconstitutional.
Follow AzQuotes on Facebook, Twitter and Google+. Every day we present the best quotes! Improve yourself, find your inspiration, share with friends
or simply: