Wednesday, December 23, 2015

Another Example of Why Governments Should Exit the Encryption Debate (The Juniper Debacle)

With the recent revelation of the Juniper backdoor vulnerability, it begs the question as to why we should “let” the government put purposeful backdoors into our products. Apple has been very vocal on why they won’t be bullied into allowing this type of behavior with the government, and how the privacy of their users data is paramount. With the recent terror attacks in Paris and other places in the world, governments everywhere, U.S.A and U.K being the loudest, are attempting to use fear to push their agendas. This isn’t news anywhere. We know they’re looking to create backdoors into our encryption and it’s for that very reason why we have the Juniper scandal today.

In a recent article by WIRED magazine they explain that the backdoor was made possible due to the DUAL_EC_DRBG encryption algorithm which was purposely created by the NSA to decrypt data surreptitiously. This was always assumed while the protocol was in review, but was eventually pushed into NIST standard as one of the recommended encryption protocols at the time. It’s been reported that this was part of the NSA’s operation BULLRUN, which was created to break encryption for monitoring targets, and one in which they had a nearly $250 million dollar yearly budget to do so. Even more concerning is that the NSA purportedly paid off RSA with the sum of $10 million to include this algorithm into their product. RSA has since said that they were unaware of this at the time, but it’s still highly suspicious.

This being said, governments have already been accessing our systems, either in cooperation with technology vendors, or by illegally circumventing vendors technology to gather the data they’re looking to collect. So why should we trust them to be more responsible by allowing them to put holes into products that we use everyday? What have they done in the past to gain this respect and trust? They don’t have our confidence to play within the rules, so what makes them think we’d be willing to be taken by the hand and walked down a path we'll eventually regret? The problems they’re creating, look at Stuxnet and DUAL_EC_DRBG, discredit them from being taken serious. Also, it’s overreaching to start using the terrorist attacks in Paris, where they didn’t use encrypted channels for communications, or the terrorist attacks in San Bernardino, where there were public Facebook announcements made by the terrorist alerting of their actions. Both of these attack communications were in cleartext and both of these attacks weren’t stopped. This might be somewhat far-fetched by me, but if you want all the encrypted information now start stopping things that happen in the clear first.

What many of these governments aren’t thinking now is that they’re making your device less secure and more vulnerable to eventual attack by someone else. I understand they want to have a separate key that would only allow them to access the data when needed; which is still scary. But just like Dr. Ian Malcolm said in Jurassic Park, “Life, uh….finds a way” and it’s possible that the vulnerability/hole you created for yourself will be abused by others. That this hole will be used to spy against you, or that even more malicious actors will use a similar method to abuse the access that was blown open to “protect” people.  I can’t see any concrete reasons, or examples, that have been used in the past that dramatically slides the argument into the governments favor against us giving up our privacy. So as we watch the latest backdoor issue we've seen come to light with Juniper, all due to the NSA making a hole that shouldn't have been there to begin with, is yet another example of why the government should remove themselves from this debate completely. They don't have a track record of being responsible with this type of access and we don't want to give it to them.


No comments:

Post a Comment