Encryption Nonsense

It’s hard to improve on Charlie Savage’s story and the many good discussions of the civil liberties implications of the Obama administration’s desire to mandate back doors to encrypted messaging systems like BlackBerry and Skype. I just want to point out that, in addition to the violation of privacy, it’s also bad business and bad security.

It’s bad business because it opens the doors to companies that aren’t governed by US law to create competing solutions and sell them in places where US law doesn’t apply. BlackBerry may buckle under and allow a back door to remain a player in the US market, but some other player could well create a smartphone messaging system that doesn’t have a back door and sell it in the parts of the world that don’t give a shit about US law. And other companies may create smartphone software (apps) that run on top of your iPhone or BlackBerry’s phone or messaging apps to encrypt voice and text traffic, but those companies will be headquartered (and employ engineers) somewhere beyond Eric Holder’s reach.

It’s bad security because a back door is an opening that can be breached by hackers as well as law enforcement, and the existence of a back door makes the system that has one an immediate target of hackers. RIM, the maker of BlackBerry, may not care about your civil rights, but they sure as hell don’t want to be the target of a hack that leverages a back door that they put in to satisfy the US, UAE and India.

A back door to publicly-available encryption has been the wet dream of law enforcement for decades. The last attempt, Clipper, died a quick and inglorious death, and I can only hope that the current idiocy is tucked away in a grave soon, and buried deep.






18 replies
  1. 1
    BR says:

    It’s also an attempt to return to the old days where encryption routines and the like were treated as munitions. (Back in the early 90s, this was the case, which is why when PGP was first released it was a big deal. Also, to get the source code of PGP to Europe, IIRC they printed the entire thing, took it on a flight to Europe, and then scanned it back in so that they wouldn’t run afoul of the stupid “encryption software is munitions” export rules.)

  2. 2
    thrashbluegrass says:

    In fact, this is what you had happening during the years when cryptography was listed as a munition and export was more tightly controlled: all the crypto firms were in Canada.

    And, given that a large number of companies in the field (e.g., Ericsson, Nokia) are headquartered in Europe, where the nerds are even more vocal about not taking up “tainted” technology, I’m pretty sure that the only effect of it would be locking US firms out of major markets.

  3. 3
    c u n d gulag says:

    If you remember the WaPo expose on the whole national security apparatus in this country (last month?), there are already too many agencies trying to sift through way too much information already.
    They can’t handle what they already have, so the solution is to give them some more?

  4. 4
    Jinchi says:

    It’s bad security because a back door is an opening that can be breached by hackers as well as law enforcement

    It’s amazing that security hawks never realize that these things work both ways. On 9/11, America’s only success happened when passengers on flight 93 used their cell phones to learn what was happening on other flights and decided to retake control of the plane.

    Bush’s first reaction?

    The government needs the ability to shut off all cell phone service during an attack.

  5. 5
    meh says:

    thank you. I learned something I did not know because of your post.

  6. 6
    Jeff says:

    I think that part of this is the cops natural reaction –he wants something to do his job for him– oooh look back door to encryption– now I won’t have to do any of that boring investigating stuff.
    In a way this could back fire on the Administration big time=
    it could make security worse , not better by inundating the agencies with trivial information, which would create a sort of “security by obscurity” situation.
    Smarter by far would be to work on systems analysis, and old fashioned foot work — which has how the British have been breaking up terrorist cells for a while.

  7. 7
    demimondian says:

    The huge hole in the whole thing is the line about “freeware applications”. Packet switched video conferencing over the internet is based on a publicly available statndard (H.323), which uses the ITU H.235 standards to ensure end-to-end privacy and security. The ISP has no way to intercept these calls.

    As usual, the FBI, like most of the supporters of the Confederate Party, is busy trying to bring back the past — a past which never existed.

  8. 8
    Dennis SGMM says:

    Many of the reasons why this is a catastrophically dumb idea were aired in an earlier thread. Between then and now I came to wonder how many other catastrophically dumb (But less publicized) ideas have been put into practice in the name of security. I also wonder how many “experts” have been hired in the rush to put butts in chairs.

    It’s difficult to escape the feeling that an immense supply of money has attached itself to an immense amount of stupid and the miserable and ineffective results are all papered over in the name of national security.

  9. 9
    Jeff says:

    @Dennis SGMM: you really don’t want to know the answer to that question–
    you would never sleep peacefully again.
    The only saving grace to that is that our enemies usually are stupider than we are– it’s the rare occasion they’re not that causes the catastrophe.

  10. 10

    All we need to do is unearth some Japanese war gold in the Philippines and Cryptonomicon becomes a documentary.

  11. 11
    rageahol says:

    This is all you really need to know about government-mandated backdoors.

  12. 12
    RareSanity says:

    As a person that writes software for devices that include high levels of (FIPS certified) encryption, this is the dumbest thing I have heard from the government in a long time.

    Any backdoor, on any application, on any platform, is a huge security risk.

    Besides, the security with encryption doesn’t lie in the algorithms. All of the algorithms are in the public domain. Security is all about key management.

    Backdoors would have to, in some way, disclose key values. Unless NIST changes its rules, any such backdoor would immediately invalidate any current FIPS certifications obtained by manufacturers. The process of certification, costs companies hundreds of thousands of dollars, per device, to obtain. And, it is required, that any Federal agency purchasing devices with encryption, or, agencies using Federal dollars to purchase devices with encryption, must be FIPS certified.

    I don’t think this will happen.

    Then again, 8 years ago I thought, “There’s no way they’ll (Congress) will just let Bush invade Iraq. He’s obviously lying about WMD.”

  13. 13
    burnspbesq says:

    The legal profession is going to have the mother of all conniptions about this. All of us have ethical obligations to preserve client confidences. There is no fucking way I’m breaching those just because the FBI is too fucking lazy to get warrants. Lawyers will go to jail over this; it strikes at the heart of who we are and what we.

  14. 14
    NCSteve Temporarily Defecting from TPM says:

    Back when the 1990s version of this push happened under Clinton and then just abruptly got dropped when it ran into resistance, my suspicious, spy-novel poisoned mind strongly suspected it meant we could already hack all the impossible-to-hack encryption software and wanted to lure people wanted to spy on into using it.

    I pretty much suspect the exact same thing about this.

  15. 15
    mclaren says:

    This is a non-issue due to PGP public key encryption available on e-mail etc., and the Android platform which, if it doesn’t have PGP encryption built in now into IM and SMS and VOIP and everything else, will soon.

    The whole thing is like watching that pathetic Pentagon effort to censor that book by buying up all the copies. Of course Wikileaks had a copy, so it was all pointless.

  16. 16
    MikeJ says:

    Letting other people determine what software runs on your hardware is always a bad idea.

    Sadly the turtlenecked keep pressing the idea that it will keep you safe.

  17. 17
    Tonal Crow says:

    @mclaren: A proposal to criminalize an important personal activity is NEVER a “non-issue”.

    When it is illegal to use secret-key encryption, its use becomes probable cause to search and seize. What individual wants to lose her computer/cell phone/iPad — and all the information on it? What businessperson can afford to lose such a device?

    Further, anything discovered during a search — even if unrelated to the search’s purpose — is game for prosecution. Fudging on your taxes a little? Emailing others about smoking pot? Got photos of Catherine the Great-style sex parties? It’s all open to investigation and possible prosecution.

    And how will they know you’re using forbidden encryption in the first place? Easy: the warrantless spying dragnet Bush created.

    It all fits together, and that’s no coincidence.

    This proposal will die only if we fight it. Hard.

    I suggest beginning with a fat contribution to the ACLU, and following up with some strong LTEs.

  18. 18
    sneezy says:

    @Dennis SGMM:

    I came to wonder how many other catastrophically dumb (But less publicized) ideas have been put into practice in the name of security

    “Most of them” seems like a pretty good bet.

Comments are closed.