Donate for the Cryptome archive of files from June 1996 to the present

28 April 2012

Cryptographers vs. Software Engineers


Date: Sat, 28 Apr 2012 16:05:16 +1000
From: ianG <iang[at]iang.org>
To: cryptography[at]randombit.net
Subject: Re: [cryptography] “On the limits of the use cases for authenticated encryption”

On 26/04/12 04:47 AM, Zooko Wilcox-O'Hearn wrote:

https://plus.google.com/108313527900507320366/posts/cMng6kChAAW

*“On the limits of the use cases for authenticated encryption**”*

*What is authenticated encryption?*

“Authenticated Encryption” is an abstraction that is getting a lot of attention among cryptographers and crypto programmers nowadays. Authenticated Encryption is just like normal (symmetric) encryption, in that it prevents anyone who doesn't know the key from learning anything [*] about the text. The "authenticated" part is that it /also/ prevents anyone who doesn't know the key from undetectably altering the text. (If someone who doesn't know the key does alter the text, then the recipient will cleanly reject it as corrupted rather than accepting the altered text.)

It is a classic mistake for engineers using crypto to confuse encryption with authentication. If you're trying to find weaknesses in someone's crypto protocol, one of the first things to check is whether the designers of the protocol assumed that by encrypting some data they were preventing that data from being undetectably modified. Encryption doesn't accomplish that, so if they made that mistake, you can attack the system by modifying the ciphertext. Depending on the details of their system, this could lead to a full break of the system, such that you can violate the security properties that they had intended to provide to their users.

Since this is such a common mistake, with such potentially bad consequences, and because fixing it is not that easy (especially due to timing and exception-oracle attacks against authentication schemes), cryptographers have studied how to efficiently and securely integrate both encryption and authentication into one package. The resulting schemes are called “Authenticated Encryption” schemes.

In the years since cryptographers developed some good authenticated encryption schemes, they've started thinking of them as a "drop-in replacement" for normal old unauthenticated encryption schemes, and started suggesting that everyone should use authenticated encryption schemes instead of unauthenticated encryption schemes in all cases. There was a recent move among cryptographers, spearheaded by the estimable Daniel J. Bernstein, to collectively focus on developing new improved authenticated encryption schemes. This would be a sort of community-wide collaboration, now that the community-wide collaboration on secure hash functions—the SHA-3 contest—is coming to an end.

Several modern cryptography libraries, including “Keyczar” and Daniel J. Bernstein's “nacl”, try to make it easy for the programmer to use an authenticated encryption mode and some of them make it difficult or impossible to use an unauthenticated encryption mode.

I don't think that's clearly a good idea. This notion of authenticated encryption is far too recent to dogmatically move it to crypto-policy. Also, I don't think we understand quite what those modes do as yet, if they are shoehorned in with the current compromises.

When Brian Warner and I presented Tahoe-LAFS at the RSA Conference in 2010, I was surprised and delighted when an audience member who approached me afterward turned out to be Prof. Phil Rogaway, renowned cryptographer and author of a very efficient authenticated encryption scheme ("OCB mode"). He said something nice about our presentation and then asked why we didn't use an authenticated encryption mode. Shortly before that conversation he had published a very stimulating paper named “Practice-Oriented Provable Security and the Social Construction of Cryptography”, but I didn't read it until years later. In that fascinating and wide-ranging paper he opines, among many other ideas, that authenticated encryption is one of “the most useful abstraction boundaries”.

Hmmm. I think there may be some limits to absorbing the opinion of a renowned cryptographer without some thought.

Talking generally, we are still blind-sided by the collapse of the MD5/SHA1 reputation at the hands of new thinking. In all my reading and thinking about hashes over 15 years, it was only a week ago that I unravelled the difference between a perfect message digest concept and the Merkle-Damgård construction. Doh!

Let me put some more foundation on this complaint. In my reading of the new-gen authenticated encryption modes, I have experienced a strong feeling that it is less about cryptography and more about software engineering. Yet the cryptographers seem to be doing that work without realising that crypto modes are as much software engineering as crypto, and more so the more complicated they become.

You sell yourself short - as a renowned software engineer, who crosses over to crypto as and when needed, you have actually more command of the tools needed. And you've worked out what the answer is in this case - the cryptographer's push for AE mode is simply the creation of a more perfect hammer, when our real worries are about the building, not the nail.

So, here's what I wish I had been quick-witted enough to say to him when we met in 2010: authenticated encryption can't satisfy any of my use cases!

And elsewhere:

I remember this discussion on the zfs-crypto list! It led me to a very general crypto engineering question. It goes like this: suppose you want to ensure the integrity of a chunk of data. There are at least two ways to do this (excluding public key digital signatures):

1. the secret-oriented way: you make a MAC tag of the chunk (or equivalently you use Authenticated Encryption on it) using a secret key known to the good guy(s) and unknown to the attacker(s).

2. the verifier-oriented way: you make a secure hash of the chunk, and make the resulting hash value known to the good guy(s) in an authenticated way.

Right. And there are serious differences. Choice 1. is tactical - fast and ephemeral. Choices 2 and 0 are more time-robust and more sustainable. (you forgot to put a number on public key :)

Another way of saying this is that the authentication of the delivered packet can be done at a higher layer (a software engineering concept...) depending on the business requirements (another software engineering concept). The very different characteristics lend themselves to different problem-spaces.

...

*Can't be implemented with authenticated encryption!*

...

As far as I can tell, authenticated encryption cannot be used to implement these properties.

From some pov, a lot of people assume encryption, and then add to that ... climbing bottom-up in search of some nirvana in security.

Yet, your description of your application didn't even evidence the need for encryption. Choosing AE over E doesn't make sense until you have established what the E is about...

This skepticism a good thing: A good test of a well designed system is to operate it in plain text, and still show it works; This has subtleties away from simplistic Alice-Mallory-Bob contexts, which subtleties again are the territory of software engineers.

Our designs are tested by breaking components and seeing if they still work in face of local failure. Whereas cryptographers laud counter mode instead of CBC -- strip out our software-oriented defences of redundancy and make us vulnerable again.

*What does this imply for other users of cryptography?*
I'm not sure if the Tahoe-LAFS design is sort of the odd duck, and all the rest of the world should go ahead and upgrade from unauthenticated encryption to authenticated encryption, or if this mismatch is a warning sign. Maybe authenticated encryption isn't the most useful abstraction boundary after all.

The latter. I can think of several applications where authenticated encryption adds far less than thought, and carries the danger that people will over-state its value, consequently making bad decisions elsewhere. Payments, contracts. OTR :)

Maybe we should have a conversation about which abstractions benefit our users. I think it helps to work “top-down”, from use cases (e.g. one-to-one chat, group chat, file-sharing, web hosting, live file-editing collaboration, streaming video, voice, etc.) to desired semantics, and then to the security properties of protocols. So far I think the enthusiasm for authenticated encryption has been somewhat “bottom-up”—after we all witnessed the repeated mistake of relying on encryption for authentication, we invented a way to prevent that, and then started thinking that we should apply the solution to all uses.

Yes. I wondered about that too. I think it is just another case of "I have a hammer, let's assert that the problem is lack of better nails."

If we look at that old saw, SSL and secure browsing and gmail attacks and so forth. A failure of authentication right there. AE won't address the failures we have, but AE will make its way into SSL. SSL will never be fixed to clear out the real authentication issues (like making client-certs compulsory so that passwords aren't an issue any more).

The hammer is being assumed and is being used, even as they steal the building away from us before our very eyes.

[*] Except they get to learn the length of it, depending on your padding. And of course they get to learn from where and to where it was transmitted, and when, depending on how your comms work. That's called "traffic analysis" and it is often the most valuable information to the attacker anyway.

Yes. Let's talk about traffic analysis - how are we going to mitigate traffic analysis? That's much more interesting.

iang

_______________________________________________

cryptography mailing list
cryptography[at]randombit.net
http://lists.randombit.net/mailman/listinfo/cryptography


Date: Sat, 28 Apr 2012 16:31:10 +1000
From: "James A. Donald" <jamesd[at]echeque.com>
To: cryptography[at]randombit.net
Subject: Re: [cryptography]

On 2012-04-28 4:05 PM, ianG wrote:

the cryptographer's push for AE mode is simply the creation of a more perfect hammer, when our real worries are about the building, not the nail.

Well said. Cryptographers have a habit of building a fortress with three entirely impregnable walls and one picket fence with a permanently open gate in it.

Yes. Let's talk about traffic analysis - how are we going to mitigate traffic analysis? That's much more interesting.

Assume everything is encrypted. Then stuff that is not time urgent (documents and whatever replaces email), will usually go to some central server farm, and then out again on demand. If everyone sends in their edits and messages encrypted, so that only server sees the addressees, then traffic analysis tells you Ann and Carol are using server X, but not that Ann is using server X to communicate with Carol.

Interactive voice and video calls must necessarily go by the most direct possible path from one network address to another, thus traffic analysis will reveal that one network address is communicating with another network address. This seems unavoidable.