11 March 1999


Date: Tue, 9 Mar 1999 17:07:27 GMT
From: Charles Lindsey <chl@clw.cs.man.ac.uk>
To: UKcrypto@maillist.ox.ac.uk
Subject: Comments on the DTI document


The DTI proposals are clearly a considerable improvement on anything we
have seen before. More importantly, it is clear that their understanding
of cryptographic issues has advanced considerably since then (I think
this list can take some of the credit for their education), to the
extent that they now seem to be asking the right questions, and even
giving many of the correct answers.

Nevertheless, the details still need to be examined carefully, and there
are some suspicious examples of double-speak which might leave room for
things to change in the wrong direction come the legislation. So here
are my own comments, some of which may form part of my eventual
submission to them.


1. Key Generation.
------------------

This is, I think, the biggest flaw in the present document. In #38,
under "Illustrative Examples Of Cryptography Services":

	Key Generation: Key generation is a critical part of the process
	whereby the key pair (both public and private) is generated for
	the issue of a certificate.

So they clearly recognise the criticality of key generation, yet they
still imagine that it will be the normal situation for CAs and other
TSPs to be generating keys! The mind boggles! And yet they truly seem to
believe this.  Under Annex A(II):

	Generation of Key Pair: A provider will need to submit details
	of how an asymmetric key-pair is generated and, if appropriate,
	delivered to the client. If the key-pair is to be
	clientgenerated then details of the process used will need to be
	supplied.
....
	An applicant must provide details of the mechanism it uses to
	ensure that the private signature key is only known to the
	client on issue.

And this is part of an attempt to ensure that CAs follow some sort of
'best current practice'? See also #44.

Let us be clear what 'best practice' is.

Key Pairs (especially for signature) MUST be generated by the client,
preferably on his own equipment (or on board his recently-purchased
smart card), and ideally on his own laptop in the middle of a large
field away from all possibility of Tempest surveillance. Whilst one
MIGHT allow him to use a program provided by the CA for the purpose, the
algorithm used MUST be in the Public Domain, and he MUST have the choice
of using a program obtained from elsewhere. Indeed, the normal situation
should be that he generates his key BEFORE bringing it to the CA.
Moreover, he might submit the same key to more than one CA, and when the
certificate expires he might take it to a different CA for renewal
(perhaps for a smaller fee). Moreover, it MUST be impossible for any Bad
Guy that sees the public key to be able to tell by whom or with what
program it was generated (for if you know what program generated it, you
perhaps know what Trojan Horses were embedded in that program).

If a CA should offer to generate a key and deliver it to you, then that
MUST be grounds for immediate withdrawal of that CA's licence.

There is the related problem of ensuring that the algorithm used for
generating the key is above suspicion. This comes in two parts: using an
adequate means of generating random numbers, and ensuring that it
contains no Trojans. Annex A(II) seems to recognise this:

	Technical Assurance: A provider will be expected to provide
	evidence, if appropriate, that the systems used for generating
	key pairs and storing its own private key has been independently
	assessed (e.g.  through the issue of an ITSEC or CC Certificate)
	in terms of security assurance.

(well, maybe that is addressed at generating its own keys, which
certainly must be beyond reproach). And then, further on:

	User signature generation products: Although not a condition of
	the licence, a client of a licensed Certification Authority will
	be required to use an "approved" signature generation product to
	take advantage of full legal recognition (see page 12). Licensed
	Certification Authorities will be required to supply information
	on products which have been "approved".

That "approved" is worrying. Who is to give "approval". Footnote 13 on
p12 makes vague reference to "discussions on the EU Electronic
Signatures Directive". This needs clarification. To my taste, the best
form of approval is that given by public scrutiny of published
algorithms, but that is not much help in ensuring that a smart card has
no Trojans within it. But, IMHO, I would rather have a signature
generated independently by a less-than-perfect program than one
generated on my behalf by a CA using the pig in his poke.

I have made a big issue of this. It IS a big issue. The present document
is not "technology neutral" in this regard. Indeed, it positively
fosters a regime where the Government could actually control
cryptography by misusing the licensing and approval mechanisms. I am
sure this is not the Government's intent, but that is not enough. It
should be seen not to be the Government's intent.


2. Legal recognition of signatures.
-----------------------------------

I only spotted this one when composing the previous section. Look at
#20.

	The intention of this is to give parties relying on an
	electronic signature, backed by a certificate from a licensed
	Certification Authority and using an "approved" signature
	creation device, a high degree of confidence that the signature
	is what it claims to be.

So far so good:

	Such a degree of legal recognition will also apply where a
	"qualified certificate" is used to back up an electronic
	signature created by an "approved" signature creation device.

And there's the "gotcha"! A qualified certificate (from an unlicensed
CA, perhaps) is OK, if the Court believes it, but it still must be
signed by 'an "approved" signature creation device'. Actually, this
looks like a problem within the EU Electronic Signatures Directive,
rather than the DTI document.

Who cares how the document was signed? The important question is whether
it was verified by "an approved signature verification device". There is
no way that a document could be forged so as to fool such a verifier
whether by accident, or by using an unapproved signature device. The
only way to produce a forgery is to use too short a key, or a key
_generated_ by an unapproved device (complete with Trojan), or to have
possession of the private key.



3. Double-Speak.
----------------

Nice bit in #36:

	Businesses are increasingly recognising the importance of being
	able to recover critical data, which staff may have encrypted,
	or the text of messages ... Providers of confidentiality
	services are, therefore, encouraged to make the recovery of keys
	(or other information protecting the secrecy of the information)
	possible through suitable storage arrangements ...

Followed by #37:

	The widespread deployment of such technologies would also help
	law enforcement, by allowing law enforcement agencies to recover
	encryption keys under strictly regulated procedures.

Talk about putting the cart before the horse :-) .

But the worst is in Footnote 17:

	An alternative to the use of Trusted Third Parties (TTPs) and
	key escrow is the use of encryption products which support key
	recovery, also known as key encapsulation, (confusingly key
	recovery can be used as a generic term to cover both key storage
	or "key escrow", and key encapsulation; we only use it in the
	narrower sense in this document).  Such commercial encryption
	products, which are already being used in the US, can
	incorporate the public key of an agent (usually a company) known
	as a Key Recovery Agent (KRA). This allows the user of such
	products to recover their (stored or communicated) data by
	approaching the KRA with an encrypted portion of the message.
	Lawful access to the keys (which are likely to be different for
	each message) can also be granted if a written authorisation is
	served on the KRA. In both cases, namely access by the user or
	by law enforcement, the KRA neither holds the user's private
	keys, nor has access to the plaintext of their data.

As they say (and confirm in the glossary definition) 'confusingly key
recovery can be used as a generic term to cover both key storage or "key
escrow", and key encapsulation"'. And who is doing the confusing? Why
the DTI of course, by taking a term with an admittedly ambivalent
meaning and redefining it to suit their own convenience. The proper term
for what they describe is "plaintext recovery" or, to use the term most
often applied, "GAK". But, of course, they couldn't say that, could
they?

And even what they are trying to say is wrong. If the Company wants to
recover its encrypted files after an employee has left, why does it need
the extra key to be held by an external KRA? Why not hold it internally?
There is a feature in PGP 5.5 to do this job (much maligned by those who
wish to misunderstand it) but it does not involve any external KRA. Does
anyone know of any products that actually operate in the manner
described?


4. Licensing.
-------------

The decision to use OFTEL as the licensing agency does not fill me with
overwhelming confidence :-( . But who else could they have chosen? The
Data Protection Registrar, perhaps? There really is no existing
Government Body with the necessary expertise (well, there is, but we
don't talk about that one, do we?).

So the real question is "How is OFTEL to be educated in the necessary
skills and knowledge to fulfill its task"? I am not aware that they
currently employ any professional cryptographers, but clearly they will
need to do so.

There seems to be some intention to involve industry. That is probably
good, but it needs to be spelled out in more detail. Essentially, I
think what we want is a Licensing Board, composed of Government and
Industry representatives - a Quango that is seen not to be totally under
Government control. It must also be made clear that its duty is towards
the users of cryptography, not towards those who supply it. In other
words, it should issue licences and approve products purely on their
technical merit, and without regard to where they came from (as we know,
the best cryptographic products actually arise from cottage industries).
The acid test would be whether it would be possible for GPG to obtain
approval as a signing tool.

In fact, a specific exemption from approval fees for products to be put
in the Public Domain would be no bad thing.


5. Export controls.
-------------------

Look at #47. They let the horns show through their cap for an instant
:-( .


6. Law enforcement.
-------------------

Well their examples are bogus (or rather inapplicable) but there is
indeed a problem to be faced. The idea seems to be that anyone who holds
a key can be required, by warrant, to provide assistance to the LEAs.
Fair enough; I think that is accepted, but it is the details that have
to be looked at. Look at #58:

	This means providing a power for lawful access to encryption
	keys.

But what they actually want is access to plaintext. The encryption key
is just one way of obtaining it. What they need to distinguish here is
access to
	a) Private keys
	b) Session keys
	c) Plaintext
Each or which may be appropriate in some circumstances. But the total
lack of all mention of "session keys" is a notable omission from the
document. Look at #63:

	The Government believes it is necessary to establish a new power
	to allow the police to require disclosure of encryption keys ...

But what they should actually want is whichever of the three is
appropriate to the circumstance. Look at #69:

	The written notice will specify the keys or material required to
	be disclosed. It will be for an authorised officer to decide,
	given the circumstances of a particular case, whether to order
	the production of specified material in a comprehensible form
	(e.g. the plaintext of a document) or order disclosure of the
	relevant confidentiality key.

And there it is! It is the "officer" who decides! That is not good
enough. To require delivery of a Private Key, from a possibly innocent
party, is going to do untold damage to that party (or to those on behalf
of whom that key is held) because it can enable access to communications
way beyond those covered by the Warrant. Yes, there are the usual
incantations against unlawful usage of the key by the authorities, and
about destruction of the key when the investigations are complete, but
suppose you are a large, respectable, (and innocent) Bank. Suppose a
private key has been seized that covers a substantial amount of your
incoming traffic. Would you trust such assurances?  Well, would you?

No! The party on whom the warrant is served MUST be given the option of
providing the private key OR of providing session keys generated with
it. If this means that the party (the Bank) has to set up a service to
provide session keys in return for the headers of encrypted
communications within milliseconds, then so be it.

Again, the party MUST be allowed the option of providing plaintext to
the Officer. This may not be so simple, because perhaps the Officer does
not want the party to see the plaintext, and moreover the party would
have to convince the Officer that the plaintext he delivered did indeed
derive from the cryptotext provided; and that really involves disclosure
of the session key.  So I think that, in either case, it is likely to be
the session key that should normally be handed over, unless the
_party_concerned_ elects otherwise.

Now look at #80:

	The proposals described above rest on the assumption that, by
	serving an appropriately authorised written notice, it will be
	possible to decrypt communications and stored data. This will
	normally be true where the notice is served on the individual in
	control of the encryption process.  ^^^^^^^^^^

Is this a typo, or have the DTI not YET realised that you have to serve
the warrant on the man who can _decrypt_ the communication?


7. Warrants.
------------

#88 speaks of "timeliness" of the decryption process. Elsewhere, the
document speaks of the need for the restrictions to be "proportionate".
These issues are related. If you are following a money-laundering
operation, then milliseconds are important. If terrorists are planting a
bomb, then you are concerned about minutes. To catch drug runners, you
need your information within hours, and when tracking paedophiles, it
does not really matter if you have to wait days.

So a warrant SHOULD indicate the urgency appropriate to the particular
case, and not demand faster or more immediate compliance than is
justified by the circumstances.

The regulation must also take account of the fact that, even where an
important key is escrowed with TTPs, it is likely that it will have been
split into, say, four parts such that any three of them can be used to
reconstruct the key, and the four parts deposited with four different
TTPs. This is "current best practice" and so must be regarded as normal
(remember, these proposals are "technology neutral"). Indeed, a TSP who
advises any practice other than this in the case of an important key of
a large corporation deserves to have his licence revoked, anyway.

It should be possible for the four TTPs involved to be able to set up
means for producing session keys in a timely manner when served with
proper warrants, but do not expect them to be able to set up such a
facility in a short time, certainly not at 2a.m. on a Sunday morning
when at least two Directors of each TTP involved would have to be called
to authorise the setup (any TTP who permitted operational staff to act
upon such warrants would not be worthy of licensing in any case).

I note that several places (e.g. #67 and Annex A(III)) explicitly use
the word "written" in connection with warrants. This is good (assuming
it means what it says). But there is no mention of how quickly such
warrants would have to be executed (recall the "instantaneous" execution
of electronic warrants envisaged by the previous proposals).

I believe that this should be no faster that that required under the
present IOCA, or no faster than 12 hours, whichever is the greater, the
reason being that there MUST be time to go before a Judge in Chambers to
attempt to show that the warrant is unlawful.

And finally, please can someone explain to me how the LEAs are supposed
to know who (or which TTP(s)) holds a particular private key that they
might happen to be interested in.

8. Alternatives to escrow.
--------------------------

In #87 we find:

	A proportion of electronic commerce activity will occur between
	individuals and corporate organisations. Where this involves
	serious criminal activity, law enforcement agencies may need
	access to communications between serious criminals and corporate
	organisations (e.g. internet shopping, purchasing airline
	tickets etc.). However, law enforcement agencies would be
	content to seek the co-operation of
	                         ^^^^^
	the legitimate corporate organisation where this is necessary.

Well, I would rather see "should" in place of that "would". The
corporate organisations are likely to be the innocent parties in this.
Moreover, they are going to have to be relied upon not to "tip off".

However, there is another possibility available in the case where the
corporate organisation is a Bank, which will be the common case. There
is already a power under some Act (I don't know which, but it is
certainly not PACE or the IOCA) for details of bank accounts and
transactions within them to be inspected by LEAs, under suitable
warrants. This power could easily be extended to the provision of
instant electronic access to all transaction in designated accounts,
under an appropriate warrant.

This would actually provide the LEAs with _more_ than they would get by
decrypting specific communications, because they would automatically get
to see ALL transactions to that account (like being able to listen to
both sides of a conversation) without the necessity of having to tap any
communications, or even of having to work out which communications would
need to be tapped.

And it would be better from the Banks' point of view (albeit expensive
to set up) because it would save them the embarrassment of having their
valuable private keys disclosed.


9. Content of certificates.
---------------------------

This proposal is supposed to be technologically neutral. Therefore it
cannot dictate any particular format for certificates, etc. That is a
job for industry and the standardization bodies.

Now we have, in Annex A(II):

	* a unique certificate number;

Would a hash of the certificate suffice?

	* an unambiguous statement that the certificate must not be used
	to validate a key being used to secure the confidentiality of
	information;

I have no objection to that in principle (there are obvious ways of
getting around it, so it is really just a political statement). But it
might not be technologically possible within a given encryption system.
Therefore, it needs the addition of:

	"or alternatively, the key certified must be self-evidently a
	signature-only key."

That would then be suitable for Open-PGP, where that information is
carried in the key rather than in the signature. Note also that, if
asked to sign an Open-PGP key that included both a (primary) signature
key and a (secondary) encryption key - this is the normal PGP case -
then the CA could return a certificate computed over only the primary
portion of the key.


10. Liability.
--------------

#44 raises the question of the liability of CAs if they should
inadvertently grant certificates to the Bad Guys. The issues seem to be
well covered, so I would just wish to emphasise that the primary
liability of the CA is to the _third_party_ who relies on the
certificate, even in the case where he has paid no consideration for it.
Since this is a departure from the normal law of contract, it will
indeed require special legislation.


11. Spamming.
-------------

The section on spam (#28) is an interesting (though important) red
herring.  May I suggest that we submit comments upon that section
separately from whatever comments we submit regarding encryption.


12. And finally.
----------------

Observe Footnote 3:

	For the purposes of this document, providers of cryptography
	services are described as Trust Service Providers.  A TSP may
	provide one, or more, cryptography services including acting as:
	a Certification Authority (see footnote 4), a Trusted Third
	Party (providing confidentiality services) or a Key Recovery
	Agent (see footnote 17).

Observe that "one or more". That is the big concession that has been
made in the last few months. Watch it carefully to make sure that it
remains.


Charles H. Lindsey ---------At Home, doing my own thing------------------------
Email:     chl@clw.cs.man.ac.uk  Web:   http://www.cs.man.ac.uk/~chl
Voice/Fax: +44 161 437 4506      Snail: 5 Clerewood Ave, CHEADLE, SK8 3JU, U.K.
PGP: 2C15F1A9     Fingerprint: 73 6D C2 51 93 A0 01 E7  65 E8 64 7E 14 A4 AB A5


To: ukcrypto@maillist.ox.ac.uk Subject: PKI and law enforcement Date: Wed, 10 Mar 1999 11:10:46 +0000 From: Ross Anderson <Ross.Anderson@cl.cam.ac.uk> So the policy is supposed to build consumer confidence in e-commerce and help law enforcement. Two comments: (1) It certainly offers a big carrot to commerce - particularly to banks and merchants - in that it will limit their liability. At present, if I use a credit card to buy something on the net and get ripped off - whether the goods are shoddy or extra transactions get billed, it doesn't matter - then I have a claim against the card issuer under the Consumer Credit Act, and the issuer can recover costs and damages from the merchant. Under the proposed arrangements, if approved digital signature are used to make such transactions, then signatures made with my key will be presumed valid even if I didn't make them, and the bank has its liability limited. So bankers win and consumers lose. Stand by for a dirty great rush to deploy SET in place of SSL :-) (2) I fear the proposals will actually harm law enforcment. Secure communication is often easier if at least one of the principals has a certificate. At present only banks and merchants have them; so they can can communicate securely with individuals using SSL, but for individuals to talk to each other they need to fiddle around with PGP or whatever which is too much hassle for most. (I am ignoring embedded apps where the customer keys can't sign arbitrary content.) However if the DTI proposals succeed, we will end up with a public key infrastructure of certified keys which are held by users and which can sign arbitrary content. This is not actually needed for e-commerce; we could happily muddle on using credit cards and SSL. But the DTI will have removed the main brake on deploying secure user to user communications. Sure, someone will have to actually write the `unclicensed' software that bootstraps off the PKI, but there will be no shortage of willing coders! The intelligence community now considers Clipper to have been a bad mistake - had the NSA not panicked at the the AT&T crypto phone and persuaded them to incorporate it, the 3-DES version of the phone would have sold a few hundred units and then been abandoned. Crypto would still be an obscure subject of interest to a couple of hundred mathematicians. I predict that PKIs will be the same, and if I were Director GCHQ I would be doing everything in my power to see to it that only responsible companies got public key certificates. The spooks' underlying problem is that they have invested enormous amounts of time and effort persuading the pols that `something must be done' about cryptology. However all the practical proposals they have come up with just look set to make the problem worse. Ross
Date: Wed, 10 Mar 1999 15:24 +0000 (GMT Standard Time) From: hcorn@cix.co.uk (Peter Sommer) Subject: duncan campbell@csrc/lse: March 16 To: ukcrypto@maillist.ox.ac.uk Computer Security Research Centre Public Security Colloquia 16 March 1999 Tuesdays LSE Clement Building Room D602, 1700-1900hrs Global information surveillance: Intelligence and law enforcement planning and capabilities Duncan Campbell Duncan Campbell will report on and discuss his current work for the European Parliament on such systems as Echelon and proposed legislation / mutual assistance arrangements as Enfopol and the US Communications Assistance to Law Enforcement Act.  On the back of e-commerce, the world's largest intelligence-gathering and law enforcement agencies have increased their demands for access to real-time surveillance facilities.  What is possible now,  what might be possible in the future,  what justification can be made,  and what controls on this activity should we be seeking? Members of the public may attend free of charge but must pre-register by e-mail to csrc@lse.ac.uk.  As this is work in progress thee meeting will take place under Chatham House Rules;  there will be a free exchange of ideas but the meeting cannot for the moment be reported in the press.  The LSE Clement Building is on the Aldwych, London WC2 between the Law Courts and the south end of Kingsway;  there is a large hanging white sign on the outside. Future Programme details:  http://csrc.lse.ac.uk/Colloquia/colloquia1.htm Enquiries: 0171 955 6197 (voice-mail service) This year's colloquium series is made possible through the kind assistance of Deutsche Bank |->   Peter Sommer   --------------------------------------------->| |->   hcorn@cix.co.uk   P.M.Sommer@lse.ac.uk  -------------------->| |->   Academic URL:  http://csrc.lse.ac.uk/Sommer/sommer.htm  ---->| |->   Commercial URL:  http://www.virtualcity.co.uk  ------------->| [See also Duncan Campbell's Echelon reporting: http://jya.com/echelon-dc.htm]