29 April 1998


Date: Wed, 29 Apr 1998 05:55:09 +0200 (MET DST)
From: Anonymous <nobody@REPLAY.COM>
Subject: Anonymity, FC '98, and escrowed paper shredders
To: cypherpunks@toad.com

Avi Rubin writes:

>    http://www.itd.nrl.navy.mil/ITD/5540/ieee/cipher/

Most interesting from a cypherpunk point of view.

"Matt Blaze expressed an interesting analogy in describing a paper
shredder that created a digital copy of a document and sent it off to
a central database. When a document was accidentally shred, the user
could contact the database and have a copy faxed."

Mentions of anonymity contain an anonymous poster to cypherpunks, an
anonymizer inside the FBI's intrusion detection program, AAAS proposal
to use remailers to defend human rights, toll collection and mass
surveillance in New Jersey, trustees and anonymity-revocation in
payment schemes, policy issues for anonymizing services and discussion
of onion routing at a spookish security conference.

Finally, good manners or not, here's Paul Syverson's must-read item on
FC '98 for those of us who couldn't go:

----------------------------------------------------------------------

The second annual Financial Cryptography Conference (FC98) was held in
Anguilla in the British West Indies on February 23--26, 1998.  The
conference was a rousing success, Attendance was up with over 100
participants from business, academia, and government with interests
in cryptology, computer security, and/or the financial industries.  A
governing body over the conference was introduced, the International
Financial Cryptography Association, and held its first meeting,
electing a board consisting of Vince Cate, Bob Hettinga, Ray
Hirschfeld, Lucky Green, and Ron Rivest.

The presentations were interesting and well attended, no mean feat
considering the Caribbean diversions that surrounded the participants.
The quality was probably best summed up by David Chaum who remarked on
the last day, ``I can't remember the last time I sat through an entire
session much less a whole conference, but I came to every paper here.''

The following description will focus on the official program. This
means that it will deal almost entirely with presentations by
cryptology and computer security researchers. Unlike last year, there
were no papers presented by members of the financial community or
policy experts. Those contributions occurred entirely in presentations
and panels that were not part of the official program.  This was
unfortunate. Given the available distractions, these unofficial
sessions were much less well attended. The ones I did attend were very
instructive in understanding the financial side of financial
cryptography. Had they been part of the official program, there might
have been even more of a dialogue between the two sides that give the
conference its name. Which is not to say that interaction was minimal,
far from it. But the official dialogue was a bit one sided. (A much more
off-program description of the conference can be found at
http://www.live.co.uk/ftvfr398.htm )

The conference opened with welcoming remarks from the chairs and from
Victor Banks, the finance minister of Anguilla. He noted that Anguilla
was well suited as the site of the conference, observing that it may
have more web pages per capita than anywhere else in the world. He also
noted that revolutions, particularly bloodless revolutions, do well in
Anguilla. And, like their own revolution in the late 1960s, he held
high hopes for the revolution in electronic commerce at the forefront
of which one can find this conference.

The first session began with a paper on ``Micropayments via Efficient
Coin-Flipping'' by  Richard Lipton and Rafail Ostrovsky.  The goal is
to minimize communication: number of rounds, number of bits sent,
hardware requirements, fraud, and computational requirements.  In this
scheme a coin-flip protocol is performed on the links of preprocessed
hash chains formed independently at the vendor and the customer. Coin
flips resulting from the chain results will only infrequently indicate
a payment. The bank participates only when a payment is required. This
is somewhat similar to Rivest's ``Electronic Lottery Tickets as
Micropayments'' which was presented at last year's rump session and was
published in the final proceedings (which are now available from
Springer). However, as Ostrovsky later explained at the rump session.
the two are not the same. One difference is that, roughly speaking,
Rivest's scheme backloads the winning result onto the lottery protocol,
while the Lipton-Ostrovsky scheme frontloads the winning result.

The next paper was ``X-Cash: Executable Digital Cash'' by Markus
Jakobsson and Ari Juels. The basic idea is to have applets carrying
cash that they can spend under appropriate conditions. The contribution
of the paper was to show how to do this in such a way that the applet
cannot easily be pickpocketed by an attacker or hostile host.

The first session ended with ``Distributed Trustees and Revocability:
A Framework for Internet Payment'' by David M'Raihi and David
Pointcheval.  One goal is to relax constraints on usual trust model and
reduce trust assumptions of previous work. One may adopt different
approaches to the use of trustees: trustee in every transaction,
trustee just at account opening, or trustee only in
anonymity-revocation. The paper combines the last two of these. It is
based on the use of smartcards with user pseudonyms. The paper also
makes use of a threshold approach to anonymity revocation so that
honest users get assurance of privacy against a (small number of)
compromised trustees.

David Maher presented ``A Platform for Privately Defined Currencies,
Loyalty Credits, and Play Money''. This was also a smartcard scheme.
But, the idea is to have a fairly generic smartcard on which a number
of different private currencies could easily be maintained.  He
sketched a number of potential applications: vendor loyalty points,
corporate scrips, and monetary values for virtual environments like MUDS
and interactive games. The idea is to have the currencies be easily
defined and implemented as well as fungible with more ordinary
currencies. It seems like a very interesting idea; although some in the
audience questioned whether vendors would want to be bothered with the
infrastructure overhead.

``Assessment of Threats for Smart Card Based Electronic Cash''
was the next paper, by Kazuo J. Ezawa, Gregory Napiorkowski.

It prompted lots of detailed questions. As was noted by Ron Rivest
during questions, the threat model was someone trying to get money
out of Mondex by counterfeiting cards rather than say
a competitor trying to undermine confidence in the Mondex
system. This was acknowledged as the focus of the work.

The last paper of the day was ``Using a High-Performance, Programmable
Secure Coprocessor'' by Sean W. Smith, Elaine R. Palmer, Steve Weingart
The talk nicely outlined all the problems in developing building
deploying, and updating (the software on) secure coprocessors.

Gene Tsudik kicked off the Tuesday program talking about ``Secure Group
Barter: Multi-Party Fair Exchange with Semi-Trusted Neutral Parties'',
which he wrote with Matt Franklin.  The Franklin-Tsudik approach uses
unbalanced verifiable secret sharing to increase efficiency. They
reduce all types of multiparty exchange to single unit cyclic exchange.
In the multiparty case, principals will get what they want. But,
principals may not know from whom they get it.  Cyclic order is hidden
by the STNP, and it does not necessarily know the size of the group.

The next paper was ``A Payment Scheme Using Vouchers'' by Ernest Foo
and  Colin Boyd.  The voucher approach uses the same payment principals
as other approaches: the customer, the bank, and the merchant.  The
main difference is that it reverses the usual payment cycle.

-bank and merchant create a voucher
-merchant sends the voucher to customer (including encrypted goods)
-customer sends voucher with cash to the bank
-bank evaluates voucher
-bank informs merchant and
-bank releases voucher to customer

Vouchers are made only when merchant wants to make a new product Then
they sit on the ftp site and wait for customers.  Efficiency was
claimed over, e.g., Netbill and iKP Also, there is no online processing
by merchant.  Like Netbill, goods are part of the protocol, not just
cash is sent. One can have customer anonymity via anonymous ftp, but
not anonymity from the bank.  Detailed comparison was given of the
number of messages, symmetric encryptions, the location of computation,
signatures, etc.  It was noted that this scheme is  not as efficient as
some of the micropayment schemes. Also, it goes against the usual
network thinking by placing load at the bank. But, it requires less
work by the merchant. A question was raised about static vs. dynamic
products This scheme only allows static (predetermined) products.

The next paper was ``A Formal Specification of Requirements for Payment
Transactions in the SET Protocol'' by  Catherine Meadows and Paul
Syverson.  SET is the proposed industry standard for credit card
transactions on the Internet. This paper gave an overview of the
payment part of SET.  Requirements were given in NPATRL (the NRL
Protocol Analyzer Temporal Requirements Language) for analysis using
the NRL Protocol Analyzer.  Modifications and additions to NPATRL
needed to formalize requirements for SET were also described.

Markus Jakobsson presented a position paper written with Moti Yung
entitled ``On Assurance Structures for WWW Commerce''.  The motivating
question was, ``What is left to do to facilitate trade over the
Internet?'' The current environment was claimed to be characterized by
lawlessness, changing identities, and gang wars, where one must be
careful carrying cash, and there are no road signs. Basically, they
compared the World Wide Web with the wild and wooly west.  (Within this
the western theme Markus described the good, the bad, and the ugly of
what is on the Web.) Main components of the infrastructure needed are
the access structure, for people to find the goods and services they
need, the trust structure to facilitate trust between customers and
merchants. Also needed are protections in other contexts. Anonymity,
freedom from profiling, prevention of access to information, and
[forced access to information] i.e., direct marketing, were all raised.
Basically the need for both individual and institutional rights.
Finally they noted the need for a means for maintaining the structure
of assurances. They also considered the economic, legal, and other
impediments to providing these needs.

The next program elements was a panel discussion on the Mechanics and
Meaning of Certificate Revocation moderated by Barb Fox(BF).  Other
panelists were Joan Feigenbaum(JF), Paul Kocher(PK), Michael Myers(MM),
and Ron Rivest(RR).

BF began by characterizing revocation as the undoing of a persistent
signed statement.  The reasons could be either key compromise or some
sort of relationship binding failure, either a key to an identity or an
identity to a CA (certification authority).

Questions for panel given were:  Can X.509 work? What are the
alternative CRLs? And, what about revocation across PKIs?  Other
questions were: Who owns a certificate?  Who pays for revocation? What
is the relationship between revocation and trust management? Finally,
should we wait for legal mechanisms?

MM noted that we can't solve all the problems today, but major
corporations want to use this today to manage their risk. There is
also nonrepudiation and other issues besides risk management. He noted
that a CRL can be good for many needs even if it is just a blacklist,
and CRLs are well position in architectures today. But, on the other
side he noted their large size and inability of the basic approach to
handle timeliness effectively. The alternative of short lived
certificates take advantage of existing mechanisms and are easy to
deploy within an enterprise. But, the don't scale well; it must be
decided for how long they are valid. Thus, it is somewhat a case of
just moving the bandwidth elsewhere. He also mentioned pros and cons of
on-line and off-line approaches.

PK claimed that revocation is needed to make public key crypto automatic.
Solutions must consider security, scalability, performance,
memory (smartcards), bandwidth, auditability, practicality wrt
what is currently available, secure manageability, and simplicity
(e.g., should use standard crypto).

CRLs fail at least wrt reliability, scale, performance, memory, bandwidth,
and  practicality (applications don't know where to get CRLs from).
Valicert's approach is to use Certificate Revocation Trees and he
claimed that these meet all the requirements.

RR gave his position as one favoring no certificate revocation.
Certificates support a signed message/request.  Freshness matters to
acceptor (more than the CA), so freshness requirements must be set by
the acceptor not the CA.  Corollary: periodically issued CRLs are
wrong.  E.g., a badge checker wants at most day old badge information
but CRLs come out once a week.

He then gave the SDSI model in which the signer must get the freshness
evidence, not overworked server.  And, the simplest freshness check is
a (more) recently issued certificate.  He noted that key compromise is
different.  Who controls a key's good/compromised bit?  He noted that
the  PGP suicide note is no good in the case a where a key is
deliberately shared.  He proposed a network of suicide bureaus with
which you register when obtaining a public key. Suicide notes can be
sent to any suicide bureau from which it will quickly be disseminated
to all.  This means that you can obtain a health certificate from the
bureau with which you registered saying that you indeed are registered
and no evidence of problems with your key has been received.  He ended
with a bit of advice from the grammar and style classic by Shrunk and
White: always go positive when you can.

JF said that she agreed with everything Ron said especially, put it in
positive terms. She noted that the cost of infrastructure maintenance
is crucial. Fast cross PKI checks will be expensive, but probably can
be minimized.

After basic positions were given the panelists all generally agreed on
things ;>). For example, Matt Blaze (one of JF's co-creators of
Policymaker) asked, ``Is it worth it to build this whole infrastructure
to have certificate revocation?'' MM responded that there isn't much
infrastructure difference between revocation and validation. To which
JF responded, ``No. There's a big difference.''

David Aucmith pointed out that devices (not people) often carry
keys. And, they can't make suicide decisions. For them
CRLs are important. This was one question for which
I didn't hear a good answer to, although something akin to
Rivest's suicide bureaus might also be able to handle this.
Presumably if evidence of compromise has arisen somewhere, then
the device will not be able to obtain a certificate of health
when needed. It's inability to function should then ultimately
attract the attention of a human who can then decide to obtain
a new key for the device.

Someone else raised that CRLs are a mechanism for managing changing
trust, but why should we think that this one mechanism can handle all
the trust management available from public keys?  If there is evidence
that my key was compromised two weeks ago, I can incorporate that in a
CRL, but how could you do this on the positive approach?  It can't go
back in time like a CRL can.  Ron Rivest said that this was a tough
problem and he didn't know the answer. But he added, "that's what
juries are for."

After dinner Tuesday night was the first meeting of the
International Financial Cryptography Association (IFCA).
As mentioned above, a governing board was elected. The other
main topic of business was where to hold future conferences.

After much animated discussion it was decided that the conference
would stay in Anguilla for at least the near term.

Following this, there was a rump session.

John Kelsey described cryptanalysis of the SPEED Cipher (work done with
with Wagner, Hall, and Schneier). The SPEED cipher was introduced at
FC97 by Yuliang Zheng. He observed that the interesting part was  the
cryptanalysis that fails.  The obvious differential attack doesn't
work.  Instead they use a related key attack.

Ian Grigg announced NISI Advanced Encryption Standard Support. They will
do the JAVA implementation for any algorithm that anyone wants because
NIST wants 3 implementations for standards including one in JAVA.
They're the middle men. They need volunteers to do it.

Stephan Overbeek described the N-count value Analyzer.
It is based on one-way chaining in smart cards.  Value is in the
number of chain links revealed (reversed).  The claimed main difference
is that the 1-way chain is specific to a terminal rather than the user.
It was claimed to be fast and good for micropayments.

Cathy Meadows gave a quick overview of the NRL Protocol Analyzer,
an interactive Prolog based tool for analyzing cryptographic protocols.
It examines a protocols by starting in a final state and searching
backwards to see if it is possible to reach an insecure initial state.
It is thus like a model checker. But unlike a model checker, it sometimes
analyzes infinite state spaces, which it does by facilitating the proving
of lemmas (like a theorem prover) that allow pruning of infinite chunks
off the search space.

Alain Mayer described policy issues for running an anonymizing service.
He raised three general problems that might arise, not necessarily
specific to Lucent's LPWA.

-Your service is used for a(n attempted) break-in at another site.
-somebody posts threats or insults on a message board via your service.
-a site asks you to block access from your service to the site.

I noted that all three of these had actually occurred with our Onion
Routing prototype, and that at the time we were struggling with general
policy solutions to these problems. (We have since formulated a policy,
which is posted on our Web site. LPWA has also posted a policy statement
at http://lpwa.com:8000/policy.html )

Rafi Ostrovsky explained why Rivest's Lottery scheme is not equal to
the Lipton-Ostrovsky given on Monday. The difference has been described
above in the synopsis of his Monday presentation.

Paul Syverson presented Weakly Secret Bit Commitment. I gave an example
of an exchange protocol with no trusted third party where the
principals are not forced to be fair but rather where their incentive
to proceed outweighs their incentive to cheat.

Jon Ziegler described the Java Ring, which is Java running on a Dallas
semiconductor iButton. Amongst other nifty features, it does garbage
collection so you can delete applets when their done.

David Goldschlag presented Security Models for content.  This was an
overview of the Divx approach to, e.g., ``renting'' movies, in which
the rental period starts when the movie is first played rather than
when it is obtained and there is no need to return the DVD. To allow
you to `re-rent' the disc the DVD player has a dialup connection to a
backend system.  The DVD player logs the disc serial number of played
discs and reports the log periodically to the backend (offline).  If
you prevent the player from calling in for a long time it will lock up.
Questions were raised about privacy.  David responded that release of a
customer profile is better protected than at conventional video rental
chains where the cashier has your profile rather than an access
protected billing service.

Stuart Stubblebine presented On Revocation. This was roughly improved
or extended versions of Rivest's principles (given during panel, c.f.,
above). The principles were related to his own work on recent security
and metrics of authentication.  One example, Rivest principle:
Freshness requirements must be set by acceptor not a CA.  This was
amended to: Freshness requirements must be set by all entities relying
on them.

Bob Green described what it was like to be a Programmer Living in Anguilla.
This wasn't really on the topic of the conference. But, it gave a fascinating
glimpse of what it is like to work in Anguilla. Some advice and comments
gleaned from the talk. If you want to move here, bring two of everything
that can break. Officially on paper, you can't move, so you just do it.
If you fix somebody's PC there, you now know their whole family.
And, since there are only a handful or so of families on the island,
you get to know everybody pretty quickly.

Bob Hettinga presented Market model for bearer certificates.
He suggested that we should base it on the old physical bearer bond model.
Major Claim: even if you issue a bearer certificate at every exchange,
that's still cheaper than, e.g., seven years of credit card audit
trails.

Steve Schear rounded out the evening with a description of First E-Cache.

Wednesday morning began with an invited talk by David Chaum, who I
think could reasonably be called the undisputed father of financial
cryptography.  The title of his talk in the preproceedings was
``Private Signatures and E-commerce''; however, the title on his
opening slide was ``Which Flavor Will Win in the `Way-More-Digital'
World''. This brief writeup can only sketch some of the many topics
on which he touched.

There were two foci to his talk, info technology policy issues
and privacy, particularly in payments.

His policy overview covered three areas.  (1) commons issues: free
bandwidth has had a positive effect on cyberspace growth (2) consumer
protection: false privacy?  (3) human rights: next wave of fundamental
human rights is informational rights.  Consumer protection and
bandwidth intersect at junk mail and push technology.  Consumer
protection and human rights intersect at the consumer platform and
interface. And, bandwidth and human rights intersect in the area of
message encryption secrecy. In the intersection of all three is access
-- interaction security (people have to be able to protect their
interests in cyberspace). He went on to describe both the problems and
facilitating factors of establishing interaction security.

He began his discussion of privacy by noting:  The consensus of the
heads of major technology companies, Greenspan, others is that consumer
confidence in privacy protection is the major reason that e-commerce
hasn't taken off. In fact, surveys even show that people are generally
expecting increased privacy from e-commerce vs. current commerce.  He
then explained some of the drawbacks of e-commerce using conventional
payment mechanisms such as credit cards and explained how blind
signatures enable one-way private e-cash.  He felt it was quite
important to stress that it is one-way privacy not anonymity, as is
often said in the media.  In other words, nobody can without your
agreement know where you spent your money BUT, you can always prove
with the bank's help who received any payment, as well as when and for
how much.

His conclusion was that there were forces moving us in two directions.
flavor #1: an all traceable nonrepudiable more-centralized world, and
flavor #2: an expanding decentralized informational-rights world
(the good one).  He didn't say definitively which way things would go,
but he felt that work such as done by the attendees of this conference
would help push in the right direction.

A fascinating claim that he made during questions, but on which he did
not have time to elaborate was that, with the various cryptographic and
other mechanisms  he had described in his talk, the possibility exists
to virtually eliminate of organized crime.

The conference continued with ``Group Blind Digital Signatures:  A
Scalable Solution to Electronic Cash'' by Anna Lysyanskaya and Zulfikar
Ramzan, who split the presentation duties.  Their model is of a central
bank with smaller banks that users choose. The goal is to make the
Goal: identity of the user and of the user's bank anonymous to the
vendor and the vendor's bank (only the central bank can find out the
issuing bank of a piece of e-cash. And, no bank (even central) can
issue cash in another bank's name. The scheme is online, hence somewhat
expensive. But it can be made offline if we compromise a degree of
user anonymity.

Before the next session Ian Goldberg announced that he had a 100 byte program
to turn an export version of Netscape into one with all the strong
crypto and announce a contest to write a smaller one. He also extended
the contest to write a similar program for Internet Explorer.

The next talk was ``Curbing Junk E-Mail via Secure Classification'' by
Eran Gabber, Markus Jakobsson, Yossi Matias, and Alain Mayer (the last
of whom gave the talk).  H e noted that spamming is currently easy:
it's easy to to gets lots of addresses and to send to them, and it's
hard to distinguish spam from other mail. There are tools available,
but their solution was claimed to have advantages over each of them.
The gist of their solution is to have extended email addresses,
basically you have a core address plus extensions for use with
multiple groups of users. A handshake to the core address just gets
extensions This deters spammers and adds functionality.  Also, you can
later revoke an extension (by filtering all messages with that
extension). So a spammer buying the address from another spammer won't
get any value since the extension is revoked.  This approach is claimed
to be provide transparency of extensions to actual users, robustness
(flexible about how much automation is used) backwards compatibility
with sendmail, etc., and -interoperability with the rest of the world.

Next up was ``Publicly Verifiable Lotteries:  Applications of Delaying
Functions'' by David Goldschlag and Stuart Stubblebine.  Regular
lotteries require trusting the auditors and determining the winner is
not repeatable since it relies on a random element. The goal here is to
find a fair, closed, and publicly verifiable lottery in which not
even the lottery agent is trusted.  The basic idea is to make the
winning number calculation slow and require at least one random entry.
Besides the obvious application of running a lottery other applications
include distributed random numbers (with a low overhead of communication).
It was also shown how to use delaying functions in the exchange protocol
I described in the rump session.

The next paper was ``Security of Digital Watermarks'' by Lesley R.
Matheson, Stephen G. Mitchell, Talal G. Shamoon, Robert E. Tarjan, and
Francis X. Zane.  This was a very nice survey of existing watermarking
technologies.  Their stated goal is to have invisible and robust
watermarking: only the key holder can find it, and it can't be removed
without destroying the data.  The focus was on perceptual content
(video, etc.) rather than representational content (programming text,
etc.) It was noted that it may be Important to have layers of marking
for e.g. private and public watermarks.

After lunch came ``Security in the Java Electronic Commerce Framework''
by Surya Koneru, Ted Goldstein. The talk was given by John Ziegler.
The talk contrasted commerce with EDI. Commerce is not about absolute
trust.  In fact, spontaneous commerce requires zero trust in the
principals; all trust is in the payment token. The opposite extreme is
EDI, where trust is in the long term relationship, and the payment
token can be just about anything.  Their offerings are Java Commerce
Beans and Java Commerce Client (a wallet).  Java Commerce Client
anchors the client side of the transaction, handles client delivery,
installation, update, cooperation with a trusted and familiar
interface.  Java Commerce Beans provide a structure for creating
customer relationships:  operations, instruments, protocols, services,
etc.

Next up was ``Beyond Identity: Warranty-Based Digital Signature
Transactions'' by Yair Frankel, David Kravitz, Charles Montgomery, and
Moti Yung.  A standard CA architecture assures static properties,
liability with respect to contract enforcement, nonrepudiation of
signers, etc.  The main concept of a warranty is that it addresses the
need to further validate current contextual information beyond
identity. A warranty granting transaction system is dynamic: providing
warrants on a per-transaction basis, accounting for user history and
providing user-specified access to control parameters.

The next presentation was ``Compliance Checking in the PolicyMaker
Trust Management System'' by Matt Blaze, Joan Feigenbaum, and Martin
Strauss.  The motivating problem for this presentation was: Even if
wary customer Alice has convinced herself that Bob of small company
Bobsoft signed a program so what?  She wants to know if Bob complies
with her policy for buying software. The topic of this paper is: What
do we mean by proof of compliance?  Compliance checking approach works
by incremental proofs using supplied credentials (authorizations). For
example, Cred1 is run and it says Bankofficer1 will approve if he sees
evidence of freshness. Cred2 is run and says fresh, Cred1 is run again
and says approved.  Yes means there is some finite sequence of the
running of credentials there is an acceptance record that says the
policy is satisfied.  But this is undecidable!  (Various restrictions
can get this down to NP hard, or NP complete.) Nonetheless, this has
been implemented and runs in application.  Applications noted as
described elsewhere include signed email, PICS labels, and license
management.  Note that since policies must be monotonic you can't
directly do certificate revocation type things.

Next was ``An Efficient Fair Off-Line Electronic Cash System with
Extensions to Checks and Wallets with Observers'' by Aymeric de Solages
and Jacque Traore.  This paper is at the most recent in a chain of
papers making various improvements on Brands's CRYPTO 93 paper of
similar name.  The present contribution is to improve the efficiency of
the payment protocol.

The final paper of the official program was ``An Efficient Untraceable
Electronic Money System Based on Partially Blind Signatures of the
Discrete Logarithm Problem'' by Shingo Miyazaki and Kouichi Sakurai.
Those who stayed until this last paper were rewarded with an
interesting talk that began with a presentation of nondigital (hence
exportable) origami ninja weapons.  The basic idea is that the signer
signs a blind part (user ID and coin number) and a clear part (validity
and amount of money).  Partially blind signature makes the system more
efficient because bill amounts need not be tied to signing key, i.e.,
you don't need a separate key for $10 bills, $20 bills, etc.  The
combined embedding and engraving signature scheme is designed to cover
all the types of information needed.

Thursday was primarily occupied by an empirical investigation of
so-called  ``ecliptic curve cryptography''. That is, most of us took a
boat down to a few miles off the coast of Montserrat to observe a total
eclipse of the sun while simultaneously keeping one eye on the volcano
spewing tons of ash just to our west. The geek-o-meter registered quite
high as several preprogrammed GPS devices could be heard going off when
the boat reached the contracted observation location. (Other evidence
of geekhood such as people spotted brandishing a laptop and a notebook
on the boat and actually doing work are vehemently denied by this author.)

Friday after breakfast there was an unscheduled question and answer
hour with David Chaum, which I was unfortunately unable to attend.

After this there was a roundtable discussion on ``Financial
Intermediaries, Public Networks, and Financial Cryptography'' moderated
by Steve Schear. Other presenters were Paul Guthrie, A.S. von Bernhardi
(aka Black Unicorn), and Frank Trotter.

Steve Schear lead off with an overview.  On a national level, the
central bank is the ultimate financial intermediary--setting interest
rates, rules for interbank loans, etc.  Below them are the commercial
banks.  These do the financial networks and management for individuals
and businesses.  Below them are the credit cards between the banks and
the consumers.  These do risk management.  There are also a large
number of processors like First Data, and Virtual that sit between the
bank and the merchant, as well as ATM networks like Cirrus and smaller
regional associations like Most.  Finally, there are also nonbank
financial intermediaries brokers, check cashing services, etc.

Paul Guthrie gave a description of where things are going with card
associations, which are made of member banks (Visa, Mastercard), and
card companies, which have as customers rather the end consumer
(American Express, Discover).  For card associations, acceptance will
imply certificates (making sure that the card is accepted at a store
need merchant certificates since anyone can stick up a logo on a Web
page).  Cards will carry more software. There will need to be PKI
infrastructures.  There may be adoption of new payment systems. There
will also be more opportunity for new brands in cyberspace. Thus, the
meaning of brands must be made clearer. Adam Shostack asked: networks
can be more open and yet there is going to be more certification of who
is authorized to accept a card?  Answer: It's up to the member bank,
which merchants they want to back. More liberal banks will run a higher
discount rate.

Frank Trotter began by observing that there are no hard currencies anymore
and discussed the roles of some of the traditional players in the new world.
He observed that state banks and regulatory agencies have increasingly
less reason for being, resulting in various turf squabbles.
Banks meanwhile are trying to defend their current franchise value.
Banks provide credit stability for the consumer.

If anybody who sets up a private mint and goes bankrupt, that will kill 
the confidence in the market for some time.

At that point von Bernhardi brought up the story of the failure of the
EU Bank in Antigua. This was basically an offshore, online bank that
was destroyed and lost (only!) 12 million dollars.  Someone noted that
it's good this happened earlier when the sacrifice was small and
everyone can make sure it doesn't happen again.  The important danger
is that institutional risk becomes systemic risk.  Guthrie noted that
Visa will drop banks that become a risk or will require a cash deposit
in a third party neutral bank.  Then, von Bernhardi contrasted public vs.
private insurance (what he called ``the myth of government backing of
financial institutions'').  Trotter then pointed out that many other
industries are are moving into banking.  Telecom is the biggest threat
to banking:  they have a big base and good records.  They could start to
take deposits and get backing of FDIC.

In beginning his own presentation, von Bernhardi stated that, ``It's
ironic I'm here...  At the far end of the tunnel, I would like to see
intermediaries diminish.'' He proceeded to give his impression as an
offshore banker.  The motivation is not to provide stability of the
international financial community but to make money.  Local offshore
governments typically take an attitude of `if you behave here, you can
stay'. The threshold of acceptable behavior is much higher than in the
US.  Regulators in the US are interested in providing global
stability.  Offshore banks MUST operate out of band, because they're
there.  To connect to the system, they have to go through the ACH
(Automated Clearing House). They have to go through VISA. But, there's
a lot more freedom.  He would like to see financial intermediaries
functioning in the exception rather than ordinarily in transactions.
Consumer efficiency involves reducing the middle man.  But, then how do
we broker trust?  Well you can have TTPs in the short run.  Crypto
protocols won't do the whole job.  Offshore could use these new
technologies so that they can go through these intermediaries faster.
But, in the long run, fewer and more offline intermediaries is the way
to go.  Reputation, he noted, is a multifaceted issue. It's not just a
question of having a certificate on the wall. ``If one of my clients
walked in to Citibank with a cashier's check from us, I can guarantee
that it won't clear right away.''

Someone asked from the floor what will happen when we start to see
private minting. etc. Bernhardi responded that selling your frequent
flyer miles is now possible and becoming easier.  And, Trotter observed
that you build a trading system, and if enough trust is built into the
system it becomes another currency.  Someone else in the audience
observed that the very existence of these systems is evidence of
inefficiencies in the main systems. They will then adapt and primarily
the small systems will remain small.