Features Download

From: Dan Brown <dbrown <at> certicom.com>
Subject: Re: Dual_EC_DRBG
Newsgroups: gmane.ietf.irtf.cfrg
Date: Friday 3rd January 2014 01:06:51 UTC (over 5 years ago)
Happy New Year to all.

> -----Original Message-----
> From: Alyssa Rowan
> Sent: Monday, December 30, 2013 10:11 AM
> On 27/12/2013 20:43, Dan Brown wrote:
> > Repeating myself, nobody showed how the alternative P&Q could be
> > backdoored.
> Yes, they did. Even with another point, Dual_EC_DRBG still doesn't
> uniform random numbers (2^28 distinguisher¹, and a p=0.0011 consecutive
> bit predictor² - both of which are practical attacks).

1) I too wrote about the same bias in Appendix B of: 


At the end of that paper, I try to suggest that the bias does not have an
impact for nonces or keys. I also try to say that if indistinguishability
from uniform bit strings is needed, then more bits can be truncated from
the point, which is consistent with what the standards allow.  

This indistinguishability, under sufficient amounts truncation, say half
the bits, seems a reasonable conjecture.

To motivate a real-world need for such indistinguishability, my eprint
cited one-time pads.  Certainly one-time pads (e.g. stream ciphers) do need
full indistinguishability, but in real cryptographic systems, an RNG is not
really meant to provide the service of a one-time pad, because, for a
one-time pad to useful, i.e. to permit decryption, it has to be
reproducible.  Reproducibility undermines the main service provided by an
RNG, which is unpredictability, at least in the sense of forward secrecy. 
So, in retrospect, it seems that I gave a false motivation.

Perhaps, a better motivation would have been some information-theoretic
crypto scheme, such as secret sharing.  This is outside my area of
expertise.  Also, neither ANSI nor NIST standardize such info-theoretic
algorithms, so at the time I thought it irrelevant, though I see now that
to be a too narrow view.

In summary, I agree that there are feasible distinguishers, but I fail to
see how these particular distinguishers lead to practical attacks (see
further specifics below "forbid") on uses of the RNG.

2) Though this is not a backdoor in the usual narrow sense (that somebody
needs to possess a secret key to open), I am sympathetic to your concern
that this is an intentional weakness.  Maybe somebody on this list knows of
a better name for such an intentional open weakness?  

3) S&S [your ref 1] advanced the understanding of the bias by amplifying
the distinguisher using longer outputs. That kind of amplification is
fairly well known, and I should have thought about it.  

S&S also conclude that the Dual_EC_DRBG is insecure, but without giving an
applied example attacks (see further specifics below "forbid"), and, IIRC,
without acknowledging that that some truncation parameters seem to resist
the bias attack.

> So, if you use it to generate k for (DSA|ECDSA) signatures... or, heaven
> forbid, a key...

1) What your does your ellipsis mean? (BTW, an elliptical implication,
which is a clever, implicit pun.)

2) Your comment on k is a very interesting research question, at least to
me.  My research on DSA|ECDSA indicates that there is a gap between the
known necessary and sufficient conditions for the k value used in DSA and
ECDSA signatures.  An impressive series of research papers, leading up to
one by Bleichenbacher, shows it necessary that k avoids certain types of
arithmetic bias.  My theoretical work (see ECC 2001, and Advances in ECC,
Ch. 2) shows that a certain sets of conditions and models, including the k
being chosen uniformly, may be sufficient for the security of ECDSA,
against certain types of forgery.  The gap is that it is unknown whether
certain other kinds of bias in k affect security, because they are not
covered by the known attacks or the known security proofs.  

For a contrived example, what if k was chosen such that its SHA-3 digest
ends in the bits 011.  This is bias that anybody could test, yet would not
seem to lead to any attack on DSA or ECDSA.  

More to the point, if k was the output of Dual_EC_DRBG, and this led to an
attack, then in my view this would be a new, and major result, surpassing
Bleichenbacher's attack by a long shot because it exploits much less bias,
and more importantly a much different type of bias.  If you are aware of
such an attack, then maybe inform the CFRG (or just remind me if it is well

3) Schnorr has a very interesting result, albeit rather theoretical and in
the generic group model, that most forms of bias in an ECC key do not make
the ECDLP any easier.  It is known that some forms of bias, such as a small
key lead to weaker ECDLP.  So, again there is a gap.  Any further closure
of this gap would be significant.  Again, I am forced to wonder if you know
something here that I don't.  I couldn't find such an attack in S&S.

4) I am less familiar with other key types, e.g. RSA, AES, and HMAC.  I
vaguely recall that certain types of bias in RSA and RC4 secret keys
undermine the security. I feel that this is an important area of research. 
If it could be shown that the bias from Dual_EC_DRBG undermines some
similar computational-based crypto scheme, then this would be an
improvement in research.

5)  Of course, it is always prudent to try to make all keys to uniform, if
only because that maximizes the min-entropy.  (Also, it makes security
proofs easier, I think.)  That's why I formulated that TPP in the my eprint
of the Dual_EC_DRBG, and used the indistinguishability as the security
definition.  I think now, I would also examine unguessability specifically
(see next point).

6) I've since wondered a little what the more important threat models, i.e.
security definitions, for a DRBG would be.  Indistinguishability is
certainly very nice, especially for information-theoretic crypto, but
unguessabiltiy is more important.  So, if unguessability can rely one
weaker assumptions than indistinguishability, then this would be worthwhile
to understand.  I touched upon the unguessabilty of Dual_EC_DRBG in my 2006
eprint, but haven't looked at the problem since 2007.  Anyway, I don't yet
know what the best security definition for a DRBG would be.  Maybe the
researchers on "randomness extractors" have something that could be
converted over to DRBG, namely entropy condensation.

> It is thoroughly unsuitable as a CSPRNG, and is obviously broken by
design, as
> was widely pointed out at the time, even before we knew for sure NSA had
> hand in it.
> > I would even want them to clearer.
> I would want it removing from the standard entirely, as thoroughly and
> completely discredited.

1) Just wondering: do you also dispute that benefit of having an RNG whose
security basis is number-theoretic?  What about just for algorithm agility?

I still personally recommend number-theoretic RNGs for security: mainly
because I feel more confidence in ECC, and other number-theoretic
algorithms, than in symmetric crypto algorithms.  My feeling here is rather
vague, and largely based on familiarity, but also philosophically based on
ECC and RSA being something innate, whereas symmetric key seems devised. 
But I could easily understand how others have the exact opposite feeling. 
So, I generally did not expect much adoption of number-theoretic RNGs.

There are other number-theoretic algorithms.  If they have been
standardized and proven secure, then they deserve consideration too.  But I
doubt that their standards have received as much attention as the
Dual_EC_DRBG.  Anyway, I haven't studied any of these, so cannot recommend
them personally.

Have there been any advances in the security analysis of the Dual_EC_DRBG
algorithm since 2007?  Even though people seem motivated to discredit it? 
If not, then I would argue that is a good sign, but also for other
unaffected NIST DRBGs.

I haven't really studied the HMAC_DRBG, Hash_DRBG or CTR_DRBG from NIST SP

2) Over the last few days, I've been trying to see things from a wider
My position on the standard has recently changed.

Implementers (and users) may have placed more trust in the standard than I
thought, and they may have been less informed about the widely publicized
potential backdoor than I thought.  

In other words, all the publicity may have been ineffective at preventing
the alleged backdoor from being exploited, contrary to my belief.  The
standard should not have included the default P&Q.

That all said, despite the deficiency of the standard, I feel implementers
still have some responsibility to keep up with the news and research about
crypto, given that standards need to be stable documents for

In summary, my position on the standards, and standards process, is not

To clarify my past position, I said "not subverted standard", but I did not
mean to rule out a backdoor in the default P&Q, just that the publicity
would render any subversion inoperable, and that there could other things
than standards enabling such subversion.  But now I see that I was wrong. 

My position on the algorithm Dual_EC_DRBG, with properly random,
backdoor-free, P&Q is the same as in 2007 (see also bias discussion above),
which is that it is secure.

> > Reported, yes.  But AFAIK, not published.
> Unless you know of another EC-based RNG published by NIST in 2006, all
> doubt has been removed that it is, in the NSA's words, 'enabled'.

Reports I've seen say "suggest" or "reported", which leaves doubt.  Please
send me the links to the correct reports.

Personally, I still don't know for sure if the default P&Q are backdoored,
but it would be prudent for implementers and users to assume that they are.

> Moreover, the internal NSA memo discussed by NYT might literally name
> you, given your name is on the patent³ along with Vanstone, so I would
> understand why you might have a vested interest in its publication.
> You'd have to ask NYT for that, I suppose.
> I'm going to ask you directly, Dan: Have you had any contact with the 
> Enabling Project that you know of? I mean, you even talk specifically
> "escrow keys" in that patent. That seems directly relevant to their work.

1) No contact that I know of.  I never heard of "SIGINT Enabling Project"
until the recent news.

2) At some point, I clued into the possibility of a backdoor, and, among
other things, tried to make sure the possibility was at least publicly
known, at first quietly: with a patent, and with a comment within my March
2006 eprint.  Later, others raised much more publicity, which seemed
sufficient to me.  I had expected such publicity to cause the proposers,
X9F1 and NIST to withdraw the default P&Q from the standard.

> ___
> 1. Shoenmakers & Sidorenko [2006] <http://eprint.iacr.org/2006/190.pdf>
> 2. Gjøsteen [2005], comments to draft NIST SP 800-90A 
> 3. <http://patents.justia.com/patent/8396213>
CD: 68ms