**Subject:** Re: security/10206 - proposed solution (concept)

**To:** Bill Studenmund *<wrstuden@netbsd.org>*

**From:** Johan =?iso-8859-1?Q?Wall=E9n?= *<johan.wallen+lists@hut.fi>*

**List:** tech-security

**Date:** 08/18/2005 11:52:47
Bill Studenmund <wrstuden@netbsd.org> writes:
[Approximating the entropy in passphrases/passwords.]
> Is there a tool that will measure this? I'd like to measure the entropy in
> my passphrases. I realize it's an approximate measure, but none the less
> interesting.
Since entropy is a measure of the information content in a probability
distribution, you should consider the method used to generate the
passphrase and estimate the entropy of the distribution induced by the
generation method. The entropy of a single passphrase is quite
meaningless: since the passphrase is fixed, the probability that it
has that particular value is 1, and its entropy is thus 0. So no tool
can estimate the entropy of your passphrases in a meaningful way
without knowing (or making assumptions about) how you generate the
passphrases.
Note also that Shannon entropy is not a good measure of the quality of
passphrase/key generation methods. Consider an algorithm that with
probability 0.99 outputs an n-bit string of all zeros, and with
probability 0.01 outputs a uniformly distributed n-bit string. The
Shannon entropy of the output of this algorithm is more than n/100.
By letting n grow, we get n-bit strings with arbitrarily high Shannon
entropy, but still the output is all zeros with probability more than
0.99. (This is a quite standard example.)
A more meaningful measure is the min-entropy. An output distribution
has min-entropy m if the probability of the most likely output is
2^-m. That is, high min-entropy means that no particular outcome is
likely to occur.
-- Johan