Subject: Re: Hardware RNG support for EM64T systems
To: Travis H. <solinym@gmail.com>
From: Brett Lymn <blymn@baesystems.com.au>
List: port-amd64
Date: 03/01/2006 12:37:50
On Thu, Feb 23, 2006 at 07:18:11AM -0600, Travis H. wrote:
> 
> So the Nyquist theorem gives an upper bound, not a complete
> characterization.
> 

Indeed - I was told a story about the dangers of this.  It was a
control system for an automated train system, all the modelling and
simulation showed the system was stable - the train pulled up at the
station and came to an orderly stop.  When it came to the actual
demonstration in front of the big wigs, the train pulled into the
station almost stopped, went back the other way, then came back... in
ever increasing swings.  The monitoring of the control system showed
that it thought that the system was nicely damped and settling as
expected when, in fact, the system was oscillating wildly.  The reason
was that the sample rate was too low and it just happened to coincide
with the natural frequency of the system.

> 
> but cryptanalysis
> is full of examples of non-random structure which were too complex to
> notice right away but once detected could be used against the
> system. 
>

Yes, indeed and by making your system more complex you make it more
difficult to tell if you have a hidden structure there... and there
have been quite a few instances of a seemingly benign modification
totally destroying the utility of the system.

> Throwing information away can only weaken your analysis, it can't tell
> you what someone with the full set could predict. 

Yes, but I suggest that you don't have a full set either - you have a
digital approximation, sure you have more bits but in this context is
more better?  Quantisation errors will introduce artifacts into your
sampled data stream, this could result in subtle patterns.

> example, make a random number generator by taking some long ASCII text
> file and throwing away the upper seven bits of each byte.  That may
> make it unpredictable to you, or anyone who only has access to the
> least significant bits, but not necessarily to me; I have access to
> all eight bits and can start to predict the subsequent values,
> assuming that the text file has structure (i.e. is in English). 

Well, I do agree that your example of English text is poor due to the
known poor randomness of text but I do understand the intent.  Let's
run with it anyway because I can see the flip side to this.  Let's
assume that you digitise a 60Hz sine wave with a sampling frequency
that is not an integral number relative to 60Hz (this is important),
let's just say our ADC is an 8bit device.  So, we have 8bit samples,
take all the 8bits and you have a pretty strong line at 60Hz in the
frequency domain which is no surprise.  Now, if we throw away the
upper 7bits of the data what do we have left?  Basically, quantisation
noise, which is a random number with a gaussian distribution.  So, by
throwing information away we have removed structure from the
datastream leaving us with a random number.

> 
> Now what if those samples had been fed through something as trivial as
> a von Neumann corrector (discards identical pairs of samples)?  I am
> not so sure the signal would have been so easy to detect, but I think
> it is quite likely that it would require something more
> sophisticated.

Well, throwing samples away is called decimation - all that results in
is a shift of the signal in the frequency domain.

-- 
Brett Lymn