Subject: Re: microtime
To: None <mouse@Rodents.Montreal.QC.CA, tech-kern@netbsd.org>
From: Sean Doran <smd@ab.use.net>
List: tech-kern
Date: 08/22/2002 20:23:07
| What's more, Moore's Law is showing no signs of giving out.  It's not
| going to be all that long before networking hardware can fit a whole
| packet inside a nanosecond.

Moore's Law does not repeal the constant c, although it would
be very nice to have some sort of photon accelerator -:)

Moving the start through the end of a data packet a few centimetres
through an array of switching elements, within a nanosecond, is a
tough target.   Moving through more than a few centimetres in that time
depends heavily on hidden variables in QM, and Moore doesn't help there
either.

Moore's Law *may* help with the detection problem -- higher bitrates
mean that each bit is shorter, and short bits are harder to detect
than long ones; this is not causally related to the number of transistors
on a die.  Moreover, short bits are prone to corruption over
distance thanks to nonlinear effects in fibre and other transmissions
media, and various other sources of information decay, again not
as an effect of transistors/die.   

On the other hand, using ever greater computational power to
generate some sort of feedback-driven waveform shaping on the
transmit side, based on observations made on the receive side, will
help overcome some of the detect-side problems with short bits.
On the other other hand, spending computational power on sorting
traffic into multiple parallel "colours" travelling within the same
medium, may neatly side-step (or at least mitigate) the short-bits problem.

At any rate, there are alot harder problems than nanosecond timer
resolution to solve in extremely high-bandwidth high-delay networks,
although it's nice seeing people thinking ahead to a time when
rapacious bell-headed local monopolies no longer can constrain
last-mile networking speeds nearly as effectively as they have done to date.
It would be *nice* if end systems were the true bottlenecks of the Internet.

	Sean.