Subject: Re: problem with download-vulnerability-list
To: David Maxwell <david@crlf.net>
From: Greg A. Woods <woods@weird.com>
List: netbsd-users
Date: 07/28/2003 16:48:57
[ On Monday, July 28, 2003 at 14:54:59 (-0400), David Maxwell wrote: ]
> Subject: Re: problem with download-vulnerability-list
>
> e requires infrastructure, harware, and keys.
>
> f requires infrastructure and keys.

No, not really, at least not additional infrastucture and hardware,
etc. (though I suppose it depends on what you mean by keys).  We have
precedent of using various checksums and MD5 signatures and such for
other files and I think we'd all agree that at least one of these
existing tools is sufficiently secure for verifying the integrity of
files published on the web and ftp servers and mirrors.

If the files and their signatures are both separately and securely
published to at least two completely separate mirror roots then users
can cross-check the integrity of a file from a site in the first mirror
sub-set by retrieving a signature from another site in the other mirror
sub-set.  Ultimately even this use of MD5 digests or whatever as
signatures is really just an optimization.  The whole file can be
separately retrieved from one or more sites in each mirror root tree and
compared bit-for-bit if the cost of doing this outweighs the potential
cost of using a damaged file.
 
I.e. I think the infrastructure nearly exists already -- it should
logically a flip of a few small switches to implement, plus all the
maintainers will have to separately publish two copies of everything (or
at least all the most important files) instead of the one they do now.
That double publishing effort is really not too difficult to automate as
well if the maintainers have separate SSH keys (for example) for each
root mirror server and they script their scp/rsync/whatever commands.

> f' requires a clearing house (paid role)

Well that depends on what threat profile you're considering.  Even a
clearing house won't easily be able to verify the actual integrity of a
file (i.e. protect against a malicious developer).  I.e. at some point
we have to trust the developers and maintainers of the source data.  I
think what's at question is just the mechanism of how we verify that
what is retrieved from a distribution server matches what the developer
intended without having to trust the integrity of any given single
mirror site.  I don't think we need to have some third party, that's
supposedly trustable because it gets paid well to maintain its
reputation, do the actual signing of published files.  I also hope we
don't need to have all the developers and maintainers be bonded by some
insurance company just so we can trust their product!  :-)

I believe the suggestion I've made of simply using two separate mirror
root servers (that don't in any way trust each other) is the simplest
way of using existing infrastructure and tools to distribute all files
in such a way that attackers will have a much harder time of making
malicous changes to those files and at the same time give us users
several options in how we can verify that not only have the files
remained unchanged since they were first published but also that we've
successfully downloaded the whole file as an exact copy of the original.

That doesn't mean developers and maintainers, especially those touching
important files like the vulnerablity list, don't still have to (at
least occasionally) themselves manually verify that what they've
actually published and what they can retrieve from a mirror site does
match what they intended to publish....  It's like balancing your
cheque-book or comparing your credit-card receipts with the CC invoice
-- the more a mistake costs, the more often and more carefully you need
to check that mistakes don't go un-noticed.

-- 
						Greg A. Woods

+1 416 218-0098                  VE3TCP            RoboHack <woods@robohack.ca>
Planix, Inc. <woods@planix.com>          Secrets of the Weird <woods@weird.com>