Subject: Re: SUP Updates - Sanity check
To: Michael Graff <explorer@iastate.edu>
From: None <davidb@melb.cpr.itg.telecom.com.au>
List: current-users
Date: 12/21/1993 22:04:14
>>I would hate to have to go back to the tar file scenario.

> I've got a machine on campus which can SUP, and one at home which cannot (UUCP
> connected to campus)

> I'd be very interested in writing something which would parse the CVS output
> and/or the SUP file list directly and decide what to do about removed files,
> new files, and updated files.  Kinda a batch-sup or something of the sort.

> Has anyone else looked into this?

I set up a simple system like this for a different application a while ago.
If you can generate a file list for each end (the master and your slave)
where each line is of the form:

	path-name	ID
		(where ID is a timestamp or other such id)

then it's fairly simple to use sort/diff/awk/perl/whatever to compare
these lists and provide lists of what to delete and what to download.
It's also easy to check those against your list of local modifications
so that these aren't lost automagically.

I wrote a simple program that takes a set of filenames as arguments
and prints one line each with the name and the file modification time,
and called that from find/xargs.  The find didn't descend into CVS or obj
directories and only reported files not directories.

For my purposes the mod time was sufficient.  For some purposes a digital
signature (eg. MD5) is better, but that tends to chew a bit more CPU...

If an ID can be agreed upon, it should be possible to generate a sun-lamp
filelist at update time and make that available for FTP.

- David B.

------------------------------------------------------------------------------