Subject: data corruption using gzip
To: None <current-users@netbsd.org>
From: Joseph Sarkes <jsarkes@tiac.net>
List: current-users
Date: 02/28/2001 23:36:58
for the last couple weeks I've been doing a tar -czf command
to save my source tree. All my backups are bad, with corrupt
zip files.

I can make a file aprox 1GB in size containing my doc/src/pkgsrc/xsrc
and it looks fine to tar. I can also copy it using cp or dd and make
a duplicate of the file that cmp finds no errors with. However,
if I gzip the file, when I gunzip it (tar -tzf ...) it complains
about a bad zip file after a variable amount of time (same place
on a particular .gz file, different for a newly gzipped copy of
the file). 

Obviously it appears that there is either a hardware failure, or
somehow the buffers aren't being properly written to disk, or 
whatever...

Is there some type of stress program that I can run to localize
what is causing the error? My system is an asus A7V athlon system
with -current running on it, 256MB of non parity memory and 850
MHz processor. Somehow, there is either a bad memory location, or
the gzip program is failing intermittantly (not likely) or under
heavy disk use, the softdep filesystem dies? I'll try without
softdep and see what happens, but if anyone else is having any
problems or ideas I'd like to hear about it to sort things out.
-- 
Joseph Sarkes		jsarkes@tiac.net