Skip site navigation (1)Skip section navigation (2)
Date:      Fri, 4 May 2007 13:33:09 -0400
From:      David Banning <>
To:        dex <>
Subject:   Re: can't zip large files 2gb >
Message-ID:  <>
In-Reply-To: <>
References:  <> <> <>

Next in thread | Previous in thread | Raw E-Mail | Index | Archive | Help
> Try the same operation on a known working system, take that output
> file and do a diff with that and the corrupt one after a 'strings', so
> 'strings new.gz > new-text', 'strings corrupt.gz > corrupt-text',
> 'diff new-text corrupt-text'.  I'm just interested in how it's being
> corrupted and maybe the strings output will tell you something.

I don't have a separate system, but I tried the strings output
of the tar before compression and the strings output of the tar
-after- compression and uncompression - as I mentioned the size output
is only two bites difference. 

The result was that the memory was exhausted on attempting a diff
of the two files, but there was around a 1 meg difference between
the two 1.5G ascii files.

> Sorry if this was specified before, but did this just start happening
> or is this the first time you've tried to gzip large files on this
> system?

first time I have tried files of this size - but I get the same problem
no matter what compression utility I use; tried gzip, bzip2, rzip
and compress.

Want to link to this message? Use this URL: <>