Date: Fri, 4 May 2007 11:53:44 -0400 From: David Banning <david+dated+1178726029.919f5b@skytracker.ca> To: Eduardo Morras <nec556@retena.com> Cc: questions@freebsd.org Subject: Re: can't zip large files 2gb > Message-ID: <20070504155343.GA13432@skytracker.ca> In-Reply-To: <462CBDD1003FC3F4@> References: <20070501195825.GA10269@skytracker.ca> <462CBDD1003FC3F4@>
next in thread | previous in thread | raw e-mail | index | archive | help
> What version of gzip are you using? From www.gzip.org: > > gzip 1.2.4 may crash when an input file name is too long (over > 1020 characters). not in my case. > The buffer overflow may be exploited if gzip is > run by a server such as an ftp server. Some ftp servers allow > compression and decompression on the fly and are thus vulnerable. I'm not running it via ftp. > This <http://www.gzip.org/gzip-1.2.4b.patch>patch to gzip 1.2.4 > fixes the problem. The beta version > <http://www.gzip.org/gzip-1.3.3.tar.gz>1.3.3 already includes a > sufficient patch; use this version if you have to handle files > larger than 2 GB. Tried that version, but still I have the same problem. A new official version of gzip will be released > soon. > > Also, check if you have compiled gzip with 64bit I/O. > Check this gzip FAQ http://www.gzip.org/#faq10 too. > > HTH > > I used their suggestion for unzip; gunzip < file.gz > file which actually completes (siting a crc error), but gives a file size 2 bites greater than the original when unzipped.
Want to link to this message? Use this URL: <https://mail-archive.FreeBSD.org/cgi/mid.cgi?20070504155343.GA13432>