Skip site navigation (1)Skip section navigation (2)
Date:      Wed, 3 Oct 2007 14:00:13 -0600
From:      Chad Perrin <perrin@apotheon.com>
To:        FreeBSD Questions <freebsd-questions@freebsd.org>
Subject:   Re: Managing very large files
Message-ID:  <20071003200013.GD45244@demeter.hydra>
In-Reply-To: <200710041458.22743.wundram@beenic.net>
References:  <4704DFF3.9040200@ibctech.ca> <200710041458.22743.wundram@beenic.net>

next in thread | previous in thread | raw e-mail | index | archive | help
On Thu, Oct 04, 2007 at 02:58:22PM +0200, Heiko Wundram (Beenic) wrote:
> Am Donnerstag 04 Oktober 2007 14:43:31 schrieb Steve Bertrand:
> > Is there any way to accomplish this, preferably with the ability to
> > incrementally name each newly created file?
> 
> man 1 split
> 
> (esp. -l)

That's probably the best option for a one-shot deal like this.  On the
other hand, Perl itself provides the ability to go through a file one
line at a time, so you could just read a line, operate, write a line (to
a new file) as needed, over and over, until you get through the whole
file.

The real problem would be reading the whole file into a variable (or even
multiple variables) at once.

-- 
CCD CopyWrite Chad Perrin [ http://ccd.apotheon.org ]
Isaac Asimov: "Part of the inhumanity of the computer is that, once it is
completely programmed and working smoothly, it is completely honest."



Want to link to this message? Use this URL: <https://mail-archive.FreeBSD.org/cgi/mid.cgi?20071003200013.GD45244>