Skip site navigation (1)Skip section navigation (2)
Date:      Thu, 04 Oct 2007 16:16:29 -0400
From:      Steve Bertrand <iaccounts@ibctech.ca>
To:        FreeBSD Questions <freebsd-questions@freebsd.org>
Subject:   Re: Managing very large files
Message-ID:  <47054A1D.2000701@ibctech.ca>
In-Reply-To: <20071003200013.GD45244@demeter.hydra>
References:  <4704DFF3.9040200@ibctech.ca>	<200710041458.22743.wundram@beenic.net> <20071003200013.GD45244@demeter.hydra>

next in thread | previous in thread | raw e-mail | index | archive | help
>> man 1 split
>>
>> (esp. -l)
> 
> That's probably the best option for a one-shot deal like this.  On the
> other hand, Perl itself provides the ability to go through a file one
> line at a time, so you could just read a line, operate, write a line (to
> a new file) as needed, over and over, until you get through the whole
> file.
> 
> The real problem would be reading the whole file into a variable (or even
> multiple variables) at once.

This is what I am afraid of. Just out of curiosity, if I did try to read
the entire file into a Perl variable all at once, would the box panic,
or as the saying goes 'what could possibly go wrong'?

Steve




Want to link to this message? Use this URL: <https://mail-archive.FreeBSD.org/cgi/mid.cgi?47054A1D.2000701>