Skip site navigation (1)Skip section navigation (2)
Date:      Thu, 22 Apr 2004 01:22:33 +1000
From:      Tim Robbins <tjr@freebsd.org>
To:        Eric Anderson <anderson@centtech.com>
Cc:        freebsd-current@freebsd.org
Subject:   Re: Directories with 2million files
Message-ID:  <20040421152233.GA23501@cat.robbins.dropbear.id.au>
In-Reply-To: <40867A5D.9010600@centtech.com>
References:  <40867A5D.9010600@centtech.com>

next in thread | previous in thread | raw e-mail | index | archive | help
On Wed, Apr 21, 2004 at 08:42:53AM -0500, Eric Anderson wrote:

> First, let me say that I am impressed (but not shocked) - FreeBSD 
> quietly handled my building of a directory with 2055476 files in it.  
> I'm not sure if there is a limit to this number, but at least we know it 
> works to 2million.  I'm running 5.2.1-RELEASE.
> 
> However, several tools seem to choke on that many files - mainly ls and 
> du.  Find works just fine.  Here's what my directory looks like (from 
> the parent):
> 
> drwxr-xr-x   2 anderson  anderson  50919936 Apr 21 08:25 data
> 
> and when I cd into that directory, and do an ls:
> 
> $ ls -al | wc -l
> ls: fts_read: Cannot allocate memory
>       0

The problem here is likely to be that ls is trying to store all the
filenames in memory in order to sort them. Try using the -f option
to disable sorting. If you really do need a sorted list of filenames,
pipe the output through 'sort'.


Tim



Want to link to this message? Use this URL: <https://mail-archive.FreeBSD.org/cgi/mid.cgi?20040421152233.GA23501>