Skip site navigation (1)Skip section navigation (2)
Date:      Tue, 23 Jul 1996 17:12:32 -0500 (CDT)
From:      Jason Garman <garman@phs.k12.ar.us>
To:        "G. Jin" <gj@swanlake.com>
Cc:        questions@freebsd.org
Subject:   Re: one large file or many small files
Message-ID:  <Pine.LNX.3.92.960723170933.19968A-100000@phs.k12.ar.us>
In-Reply-To: <31F5227B.652@swanlake.com>

next in thread | previous in thread | raw e-mail | index | archive | help
On Tue, 23 Jul 1996, G. Jin wrote:

>
> Can anybody tell me which way is better?
>
> I have the following considerations so far:
>
> 1. Can Linux/FreeBSD support 100K files?
>
I don't see why not...

>    b. In case many individual files, I will assign his data file to
> "f12345.dat".
>       with 100K files in place, will accessing to a certain file
> "f12345.dat"
>       cause too slow a directory search to find its address?
>
Why don't you use a hash directory structure instead of stuffing 100,000
files in one directory?  It'll be much faster than looking up each file in
one directory.

Look at any decent http proxy-cache for examples of how such a hash scheme
would work.

Enjoy,

--
Jason Garman                             http://www.nesc.k12.ar.us/~garman/
Student, Eleanor Roosevelt High School                 garman@phs.k12.ar.us




Want to link to this message? Use this URL: <https://mail-archive.FreeBSD.org/cgi/mid.cgi?Pine.LNX.3.92.960723170933.19968A-100000>