[sldev] Cache speed experiment & results...

Dale Mahalko dmahalko at gmail.com
Tue Jun 3 15:59:40 PDT 2008


The price of huge storage capacities has fallen like a stone in the
past two years. It seems crazy that I should be able to buy a 250 gig
7200 RPM SATA-3.0 hard drive for US$58 on NewEgg. I do wonder if the
drive manufacturers are looking over their shoulder and fearing the
arrival of solid state drives.

Therefore, I would be in favor of storing uncompressed cache data for
all cache types, including sound files, and without a defined storage
limit. Even a 10 gig cache seems far too small.

Just as LL raises the bar with WindLight for higher performance 3D
cards, it should also raise the bar for performance increases via huge
uncompressed caches. SL wants 10 gig for basic performance, and
optionally, 100 gig for enhanced performance? Well, sure, I suppose I
can handle that.


There are however filesystem limitations that may cause slowdowns, and
there may be diminishing returns at some point if only the local OS is
relied upon for huge caching storage. I would be interested to see
what performance gain is possible by have a dedicated slave MySQL
database running on the client PC when using a ridiculously huge cache
size..

Windows isn't good at dealing with directories containing tens of
thousands of files. Ever tried opening a folder containing 20,000
files? There is a delay of several seconds trying to do this in
Windows. Also the hidden MFT and directory structures can become
ridiculously fragmented, also leading to slowdowns.

A proper local database engine would likely deal with these huge
numbers of tiny files more efficiently than the local operating
system. An alternative would be to leave a large portion of system
memory free to cache these huge directory structures alongside the
client, say 2-4 gig of memory free, over and above what the client is
using.

- Scalar Tardis / Dale Mahalko


On Tue, Jun 3, 2008 at 3:01 PM, Buckaroo Mu <sldev at bitparts.org> wrote:
> This is my thought. If I want to compress the cache, I'll tell Windows to
> compress that folder - no, not seriously. I have something on the order of
> 100gb free on my main storage drive - I'd be more than willing to commit 10g
> of that to SL cache, and storing the objects as flat-out decompressed
> ready-to-use (llRawImage?) data would be the logical thing to do.


More information about the SLDev mailing list