[opensource-dev] Question about LOD debug setting

Ann Otoole missannotoole at yahoo.com
Sun Oct 3 12:48:55 PDT 2010


It simply depends on your computer and video card. I can run that setting at 4 
with no noticeable difference. 


LL's quest to remain mired in circa 1999 graphics is admirable and noble and all 
but is costing SL/LL a vast amount of money. When mesh is rolled out and the 
inevitable "use all available resources" happens then I dare say SL will look a 
lot better but will probably lose the lower end anyway as they get tired and 
drop out because they can't have a decent experience and they choose alcohol, 
tobacco, reefer, and pizza over a decent gaming rig.

As is said at Microsoft: Upgrade or die" This too shall happen with SL if SL is 
to survive a long time. If LL doesn't do it then someone else will.

As for defaults yes the ultra default should be 4. LL recently changed the 
gpu_table.txt settings and any card that defults to ultra can more than handle 
RenderVolumeLODFactor at 4 with no noticeable impact. My GT240 on Kirstens with 
full shadows tweaked for realism gets a great frame rate with 
RenderVolumeLODFactor at 4. With shadows off and RenderVolumeLODFactor at 4 I 
get better than 29.97 FPS which is cinematic quality. 


However I get tired of ARC pundits spreading lies. Especially the ones saying 
mesh is low ARC since currently a theoretical mesh with 64 ktris/fr (64,000 
polys per frame) will register less than 20 ARC depending on the settings and 
scripts involved. It is, after all, one damned prim to ARC. Oh, and BTW, the 
"ARC" will have to be changed to show mesh render cost and leave out the script 
cost. Make a script cost measure and a real ktris/fr worst case cost estimate 
measure for the avatar. And then we need parcel/region render cost metrics 
available as well. And an estimated bytes downloaded measure for people on 
capped bandwidth plans. The entire concept of impact metrics needs to be 
revisited and done right IMHO.




________________________________
From: leliel <leliel.mirihi at gmail.com>
To: opensource-dev <opensource-dev at lists.secondlife.com>
Sent: Sun, October 3, 2010 2:54:36 PM
Subject: Re: [opensource-dev] Question about LOD debug setting

On Sun, Oct 3, 2010 at 10:37 AM, Ponzu <lee.ponzu at gmail.com> wrote:
> I picked up a notecard that says to increase RenderVolumeLODFactor to 4.  Is
> this reasonable, do you think?  And if so, why not increase the default a
> bit (currently seems to be 1.125

It is reasonable, the default setting is a bit low. It varies with
your graphics settings tho, 0 for low, 1.125 for mid & high, and 2 for
ultra IIRC. I find 3 a good compromise between quality and
performance.


> Unlike increasing your draw distance, this will NOT create lag for yourself

This however, is blatantly false. If rendering everything at full
detail all the time didn't cause a drop in frame rate than why would
we even bother with LOD?
_______________________________________________
Policies and (un)subscribe information available here:
http://wiki.secondlife.com/wiki/OpenSource-Dev
Please read the policies before posting to keep unmoderated posting privileges



      
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://lists.secondlife.com/pipermail/opensource-dev/attachments/20101003/c8c6fa4d/attachment.htm 


More information about the opensource-dev mailing list