[sldev] 1.20 wreaking havok with nvidia drivers under vista

Dave Parks davep at lindenlab.com
Thu Apr 10 19:50:58 PDT 2008


I think this is being caused by the extra impostor updates that are 
happening in 1.20.  Unstable render pipe + more passes through the pipe 
= more crashes.  Try disabling avatar impostors and see if that helps. 

It looks like some part of the render pipe is leaving a vertex data 
stream enabled for reading, and subsequent calls occasionally read off 
the end of the stream.  I have work in an internal branch (maint-render) 
that protects vertex data streams using LLVertexBuffer (removes willy 
nilly calls to glEnable/DisableClientState, glClientActiveTexture, and 
glDrawRangeElements).  Time will tell if this is the culprit.

Moving forward, we'll be looking at removing all GL dependencies from 
the newview project and creating an API agnostic interface to the 
graphics card.  This is an early step in that direction, hot on the 
heels of the use of gGL.begin/gGL.end vs. glBegin/glEnd.


Brad Kittenbrink (Brad Linden) wrote:
> Thomas Grimshaw wrote:
>> Yes, I will of course file a Jira. Some early information which might 
>> be of help to you though, disabling atmospheric shaders seems to help 
>> the issue immensely (still unstable though).
>>
> The atmospheric shaders bit is Interesting, I'd be curious to hear how 
> the behavior interacts with the Enable VBOs checkbox on the hardware 
> settings menu, and the RenderMaxVBO debug setting (from the Advanced 
> menu -> Debug Settings...).
>
> -Brad
> _______________________________________________
> Click here to unsubscribe or manage your list subscription:
> /index.html



More information about the SLDev mailing list