[opensource-dev] 2.0 Absolute Dealbreaker - script count feature request
soft at lindenlab.com
Wed Sep 29 17:56:35 PDT 2010
On Wed, Sep 29, 2010 at 4:59 PM, Kelly Linden <kelly at lindenlab.com> wrote:
> On Wed, Sep 29, 2010 at 4:47 PM, Brian McGroarty <soft at lindenlab.com>wrote:
>> On Wed, Sep 29, 2010 at 4:06 PM, Kelly Linden <kelly at lindenlab.com>wrote:
>>> * In my mind the biggest issue is that mono scripts will appear 4x worse
>>> than LSL scripts. This is really the reason I am hesitant to push a function
>>> like this through before we have the ability for mono scripts to better
>>> reflect how much memory they may use. We need more development time for any
>>> solution on that front. Right now because a mono script could use 64k, that
>>> is the only number we have available to count. Maybe it would be nice to
>>> have an API to access number of scripts, number of LSL vs. Mono scripts,
>>> amount of memory used and script time used. However we rapidly get away from
>>> my desired philosophy of minimal interfaces.
>> The vast majority of scripts are tiny utility things, and are only
>> compiled as mono because that became the default a year or so back. In
>> reality, the typical script is probably using much less than 16k.
>> What about using 16k for both LSL and Mono until real Mono values and
>> controls can be added later? This is probably closer to real memory use than
>> the sum of maximums would be.
> Straying away from reporting as "real" of numbers as we can is a slippery
> slope I am a bit wary of, yet all options at this point stray from reality
> in some manner.
> * 64k for Mono doesn't reflect what the script is actually using or what
> its peak was, but does report the highest potential usage.
> * Current Used doesn't reflect that the script could use much more and you
> may have just caught it in a lull.
> * Max Used still doesn't reflect what it is currently using or what it
> could potentially use.
> * 16k seems practically random and unrelated to the script in question at
> * 10k means that in aggregate over hundreds or thousands of scripts it is
> probably pretty accurate. However it doesn't tell you anything about the
> particular script in question.
> Now, if we allowed mono scripts to set their own memory cap then I think we
> have the best compromise of currently used / max used / potential used. If
> that is the best future, I think it probably makes the most sense to stick
> with reporting the cap now, even if it isn't configurable yet and isn't the
> most ideal number at this time. Then things will continue to "just work" as
> we move forward.
Using a value other than 16k before configurable limits are in place will
create pressure to adopt or abandon Mono, depending on the value chosen.
It's not random - it's the only value that defers encouraging change until
it can be done with meaningful data. Whether that's the right goal, I don't
Brian McGroarty | Linden Lab
Sent from my Newton MP2100 via acoustic coupler
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the opensource-dev