[sldev] More crash rate statistics

Aleric Inglewood aleric.inglewood at gmail.com
Wed Aug 12 14:03:28 PDT 2009


With the risk to alienate myself, I thought I'd take up the challenge
and give my view of the problem...

There are two measures: the FPS (F) and the quality (Q) of the image per frame.

We want both to be high, but the higher the quality the lower the FPS.

You could draw a 2D graph of "image quality" against FPS, which would
give you a line.

Let F be the FPS and Q the perceived quality (for which one might be able
to construct a complex mathematical model, but in general it's unclear),
then

        F = C / f(Q)

where f is monotonic in Q (df/dQ > 0 everywhere).

Depending on the "environment" (E), hardware, avatars, region and so on,
this line will be different, roughly:

        F = C(E) / f(Q)

where C is some function of E.

The viewer has no influence on the environment (E), but it has
influence on C : better viewer code means larger C for the same E.

Aside from that, the viewer has influence on Q: where on the line
we are.

Each individual user will have a different point on any given
curve that he or she perceives as the best balance between
quality and FPS; in theory we allow them to choose this in the
'Graphics' tab of the preferences.

However, in theory the optimal point (per user) is ALSO a function
of the environment.

Thus, imagine the curve F_E(Q)  (F suffix E, function of Q as above)
with one optimal point P, then vary E, constantly choosing again
the optimal point P, this gives a different curve: P(E).

It is this curve that the user should be able to choose.

Imho, even though the curve might be different for different users,
what the viewer current does is something TOTALLY different from
the ideal curve.

For example, a user might choose the following simplistic demand:

* If the FPS is higher than 20, I don't care about FPS anymore,
  I just want the highest possible quality.

* If the FPS drops below 5, I don't care about the quality anymore,
  do whatever is needed to give me at LEAST a FPS of 5.

For every viewer variable 'k' that determines Quality (for example,
the LOD as function of distance: LOD ~ k / d, where d is the distance)
you could allow the user to choose two {k, FPS} pairs: the minimum
point and the maximum point. Then interpolate between those,
as long as the FPS is between those limits and really make k subordinate
to FPS when it goes too low or too high.

Of course, you need some not-so-simple mathematical model to
determine the k-vector (all the values of different k) when the
FPS goes too high or too low, but essentially that is the same
as solving some differential equation where you can literally
measure the effect of small changes, and that way find the optimal
point where you weight each k value by their average derivative
of FPS inside the interval.

For example...

User chooses for parameter k1 : {0.5, 5} and {2.0, 20}
Then we say: apparently a change of a factor in k (from 0.5
to 2.0, is worth a change of a factor of 4 in the FPS
(from 5 to 20) and thus FPS can be weighted with k1.

For parameter k2: {1.0, 10} and {2.0, 50}, then weight
factor is k2^(ln(50/10) / ln(2.0/1.0)) = k2^2.32.

So, if in a given environment the FPS drops below 5
(just below 10 would dictate k1 and we'd only have to
change k2) we know that we want keep k1 / k2^2.32 constant,
and thus only have a single variable left to decrease
in order to get a FPS of 5 again.

> Here's a challenge to sldev. Viewer performance is complex, dependent on
> bandwidth, hardware, the content on the region, other avatars, and so
> on. What do you think a) is the one variable viewer performance should
> ultimately be measured by and b) what are, say, three predictive
> variables that influence the dependent variable. E.g. if you think
> viewer fps is the one variable we should care about, what three factors
> would you consider most important that influence that value? I realize
> this is simplistic, but I'm curious to know from your experience, how
> would you quantify viewer performance.
>
> -Xan Linden


More information about the SLDev mailing list