[opensource-dev] J2C fast decoder

Francesco Rabbi sythos at gmail.com
Mon Sep 13 01:40:03 PDT 2010

> Would it be a huge problem, for example, to transfer HTTP textures
> as TGA or PNG and use one of the rather well-optimized decoder

TGA have lossless RLE compression anyway.... transport is
format-indipendent, you can send on HTTP voice too (like skype) or
anything you want. To increase bandwith and rendering performance
we need a client side routine (viewer must ask to asset server only
vievable textures) and servers should send the right order from closer
object to farest.
Scaling/resizing should be done by viewer after render engine calc the
size of the surface where texture is. Receive a already undersampled
image mean each movement of avatar/camera a re-trasmission of same
image each time "bigger", in medium-long time terms is better receive
the fullres texture and let the viewer resize and render using discarding
to the right level based on local viewer side settings (resolution,
monitor DPI, CPU power)

But as always we are talking with only our hardware in our hands,all
about this can be discussed better if somewhere are avaiable some
statistic data about residents' hardware.

If LL (better if others viewers team too) can collect anonymous data
all decision about decoding, pipeline, shadow or shader and all other
can taken more easy, a approx list of usefull data can be (if a
statistical collector is or will enabled) to probe each XX minutes:

- CPU mips/bogomips
- CPU calc cores (not physical, just grand total)
- avarage load of cores
- cpu load of viewer executable
- number and avarage load of plugins (voice included)
- Amount of RAM (how much free)
- Amount of swap (how much used)
- Brand and model of graphic card
- Graphic settings
- number of agent in visible area
- avarage of rendering cost of agents in visible area
- FPS collected only when viewer isn't in "icon"
- inbound&outbound bandwith
- bandwith used by viewer
- bandwith used by plugins, voice too
- % of packetloss
- uptime of connection

to be sure no double data collected and no personal data used to
collect them i suggest to use serial number of CPU (linux /proc/cpuinfo,
mac same, windows dunno how) or sort of unique UUID based on hardware
configuration (if somebody increase ram or CPU a new ID should be used)

Sent by iPhone

Il giorno 13/set/2010, alle ore 07:40, Tateru Nino <tateru at taterunino.net>
ha scritto:

If we're using HTTP textures, is there actually any need for the JPEG 2000
format? Since the transfer time of individual textures is vastly reduced
(from the first byte to the last byte) the intermediate quality levels
supported by jpg2k would seem to be redundant. Indeed, you could argue that
transferring the textures in jpg2k format imposes a now-redundant workload
on the texture-pipeline, and that providing HTTP textures in a simpler
format that is more tractable to high-speed, low-cost decoding would save a
whole lot of problems.

Would it be a huge problem, for example, to transfer HTTP textures as TGA or
PNG and use one of the rather well-optimized decoder libraries for those
instead? It seems to me that it would be more efficient both on the network
and on the system - though at the expense of conversion of all the textures
at the store.

Just thinking out loud.

On 13/09/2010 1:58 PM, Sheet Spotter wrote:

 Some comments on SNOW-361 (Upgrade to OpenJPEG v2) suggested that changing
to version 2 of OpenJPEG might improve performance, while other comments
suggested it might not support progressive decoding.


Is an upgrade to OpenJPEG v2 under active development?

Sheet Spotter


*From:* opensource-dev-bounces at lists.secondlife.com [
mailto:opensource-dev-bounces at lists.secondlife.com<opensource-dev-bounces at lists.secondlife.com><opensource-dev-bounces at lists.secondlife.com>]
*On Behalf Of *Philippe (Merov) Bossut
*Sent:* September 9, 2010 10:35 PM
*To:* Nicky Fullton
*Cc:* opensource-dev at lists.secondlife.com
*Subject:* Re: [opensource-dev] J2C fast decoder

Hi Nicky,

As it happens, I've been working on instrumenting the code to add metric
gathering for image decompression as part of the Snowstorm sprint.

You may want to use my branch (
https://bitbucket.org/merov_linden/viewer-development-vwr-22761) and create
a baseline for openjpeg then run a test for Jasper. You'll have to sort out
the failing cases certainly and just throw them so we compare what gets
truly decompressed (though, clearly, working in all cases is pretty critical
if we look at Jasper as an alternative).

Here's what I got comparing KDU and OpenJpeg:
Label     Metric                              KDU(B)     OJ2C(T)
 Diff(T-B)     Percentage(100*T/B)
     TotalBytesInDecompression    5048643    5003370    -45273        99.1
     TotalBytesOutDecompression 40415336  46592896    6177560    115.29
     TimeTimeDecompression        3.74           17.04          13.3
     TotalBytesInDecompression    5000744    5000144     -600
     TotalBytesOutDecompression 46440040  44248324   -2191716    95.28
     TimeTimeDecompression        3.64           15.02           11.37

For that test, I output data every time 5MB of compressed data have been
processed. It's partial but shows that OpenJpeg is roughly 4 times slower
than KDU (at least, the version we're using in the official viewer
currently). Would be nice to have a similar set of numbers for Jasper before
going too far down the implementation path.

I wrote a short (and still incompleted) wiki to explain a bit how the metric
gathering system works:
- https://wiki.secondlife.com/wiki/Performance_Testers

BTW, that's something we should be using more generally for other perf
sensitive areas, especially when starting a perf improvement project.

See http://jira.secondlife.com/browse/VWR-22761 for details.

- Merov

On Fri, Sep 3, 2010 at 9:05 AM, Nicky Fullton <nickyd_sl at yahoo.com> wrote:


>> i'm testing in RL office (not or a viewer) JasPer decoder for JPG2000
>> images, after a short test with openjpeg2000 from EPFL we have tested
>> last 3 days JasPer (only a POC apps to do some bench), we must do a lot
>> of work too, but this is a lil question... anybody here around never
>> tried it as alternative to OpenJPEG/KDU in a viewer?

>I'm not aware of anyone publishing results for such a test, but if you
>have the time it would be interesting reading.

You might be interested in:

I made a rather quick hack to try Jasper instead of OpenJpeg to decode

The patch has some very rough edges. In fact is the decoding into the
LLImageRaw buffer not correct.

I did not fix this (yet) because the results so far are not very promising.
Jasper can only decode around 20% of the jpeg, for the other 80% it will
create an error and then my code falls back to OpenJpeg.
This fallback makes the whole decoding rather slow, so it is hard to say
if Jasper would really be any faster.

Right now I am not sure if it would be reasonable to invest more time
looking at Jasper. First the code would need to fixed upstream, so all
images can be properly decoded. As this project looks rather dead, one
with JPEG2000 knowledge might have to step up for this.

On another note, you might like to try:

This will at least skip the step of calling OpenJpeg in
 LImageJ2COJ::getMetadata (if possible, it will do sanity checks first).

>Some things to keep in
>mind. OpenJpeg has patches floating around on its ML against 1.3 that
>reports have claimed up to 40% speed increase in places due to
>unrolling the inner loops so finding them and testing would be good.

I did not find any of those, but then again maybe I did not look hard
There is certainly some potential in OpenJpeg.
There are some loops in t1_dec_sigpass and t1_dec_refpass that can be
easily rewritten. But there is some pretty tricky stuff in t1_dec_clnpass
that would need some cleaning and mqc decoder (mqc_decode) burns a lot
of time. But that one is especially hairy as it has side effects on its
input parameter.

I am not sure if anyone without enough deep knowledge of OpenJpeg (and
the dedication to recode a good part of it) would be able to improve
much of it.


Policies and (un)subscribe information available here:
Please read the policies before posting to keep unmoderated posting

Policies and (un)subscribe information available
Please read the policies before posting to keep unmoderated posting privileges

Tateru Nino
Contributing Editor http://massively.com/

Policies and (un)subscribe information available here:
Please read the policies before posting to keep unmoderated posting
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://lists.secondlife.com/pipermail/opensource-dev/attachments/20100913/663e7bd9/attachment-0001.htm 

More information about the opensource-dev mailing list