[opensource-dev] J2C fast decoder

2010-08-25 Thread Sythos
i'm testing in RL office (not or a viewer) JasPer decoder for JPG2000
images, after a short test with openjpeg2000 from EPFL we have tested
last 3 days JasPer (only a POC apps to do some bench), we must do a lot
of work too, but this is a lil question... anybody here around never
tried it as alternative to OpenJPEG/KDU in a viewer?

ref: 
http://www.ece.uvic.ca/~mdadams/jasper/
___
Policies and (un)subscribe information available here:
http://wiki.secondlife.com/wiki/OpenSource-Dev
Please read the policies before posting to keep unmoderated posting privileges


Re: [opensource-dev] J2C fast decoder

2010-08-25 Thread Robin Cornelius
On Wed, Aug 25, 2010 at 10:00 PM, Altair Sythos  wrote:
> i'm testing in RL office (not or a viewer) JasPer decoder for JPG2000
> images, after a short test with openjpeg2000 from EPFL we have tested
> last 3 days JasPer (only a POC apps to do some bench), we must do a lot
> of work too, but this is a lil question... anybody here around never
> tried it as alternative to OpenJPEG/KDU in a viewer?

I'm not aware of anyone publishing results for such a test, but if you
have the time it would be interesting reading. Some things to keep in
mind. OpenJpeg has patches floating around on its ML against 1.3 that
reports have claimed up to 40% speed increase in places due to
unrolling the inner loops so finding them and testing would be good.
Also in SL usage, (please correct me if i am wrong) when the viewer
ups the resolution (using discard levels) openjpeg needs to redecode
the entire image, KDU does not, and the meta data extraction for
OpenJpeg is also expensive causing a decode, which could probably be
fixed. Also we have a good mix of 3,4 and more layer textures here in
SL, so your RL office comparision may not accuratly reflect
performance within SL, but its still good stuff.


Best regards


Robin
___
Policies and (un)subscribe information available here:
http://wiki.secondlife.com/wiki/OpenSource-Dev
Please read the policies before posting to keep unmoderated posting privileges


Re: [opensource-dev] J2C fast decoder

2010-08-25 Thread Kadah
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1

On 8/25/2010 2:06 PM, Robin Cornelius wrote:
> Also in SL usage, (please correct me if i am wrong) when the viewer
> ups the resolution (using discard levels) openjpeg needs to redecode
> the entire image, KDU does not, and the meta data extraction for
> OpenJpeg is also expensive causing a decode, which could probably be
> fixed.

I remember hearing that it was a patent issue, but I could be mistaken.
-BEGIN PGP SIGNATURE-
Version: GnuPG v1.4.10 (MingW32)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org/

iQEcBAEBAgAGBQJMdYhqAAoJEIdLfPRu7qE2VR0IAKZDGOW2ftr0flaL/ckbqNnQ
/UMXVCvLc6QHECuY8QWH5Jrv6xD16XEKorjjKaX7w/TtoIkedfsyubOmtFHYVOl6
SnEZiOW9I5LqGYmTOlC9lb5BWiFv6UECRusd+R8Kn6bAtIusqDPxSP3l8qq81Lqg
pySgbWpIbcDf/jasltObDTuiBCdEF8l/3LUegFjCNBV+T2D2K2JxA/QzKI9y9Vu4
uTUslESTGiyfjkQuCMGlbvtLWutokupYGfnNxYnxufwgbNeoQpIoMtbCnnNBClo+
ql0+G2rHwx3a3zeBqsaEYjhg6UfBjqGxdGt2NaGrY5jfz+dneWwIkuB2reONRPk=
=QLsg
-END PGP SIGNATURE-
___
Policies and (un)subscribe information available here:
http://wiki.secondlife.com/wiki/OpenSource-Dev
Please read the policies before posting to keep unmoderated posting privileges


Re: [opensource-dev] J2C fast decoder

2010-08-25 Thread Sythos
On Wed, 25 Aug 2010 22:06:00 +0100
Robin Cornelius  wrote:


> I'm not aware of anyone publishing results for such a test, but if you
> have the time it would be interesting reading. Some things to keep in
> mind. OpenJpeg has patches floating around on its ML against 1.3 that
> reports have claimed up to 40% speed increase in places due to
> unrolling the inner loops so finding them and testing would be good.

i've few data, only related to x-ray medical images... (400-500MB per
images), and i'm the hw&systems monkey, not the coding one

to today i can supply only decoding time on a *nix platform

jasper based decoder/viewer from uncached (not previously opened, no
memory cache) to video render time:

basing code snippet (a timer + a reading routine from disk + decoder)
processor q6...@1,6GHz
memory DDR3 4GB (4x1GB)
graphics: GTX270 on PCI-x 2.0 4x (single, no SLI)
disk on fast SATA2
gcc 4.4.5
LibC6 2.11.2
debian "stable" with GCC and development files and libs backported from
SID, kernel 2.6.32-5-amd64 compiled for 32bit (this cause the bug SL
viewer and snowglobe one detect a 64bit system why use wrongly uname,
but is a 32bit system with extended register enabled)

JASPER 1.900 (from site, not from debian repos)
compiled w/o optimization: 19,31msec 
compiled w SSE2: 11,60msec 
compiled w SSSE3: 9,11msec 
compiled w SSE2+OpenCL (195.36.31 libs from nvidia (multithread):
4,45msec [1]

OpenJPEG1.3 (from google code)
compiled w/o optimization: 26,12msec
compiled w SSE2: 15,33msec
compiled w SSSE3: 13,41msec
compiled w SSE2+OpenCL (195.36.31 libs from nvidia (multithread):
6,56msec [1]

OpenJPEG2.0_alpha (from google code)
compiled w/o optimization: 22,09msec
compiled w SSE2: crashed (maybe our fault, still debugging)
compiled w SSSE3: crashed (maybe our fault, still debugging)
compiled w SSE2+OpenCL (195.36.31 libs from nvidia (multithread):
crashed (maybe our fault, still debugging)

cannot supply neither code snippet neither image (is a knee of a
customer)

too few numbers, just to give a hint

[1] OpenCL, in our "poor" code, create some trouble in OS cache and
made system unstable after 4h of cyclic run of snippet, beginning a
crazy dance of IRQ27 on whole cores of machine and overloading
disks transfers why too swap)

___
Policies and (un)subscribe information available here:
http://wiki.secondlife.com/wiki/OpenSource-Dev
Please read the policies before posting to keep unmoderated posting privileges


Re: [opensource-dev] J2C fast decoder

2010-09-03 Thread Nicky Fullton
Hello,

>> i'm testing in RL office (not or a viewer) JasPer decoder for JPG2000
>> images, after a short test with openjpeg2000 from EPFL we have tested
>> last 3 days JasPer (only a POC apps to do some bench), we must do a lot
>> of work too, but this is a lil question... anybody here around never
>> tried it as alternative to OpenJPEG/KDU in a viewer?

>I'm not aware of anyone publishing results for such a test, but if you
>have the time it would be interesting reading.

You might be interested in:
http://bitbucket.org/NickyD/viewer-development/changeset/027bf44c5582

I made a rather quick hack to try Jasper instead of OpenJpeg to decode
images.

The patch has some very rough edges. In fact is the decoding into the
LLImageRaw buffer not correct.

I did not fix this (yet) because the results so far are not very promising.
Jasper can only decode around 20% of the jpeg, for the other 80% it will
create an error and then my code falls back to OpenJpeg.
This fallback makes the whole decoding rather slow, so it is hard to say
if Jasper would really be any faster.

Right now I am not sure if it would be reasonable to invest more time 
looking at Jasper. First the code would need to fixed upstream, so all 
images can be properly decoded. As this project looks rather dead, one
with JPEG2000 knowledge might have to step up for this.

On another note, you might like to try:
http://bitbucket.org/NickyD/viewer-development/changeset/e4eff3e2af39

This will at least skip the step of calling OpenJpeg in  
LImageJ2COJ::getMetadata (if possible, it will do sanity checks first).

>Some things to keep in
>mind. OpenJpeg has patches floating around on its ML against 1.3 that
>reports have claimed up to 40% speed increase in places due to
>unrolling the inner loops so finding them and testing would be good.

I did not find any of those, but then again maybe I did not look hard
enough.
There is certainly some potential in OpenJpeg.
There are some loops in t1_dec_sigpass and t1_dec_refpass that can be
easily rewritten. But there is some pretty tricky stuff in t1_dec_clnpass
that would need some cleaning and mqc decoder (mqc_decode) burns a lot
of time. But that one is especially hairy as it has side effects on its
input parameter.

I am not sure if anyone without enough deep knowledge of OpenJpeg (and
the dedication to recode a good part of it) would be able to improve
much of it.

Cheers,
   Nicky


  

___
Policies and (un)subscribe information available here:
http://wiki.secondlife.com/wiki/OpenSource-Dev
Please read the policies before posting to keep unmoderated posting privileges


Re: [opensource-dev] J2C fast decoder

2010-09-03 Thread Tofu Linden
Altair Sythos Memo wrote:
> i'm testing in RL office (not or a viewer) JasPer decoder for JPG2000
> images, after a short test with openjpeg2000 from EPFL we have tested
> last 3 days JasPer (only a POC apps to do some bench), we must do a lot
> of work too, but this is a lil question... anybody here around never
> tried it as alternative to OpenJPEG/KDU in a viewer?
> 
> ref: 
> http://www.ece.uvic.ca/~mdadams/jasper/

Chiming in late - when we were looking for a happy-licensed KDU
alternative (mid-late 2007) we tested both OpenJPEG and JasPer against a
typical SL data set.

OpenJPEG was (a bit) faster and better-maintained, so we went with
that.

Lots of things (dominant system architectures, our usage patterns,
project churn) have changed since then - re-evaluation is a great
idea.  Although the library's speed is only one of many axes. :)

Cheers,
-tofu
___
Policies and (un)subscribe information available here:
http://wiki.secondlife.com/wiki/OpenSource-Dev
Please read the policies before posting to keep unmoderated posting privileges


Re: [opensource-dev] J2C fast decoder

2010-09-09 Thread Philippe (Merov) Bossut
Hi Nicky,

As it happens, I've been working on instrumenting the code to add metric
gathering for image decompression as part of the Snowstorm sprint.

You may want to use my branch (
https://bitbucket.org/merov_linden/viewer-development-vwr-22761) and create
a baseline for openjpeg then run a test for Jasper. You'll have to sort out
the failing cases certainly and just throw them so we compare what gets
truly decompressed (though, clearly, working in all cases is pretty critical
if we look at Jasper as an alternative).

Here's what I got comparing KDU and OpenJpeg:
Label Metric  KDU(B) OJ2C(T)
 Diff(T-B) Percentage(100*T/B)
ImageCompressionTester-1
 TotalBytesInDecompression50486435003370-4527399.1
 TotalBytesOutDecompression 40415336  465928966177560115.29
 TimeTimeDecompression3.74   17.04  13.3
455.39
ImageCompressionTester-2
 TotalBytesInDecompression50007445000144 -600
99.99
 TotalBytesOutDecompression 46440040  44248324   -219171695.28
 TimeTimeDecompression3.64   15.02   11.37
 412.02

For that test, I output data every time 5MB of compressed data have been
processed. It's partial but shows that OpenJpeg is roughly 4 times slower
than KDU (at least, the version we're using in the official viewer
currently). Would be nice to have a similar set of numbers for Jasper before
going too far down the implementation path.

I wrote a short (and still incompleted) wiki to explain a bit how the metric
gathering system works:
- https://wiki.secondlife.com/wiki/Performance_Testers

BTW, that's something we should be using more generally for other perf
sensitive areas, especially when starting a perf improvement project.

See http://jira.secondlife.com/browse/VWR-22761 for details.

Cheers,
- Merov

On Fri, Sep 3, 2010 at 9:05 AM, Nicky Fullton  wrote:

> Hello,
>
> >> i'm testing in RL office (not or a viewer) JasPer decoder for JPG2000
> >> images, after a short test with openjpeg2000 from EPFL we have tested
> >> last 3 days JasPer (only a POC apps to do some bench), we must do a lot
> >> of work too, but this is a lil question... anybody here around never
> >> tried it as alternative to OpenJPEG/KDU in a viewer?
>
> >I'm not aware of anyone publishing results for such a test, but if you
> >have the time it would be interesting reading.
>
> You might be interested in:
> http://bitbucket.org/NickyD/viewer-development/changeset/027bf44c5582
>
> I made a rather quick hack to try Jasper instead of OpenJpeg to decode
> images.
>
> The patch has some very rough edges. In fact is the decoding into the
> LLImageRaw buffer not correct.
>
> I did not fix this (yet) because the results so far are not very promising.
> Jasper can only decode around 20% of the jpeg, for the other 80% it will
> create an error and then my code falls back to OpenJpeg.
> This fallback makes the whole decoding rather slow, so it is hard to say
> if Jasper would really be any faster.
>
> Right now I am not sure if it would be reasonable to invest more time
> looking at Jasper. First the code would need to fixed upstream, so all
> images can be properly decoded. As this project looks rather dead, one
> with JPEG2000 knowledge might have to step up for this.
>
> On another note, you might like to try:
> http://bitbucket.org/NickyD/viewer-development/changeset/e4eff3e2af39
>
> This will at least skip the step of calling OpenJpeg in
>  LImageJ2COJ::getMetadata (if possible, it will do sanity checks first).
>
> >Some things to keep in
> >mind. OpenJpeg has patches floating around on its ML against 1.3 that
> >reports have claimed up to 40% speed increase in places due to
> >unrolling the inner loops so finding them and testing would be good.
>
> I did not find any of those, but then again maybe I did not look hard
> enough.
> There is certainly some potential in OpenJpeg.
> There are some loops in t1_dec_sigpass and t1_dec_refpass that can be
> easily rewritten. But there is some pretty tricky stuff in t1_dec_clnpass
> that would need some cleaning and mqc decoder (mqc_decode) burns a lot
> of time. But that one is especially hairy as it has side effects on its
> input parameter.
>
> I am not sure if anyone without enough deep knowledge of OpenJpeg (and
> the dedication to recode a good part of it) would be able to improve
> much of it.
>
> Cheers,
>   Nicky
>
>
>
>
> ___
> Policies and (un)subscribe information available here:
> http://wiki.secondlife.com/wiki/OpenSource-Dev
> Please read the policies before posting to keep unmoderated posting
> privileges
>
___
Policies and (un)subscribe information available here:
http://wiki.secondlife.com/wiki/OpenSource-Dev
Please read the policies before posting to keep unmoderated posting privileges

Re: [opensource-dev] J2C fast decoder

2010-09-12 Thread Sheet Spotter
Some comments on SNOW-361 (Upgrade to OpenJPEG v2) suggested that changing
to version 2 of OpenJPEG might improve performance, while other comments
suggested it might not support progressive decoding. 

http://jira.secondlife.com/browse/SNOW-361

 

Is an upgrade to OpenJPEG v2 under active development?

 

 

Sheet Spotter

 

  _  

From: opensource-dev-boun...@lists.secondlife.com
[mailto:opensource-dev-boun...@lists.secondlife.com] On Behalf Of Philippe
(Merov) Bossut
Sent: September 9, 2010 10:35 PM
To: Nicky Fullton
Cc: opensource-dev@lists.secondlife.com
Subject: Re: [opensource-dev] J2C fast decoder

 

Hi Nicky,

As it happens, I've been working on instrumenting the code to add metric
gathering for image decompression as part of the Snowstorm sprint.

You may want to use my branch
(https://bitbucket.org/merov_linden/viewer-development-vwr-22761) and create
a baseline for openjpeg then run a test for Jasper. You'll have to sort out
the failing cases certainly and just throw them so we compare what gets
truly decompressed (though, clearly, working in all cases is pretty critical
if we look at Jasper as an alternative).

Here's what I got comparing KDU and OpenJpeg:
Label Metric  KDU(B) OJ2C(T)
Diff(T-B) Percentage(100*T/B)
ImageCompressionTester-1
 TotalBytesInDecompression50486435003370-4527399.1
 TotalBytesOutDecompression 40415336  465928966177560115.29
 TimeTimeDecompression3.74   17.04  13.3
455.39
ImageCompressionTester-2
 TotalBytesInDecompression50007445000144 -600
99.99
 TotalBytesOutDecompression 46440040  44248324   -219171695.28
 TimeTimeDecompression3.64   15.02   11.37
412.02

For that test, I output data every time 5MB of compressed data have been
processed. It's partial but shows that OpenJpeg is roughly 4 times slower
than KDU (at least, the version we're using in the official viewer
currently). Would be nice to have a similar set of numbers for Jasper before
going too far down the implementation path.

I wrote a short (and still incompleted) wiki to explain a bit how the metric
gathering system works:
- https://wiki.secondlife.com/wiki/Performance_Testers

BTW, that's something we should be using more generally for other perf
sensitive areas, especially when starting a perf improvement project.

See http://jira.secondlife.com/browse/VWR-22761 for details.

Cheers,
- Merov

On Fri, Sep 3, 2010 at 9:05 AM, Nicky Fullton  wrote:

Hello,

>> i'm testing in RL office (not or a viewer) JasPer decoder for JPG2000
>> images, after a short test with openjpeg2000 from EPFL we have tested
>> last 3 days JasPer (only a POC apps to do some bench), we must do a lot
>> of work too, but this is a lil question... anybody here around never
>> tried it as alternative to OpenJPEG/KDU in a viewer?

>I'm not aware of anyone publishing results for such a test, but if you
>have the time it would be interesting reading.

You might be interested in:
http://bitbucket.org/NickyD/viewer-development/changeset/027bf44c5582

I made a rather quick hack to try Jasper instead of OpenJpeg to decode
images.

The patch has some very rough edges. In fact is the decoding into the
LLImageRaw buffer not correct.

I did not fix this (yet) because the results so far are not very promising.
Jasper can only decode around 20% of the jpeg, for the other 80% it will
create an error and then my code falls back to OpenJpeg.
This fallback makes the whole decoding rather slow, so it is hard to say
if Jasper would really be any faster.

Right now I am not sure if it would be reasonable to invest more time
looking at Jasper. First the code would need to fixed upstream, so all
images can be properly decoded. As this project looks rather dead, one
with JPEG2000 knowledge might have to step up for this.

On another note, you might like to try:
http://bitbucket.org/NickyD/viewer-development/changeset/e4eff3e2af39

This will at least skip the step of calling OpenJpeg in
LImageJ2COJ::getMetadata (if possible, it will do sanity checks first).

>Some things to keep in
>mind. OpenJpeg has patches floating around on its ML against 1.3 that
>reports have claimed up to 40% speed increase in places due to
>unrolling the inner loops so finding them and testing would be good.

I did not find any of those, but then again maybe I did not look hard
enough.
There is certainly some potential in OpenJpeg.
There are some loops in t1_dec_sigpass and t1_dec_refpass that can be
easily rewritten. But there is some pretty tricky stuff in t1_dec_clnpass
that would need some cleaning and mqc decoder (mqc_decode) burns a lot
of time. But that one is especially hairy as it has side effects on its
input parameter.

I am not sure if anyone without enough deep knowledge of

Re: [opensource-dev] J2C fast decoder

2010-09-12 Thread Brandon Husbands
Phoenix uses a newer openjpeg and its static compiled might want to
check.its source

On Sep 12, 2010 10:58 PM, "Sheet Spotter"  wrote:

 Some comments on SNOW-361 (Upgrade to OpenJPEG v2) suggested that changing
to version 2 of OpenJPEG might improve performance, while other comments
suggested it might not support progressive decoding.

http://jira.secondlife.com/browse/SNOW-361



Is an upgrade to OpenJPEG v2 under active development?





Sheet Spotter


 --

*From:* opensource-dev-boun...@lists.secondlife.com [mailto:
opensource-dev-boun...@lists.secondlife.com] *On Behalf Of *Philippe (Merov)
Bossut
*Sent:* September 9, 2010 10:35 PM
*To:* Nicky Fullton
*Cc:* opensource-dev@lists.secondlife.com
*Subject:* Re: [opensource-dev] J2C fast decoder





Hi Nicky,

As it happens, I've been working on instrumenting the code to add metric
gathering f...

___
Policies and (un)subscribe information available here:
http://wiki.secondlife.com/wiki/OpenSource-Dev
Please read the policies before posting to keep unmoderated posting
privileges
___
Policies and (un)subscribe information available here:
http://wiki.secondlife.com/wiki/OpenSource-Dev
Please read the policies before posting to keep unmoderated posting privileges

Re: [opensource-dev] J2C fast decoder

2010-09-12 Thread Tateru Nino
 If we're using HTTP textures, is there actually any need for the JPEG 
2000 format? Since the transfer time of individual textures is vastly 
reduced (from the first byte to the last byte) the intermediate quality 
levels supported by jpg2k would seem to be redundant. Indeed, you could 
argue that transferring the textures in jpg2k format imposes a 
now-redundant workload on the texture-pipeline, and that providing HTTP 
textures in a simpler format that is more tractable to high-speed, 
low-cost decoding would save a whole lot of problems.


Would it be a huge problem, for example, to transfer HTTP textures as 
TGA or PNG and use one of the rather well-optimized decoder libraries 
for those instead? It seems to me that it would be more efficient both 
on the network and on the system - though at the expense of conversion 
of all the textures at the store.


Just thinking out loud.

On 13/09/2010 1:58 PM, Sheet Spotter wrote:


Some comments on SNOW-361 (Upgrade to OpenJPEG v2) suggested that 
changing to version 2 of OpenJPEG might improve performance, while 
other comments suggested it might not support progressive decoding.


http://jira.secondlife.com/browse/SNOW-361

Is an upgrade to OpenJPEG v2 under active development?

Sheet Spotter



*From:* opensource-dev-boun...@lists.secondlife.com 
[mailto:opensource-dev-boun...@lists.secondlife.com] *On Behalf Of 
*Philippe (Merov) Bossut

*Sent:* September 9, 2010 10:35 PM
*To:* Nicky Fullton
*Cc:* opensource-dev@lists.secondlife.com
*Subject:* Re: [opensource-dev] J2C fast decoder

Hi Nicky,

As it happens, I've been working on instrumenting the code to add 
metric gathering for image decompression as part of the Snowstorm sprint.


You may want to use my branch 
(https://bitbucket.org/merov_linden/viewer-development-vwr-22761) and 
create a baseline for openjpeg then run a test for Jasper. You'll have 
to sort out the failing cases certainly and just throw them so we 
compare what gets truly decompressed (though, clearly, working in all 
cases is pretty critical if we look at Jasper as an alternative).


Here's what I got comparing KDU and OpenJpeg:
Label Metric  KDU(B) OJ2C(T)
 Diff(T-B) Percentage(100*T/B)

ImageCompressionTester-1
 TotalBytesInDecompression50486435003370-4527399.1
 TotalBytesOutDecompression 40415336  465928966177560115.29
 TimeTimeDecompression3.74   17.04  
13.3455.39

ImageCompressionTester-2
 TotalBytesInDecompression50007445000144 -600
   99.99

 TotalBytesOutDecompression 46440040  44248324   -219171695.28
 TimeTimeDecompression3.64   15.02   
11.37 412.02


For that test, I output data every time 5MB of compressed data have 
been processed. It's partial but shows that OpenJpeg is roughly 4 
times slower than KDU (at least, the version we're using in the 
official viewer currently). Would be nice to have a similar set of 
numbers for Jasper before going too far down the implementation path.


I wrote a short (and still incompleted) wiki to explain a bit how the 
metric gathering system works:

- https://wiki.secondlife.com/wiki/Performance_Testers

BTW, that's something we should be using more generally for other perf 
sensitive areas, especially when starting a perf improvement project.


See http://jira.secondlife.com/browse/VWR-22761 for details.

Cheers,
- Merov

On Fri, Sep 3, 2010 at 9:05 AM, Nicky Fullton <mailto:nickyd...@yahoo.com>> wrote:


Hello,

>> i'm testing in RL office (not or a viewer) JasPer decoder for JPG2000
>> images, after a short test with openjpeg2000 from EPFL we have tested
>> last 3 days JasPer (only a POC apps to do some bench), we must do a lot
>> of work too, but this is a lil question... anybody here around never
>> tried it as alternative to OpenJPEG/KDU in a viewer?

>I'm not aware of anyone publishing results for such a test, but if you
>have the time it would be interesting reading.

You might be interested in:
http://bitbucket.org/NickyD/viewer-development/changeset/027bf44c5582

I made a rather quick hack to try Jasper instead of OpenJpeg to decode
images.

The patch has some very rough edges. In fact is the decoding into the
LLImageRaw buffer not correct.

I did not fix this (yet) because the results so far are not very 
promising.

Jasper can only decode around 20% of the jpeg, for the other 80% it will
create an error and then my code falls back to OpenJpeg.
This fallback makes the whole decoding rather slow, so it is hard to say
if Jasper would really be any faster.

Right now I am not sure if it would be reasonable to invest more time
looking at Jasper. First the code would need to fixed upstream, so all
images can be properly decoded. As 

Re: [opensource-dev] J2C fast decoder

2010-09-12 Thread leliel
On Sun, Sep 12, 2010 at 10:40 PM, Tateru Nino  wrote:
> If we're using HTTP textures, is there actually any need for the JPEG 2000
> format?

There is the 500TB asset server full of jpeg2k images. Switching to a
new format would be a massive undertaking.

> Would it be a huge problem, for example, to transfer HTTP textures as TGA or
> PNG and use one of the rather well-optimized decoder libraries for those
> instead?

TGA is uncompressed so it won't work. PNG could work, but its
compression ratio and file overhead isn't that much better than
jpeg2k. So the gains would only be in decode time, which is the
biggest cpu drain in the viewer so it may be worth it.
___
Policies and (un)subscribe information available here:
http://wiki.secondlife.com/wiki/OpenSource-Dev
Please read the policies before posting to keep unmoderated posting privileges


Re: [opensource-dev] J2C fast decoder

2010-09-12 Thread Dahlia Trimble
Jpeg 2000 discard levels are also used for reducing the resolution of
textures for distant objects which reduces data download requirements. Few
other formats offer comparable compression ratios with the quality that Jpeg
2000 offers. HTTP transfer doesn't magically make data traverse the network
faster; much of the reduced download time is due to offloading the sim from
the task of sending textures as they can come from another server process
(or even another physical server).


On Sun, Sep 12, 2010 at 10:40 PM, Tateru Nino  wrote:

>  If we're using HTTP textures, is there actually any need for the JPEG 2000
> format? Since the transfer time of individual textures is vastly reduced
> (from the first byte to the last byte) the intermediate quality levels
> supported by jpg2k would seem to be redundant. Indeed, you could argue that
> transferring the textures in jpg2k format imposes a now-redundant workload
> on the texture-pipeline, and that providing HTTP textures in a simpler
> format that is more tractable to high-speed, low-cost decoding would save a
> whole lot of problems.
>
> Would it be a huge problem, for example, to transfer HTTP textures as TGA
> or PNG and use one of the rather well-optimized decoder libraries for those
> instead? It seems to me that it would be more efficient both on the network
> and on the system - though at the expense of conversion of all the textures
> at the store.
>
> Just thinking out loud.
>
>
> On 13/09/2010 1:58 PM, Sheet Spotter wrote:
>
>  Some comments on SNOW-361 (Upgrade to OpenJPEG v2) suggested that
> changing to version 2 of OpenJPEG might improve performance, while other
> comments suggested it might not support progressive decoding.
>
> http://jira.secondlife.com/browse/SNOW-361
>
>
>
> Is an upgrade to OpenJPEG v2 under active development?
>
>
>
>
>
> Sheet Spotter
>
>
>  --
>
> *From:* opensource-dev-boun...@lists.secondlife.com [
> mailto:opensource-dev-boun...@lists.secondlife.com]
> *On Behalf Of *Philippe (Merov) Bossut
> *Sent:* September 9, 2010 10:35 PM
> *To:* Nicky Fullton
> *Cc:* opensource-dev@lists.secondlife.com
> *Subject:* Re: [opensource-dev] J2C fast decoder
>
>
>
> Hi Nicky,
>
> As it happens, I've been working on instrumenting the code to add metric
> gathering for image decompression as part of the Snowstorm sprint.
>
> You may want to use my branch (
> https://bitbucket.org/merov_linden/viewer-development-vwr-22761) and
> create a baseline for openjpeg then run a test for Jasper. You'll have to
> sort out the failing cases certainly and just throw them so we compare what
> gets truly decompressed (though, clearly, working in all cases is pretty
> critical if we look at Jasper as an alternative).
>
> Here's what I got comparing KDU and OpenJpeg:
> Label Metric  KDU(B) OJ2C(T)
>  Diff(T-B) Percentage(100*T/B)
> ImageCompressionTester-1
>  TotalBytesInDecompression50486435003370-4527399.1
>  TotalBytesOutDecompression 40415336  465928966177560115.29
>  TimeTimeDecompression3.74   17.04  13.3
> 455.39
> ImageCompressionTester-2
>  TotalBytesInDecompression50007445000144 -600
> 99.99
>  TotalBytesOutDecompression 46440040  44248324   -219171695.28
>  TimeTimeDecompression3.64   15.02   11.37
>  412.02
>
> For that test, I output data every time 5MB of compressed data have been
> processed. It's partial but shows that OpenJpeg is roughly 4 times slower
> than KDU (at least, the version we're using in the official viewer
> currently). Would be nice to have a similar set of numbers for Jasper before
> going too far down the implementation path.
>
> I wrote a short (and still incompleted) wiki to explain a bit how the
> metric gathering system works:
> - https://wiki.secondlife.com/wiki/Performance_Testers
>
> BTW, that's something we should be using more generally for other perf
> sensitive areas, especially when starting a perf improvement project.
>
> See http://jira.secondlife.com/browse/VWR-22761 for details.
>
> Cheers,
> - Merov
>
> On Fri, Sep 3, 2010 at 9:05 AM, Nicky Fullton  wrote:
>
> Hello,
>
> >> i'm testing in RL office (not or a viewer) JasPer decoder for JPG2000
> >> images, after a short test with openjpeg2000 from EPFL we have tested
> >> last 3 days JasPer (only a POC apps to do some bench), we must do a lot
> >> of work too, but this is a lil question... anybody here around never
> >> tried it as alternative to OpenJPEG/KDU in a viewer?

Re: [opensource-dev] J2C fast decoder

2010-09-13 Thread Tateru Nino
 Wouldn't we be making a network saving by omitting the discard levels 
entirely? Granted, I don't have hard data about that - would the base 
texture encoded in a lighter-weight format end up causing less data to 
traverse for a given texture in the long-run than the more-efficiently 
compressed j2c of the same texture including discard levels? My gut 
instinct says 'probably', but I can't prove that with data.


If it *does* then we would have a double-bonus of also saving on 
decoding time.


On 13/09/2010 4:38 PM, Dahlia Trimble wrote:
Jpeg 2000 discard levels are also used for reducing the resolution of 
textures for distant objects which reduces data download requirements. 
Few other formats offer comparable compression ratios with the quality 
that Jpeg 2000 offers. HTTP transfer doesn't magically make data 
traverse the network faster; much of the reduced download time is due 
to offloading the sim from the task of sending textures as they can 
come from another server process (or even another physical server).



On Sun, Sep 12, 2010 at 10:40 PM, Tateru Nino <mailto:tat...@taterunino.net>> wrote:


If we're using HTTP textures, is there actually any need for the
JPEG 2000 format? Since the transfer time of individual textures
is vastly reduced (from the first byte to the last byte) the
intermediate quality levels supported by jpg2k would seem to be
redundant. Indeed, you could argue that transferring the textures
in jpg2k format imposes a now-redundant workload on the
texture-pipeline, and that providing HTTP textures in a simpler
format that is more tractable to high-speed, low-cost decoding
would save a whole lot of problems.

Would it be a huge problem, for example, to transfer HTTP textures
as TGA or PNG and use one of the rather well-optimized decoder
libraries for those instead? It seems to me that it would be more
efficient both on the network and on the system - though at the
expense of conversion of all the textures at the store.

Just thinking out loud.


On 13/09/2010 1:58 PM, Sheet Spotter wrote:


Some comments on SNOW-361 (Upgrade to OpenJPEG v2) suggested that
changing to version 2 of OpenJPEG might improve performance,
while other comments suggested it might not support progressive
decoding.

http://jira.secondlife.com/browse/SNOW-361

Is an upgrade to OpenJPEG v2 under active development?

Sheet Spotter



*From:* opensource-dev-boun...@lists.secondlife.com
<mailto:opensource-dev-boun...@lists.secondlife.com>
[mailto:opensource-dev-boun...@lists.secondlife.com] *On Behalf
Of *Philippe (Merov) Bossut
*Sent:* September 9, 2010 10:35 PM
*To:* Nicky Fullton
*Cc:* opensource-dev@lists.secondlife.com
<mailto:opensource-dev@lists.secondlife.com>
*Subject:* Re: [opensource-dev] J2C fast decoder

Hi Nicky,

As it happens, I've been working on instrumenting the code to add
metric gathering for image decompression as part of the Snowstorm
sprint.

You may want to use my branch
(https://bitbucket.org/merov_linden/viewer-development-vwr-22761)
and create a baseline for openjpeg then run a test for Jasper.
You'll have to sort out the failing cases certainly and just
throw them so we compare what gets truly decompressed (though,
clearly, working in all cases is pretty critical if we look at
Jasper as an alternative).

Here's what I got comparing KDU and OpenJpeg:
Label Metric  KDU(B)   
 OJ2C(T) Diff(T-B) Percentage(100*T/B)

ImageCompressionTester-1
 TotalBytesInDecompression50486435003370   
-4527399.1
 TotalBytesOutDecompression 40415336  465928966177560   
115.29
 TimeTimeDecompression3.74   17.04 
13.3455.39

ImageCompressionTester-2
 TotalBytesInDecompression50007445000144 -600   
   99.99
 TotalBytesOutDecompression 46440040  44248324   -2191716   
95.28
 TimeTimeDecompression3.64   15.02  
11.37 412.02


For that test, I output data every time 5MB of compressed data
have been processed. It's partial but shows that OpenJpeg is
roughly 4 times slower than KDU (at least, the version we're
using in the official viewer currently). Would be nice to have a
similar set of numbers for Jasper before going too far down the
implementation path.

I wrote a short (and still incompleted) wiki to explain a bit how
the metric gathering system works:
- https://wiki.secondlife.com/wiki/Performance_Testers

BTW, that's something we should be using more generally for other
perf sensitive areas, especi

Re: [opensource-dev] J2C fast decoder

2010-09-13 Thread Dahlia Trimble
Requesting a discard level means you only get a portion of the entire file,
and if you wanted the highest resolution you would download the entire file
which would include all discard levels. The advantage of being able to
request only what you need can save a lot of network traffic. You don't
really want to download a 1024x1024 texture for a distant object that only
covers a few pixels on the screen.



On Mon, Sep 13, 2010 at 12:24 AM, Tateru Nino  wrote:

>  Wouldn't we be making a network saving by omitting the discard levels
> entirely? Granted, I don't have hard data about that - would the base
> texture encoded in a lighter-weight format end up causing less data to
> traverse for a given texture in the long-run than the more-efficiently
> compressed j2c of the same texture including discard levels? My gut instinct
> says 'probably', but I can't prove that with data.
>
> If it *does* then we would have a double-bonus of also saving on decoding
> time.
>
>
> On 13/09/2010 4:38 PM, Dahlia Trimble wrote:
>
> Jpeg 2000 discard levels are also used for reducing the resolution of
> textures for distant objects which reduces data download requirements. Few
> other formats offer comparable compression ratios with the quality that Jpeg
> 2000 offers. HTTP transfer doesn't magically make data traverse the network
> faster; much of the reduced download time is due to offloading the sim from
> the task of sending textures as they can come from another server process
> (or even another physical server).
>
>
> On Sun, Sep 12, 2010 at 10:40 PM, Tateru Nino wrote:
>
>>  If we're using HTTP textures, is there actually any need for the JPEG
>> 2000 format? Since the transfer time of individual textures is vastly
>> reduced (from the first byte to the last byte) the intermediate quality
>> levels supported by jpg2k would seem to be redundant. Indeed, you could
>> argue that transferring the textures in jpg2k format imposes a now-redundant
>> workload on the texture-pipeline, and that providing HTTP textures in a
>> simpler format that is more tractable to high-speed, low-cost decoding would
>> save a whole lot of problems.
>>
>> Would it be a huge problem, for example, to transfer HTTP textures as TGA
>> or PNG and use one of the rather well-optimized decoder libraries for those
>> instead? It seems to me that it would be more efficient both on the network
>> and on the system - though at the expense of conversion of all the textures
>> at the store.
>>
>> Just thinking out loud.
>>
>>
>> On 13/09/2010 1:58 PM, Sheet Spotter wrote:
>>
>>  Some comments on SNOW-361 (Upgrade to OpenJPEG v2) suggested that
>> changing to version 2 of OpenJPEG might improve performance, while other
>> comments suggested it might not support progressive decoding.
>>
>> http://jira.secondlife.com/browse/SNOW-361
>>
>>
>>
>> Is an upgrade to OpenJPEG v2 under active development?
>>
>>
>>
>>
>>
>> Sheet Spotter
>>
>>
>>  --
>>
>> *From:* opensource-dev-boun...@lists.secondlife.com [
>> mailto:opensource-dev-boun...@lists.secondlife.com]
>> *On Behalf Of *Philippe (Merov) Bossut
>> *Sent:* September 9, 2010 10:35 PM
>> *To:* Nicky Fullton
>> *Cc:* opensource-dev@lists.secondlife.com
>> *Subject:* Re: [opensource-dev] J2C fast decoder
>>
>>
>>
>> Hi Nicky,
>>
>> As it happens, I've been working on instrumenting the code to add metric
>> gathering for image decompression as part of the Snowstorm sprint.
>>
>> You may want to use my branch (
>> https://bitbucket.org/merov_linden/viewer-development-vwr-22761) and
>> create a baseline for openjpeg then run a test for Jasper. You'll have to
>> sort out the failing cases certainly and just throw them so we compare what
>> gets truly decompressed (though, clearly, working in all cases is pretty
>> critical if we look at Jasper as an alternative).
>>
>> Here's what I got comparing KDU and OpenJpeg:
>> Label Metric  KDU(B) OJ2C(T)
>>  Diff(T-B) Percentage(100*T/B)
>> ImageCompressionTester-1
>>  TotalBytesInDecompression50486435003370-4527399.1
>>  TotalBytesOutDecompression 40415336  465928966177560115.29
>>  TimeTimeDecompression3.74   17.04  13.3
>> 455.39
>> ImageCompressionTester-2
>>  TotalBytesInDecompression50007445000144 -600
>> 99.99
>>  TotalBytesOutDecompression 4

Re: [opensource-dev] J2C fast decoder

2010-09-13 Thread leliel
On Mon, Sep 13, 2010 at 12:24 AM, Tateru Nino  wrote:
> Wouldn't we be making a network saving by omitting the discard levels
> entirely? Granted, I don't have hard data about that - would the base
> texture encoded in a lighter-weight format end up causing less data to
> traverse for a given texture in the long-run than the more-efficiently
> compressed j2c of the same texture including discard levels? My gut instinct
> says 'probably', but I can't prove that with data.
>
> If it *does* then we would have a double-bonus of also saving on decoding
> time.

The problem is that discard levels save us on more then just
bandwidth. when you look at an object 300m away it will only be taking
up a few dozen pixels on the screen so the viewer will just download
up to the first or second discard level, say 32x32 or so, and that is
all that will be stored on your video card. Switching to a format that
doesn't support discard levels could give us big savings in decode
time, but at the cost of having to always download the whole file.
We'd be trading cpu time for vram, and the average users machine has a
lot more of the former than the later
___
Policies and (un)subscribe information available here:
http://wiki.secondlife.com/wiki/OpenSource-Dev
Please read the policies before posting to keep unmoderated posting privileges


Re: [opensource-dev] J2C fast decoder

2010-09-13 Thread Francesco Rabbi
> Would it be a huge problem, for example, to transfer HTTP textures
> as TGA or PNG and use one of the rather well-optimized decoder

TGA have lossless RLE compression anyway transport is
format-indipendent, you can send on HTTP voice too (like skype) or
anything you want. To increase bandwith and rendering performance
we need a client side routine (viewer must ask to asset server only
vievable textures) and servers should send the right order from closer
object to farest.
Scaling/resizing should be done by viewer after render engine calc the
size of the surface where texture is. Receive a already undersampled
image mean each movement of avatar/camera a re-trasmission of same
image each time "bigger", in medium-long time terms is better receive
the fullres texture and let the viewer resize and render using discarding
to the right level based on local viewer side settings (resolution,
monitor DPI, CPU power)

But as always we are talking with only our hardware in our hands,all
about this can be discussed better if somewhere are avaiable some
statistic data about residents' hardware.


If LL (better if others viewers team too) can collect anonymous data
all decision about decoding, pipeline, shadow or shader and all other
can taken more easy, a approx list of usefull data can be (if a
statistical collector is or will enabled) to probe each XX minutes:

- CPU mips/bogomips
- CPU calc cores (not physical, just grand total)
- avarage load of cores
- cpu load of viewer executable
- number and avarage load of plugins (voice included)
- Amount of RAM (how much free)
- Amount of swap (how much used)
- Brand and model of graphic card
- Graphic settings
- number of agent in visible area
- avarage of rendering cost of agents in visible area
- FPS collected only when viewer isn't in "icon"
- inbound&outbound bandwith
- bandwith used by viewer
- bandwith used by plugins, voice too
- % of packetloss
- uptime of connection

to be sure no double data collected and no personal data used to
collect them i suggest to use serial number of CPU (linux /proc/cpuinfo,
mac same, windows dunno how) or sort of unique UUID based on hardware
configuration (if somebody increase ram or CPU a new ID should be used)

-- 
Sent by iPhone

Il giorno 13/set/2010, alle ore 07:40, Tateru Nino 
ha scritto:

If we're using HTTP textures, is there actually any need for the JPEG 2000
format? Since the transfer time of individual textures is vastly reduced
(from the first byte to the last byte) the intermediate quality levels
supported by jpg2k would seem to be redundant. Indeed, you could argue that
transferring the textures in jpg2k format imposes a now-redundant workload
on the texture-pipeline, and that providing HTTP textures in a simpler
format that is more tractable to high-speed, low-cost decoding would save a
whole lot of problems.

Would it be a huge problem, for example, to transfer HTTP textures as TGA or
PNG and use one of the rather well-optimized decoder libraries for those
instead? It seems to me that it would be more efficient both on the network
and on the system - though at the expense of conversion of all the textures
at the store.

Just thinking out loud.

On 13/09/2010 1:58 PM, Sheet Spotter wrote:

 Some comments on SNOW-361 (Upgrade to OpenJPEG v2) suggested that changing
to version 2 of OpenJPEG might improve performance, while other comments
suggested it might not support progressive decoding.

http://jira.secondlife.com/browse/SNOW-361



Is an upgrade to OpenJPEG v2 under active development?





Sheet Spotter


 --

*From:* opensource-dev-boun...@lists.secondlife.com [
mailto:opensource-dev-boun...@lists.secondlife.com]
*On Behalf Of *Philippe (Merov) Bossut
*Sent:* September 9, 2010 10:35 PM
*To:* Nicky Fullton
*Cc:* opensource-dev@lists.secondlife.com
*Subject:* Re: [opensource-dev] J2C fast decoder



Hi Nicky,

As it happens, I've been working on instrumenting the code to add metric
gathering for image decompression as part of the Snowstorm sprint.

You may want to use my branch (
https://bitbucket.org/merov_linden/viewer-development-vwr-22761) and create
a baseline for openjpeg then run a test for Jasper. You'll have to sort out
the failing cases certainly and just throw them so we compare what gets
truly decompressed (though, clearly, working in all cases is pretty critical
if we look at Jasper as an alternative).

Here's what I got comparing KDU and OpenJpeg:
Label Metric  KDU(B) OJ2C(T)
 Diff(T-B) Percentage(100*T/B)
ImageCompressionTester-1
 TotalBytesInDecompression50486435003370-4527399.1
 TotalBytesOutDecompression 40415336  465928966177560115.29
 TimeTimeDecompression3.74   17.04  13.3
455.39
ImageCompressionTester-2
 TotalBytesInDecompression50007445000144 -600
99.99
 Total

Re: [opensource-dev] J2C fast decoder

2010-09-13 Thread Argent Stonecutter
On 2010-09-13, at 00:40, Tateru Nino wrote:
> If we're using HTTP textures, is there actually any need for the JPEG 2000 
> format? Since the transfer time of individual textures is vastly reduced 
> (from the first byte to the last byte) the intermediate quality levels 
> supported by jpg2k would seem to be redundant.

I'm on a 256k DSL. I have HTTP textures enabled. I still see many intermediate 
texture levels.

Also, for large textures, switching to PNG would likely increase the size of 
the transfer, which is not good.

On the other hand, since both "old" JPG and PNG support progressive decoding, 
why not use PNG for lossless textures and JPG for lossy ones? Then you don't 
lose anything.
___
Policies and (un)subscribe information available here:
http://wiki.secondlife.com/wiki/OpenSource-Dev
Please read the policies before posting to keep unmoderated posting privileges


Re: [opensource-dev] J2C fast decoder

2010-09-13 Thread Leonel Morgado
Notice that old JPG does not support alpha channels (transparency). That
means abandoning JPEG2000 would in fact force everyone with a single
transparent pixel (even if just corners of round textures) to use lossless
PNG for that, which is not optimal, to say the least.

Inté,

Leonel


-Original Message-
From: opensource-dev-boun...@lists.secondlife.com
[mailto:opensource-dev-boun...@lists.secondlife.com] On Behalf Of Argent
Stonecutter
Sent: segunda-feira, 13 de Setembro de 2010 13:15
To: Tateru Nino
Cc: opensource-dev@lists.secondlife.com
Subject: Re: [opensource-dev] J2C fast decoder

On 2010-09-13, at 00:40, Tateru Nino wrote:
> If we're using HTTP textures, is there actually any need for the JPEG 2000
format? Since the transfer time of individual textures is vastly reduced
(from the first byte to the last byte) the intermediate quality levels
supported by jpg2k would seem to be redundant.

I'm on a 256k DSL. I have HTTP textures enabled. I still see many
intermediate texture levels.

Also, for large textures, switching to PNG would likely increase the size of
the transfer, which is not good.

On the other hand, since both "old" JPG and PNG support progressive
decoding, why not use PNG for lossless textures and JPG for lossy ones? Then
you don't lose anything.
___
Policies and (un)subscribe information available here:
http://wiki.secondlife.com/wiki/OpenSource-Dev
Please read the policies before posting to keep unmoderated posting
privileges
Nenhum vírus encontrado nessa mensagem recebida.
Verificado por AVG - www.avgbrasil.com.br 
Versão: 9.0.851 / Banco de dados de vírus: 271.1.1/3128 - Data de
Lançamento: 09/12/10 19:34:00

___
Policies and (un)subscribe information available here:
http://wiki.secondlife.com/wiki/OpenSource-Dev
Please read the policies before posting to keep unmoderated posting privileges


Re: [opensource-dev] J2C fast decoder

2010-09-13 Thread Ambrosia
It goes even further. JPEG2k is superior in compression ratio compared
to quality to both, JPEG and PNG when it comes to lossy and lossless.
it is actually a quite awesome image format through and through.

The only real problem with JPEG2k are software patents. It's the
reason why free decoders like OpenJPEG and JasPer are so much slower
compared to KDU. KDU is developed by one of the people sitting in the
JPEG2K consortium, and is stuffed with proprietary algorithms.

On Mon, Sep 13, 2010 at 14:22, Leonel Morgado  wrote:
> Notice that old JPG does not support alpha channels (transparency). That
> means abandoning JPEG2000 would in fact force everyone with a single
> transparent pixel (even if just corners of round textures) to use lossless
> PNG for that, which is not optimal, to say the least.
>
> Inté,
>
> Leonel
>
>
> -Original Message-
> From: opensource-dev-boun...@lists.secondlife.com
> [mailto:opensource-dev-boun...@lists.secondlife.com] On Behalf Of Argent
> Stonecutter
> Sent: segunda-feira, 13 de Setembro de 2010 13:15
> To: Tateru Nino
> Cc: opensource-dev@lists.secondlife.com
> Subject: Re: [opensource-dev] J2C fast decoder
>
> On 2010-09-13, at 00:40, Tateru Nino wrote:
>> If we're using HTTP textures, is there actually any need for the JPEG 2000
> format? Since the transfer time of individual textures is vastly reduced
> (from the first byte to the last byte) the intermediate quality levels
> supported by jpg2k would seem to be redundant.
>
> I'm on a 256k DSL. I have HTTP textures enabled. I still see many
> intermediate texture levels.
>
> Also, for large textures, switching to PNG would likely increase the size of
> the transfer, which is not good.
>
> On the other hand, since both "old" JPG and PNG support progressive
> decoding, why not use PNG for lossless textures and JPG for lossy ones? Then
> you don't lose anything.
> ___
> Policies and (un)subscribe information available here:
> http://wiki.secondlife.com/wiki/OpenSource-Dev
> Please read the policies before posting to keep unmoderated posting
> privileges
> Nenhum vírus encontrado nessa mensagem recebida.
> Verificado por AVG - www.avgbrasil.com.br
> Versão: 9.0.851 / Banco de dados de vírus: 271.1.1/3128 - Data de
> Lançamento: 09/12/10 19:34:00
>
> ___
> Policies and (un)subscribe information available here:
> http://wiki.secondlife.com/wiki/OpenSource-Dev
> Please read the policies before posting to keep unmoderated posting privileges
>
___
Policies and (un)subscribe information available here:
http://wiki.secondlife.com/wiki/OpenSource-Dev
Please read the policies before posting to keep unmoderated posting privileges


Re: [opensource-dev] J2C fast decoder

2010-09-13 Thread Philippe (Merov) Bossut
Hi,

Very interesting discussion though it seems that folks collide several
things when talking about "textures" in general terms: there's a load of
difference between a repetitive 64x64 texture used to tile a brick wall and
a photographic quality 1024x1024 texture. The former could certainly benefit
from being stored and sent around as PNG (low format overhead, lossly
compression will actually make things worse on such a small image no matter
what, and lossless jpeg will end up being bigger than PNG) while the later
will benefit tremendously from the wavelet compression provided by jpeg2000.

I won't go into the advantage of wavelet compression for photographic images
as there is a *huge* literature on the subject with loads and loads of data
proving the point. One can argue between "normal" jpeg (using DCT) and
jpeg2000 (using wavelets) but there's absolutely no contest between jpeg
(whichever flavor) and png for photographic images in term of quality at
high or even moderate compression ratio.

On the subject of access per resolution (aka "discard levels" in SL
parlance), it is of great interest as some folks mentioned when viewing a
texture from a distance. No matter what's the transport protocol, exchanging
a 32x32 will be faster than exchanging a 1024x1024 (with RGBA pixel
values...). The viewer is able to use partially populated mipmaps (which
are, in effect, subres pyramids themselves as there isn't much difference
between "discard level" and LOD...) and, therefore, use partially downloaded
and decompressed images. Note that with jpeg2000 when asking for a new
level, one does not download the whole full res 32 bits per pixels data but
the wavelet coefficients for that level which are, roughly speaking,
encoding the difference with the previous level. That translate in huge
compression benefits in slowly changing areas in particular.

One wavelet property though that our viewer does not take advantage of is
the spatial random access property of the jpeg2000 format. That would allow
for instance to request and download only a portion of the full res data
when needed. That is advantageous in cases where only a portion of the whole
image is mapped on a prim for instance. I've no data though to know if it's
a frequent case in SL but that would be interesting to know how much of a
texture is truly displayed on average. There's may be an interesting ore of
performance to mine there.

All that though needs to be backed by data. This preliminary performance
gathering toolbox I talked about is a first step in that direction.

Cheers,
- Merov
___
Policies and (un)subscribe information available here:
http://wiki.secondlife.com/wiki/OpenSource-Dev
Please read the policies before posting to keep unmoderated posting privileges

Re: [opensource-dev] J2C fast decoder

2010-09-14 Thread Argent Stonecutter
On 2010-09-13, at 07:22, Leonel Morgado wrote:
> Notice that old JPG does not support alpha channels (transparency). That
> means abandoning JPEG2000 would in fact force everyone with a single
> transparent pixel (even if just corners of round textures) to use lossless
> PNG for that, which is not optimal, to say the least.

Doh. You are absolutely right. My bad.
___
Policies and (un)subscribe information available here:
http://wiki.secondlife.com/wiki/OpenSource-Dev
Please read the policies before posting to keep unmoderated posting privileges