Requesting a discard level means you only get a portion of the entire file,
and if you wanted the highest resolution you would download the entire file
which would include all discard levels. The advantage of being able to
request only what you need can save a lot of network traffic. You don't
really want to download a 1024x1024 texture for a distant object that only
covers a few pixels on the screen.



On Mon, Sep 13, 2010 at 12:24 AM, Tateru Nino <tat...@taterunino.net> wrote:

>  Wouldn't we be making a network saving by omitting the discard levels
> entirely? Granted, I don't have hard data about that - would the base
> texture encoded in a lighter-weight format end up causing less data to
> traverse for a given texture in the long-run than the more-efficiently
> compressed j2c of the same texture including discard levels? My gut instinct
> says 'probably', but I can't prove that with data.
>
> If it *does* then we would have a double-bonus of also saving on decoding
> time.
>
>
> On 13/09/2010 4:38 PM, Dahlia Trimble wrote:
>
> Jpeg 2000 discard levels are also used for reducing the resolution of
> textures for distant objects which reduces data download requirements. Few
> other formats offer comparable compression ratios with the quality that Jpeg
> 2000 offers. HTTP transfer doesn't magically make data traverse the network
> faster; much of the reduced download time is due to offloading the sim from
> the task of sending textures as they can come from another server process
> (or even another physical server).
>
>
> On Sun, Sep 12, 2010 at 10:40 PM, Tateru Nino <tat...@taterunino.net>wrote:
>
>>  If we're using HTTP textures, is there actually any need for the JPEG
>> 2000 format? Since the transfer time of individual textures is vastly
>> reduced (from the first byte to the last byte) the intermediate quality
>> levels supported by jpg2k would seem to be redundant. Indeed, you could
>> argue that transferring the textures in jpg2k format imposes a now-redundant
>> workload on the texture-pipeline, and that providing HTTP textures in a
>> simpler format that is more tractable to high-speed, low-cost decoding would
>> save a whole lot of problems.
>>
>> Would it be a huge problem, for example, to transfer HTTP textures as TGA
>> or PNG and use one of the rather well-optimized decoder libraries for those
>> instead? It seems to me that it would be more efficient both on the network
>> and on the system - though at the expense of conversion of all the textures
>> at the store.
>>
>> Just thinking out loud.
>>
>>
>> On 13/09/2010 1:58 PM, Sheet Spotter wrote:
>>
>>  Some comments on SNOW-361 (Upgrade to OpenJPEG v2) suggested that
>> changing to version 2 of OpenJPEG might improve performance, while other
>> comments suggested it might not support progressive decoding.
>>
>>             http://jira.secondlife.com/browse/SNOW-361
>>
>>
>>
>> Is an upgrade to OpenJPEG v2 under active development?
>>
>>
>>
>>
>>
>> Sheet Spotter
>>
>>
>>  ------------------------------
>>
>> *From:* opensource-dev-boun...@lists.secondlife.com [
>> mailto:opensource-dev-boun...@lists.secondlife.com<opensource-dev-boun...@lists.secondlife.com>]
>> *On Behalf Of *Philippe (Merov) Bossut
>> *Sent:* September 9, 2010 10:35 PM
>> *To:* Nicky Fullton
>> *Cc:* opensource-dev@lists.secondlife.com
>> *Subject:* Re: [opensource-dev] J2C fast decoder
>>
>>
>>
>> Hi Nicky,
>>
>> As it happens, I've been working on instrumenting the code to add metric
>> gathering for image decompression as part of the Snowstorm sprint.
>>
>> You may want to use my branch (
>> https://bitbucket.org/merov_linden/viewer-development-vwr-22761) and
>> create a baseline for openjpeg then run a test for Jasper. You'll have to
>> sort out the failing cases certainly and just throw them so we compare what
>> gets truly decompressed (though, clearly, working in all cases is pretty
>> critical if we look at Jasper as an alternative).
>>
>> Here's what I got comparing KDU and OpenJpeg:
>> Label     Metric                              KDU(B)     OJ2C(T)
>>  Diff(T-B)     Percentage(100*T/B)
>> ImageCompressionTester-1
>>      TotalBytesInDecompression    5048643    5003370    -45273        99.1
>>      TotalBytesOutDecompression 40415336  46592896    6177560    115.29
>>      TimeTimeDecompression        3.74           17.04          13.3
>>         455.39
>> ImageCompressionTester-2
>>      TotalBytesInDecompression    5000744    5000144     -600
>> 99.99
>>      TotalBytesOutDecompression 46440040  44248324   -2191716    95.28
>>      TimeTimeDecompression        3.64           15.02           11.37
>>      412.02
>>
>> For that test, I output data every time 5MB of compressed data have been
>> processed. It's partial but shows that OpenJpeg is roughly 4 times slower
>> than KDU (at least, the version we're using in the official viewer
>> currently). Would be nice to have a similar set of numbers for Jasper before
>> going too far down the implementation path.
>>
>> I wrote a short (and still incompleted) wiki to explain a bit how the
>> metric gathering system works:
>> - https://wiki.secondlife.com/wiki/Performance_Testers
>>
>> BTW, that's something we should be using more generally for other perf
>> sensitive areas, especially when starting a perf improvement project.
>>
>> See http://jira.secondlife.com/browse/VWR-22761 for details.
>>
>> Cheers,
>> - Merov
>>
>> On Fri, Sep 3, 2010 at 9:05 AM, Nicky Fullton <nickyd...@yahoo.com>
>> wrote:
>>
>> Hello,
>>
>> >> i'm testing in RL office (not or a viewer) JasPer decoder for JPG2000
>> >> images, after a short test with openjpeg2000 from EPFL we have tested
>> >> last 3 days JasPer (only a POC apps to do some bench), we must do a lot
>> >> of work too, but this is a lil question... anybody here around never
>> >> tried it as alternative to OpenJPEG/KDU in a viewer?
>>
>> >I'm not aware of anyone publishing results for such a test, but if you
>> >have the time it would be interesting reading.
>>
>> You might be interested in:
>> http://bitbucket.org/NickyD/viewer-development/changeset/027bf44c5582
>>
>> I made a rather quick hack to try Jasper instead of OpenJpeg to decode
>> images.
>>
>> The patch has some very rough edges. In fact is the decoding into the
>> LLImageRaw buffer not correct.
>>
>> I did not fix this (yet) because the results so far are not very
>> promising.
>> Jasper can only decode around 20% of the jpeg, for the other 80% it will
>> create an error and then my code falls back to OpenJpeg.
>> This fallback makes the whole decoding rather slow, so it is hard to say
>> if Jasper would really be any faster.
>>
>> Right now I am not sure if it would be reasonable to invest more time
>> looking at Jasper. First the code would need to fixed upstream, so all
>> images can be properly decoded. As this project looks rather dead, one
>> with JPEG2000 knowledge might have to step up for this.
>>
>> On another note, you might like to try:
>> http://bitbucket.org/NickyD/viewer-development/changeset/e4eff3e2af39
>>
>> This will at least skip the step of calling OpenJpeg in
>>  LImageJ2COJ::getMetadata (if possible, it will do sanity checks first).
>>
>> >Some things to keep in
>> >mind. OpenJpeg has patches floating around on its ML against 1.3 that
>> >reports have claimed up to 40% speed increase in places due to
>> >unrolling the inner loops so finding them and testing would be good.
>>
>> I did not find any of those, but then again maybe I did not look hard
>> enough.
>> There is certainly some potential in OpenJpeg.
>> There are some loops in t1_dec_sigpass and t1_dec_refpass that can be
>> easily rewritten. But there is some pretty tricky stuff in t1_dec_clnpass
>> that would need some cleaning and mqc decoder (mqc_decode) burns a lot
>> of time. But that one is especially hairy as it has side effects on its
>> input parameter.
>>
>> I am not sure if anyone without enough deep knowledge of OpenJpeg (and
>> the dedication to recode a good part of it) would be able to improve
>> much of it.
>>
>> Cheers,
>>   Nicky
>>
>>
>>
>>
>> _______________________________________________
>> Policies and (un)subscribe information available here:
>> http://wiki.secondlife.com/wiki/OpenSource-Dev
>> Please read the policies before posting to keep unmoderated posting
>> privileges
>>
>>
>>
>>
>> _______________________________________________
>> Policies and (un)subscribe information available 
>> here:http://wiki.secondlife.com/wiki/OpenSource-Dev
>> Please read the policies before posting to keep unmoderated posting 
>> privileges
>>
>>
>>  --
>> Tateru Nino
>> Contributing Editor http://massively.com/
>>
>>
>> _______________________________________________
>> Policies and (un)subscribe information available here:
>> http://wiki.secondlife.com/wiki/OpenSource-Dev
>> Please read the policies before posting to keep unmoderated posting
>> privileges
>>
>
>
> _______________________________________________
> Policies and (un)subscribe information available 
> here:http://wiki.secondlife.com/wiki/OpenSource-Dev
> Please read the policies before posting to keep unmoderated posting privileges
>
>
> --
> Tateru Ninohttp://dwellonit.taterunino.net/
>
>
> _______________________________________________
> Policies and (un)subscribe information available here:
> http://wiki.secondlife.com/wiki/OpenSource-Dev
> Please read the policies before posting to keep unmoderated posting
> privileges
>
_______________________________________________
Policies and (un)subscribe information available here:
http://wiki.secondlife.com/wiki/OpenSource-Dev
Please read the policies before posting to keep unmoderated posting privileges

Reply via email to