Re: [opensource-dev] J2C fast decoder

2010-09-13 Thread Tateru Nino
 Wouldn't we be making a network saving by omitting the discard levels 
entirely? Granted, I don't have hard data about that - would the base 
texture encoded in a lighter-weight format end up causing less data to 
traverse for a given texture in the long-run than the more-efficiently 
compressed j2c of the same texture including discard levels? My gut 
instinct says 'probably', but I can't prove that with data.


If it *does* then we would have a double-bonus of also saving on 
decoding time.


On 13/09/2010 4:38 PM, Dahlia Trimble wrote:
Jpeg 2000 discard levels are also used for reducing the resolution of 
textures for distant objects which reduces data download requirements. 
Few other formats offer comparable compression ratios with the quality 
that Jpeg 2000 offers. HTTP transfer doesn't magically make data 
traverse the network faster; much of the reduced download time is due 
to offloading the sim from the task of sending textures as they can 
come from another server process (or even another physical server).



On Sun, Sep 12, 2010 at 10:40 PM, Tateru Nino > wrote:


If we're using HTTP textures, is there actually any need for the
JPEG 2000 format? Since the transfer time of individual textures
is vastly reduced (from the first byte to the last byte) the
intermediate quality levels supported by jpg2k would seem to be
redundant. Indeed, you could argue that transferring the textures
in jpg2k format imposes a now-redundant workload on the
texture-pipeline, and that providing HTTP textures in a simpler
format that is more tractable to high-speed, low-cost decoding
would save a whole lot of problems.

Would it be a huge problem, for example, to transfer HTTP textures
as TGA or PNG and use one of the rather well-optimized decoder
libraries for those instead? It seems to me that it would be more
efficient both on the network and on the system - though at the
expense of conversion of all the textures at the store.

Just thinking out loud.


On 13/09/2010 1:58 PM, Sheet Spotter wrote:


Some comments on SNOW-361 (Upgrade to OpenJPEG v2) suggested that
changing to version 2 of OpenJPEG might improve performance,
while other comments suggested it might not support progressive
decoding.

http://jira.secondlife.com/browse/SNOW-361

Is an upgrade to OpenJPEG v2 under active development?

Sheet Spotter



*From:* opensource-dev-boun...@lists.secondlife.com

[mailto:opensource-dev-boun...@lists.secondlife.com] *On Behalf
Of *Philippe (Merov) Bossut
*Sent:* September 9, 2010 10:35 PM
*To:* Nicky Fullton
*Cc:* opensource-dev@lists.secondlife.com

*Subject:* Re: [opensource-dev] J2C fast decoder

Hi Nicky,

As it happens, I've been working on instrumenting the code to add
metric gathering for image decompression as part of the Snowstorm
sprint.

You may want to use my branch
(https://bitbucket.org/merov_linden/viewer-development-vwr-22761)
and create a baseline for openjpeg then run a test for Jasper.
You'll have to sort out the failing cases certainly and just
throw them so we compare what gets truly decompressed (though,
clearly, working in all cases is pretty critical if we look at
Jasper as an alternative).

Here's what I got comparing KDU and OpenJpeg:
Label Metric  KDU(B)   
 OJ2C(T) Diff(T-B) Percentage(100*T/B)

ImageCompressionTester-1
 TotalBytesInDecompression50486435003370   
-4527399.1
 TotalBytesOutDecompression 40415336  465928966177560   
115.29
 TimeTimeDecompression3.74   17.04 
13.3455.39

ImageCompressionTester-2
 TotalBytesInDecompression50007445000144 -600   
   99.99
 TotalBytesOutDecompression 46440040  44248324   -2191716   
95.28
 TimeTimeDecompression3.64   15.02  
11.37 412.02


For that test, I output data every time 5MB of compressed data
have been processed. It's partial but shows that OpenJpeg is
roughly 4 times slower than KDU (at least, the version we're
using in the official viewer currently). Would be nice to have a
similar set of numbers for Jasper before going too far down the
implementation path.

I wrote a short (and still incompleted) wiki to explain a bit how
the metric gathering system works:
- https://wiki.secondlife.com/wiki/Performance_Testers

BTW, that's something we should be using more generally for other
perf sensitive areas, especially when starting a perf improvement
project.

See http://jira.secondlife.com/brow

Re: [opensource-dev] J2C fast decoder

2010-09-13 Thread Dahlia Trimble
Requesting a discard level means you only get a portion of the entire file,
and if you wanted the highest resolution you would download the entire file
which would include all discard levels. The advantage of being able to
request only what you need can save a lot of network traffic. You don't
really want to download a 1024x1024 texture for a distant object that only
covers a few pixels on the screen.



On Mon, Sep 13, 2010 at 12:24 AM, Tateru Nino  wrote:

>  Wouldn't we be making a network saving by omitting the discard levels
> entirely? Granted, I don't have hard data about that - would the base
> texture encoded in a lighter-weight format end up causing less data to
> traverse for a given texture in the long-run than the more-efficiently
> compressed j2c of the same texture including discard levels? My gut instinct
> says 'probably', but I can't prove that with data.
>
> If it *does* then we would have a double-bonus of also saving on decoding
> time.
>
>
> On 13/09/2010 4:38 PM, Dahlia Trimble wrote:
>
> Jpeg 2000 discard levels are also used for reducing the resolution of
> textures for distant objects which reduces data download requirements. Few
> other formats offer comparable compression ratios with the quality that Jpeg
> 2000 offers. HTTP transfer doesn't magically make data traverse the network
> faster; much of the reduced download time is due to offloading the sim from
> the task of sending textures as they can come from another server process
> (or even another physical server).
>
>
> On Sun, Sep 12, 2010 at 10:40 PM, Tateru Nino wrote:
>
>>  If we're using HTTP textures, is there actually any need for the JPEG
>> 2000 format? Since the transfer time of individual textures is vastly
>> reduced (from the first byte to the last byte) the intermediate quality
>> levels supported by jpg2k would seem to be redundant. Indeed, you could
>> argue that transferring the textures in jpg2k format imposes a now-redundant
>> workload on the texture-pipeline, and that providing HTTP textures in a
>> simpler format that is more tractable to high-speed, low-cost decoding would
>> save a whole lot of problems.
>>
>> Would it be a huge problem, for example, to transfer HTTP textures as TGA
>> or PNG and use one of the rather well-optimized decoder libraries for those
>> instead? It seems to me that it would be more efficient both on the network
>> and on the system - though at the expense of conversion of all the textures
>> at the store.
>>
>> Just thinking out loud.
>>
>>
>> On 13/09/2010 1:58 PM, Sheet Spotter wrote:
>>
>>  Some comments on SNOW-361 (Upgrade to OpenJPEG v2) suggested that
>> changing to version 2 of OpenJPEG might improve performance, while other
>> comments suggested it might not support progressive decoding.
>>
>> http://jira.secondlife.com/browse/SNOW-361
>>
>>
>>
>> Is an upgrade to OpenJPEG v2 under active development?
>>
>>
>>
>>
>>
>> Sheet Spotter
>>
>>
>>  --
>>
>> *From:* opensource-dev-boun...@lists.secondlife.com [
>> mailto:opensource-dev-boun...@lists.secondlife.com]
>> *On Behalf Of *Philippe (Merov) Bossut
>> *Sent:* September 9, 2010 10:35 PM
>> *To:* Nicky Fullton
>> *Cc:* opensource-dev@lists.secondlife.com
>> *Subject:* Re: [opensource-dev] J2C fast decoder
>>
>>
>>
>> Hi Nicky,
>>
>> As it happens, I've been working on instrumenting the code to add metric
>> gathering for image decompression as part of the Snowstorm sprint.
>>
>> You may want to use my branch (
>> https://bitbucket.org/merov_linden/viewer-development-vwr-22761) and
>> create a baseline for openjpeg then run a test for Jasper. You'll have to
>> sort out the failing cases certainly and just throw them so we compare what
>> gets truly decompressed (though, clearly, working in all cases is pretty
>> critical if we look at Jasper as an alternative).
>>
>> Here's what I got comparing KDU and OpenJpeg:
>> Label Metric  KDU(B) OJ2C(T)
>>  Diff(T-B) Percentage(100*T/B)
>> ImageCompressionTester-1
>>  TotalBytesInDecompression50486435003370-4527399.1
>>  TotalBytesOutDecompression 40415336  465928966177560115.29
>>  TimeTimeDecompression3.74   17.04  13.3
>> 455.39
>> ImageCompressionTester-2
>>  TotalBytesInDecompression50007445000144 -600
>> 99.99
>>  TotalBytesOutDecompression 46440040  44248324   -219171695.28
>>  TimeTimeDecompression3.64   15.02   11.37
>>  412.02
>>
>> For that test, I output data every time 5MB of compressed data have been
>> processed. It's partial but shows that OpenJpeg is roughly 4 times slower
>> than KDU (at least, the version we're using in the official viewer
>> currently). Would be nice to have a similar set of numbers for Jasper before
>> going too far down the implementation path.
>>
>> I wrote a short (and still incompleted) wiki to explain a bit how the
>> metric gathe

Re: [opensource-dev] J2C fast decoder

2010-09-13 Thread leliel
On Mon, Sep 13, 2010 at 12:24 AM, Tateru Nino  wrote:
> Wouldn't we be making a network saving by omitting the discard levels
> entirely? Granted, I don't have hard data about that - would the base
> texture encoded in a lighter-weight format end up causing less data to
> traverse for a given texture in the long-run than the more-efficiently
> compressed j2c of the same texture including discard levels? My gut instinct
> says 'probably', but I can't prove that with data.
>
> If it *does* then we would have a double-bonus of also saving on decoding
> time.

The problem is that discard levels save us on more then just
bandwidth. when you look at an object 300m away it will only be taking
up a few dozen pixels on the screen so the viewer will just download
up to the first or second discard level, say 32x32 or so, and that is
all that will be stored on your video card. Switching to a format that
doesn't support discard levels could give us big savings in decode
time, but at the cost of having to always download the whole file.
We'd be trading cpu time for vram, and the average users machine has a
lot more of the former than the later
___
Policies and (un)subscribe information available here:
http://wiki.secondlife.com/wiki/OpenSource-Dev
Please read the policies before posting to keep unmoderated posting privileges


Re: [opensource-dev] J2C fast decoder

2010-09-13 Thread Francesco Rabbi
> Would it be a huge problem, for example, to transfer HTTP textures
> as TGA or PNG and use one of the rather well-optimized decoder

TGA have lossless RLE compression anyway transport is
format-indipendent, you can send on HTTP voice too (like skype) or
anything you want. To increase bandwith and rendering performance
we need a client side routine (viewer must ask to asset server only
vievable textures) and servers should send the right order from closer
object to farest.
Scaling/resizing should be done by viewer after render engine calc the
size of the surface where texture is. Receive a already undersampled
image mean each movement of avatar/camera a re-trasmission of same
image each time "bigger", in medium-long time terms is better receive
the fullres texture and let the viewer resize and render using discarding
to the right level based on local viewer side settings (resolution,
monitor DPI, CPU power)

But as always we are talking with only our hardware in our hands,all
about this can be discussed better if somewhere are avaiable some
statistic data about residents' hardware.


If LL (better if others viewers team too) can collect anonymous data
all decision about decoding, pipeline, shadow or shader and all other
can taken more easy, a approx list of usefull data can be (if a
statistical collector is or will enabled) to probe each XX minutes:

- CPU mips/bogomips
- CPU calc cores (not physical, just grand total)
- avarage load of cores
- cpu load of viewer executable
- number and avarage load of plugins (voice included)
- Amount of RAM (how much free)
- Amount of swap (how much used)
- Brand and model of graphic card
- Graphic settings
- number of agent in visible area
- avarage of rendering cost of agents in visible area
- FPS collected only when viewer isn't in "icon"
- inbound&outbound bandwith
- bandwith used by viewer
- bandwith used by plugins, voice too
- % of packetloss
- uptime of connection

to be sure no double data collected and no personal data used to
collect them i suggest to use serial number of CPU (linux /proc/cpuinfo,
mac same, windows dunno how) or sort of unique UUID based on hardware
configuration (if somebody increase ram or CPU a new ID should be used)

-- 
Sent by iPhone

Il giorno 13/set/2010, alle ore 07:40, Tateru Nino 
ha scritto:

If we're using HTTP textures, is there actually any need for the JPEG 2000
format? Since the transfer time of individual textures is vastly reduced
(from the first byte to the last byte) the intermediate quality levels
supported by jpg2k would seem to be redundant. Indeed, you could argue that
transferring the textures in jpg2k format imposes a now-redundant workload
on the texture-pipeline, and that providing HTTP textures in a simpler
format that is more tractable to high-speed, low-cost decoding would save a
whole lot of problems.

Would it be a huge problem, for example, to transfer HTTP textures as TGA or
PNG and use one of the rather well-optimized decoder libraries for those
instead? It seems to me that it would be more efficient both on the network
and on the system - though at the expense of conversion of all the textures
at the store.

Just thinking out loud.

On 13/09/2010 1:58 PM, Sheet Spotter wrote:

 Some comments on SNOW-361 (Upgrade to OpenJPEG v2) suggested that changing
to version 2 of OpenJPEG might improve performance, while other comments
suggested it might not support progressive decoding.

http://jira.secondlife.com/browse/SNOW-361



Is an upgrade to OpenJPEG v2 under active development?





Sheet Spotter


 --

*From:* opensource-dev-boun...@lists.secondlife.com [
mailto:opensource-dev-boun...@lists.secondlife.com]
*On Behalf Of *Philippe (Merov) Bossut
*Sent:* September 9, 2010 10:35 PM
*To:* Nicky Fullton
*Cc:* opensource-dev@lists.secondlife.com
*Subject:* Re: [opensource-dev] J2C fast decoder



Hi Nicky,

As it happens, I've been working on instrumenting the code to add metric
gathering for image decompression as part of the Snowstorm sprint.

You may want to use my branch (
https://bitbucket.org/merov_linden/viewer-development-vwr-22761) and create
a baseline for openjpeg then run a test for Jasper. You'll have to sort out
the failing cases certainly and just throw them so we compare what gets
truly decompressed (though, clearly, working in all cases is pretty critical
if we look at Jasper as an alternative).

Here's what I got comparing KDU and OpenJpeg:
Label Metric  KDU(B) OJ2C(T)
 Diff(T-B) Percentage(100*T/B)
ImageCompressionTester-1
 TotalBytesInDecompression50486435003370-4527399.1
 TotalBytesOutDecompression 40415336  465928966177560115.29
 TimeTimeDecompression3.74   17.04  13.3
455.39
ImageCompressionTester-2
 TotalBytesInDecompression50007445000144 -600
99.99
 TotalBytesOutDecompression 46440040  44248324   -219171695.28
 

Re: [opensource-dev] Client side scripting.

2010-09-13 Thread Argent Stonecutter
The hard part isn't coming up with an embedded scripting language, it's not 
even coming up with a secure set of bindings that don't allow for unanticipated 
side-effects or privilege escalation, it's integrating the scripting engine 
into an event loop that wasn't designed to have a scripting engine in it.

And designing a secure set of bindings that do something useful without 
security holes is already pretty damn tough.

I would suggest starting with a restricted subset of Javascript in the web 
engine, for interactive notecards, and see what it takes just to get THAT part 
secure, reliable, and robust.
___
Policies and (un)subscribe information available here:
http://wiki.secondlife.com/wiki/OpenSource-Dev
Please read the policies before posting to keep unmoderated posting privileges


Re: [opensource-dev] J2C fast decoder

2010-09-13 Thread Argent Stonecutter
On 2010-09-13, at 00:40, Tateru Nino wrote:
> If we're using HTTP textures, is there actually any need for the JPEG 2000 
> format? Since the transfer time of individual textures is vastly reduced 
> (from the first byte to the last byte) the intermediate quality levels 
> supported by jpg2k would seem to be redundant.

I'm on a 256k DSL. I have HTTP textures enabled. I still see many intermediate 
texture levels.

Also, for large textures, switching to PNG would likely increase the size of 
the transfer, which is not good.

On the other hand, since both "old" JPG and PNG support progressive decoding, 
why not use PNG for lossless textures and JPG for lossy ones? Then you don't 
lose anything.
___
Policies and (un)subscribe information available here:
http://wiki.secondlife.com/wiki/OpenSource-Dev
Please read the policies before posting to keep unmoderated posting privileges


Re: [opensource-dev] J2C fast decoder

2010-09-13 Thread Leonel Morgado
Notice that old JPG does not support alpha channels (transparency). That
means abandoning JPEG2000 would in fact force everyone with a single
transparent pixel (even if just corners of round textures) to use lossless
PNG for that, which is not optimal, to say the least.

Inté,

Leonel


-Original Message-
From: opensource-dev-boun...@lists.secondlife.com
[mailto:opensource-dev-boun...@lists.secondlife.com] On Behalf Of Argent
Stonecutter
Sent: segunda-feira, 13 de Setembro de 2010 13:15
To: Tateru Nino
Cc: opensource-dev@lists.secondlife.com
Subject: Re: [opensource-dev] J2C fast decoder

On 2010-09-13, at 00:40, Tateru Nino wrote:
> If we're using HTTP textures, is there actually any need for the JPEG 2000
format? Since the transfer time of individual textures is vastly reduced
(from the first byte to the last byte) the intermediate quality levels
supported by jpg2k would seem to be redundant.

I'm on a 256k DSL. I have HTTP textures enabled. I still see many
intermediate texture levels.

Also, for large textures, switching to PNG would likely increase the size of
the transfer, which is not good.

On the other hand, since both "old" JPG and PNG support progressive
decoding, why not use PNG for lossless textures and JPG for lossy ones? Then
you don't lose anything.
___
Policies and (un)subscribe information available here:
http://wiki.secondlife.com/wiki/OpenSource-Dev
Please read the policies before posting to keep unmoderated posting
privileges
Nenhum vírus encontrado nessa mensagem recebida.
Verificado por AVG - www.avgbrasil.com.br 
Versão: 9.0.851 / Banco de dados de vírus: 271.1.1/3128 - Data de
Lançamento: 09/12/10 19:34:00

___
Policies and (un)subscribe information available here:
http://wiki.secondlife.com/wiki/OpenSource-Dev
Please read the policies before posting to keep unmoderated posting privileges


Re: [opensource-dev] J2C fast decoder

2010-09-13 Thread Ambrosia
It goes even further. JPEG2k is superior in compression ratio compared
to quality to both, JPEG and PNG when it comes to lossy and lossless.
it is actually a quite awesome image format through and through.

The only real problem with JPEG2k are software patents. It's the
reason why free decoders like OpenJPEG and JasPer are so much slower
compared to KDU. KDU is developed by one of the people sitting in the
JPEG2K consortium, and is stuffed with proprietary algorithms.

On Mon, Sep 13, 2010 at 14:22, Leonel Morgado  wrote:
> Notice that old JPG does not support alpha channels (transparency). That
> means abandoning JPEG2000 would in fact force everyone with a single
> transparent pixel (even if just corners of round textures) to use lossless
> PNG for that, which is not optimal, to say the least.
>
> Inté,
>
> Leonel
>
>
> -Original Message-
> From: opensource-dev-boun...@lists.secondlife.com
> [mailto:opensource-dev-boun...@lists.secondlife.com] On Behalf Of Argent
> Stonecutter
> Sent: segunda-feira, 13 de Setembro de 2010 13:15
> To: Tateru Nino
> Cc: opensource-dev@lists.secondlife.com
> Subject: Re: [opensource-dev] J2C fast decoder
>
> On 2010-09-13, at 00:40, Tateru Nino wrote:
>> If we're using HTTP textures, is there actually any need for the JPEG 2000
> format? Since the transfer time of individual textures is vastly reduced
> (from the first byte to the last byte) the intermediate quality levels
> supported by jpg2k would seem to be redundant.
>
> I'm on a 256k DSL. I have HTTP textures enabled. I still see many
> intermediate texture levels.
>
> Also, for large textures, switching to PNG would likely increase the size of
> the transfer, which is not good.
>
> On the other hand, since both "old" JPG and PNG support progressive
> decoding, why not use PNG for lossless textures and JPG for lossy ones? Then
> you don't lose anything.
> ___
> Policies and (un)subscribe information available here:
> http://wiki.secondlife.com/wiki/OpenSource-Dev
> Please read the policies before posting to keep unmoderated posting
> privileges
> Nenhum vírus encontrado nessa mensagem recebida.
> Verificado por AVG - www.avgbrasil.com.br
> Versão: 9.0.851 / Banco de dados de vírus: 271.1.1/3128 - Data de
> Lançamento: 09/12/10 19:34:00
>
> ___
> Policies and (un)subscribe information available here:
> http://wiki.secondlife.com/wiki/OpenSource-Dev
> Please read the policies before posting to keep unmoderated posting privileges
>
___
Policies and (un)subscribe information available here:
http://wiki.secondlife.com/wiki/OpenSource-Dev
Please read the policies before posting to keep unmoderated posting privileges


Re: [opensource-dev] Retaining Newbies (Was: The Plan for Snowglobe)

2010-09-13 Thread Oz Linden (Scott Lawrence)

 On 2010-09-11 14:11, Dilly Dobbs wrote:

   On 9/11/2010 1:05 PM, Robert Martin wrote:

On Sat, Sep 11, 2010 at 1:52 PM, dilly dobbs   wrote:

We all seem like intelligent adults that could come to an agreement on how
to add priority to the issues that we would like to see addressed.  And yes
i work in an Agile dev software shop so im sorry about the lingo.
Opinions, and ideas on this ?

I would like to put my paws into this kind of thing could we get an
indoor type inworld location for this??


I think that i would be able to provide us with a meeting place on the
grid.  My wife i am sure would be willing to give us a meeting place on
the grid.  And i would be willing to foot the group costs as long as
they don't get out of control.



Feel free to use the Hippotropolis Theater

   http://maps.secondlife.com/secondlife/Hippotropolis/241/29/23

or there are some nice meeting spaces both indoor and outdoor in the 
nearby Open Source Park


   http://maps.secondlife.com/secondlife/Hippotropolis/37/233/22

if there are other in-world resources that would be helpful, contact me.


___
Policies and (un)subscribe information available here:
http://wiki.secondlife.com/wiki/OpenSource-Dev
Please read the policies before posting to keep unmoderated posting privileges

Re: [opensource-dev] Retaining Newbies (Was: The Plan for Snowglobe)

2010-09-13 Thread dilly dobbs
Thanks OZ

Now we need to get others to come and chat about it  and see if we can come
up with a plan.


I love deadlines. I like the whooshing sound they make as they fly by

Douglas Adams


On Mon, Sep 13, 2010 at 9:17 AM, Oz Linden (Scott Lawrence) <
o...@lindenlab.com> wrote:

>  On 2010-09-11 14:11, Dilly Dobbs wrote:
>
>   On 9/11/2010 1:05 PM, Robert Martin wrote:
>
>  On Sat, Sep 11, 2010 at 1:52 PM, dilly dobbs 
>   wrote:
>
>  We all seem like intelligent adults that could come to an agreement on how
> to add priority to the issues that we would like to see addressed.  And yes
> i work in an Agile dev software shop so im sorry about the lingo.
> Opinions, and ideas on this ?
>
>  I would like to put my paws into this kind of thing could we get an
> indoor type inworld location for this??
>
>
>  I think that i would be able to provide us with a meeting place on the
> grid.  My wife i am sure would be willing to give us a meeting place on
> the grid.  And i would be willing to foot the group costs as long as
> they don't get out of control.
>
>
>
> Feel free to use the Hippotropolis Theater
>
> http://maps.secondlife.com/secondlife/Hippotropolis/241/29/23
>
> or there are some nice meeting spaces both indoor and outdoor in the nearby
> Open Source Park
>
> http://maps.secondlife.com/secondlife/Hippotropolis/37/233/22
>
> if there are other in-world resources that would be helpful, contact me.
>
>
>
> ___
> Policies and (un)subscribe information available here:
> http://wiki.secondlife.com/wiki/OpenSource-Dev
> Please read the policies before posting to keep unmoderated posting
> privileges
>
___
Policies and (un)subscribe information available here:
http://wiki.secondlife.com/wiki/OpenSource-Dev
Please read the policies before posting to keep unmoderated posting privileges

Re: [opensource-dev] Retaining Newbies (Was: The Plan for Snowglobe)

2010-09-13 Thread Robin Cornelius
On Mon, Sep 13, 2010 at 3:22 PM, dilly dobbs  wrote:
> Thanks OZ
> Now we need to get others to come and chat about it  and see if we can come
> up with a plan.

The best way is to just put out a general call for a meeting with a
set a date and a time. Or possibly 2 to try to accommodate different
timezones. Then just be prepared to log the meeting and place the
results on a wiki somewhere. If you don't just set a date/time you
will permanently be herding cats, also be prepared to set an agenda
and who ever hosts will need to be a strong chairperson to keep any
kind of order ;-)

So just post your topic/date/time/slurl and let those who want to
contribute turn up.

Robin
___
Policies and (un)subscribe information available here:
http://wiki.secondlife.com/wiki/OpenSource-Dev
Please read the policies before posting to keep unmoderated posting privileges


Re: [opensource-dev] where is the source

2010-09-13 Thread Garmin Kawaguichi
Is it the last last source code?

I say that after reading : 
https://bitbucket.org/lindenlab/viewer-development/src/tip/indra/lscript/lscript_library/lscript_library.cpp
where llClearPrimMedia is the last function. Where are : 
llSetLinkPrimitiveParamsFast, llGetLinkPrimitiveParams, llLinkParticleSystem, 
llSetLinkTextureAnim, llGetLinkNumberOfSides

GCI

- Original Message - 
> Source is live and right here: 
> https://bitbucket.org/lindenlab/viewer-development
> 
> Read a little for some background and pointers.  Link to source is in 
> the wiki, which is being reorganized a lot to reflect the new 
> processes.  http://wiki.secondlife.com/wiki/Project_Snowstorm
___
Policies and (un)subscribe information available here:
http://wiki.secondlife.com/wiki/OpenSource-Dev
Please read the policies before posting to keep unmoderated posting privileges

Re: [opensource-dev] This cannot be right...

2010-09-13 Thread Vex Streeter




Check out

http://jira.secondlife.com/browse/VWR-22757
and

http://jira.secondlife.com/browse/VWR-18427
I don't know why 209046 would be worse than prior releases, but it
would be nice to know that OSX builds do or don't have the same issue.

On 09/09/2010 01:07 PM, Ponzu wrote:

  When I start V2 Dev 209046, this show up in my system.log.  It goes on
producing more than 500 entries per second.
...
Sep  9 12:49:43 iMac /Applications/Second Life
Development.app/Contents/MacOS/Second Life[12501]: dnssd_clientstub
deliver_request: socketpair failed 24 (Too many open files)
Sep  9 12:49:43: --- last message repeated 499 times ---
...
  




___
Policies and (un)subscribe information available here:
http://wiki.secondlife.com/wiki/OpenSource-Dev
Please read the policies before posting to keep unmoderated posting privileges

Re: [opensource-dev] where is the source

2010-09-13 Thread Tofu Linden
LSL-related code in the viewer tree is mostly vestigial at this point -
the server code has its own private tree which diverged over a year
ago.

Garmin Kawaguichi wrote:
> Is it the last last source code?
> 
> I say that after reading : 
> https://bitbucket.org/lindenlab/viewer-development/src/tip/indra/lscript/lscript_library/lscript_library.cpp
> where llClearPrimMedia is the last function. Where are : 
> llSetLinkPrimitiveParamsFast, llGetLinkPrimitiveParams, llLinkParticleSystem, 
> llSetLinkTextureAnim, llGetLinkNumberOfSides
> 
> GCI
> 
> - Original Message - 
>> Source is live and right here: 
>> https://bitbucket.org/lindenlab/viewer-development
>>
>> Read a little for some background and pointers.  Link to source is in 
>> the wiki, which is being reorganized a lot to reflect the new 
>> processes.  http://wiki.secondlife.com/wiki/Project_Snowstorm
> 
> 
> 
> 
> ___
> Policies and (un)subscribe information available here:
> http://wiki.secondlife.com/wiki/OpenSource-Dev
> Please read the policies before posting to keep unmoderated posting privileges
___
Policies and (un)subscribe information available here:
http://wiki.secondlife.com/wiki/OpenSource-Dev
Please read the policies before posting to keep unmoderated posting privileges


Re: [opensource-dev] where is the source

2010-09-13 Thread Sythos
On Mon, 13 Sep 2010 16:47:28 +0200
"Garmin Kawaguichi"  wrote:

> Is it the last last source code?
> 
> I say that after reading : 
> https://bitbucket.org/lindenlab/viewer-development/src/tip/indra/lscript/lscript_library/lscript_library.cpp
> where llClearPrimMedia is the last function. Where are :
> llSetLinkPrimitiveParamsFast, llGetLinkPrimitiveParams,
> llLinkParticleSystem, llSetLinkTextureAnim, llGetLinkNumberOfSides

mono script are server side compiled, i think all these are inside
internal LL branch
___
Policies and (un)subscribe information available here:
http://wiki.secondlife.com/wiki/OpenSource-Dev
Please read the policies before posting to keep unmoderated posting privileges


Re: [opensource-dev] Retaining Newbies (Was: The Plan for Snowglobe)

2010-09-13 Thread Mike Monkowski
dilly dobbs wrote:
> Now we need to get others to come and chat about it  and see if we can 
> come up with a plan. 

Have a look at VWR-10293 and don't skip all of the closed issues there. 
  Many were closed because of Linden's perception that they had fixed 
the new user experience when they got rid of Orientation Islands and 
sent new residents to Help Islands instead and no other improvements 
were needed.

___
Policies and (un)subscribe information available here:
http://wiki.secondlife.com/wiki/OpenSource-Dev
Please read the policies before posting to keep unmoderated posting privileges


Re: [opensource-dev] where is the source

2010-09-13 Thread CG Linden
I believe that that code is used for the syntax highlighting -  and indeed,
" llGetLinkPrimitiveParams()" doesn't highlight in the viewer...
--
cg

On Mon, Sep 13, 2010 at 11:44 AM, Altair Sythos  wrote:

> On Mon, 13 Sep 2010 16:47:28 +0200
> "Garmin Kawaguichi"  wrote:
>
> > Is it the last last source code?
> >
> > I say that after reading :
> >
> https://bitbucket.org/lindenlab/viewer-development/src/tip/indra/lscript/lscript_library/lscript_library.cpp
> > where llClearPrimMedia is the last function. Where are :
> > llSetLinkPrimitiveParamsFast, llGetLinkPrimitiveParams,
> > llLinkParticleSystem, llSetLinkTextureAnim, llGetLinkNumberOfSides
>
> mono script are server side compiled, i think all these are inside
> internal LL branch
> ___
> Policies and (un)subscribe information available here:
> http://wiki.secondlife.com/wiki/OpenSource-Dev
> Please read the policies before posting to keep unmoderated posting
> privileges
>
___
Policies and (un)subscribe information available here:
http://wiki.secondlife.com/wiki/OpenSource-Dev
Please read the policies before posting to keep unmoderated posting privileges

[opensource-dev] Snowstorm Daily Scrum Summary - 09/13/2010

2010-09-13 Thread Sarah (Esbee) Hutchinson
Date: Mon Sep 13, 2010

Also available here:
https://wiki.secondlife.com/wiki/Snowstorm_Daily_Scrum_Archive

*=== General Notes ===*
* Last day of Sprint 3!
* Merge Monkey of the Day: Merov

*===Team Status===*
*
*
*
Aimee Linden
PAST
* STORM-121 Clean-up of voice client shutdown

FUTURE
* STORM-121 Finish off.

IMPEDIMENTS
* None.

Tofu Linden
PAST
* llTextBox
* Lots of Integrations!
* Got my TeamCity staging task working

FUTURE
* llTextBox will definitely carry-over to the next sprint, lots still to do
* Some OOO

IMPEDIMENTS
* none.

Oz Linden
PAST
* Updated review and submission workflow descriptions
* filing some new VWR issues
* fixed Jira helpers around Hippotropolis
* KDU license (met w/ contracts person to get started)

FUTURE
* Work on getting a codereview system set up
* Office Hour
* Wiki updates for getting source

IMPEDIMENTS
* none

Merov Linden
PAST
* STORM-115, 116, 117, 118: wrote acceptance criteria / test plans for each
of those issues
* STORM-106: Chat translator: wrote acceptance criteria / test plan. Pulled
into viewer-development.
* VWR-22769 : libPNG build: Boroondas patch is making assumptions on
symlinks that are accurate on Linux but not on Mac and Windows. We need to
so this differently if at all.

FUTURE
* STORM-104 (VWR-22761) : LLKDU upgrade. More on perf tracking while waiting
for license to clear and new version to come.
* Snowglobe backlog: go through that old list and migrate things to
Snowstorm.

IMPEDIMENTS
* Kakadu license upgrade

Q Linden
PAST
* Working on STORM-102 and STORM-103. They are not ready for primtime. Q
will tackle those. Would like to get that into the next sprint for a task.

FUTURE
* Continued work on STORM 102 and STORM 103
* VWR triage
* Prep for Sprint 4

IMPEDIMENTS
* None

Esbee Linden
PAST
 * Send Q draft of  Jira VWR workflow notes
* Set up sprints in GH
* Move Product Backlog to a "Version" in Jira to separate it from incoming
bugs
* VWR triage

FUTURE
* Schedule meeting with Orange, Gentle, and LordGregGreg.
* Send request to XD for friend permissions icons.
* Write UI customization blog post.
* Write end of Sprint 3 wrap-up blog post
* VWR triage
* Prep for Sprint 4
* Modify process for sending Viewer ideas/feature requests

IMPEDIMENTS
* None

Paul ProductEngine
PAST
Thu
*VWR-22890 (Undocked profile panels loses verb buttons after
minimize/restore)
**In progress ~ 3 hours.
Fri
* STORM-93 (Appearance > 'Wearing' tab: Add Take off / Detach function to
the gear menu)
**Pushed
*STORM-91 (Corrupted vertical scroll bar appears on 'Edit Outfit' panel if
height of 'Add More' panel was changed)
** Pushed
*STORM-89 (Undocked profile panels loses verb buttons after
minimize/restore)
**Fixed
*VWR-22330 (Remove delay so that inspector affordance appears immediately)
**Investigated and started working on. Estimate ~ 6 hours.

FUTURE
* vacation

IMPEDIMENTS
* none

Andrew ProductEngine
PAST
Thu:
* Bug VWR-22008 (People you called via Adhoc are not shown in Recent tab).
**Asked Anya question in JIRA
* Bug VWR-22888 (Empty space appears in the top of Home side panel after
redocking)
** Looked for fix for a while, found a hack, but fortunately won't need to
use it, b/c Vadim's fix for other issue fixes this as well. Reassigned to
Vadim.
* STORM-94 (Increase minimum width allowed for undocked panels)
**Fixed, ready for integration.
*STORM-97 (Back button in the undocked Landmark Info panel moves out of
panel on width resize)
**Fixed, ready for integration.
*STORM-99 (Select button inside Resident Chooser is disabled if invoke
sharing from undocked My Inventory panel)
**Started investigating. No estimate yet.
  Fri
*Bug STORM-99 (Select button inside Resident Chooser is disabled if invoke
sharing from undocked My Inventory panel)
** Found both problems causing bug, fixed, need an hour to clean up and
polish & will send for review

FUTURE
* Finish STORM-99 (Select button inside Resident Chooser is disabled if
invoke sharing from undocked My Inventory panel).
* Create ticket for resize of nearby chat appearing only on click and will
start on it.
* Will clear out tickets currently assigned to mew - a few can be closed.

IMPEDIMENTS
* none

Vadim ProductEngine
PAST
 Thu
*Bug VWR-22896 aka STORM-92 (Panel state resets on dock/undock):
**Fixed.
*Bug VWR-22888 aka STORM-96 (Empty space appears in the top of Home side
panel after redocking):
**Fixed.
*Bug STORM-98 (Update art for People and Groups default/placeholder icons):
**Fixed, but asked Epic a question before commit to be safe.
*Major bug VWR-22921 aka STORM-101 (Sidebar settings should be account
specific):
**Investigated, asked Andrey, will continue tomorrow
*jira-wrangling
*Fixed Paul's build again (problem with current Linux dist)
*Code review
Fri:
* mostly administrative:
** Fixed tickets with known incorrect status
** Moved VWR tickets to STORM that we will be working on
** Discussed JIRA issues with Sue.  She fixed them super-fast
*

Re: [opensource-dev] where is the source

2010-09-13 Thread Kelly Linden
I'd love to see that syntax highlighting and hover tip code replaced, we
really shouldn't need anything in lscript in the viewer. If the syntax
highlighting / hover tips could be read from an LLSD we could provide a
capability 'lsl-syntax' which could be queried for a version or the whole
map. Then you could get accurate syntax highlighting for the sim you are on
independent of the viewer.

 - Kelly

On Mon, Sep 13, 2010 at 4:16 PM, CG Linden  wrote:

> I believe that that code is used for the syntax highlighting -  and indeed,
> " llGetLinkPrimitiveParams()" doesn't highlight in the viewer...
> --
> cg
>
>
> On Mon, Sep 13, 2010 at 11:44 AM, Altair Sythos  wrote:
>
>> On Mon, 13 Sep 2010 16:47:28 +0200
>> "Garmin Kawaguichi"  wrote:
>>
>> > Is it the last last source code?
>> >
>> > I say that after reading :
>> >
>> https://bitbucket.org/lindenlab/viewer-development/src/tip/indra/lscript/lscript_library/lscript_library.cpp
>> > where llClearPrimMedia is the last function. Where are :
>> > llSetLinkPrimitiveParamsFast, llGetLinkPrimitiveParams,
>> > llLinkParticleSystem, llSetLinkTextureAnim, llGetLinkNumberOfSides
>>
>> mono script are server side compiled, i think all these are inside
>> internal LL branch
>> ___
>> Policies and (un)subscribe information available here:
>> http://wiki.secondlife.com/wiki/OpenSource-Dev
>> Please read the policies before posting to keep unmoderated posting
>> privileges
>>
>
>
> ___
> Policies and (un)subscribe information available here:
> http://wiki.secondlife.com/wiki/OpenSource-Dev
> Please read the policies before posting to keep unmoderated posting
> privileges
>
___
Policies and (un)subscribe information available here:
http://wiki.secondlife.com/wiki/OpenSource-Dev
Please read the policies before posting to keep unmoderated posting privileges

Re: [opensource-dev] where is the source

2010-09-13 Thread Rob Nelson
 As I recall, all that lscript stuff was to compile the LSL bytecode 
and give it to the server.  It shouldn't be needed anymore since the 
server compiles everything itself...


Rob

On 9/13/2010 4:32 PM, Kelly Linden wrote:
I'd love to see that syntax highlighting and hover tip code replaced, 
we really shouldn't need anything in lscript in the viewer. If the 
syntax highlighting / hover tips could be read from an LLSD we could 
provide a capability 'lsl-syntax' which could be queried for a version 
or the whole map. Then you could get accurate syntax highlighting for 
the sim you are on independent of the viewer.


 - Kelly

On Mon, Sep 13, 2010 at 4:16 PM, CG Linden > wrote:


I believe that that code is used for the syntax highlighting - 
and indeed, " llGetLinkPrimitiveParams()" doesn't highlight in the

viewer...
--
cg


On Mon, Sep 13, 2010 at 11:44 AM, Altair Sythos mailto:syt...@gmail.com>> wrote:

On Mon, 13 Sep 2010 16:47:28 +0200
"Garmin Kawaguichi" mailto:garmin.kawagui...@magalaxie.com>> wrote:

> Is it the last last source code?
>
> I say that after reading :
>

https://bitbucket.org/lindenlab/viewer-development/src/tip/indra/lscript/lscript_library/lscript_library.cpp
> where llClearPrimMedia is the last function. Where are :
> llSetLinkPrimitiveParamsFast, llGetLinkPrimitiveParams,
> llLinkParticleSystem, llSetLinkTextureAnim,
llGetLinkNumberOfSides

mono script are server side compiled, i think all these are inside
internal LL branch
___
Policies and (un)subscribe information available here:
http://wiki.secondlife.com/wiki/OpenSource-Dev
Please read the policies before posting to keep unmoderated
posting privileges



___
Policies and (un)subscribe information available here:
http://wiki.secondlife.com/wiki/OpenSource-Dev
Please read the policies before posting to keep unmoderated
posting privileges



___
Policies and (un)subscribe information available here:
http://wiki.secondlife.com/wiki/OpenSource-Dev
Please read the policies before posting to keep unmoderated posting privileges


___
Policies and (un)subscribe information available here:
http://wiki.secondlife.com/wiki/OpenSource-Dev
Please read the policies before posting to keep unmoderated posting privileges

Re: [opensource-dev] where is the source

2010-09-13 Thread Joshua Bell
On Mon, Sep 13, 2010 at 4:32 PM, Kelly Linden  wrote:

> I'd love to see that syntax highlighting and hover tip code replaced, we
> really shouldn't need anything in lscript in the viewer. If the syntax
> highlighting / hover tips could be read from an LLSD we could provide a
> capability 'lsl-syntax' which could be queried for a version or the whole
> map. Then you could get accurate syntax highlighting for the sim you are on
> independent of the viewer.
>

At the risk of designing this via email... oh, what the heck.

User story: "As a developer of LSL scripts, I want LSL the syntax
highlighting dictionary to be updated automagically so I don't have to
upgrade my viewer whenever new LSL functions or keywords are added."

Per-sim versioning is a nice-to-have but we could probably start simpler and
have a per-grid resource. During login, the viewer can be given the URL to
an LLSD resource it can *lazily* request. (We do this for other grid-wide
services.)

We could later choose to extend this to be a capability granted by the
simulator, which the viewer could query on each region change (when it gets
a new seed cap) to give the granularity Kelly describes above.

The server side changes to deliver a static URL to an LLSD resource from
login are minimal; if someone wants to take a stab at the client side
changes and defining a forward-looking LLSD format, I'm sure you'll find a
server-side champion.
___
Policies and (un)subscribe information available here:
http://wiki.secondlife.com/wiki/OpenSource-Dev
Please read the policies before posting to keep unmoderated posting privileges

Re: [opensource-dev] where is the source

2010-09-13 Thread Ellla McMahon
Is this JIRA related VWR-20031
Automatic
download of syntax-table  ??
Perhaps add your comments and votes : )


On 14 September 2010 00:48, Joshua Bell  wrote:

> On Mon, Sep 13, 2010 at 4:32 PM, Kelly Linden  wrote:
>
>> I'd love to see that syntax highlighting and hover tip code replaced, we
>> really shouldn't need anything in lscript in the viewer. If the syntax
>> highlighting / hover tips could be read from an LLSD we could provide a
>> capability 'lsl-syntax' which could be queried for a version or the whole
>> map. Then you could get accurate syntax highlighting for the sim you are on
>> independent of the viewer.
>>
>
> At the risk of designing this via email... oh, what the heck.
>
> User story: "As a developer of LSL scripts, I want LSL the syntax
> highlighting dictionary to be updated automagically so I don't have to
> upgrade my viewer whenever new LSL functions or keywords are added."
>
> Per-sim versioning is a nice-to-have but we could probably start simpler
> and have a per-grid resource. During login, the viewer can be given the URL
> to an LLSD resource it can *lazily* request. (We do this for other grid-wide
> services.)
>
> We could later choose to extend this to be a capability granted by the
> simulator, which the viewer could query on each region change (when it gets
> a new seed cap) to give the granularity Kelly describes above.
>
> The server side changes to deliver a static URL to an LLSD resource from
> login are minimal; if someone wants to take a stab at the client side
> changes and defining a forward-looking LLSD format, I'm sure you'll find a
> server-side champion.
>
>
> ___
> Policies and (un)subscribe information available here:
> http://wiki.secondlife.com/wiki/OpenSource-Dev
> Please read the policies before posting to keep unmoderated posting
> privileges
>
___
Policies and (un)subscribe information available here:
http://wiki.secondlife.com/wiki/OpenSource-Dev
Please read the policies before posting to keep unmoderated posting privileges

Re: [opensource-dev] where is the source

2010-09-13 Thread Yoz Grahame
On 13 September 2010 16:48, Joshua Bell  wrote:

>
> The server side changes to deliver a static URL to an LLSD resource from
> login are minimal; if someone wants to take a stab at the client side
> changes and defining a forward-looking LLSD format, I'm sure you'll find a
> server-side champion.
>

Hello!
(Ghengis and I recently made a long-overdue login.cgi fix that makes this
kind of change only require login.xml changes, though we still need to merge
it.)

-- Yoz
___
Policies and (un)subscribe information available here:
http://wiki.secondlife.com/wiki/OpenSource-Dev
Please read the policies before posting to keep unmoderated posting privileges

Re: [opensource-dev] where is the source

2010-09-13 Thread Dave Booth
  On 9/13/2010 19:56, Yoz Grahame wrote:
>
> On 13 September 2010 16:48, Joshua Bell  > wrote:
>
>
> The server side changes to deliver a static URL to an LLSD
> resource from login are minimal; if someone wants to take a stab
> at the client side changes and defining a forward-looking LLSD
> format, I'm sure you'll find a server-side champion.
>
>
> Hello!
> (Ghengis and I recently made a long-overdue login.cgi fix that makes 
> this kind of change only require login.xml changes, though we still 
> need to merge it.)
>
> -- Yoz
>
>
> ___
> Policies and (un)subscribe information available here:
> http://wiki.secondlife.com/wiki/OpenSource-Dev
> Please read the policies before posting to keep unmoderated posting privileges
Get it QA'd and merged dude :)
___
Policies and (un)subscribe information available here:
http://wiki.secondlife.com/wiki/OpenSource-Dev
Please read the policies before posting to keep unmoderated posting privileges


Re: [opensource-dev] J2C fast decoder

2010-09-13 Thread Philippe (Merov) Bossut
Hi,

Very interesting discussion though it seems that folks collide several
things when talking about "textures" in general terms: there's a load of
difference between a repetitive 64x64 texture used to tile a brick wall and
a photographic quality 1024x1024 texture. The former could certainly benefit
from being stored and sent around as PNG (low format overhead, lossly
compression will actually make things worse on such a small image no matter
what, and lossless jpeg will end up being bigger than PNG) while the later
will benefit tremendously from the wavelet compression provided by jpeg2000.

I won't go into the advantage of wavelet compression for photographic images
as there is a *huge* literature on the subject with loads and loads of data
proving the point. One can argue between "normal" jpeg (using DCT) and
jpeg2000 (using wavelets) but there's absolutely no contest between jpeg
(whichever flavor) and png for photographic images in term of quality at
high or even moderate compression ratio.

On the subject of access per resolution (aka "discard levels" in SL
parlance), it is of great interest as some folks mentioned when viewing a
texture from a distance. No matter what's the transport protocol, exchanging
a 32x32 will be faster than exchanging a 1024x1024 (with RGBA pixel
values...). The viewer is able to use partially populated mipmaps (which
are, in effect, subres pyramids themselves as there isn't much difference
between "discard level" and LOD...) and, therefore, use partially downloaded
and decompressed images. Note that with jpeg2000 when asking for a new
level, one does not download the whole full res 32 bits per pixels data but
the wavelet coefficients for that level which are, roughly speaking,
encoding the difference with the previous level. That translate in huge
compression benefits in slowly changing areas in particular.

One wavelet property though that our viewer does not take advantage of is
the spatial random access property of the jpeg2000 format. That would allow
for instance to request and download only a portion of the full res data
when needed. That is advantageous in cases where only a portion of the whole
image is mapped on a prim for instance. I've no data though to know if it's
a frequent case in SL but that would be interesting to know how much of a
texture is truly displayed on average. There's may be an interesting ore of
performance to mine there.

All that though needs to be backed by data. This preliminary performance
gathering toolbox I talked about is a first step in that direction.

Cheers,
- Merov
___
Policies and (un)subscribe information available here:
http://wiki.secondlife.com/wiki/OpenSource-Dev
Please read the policies before posting to keep unmoderated posting privileges

[opensource-dev] Reminder: No Daily Scrum tomorrow (9/14) - Instead, Sprint Planning from 7-9am PT

2010-09-13 Thread Sarah (Esbee) Hutchinson
Hi all,

Just a quick reminder that we won't meet tomorrow at 6:30am PT for the Daily
Scrum. Instead we'll meet from 7-9am PT for Sprint 4 Planning.

With the big Jira switchover, I'm a bit behind organizing stories and
defects for this sprint, but will try to get us ready tonight!

See you tomorrow!

Best,
Esbee
___
Policies and (un)subscribe information available here:
http://wiki.secondlife.com/wiki/OpenSource-Dev
Please read the policies before posting to keep unmoderated posting privileges