On Thu 10 Oct 2013 10:07:53 AM PDT, Till Schneidereit wrote:
> On Thu, Oct 10, 2013 at 6:56 PM, Brian Smith <br...@briansmith.org> wrote:
>> I'm not sure. Things like this seem like really good ideas:
>> http://blogs.msdn.com/b/ie/archive/2013/09/12/using-hardware-to-decode-and-load-jpg-images-up-to-45-faster-in-internet-explorer-11.aspx
>>
>> Obviously, I am linking to somewhat of an advertisement of a
>> competitor but the idea sounds great, especially the bit about
>> significantly lower memory usage.
>
> I agree, that's certainly something worth looking into*. It might not
> necessarily mean that we can't implement decoding of some
> less-frequently used media format in JS. Maybe even with parts of it
> running in hardware**:
> https://brendaneich.com/2013/05/today-i-saw-the-future/

It seems like the optimal efficiency vs surface exposure vs frequency 
of use tradeoff would be to do everything but the top formats (JPG, 
PNG, GIF?) in JS. Then for the top three, try to do a hybrid 
implementation where all of the core bit-slinging is done with 
C/SIMD/GPU/quantum entangled cesium ions, but all of metadata and other 
bits are done in JS. I don't know how awkward that is, but it just 
seems like in general it's fine to do custom hyperoptimized code as 
long as we're aware that we have to be very, very careful about 
security vulnerabilities in it, and use a safe language for everything 
else.

I don't know how messy that would be, though.

_______________________________________________
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform

Reply via email to