Apparently Facebook has been experimenting with sending WebP images in
place of JPEG to some users... but there are some usability problems
reported in the tech press
http://arstechnica.com/information-technology/2013/04/chicken-meets-egg-with-facebook-chrome-webp-support/
Primary problem seems
I vaguely recall the first time I came across one of these newfangled PNG
images that no viewer would display. What's wrong with GIF??? That will
never stand!
(yes, I know plenty is wrong with GIF, especially in those days...)
Pixelmator on Mac already opens webp fine, didn't test saving though.
On Tue, Apr 23, 2013 at 7:45 PM, Brion Vibber br...@pobox.com wrote:
Not sure what's a good solution for this, other than a really good
download/sharing UI on images...
Maybe the most obvious solution would be for Mozilla and Microsoft and
the usual bunch of image editors to start supporting
Hi everyone,
Magnus was very kind to implement an idea that consists of two parts:
1. use of WebP (http://en.wikipedia.org/wiki/WebP) instead of PNG/JPEG
for thumbnails in Wikipedia articles
2. use of Data-URIs (https://en.wikipedia.org/wiki/Data_URI_scheme) to
inline incluse those thumbnails.
That looks like a cool idea.
I am trying to experiment it on a few pages, and it seems to considerably
reduce the number of web requests (for
https://en.wikipedia.org/wiki/Vincent_van_Gogh it goes from 120 to under 40
requests).
But the pages get quite bigger, obviously. Also, it introduces a
On Apr 22, 2013 9:18 AM, Denny Vrandečić denny.vrande...@wikimedia.de
wrote:
But the pages get quite bigger, obviously. Also, it introduces a caching
issue for images...
That's why we have https://bugzilla.wikimedia.org/32618 :-)
Still, pretty cool.
Yeah… makes me wonder what the status of
On Mon, Apr 22, 2013 at 3:17 PM, Denny Vrandečić
denny.vrande...@wikimedia.de wrote:
That looks like a cool idea.
I am trying to experiment it on a few pages, and it seems to considerably
reduce the number of web requests (for
https://en.wikipedia.org/wiki/Vincent_van_Gogh it goes from 120 to
On Mon, Apr 22, 2013 at 2:17 PM, Denny Vrandečić
denny.vrande...@wikimedia.de wrote:
That looks like a cool idea.
I am trying to experiment it on a few pages, and it seems to considerably
reduce the number of web requests (for
https://en.wikipedia.org/wiki/Vincent_van_Gogh it goes from 120
Looks good. Makes me wish H.264 wasn't so loaded down with patents. Then
we'd maybe have an even better codec.
*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2015
Major in Computer Science
www.whizkidztech.com | tylerro...@gmail.com
On Mon, Apr 22, 2013 at 9:38 AM, Magnus Manske
On Mon, Apr 22, 2013 at 6:30 AM, Mathias Schindler
mathias.schind...@gmail.com wrote:
Which one is larger (given an empty cache)? One single big file
containing the thumbnails or one small HTML file and individual
thumbnail files (that includes possible overhead in the TCP/IP
packages for
On Mon, Apr 22, 2013 at 8:27 PM, Brion Vibber br...@pobox.com wrote:
Another possibility is to preferably load images on view/section expansion
via JavaScript, which can potentially give you a chance to query the format
compatibility in client JS and avoid any HTTP-level negotiation. (And also
Clever... that technique (loading a data URI of a small file and making
sure it works) should work with other formats too. I smell an avenue for
SVG support too here... ;)
-- brion
On Mon, Apr 22, 2013 at 12:33 PM, Mathias Schindler
mathias.schind...@gmail.com wrote:
On Mon, Apr 22, 2013 at
12 matches
Mail list logo