Hi!
There is a way to reduce the size of source tarballs by 1.4 MB by
optimizing all source image files. Merely using pngcrush would save
12% outright. There is another optimization that can be done, though --
for most images using full 32 bit RGBA is a waste of space. Walls and
floors don't need alpha, and most other images use only 1-bit alpha and
a limited color palette.
The current count:
2851 8-bit/color RGBA, non-interlaced
228 8-bit colormap, non-interlaced
76 8-bit/color RGBA, interlaced
61 8-bit/color RGB, non-interlaced
What is needed to carry all the information losslessly:
1412 4-bit colormap, non-interlaced
827 8-bit colormap, non-interlaced
816 8-bit/color RGBA, non-interlaced
93 2-bit colormap, non-interlaced
42 8-bit/color RGB, non-interlaced
19 1-bit colormap, non-interlaced
6 8-bit/color RGBA, interlaced
1 8-bit grayscale, non-interlaced
This would reduce disk space taken by 38%.
However, this would apply just to your unpacked crawl/ dir and to source
tarballs, not to git (shallow checkouts aside), as having to keep the
history means disk usage would increase as both the old and new version
need to be stored -- you can't modify old versions without rewriting
history.
So, do we want to reduce the source at the cost of bigger .git dirs?
--
1KB // Microsoft corollary to Hanlon's razor:
// Never attribute to stupidity what can be
// adequately explained by malice.
------------------------------------------------------------------------------
Beautiful is writing same markup. Internet Explorer 9 supports
standards for HTML5, CSS3, SVG 1.1, ECMAScript5, and DOM L2 & L3.
Spend less time writing and rewriting code and more time creating great
experiences on the web. Be a part of the beta today
http://p.sf.net/sfu/msIE9-sfdev2dev
_______________________________________________
Crawl-ref-discuss mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/crawl-ref-discuss