On Fri, 17 Apr 2020 at 21:11, Hans Hagen wrote:
> On 4/17/2020 4:37 PM, Mojca Miklavec wrote:
>
> > One of the interesting statistics.
> > I used a bunch of images (the same png images in all documents; cca.
> > 290k in total).
>
> It can actually make a difference what kind of png image you use. Some
> png images demand a conversion (or split of map etc) to the format
> supported by pdf. Often converting the png to pdf and include those is
> faster.

Thanks for the hint. But I tested it and it hardly makes any difference.
I had to make another batch for the archive (creating a single
document with 4k+ pages), and the full process ran in 10 minutes
(compared to cca. 2,5 hours to create individual documents). Just for
a test run I completely **removed** all the images and it only
accounted for some 10 or 20 seconds speedup. So the biggest overhead
still seems to be in warming up the machinery (which includes my share
of overhead for reading in the 1,3 MB lua table with all data entries)
and Taco's hint of using an external tool for splicing would have
probably scored best :)

I need to add that I'm extremely happy about the resource reuse
(mostly images). As I already mentioned before, individual documents
were 1,5 GB in total, and a badly written software would have created
an equally bad cumulative PDF, while ConTeXt generates a merely 17 MB
file with 4k+ pages. It's really impressive.

Mojca
___________________________________________________________________________________
If your question is of interest to others as well, please add an entry to the 
Wiki!

maillist : ntg-context@ntg.nl / http://www.ntg.nl/mailman/listinfo/ntg-context
webpage  : http://www.pragma-ade.nl / http://context.aanhet.net
archive  : https://bitbucket.org/phg/context-mirror/commits/
wiki     : http://contextgarden.net
___________________________________________________________________________________

Reply via email to