On 8/19/2021 10:07 PM, Aditya Mahajan via ntg-context wrote:
On Thu, 19 Aug 2021, Hans Hagen via ntg-context wrote:

Hi,

Here are the highlights of todays update:

- somewhat more compact tuc files, not for all documents, but it can
accumulate; also less memory used then; i could bring down an extreme
2000 page 5 column doc tuc file down to 5% -- it was 70 MB; for the
luametatex manual it reducec the tuc more than 30%; hard to tell if
there will be an associated performance hit, but i'm sure thomas will
complain if that's the case

I never realized that tuc files can grow so big. For big documents, would it 
make sense to simply read and write zipped tuc files?
normally they are not that large but when you enable for instance mechanisms that need positioning they can grow large .. zipping makes for less bytes but still large files and the overhead for serialization stays

(to some extend trying to make these things small is like compression but in a different way .. could be a nice topic for a ctx meeting)

Hans

-----------------------------------------------------------------
                                          Hans Hagen | PRAGMA ADE
              Ridderstraat 27 | 8061 GH Hasselt | The Netherlands
       tel: 038 477 53 69 | www.pragma-ade.nl | www.pragma-pod.nl
-----------------------------------------------------------------
___________________________________________________________________________________
If your question is of interest to others as well, please add an entry to the 
Wiki!

maillist : ntg-context@ntg.nl / http://www.ntg.nl/mailman/listinfo/ntg-context
webpage  : http://www.pragma-ade.nl / http://context.aanhet.net
archive  : https://bitbucket.org/phg/context-mirror/commits/
wiki     : http://contextgarden.net
___________________________________________________________________________________

Reply via email to