Re: lilypond-book runs out of memory: large file compilatioin fails

2008-07-13 Thread Mats Bengtsson

Have you tried just rerunning the lilypond-book command
again after the crash. At least this used to work as a workaround
a couple of years back, since lilypond-book only processed the
files it hadn't already processed in the first run.

   /Mats

Valentin Villenave wrote:


2008/7/2 Nikita <[EMAIL PROTECTED]>:

 


The problem is that lilypond-book script tries to dissect any file in one pass
and runs out of memory if the file is very large. The solution that worked for
me was to split the file and process it by parts. That could probably be coded
within lilypond-book itself. The latest version I tried 2.11.49 suffered from
the same problem.
   



Thanks, added to our bug tracker as
http://code.google.com/p/lilypond/issues/detail?id=644


Cheers,
Valentin


___
bug-lilypond mailing list
bug-lilypond@gnu.org
http://lists.gnu.org/mailman/listinfo/bug-lilypond
 




--
=
Mats Bengtsson
Signal Processing
Signals, Sensors and Systems
Royal Institute of Technology
SE-100 44  STOCKHOLM
Sweden
Phone: (+46) 8 790 8463 
   Fax:   (+46) 8 790 7260
Email: [EMAIL PROTECTED]
WWW: http://www.s3.kth.se/~mabe
=



___
bug-lilypond mailing list
bug-lilypond@gnu.org
http://lists.gnu.org/mailman/listinfo/bug-lilypond


Re: lilypond-book runs out of memory: large file compilatioin fails

2008-07-12 Thread Valentin Villenave
2008/7/2 Nikita <[EMAIL PROTECTED]>:

> The problem is that lilypond-book script tries to dissect any file in one pass
> and runs out of memory if the file is very large. The solution that worked for
> me was to split the file and process it by parts. That could probably be coded
> within lilypond-book itself. The latest version I tried 2.11.49 suffered from
> the same problem.

Thanks, added to our bug tracker as
http://code.google.com/p/lilypond/issues/detail?id=644


Cheers,
Valentin


___
bug-lilypond mailing list
bug-lilypond@gnu.org
http://lists.gnu.org/mailman/listinfo/bug-lilypond


Re: lilypond-book runs out of memory: large file compilatioin fails

2008-07-02 Thread Nikita
Nikita  gmail.com> writes:

> 
> I have a tex document with large number of lilypond snipsets, say about 
> several
> thousands. When I run lilypond-book 2.10.25 on it I get the following error
> while dissecting:
> 
> Dissecting...Traceback (most recent call last):
>   File "/usr/bin/lilypond-book", line 1879, in 
> main ()
>   File "/usr/bin/lilypond-book", line 1845, in main
> chunks = do_file (file)
>   File "/usr/bin/lilypond-book", line 1723, in do_file
> chunks = find_toplevel_snippets (source, snippet_types)
>   File "/usr/bin/lilypond-book", line 1355, in find_toplevel_snippets
> m = res[type].search (s[index:endex])
> MemoryError
> 
> My OS is Ubuntu with 4G memory. When I see `top` during compilation I see that
> the free memory is going down and in 5 minutes I get the error above. For 
> small
> files everything is going right.
> 

Greetings,

This problems arises on 32-bit systems for extremely large files. 32-bit systems
can't address more than 4G memory so playing around with swap gives no solution.
Seems that for amd64 this is can be solved by swapping on enough disk space.

The problem is that lilypond-book script tries to dissect any file in one pass
and runs out of memory if the file is very large. The solution that worked for
me was to split the file and process it by parts. That could probably be coded
within lilypond-book itself. The latest version I tried 2.11.49 suffered from
the same problem.

Best,
Nikita.




___
bug-lilypond mailing list
bug-lilypond@gnu.org
http://lists.gnu.org/mailman/listinfo/bug-lilypond