Re: FOP 1.1 out of memory

2012-10-26 Thread Glenn Adams
On Thu, Oct 25, 2012 at 6:11 PM, Stefan Hinz stefan.h...@oracle.com wrote:

  Thanks, Glenn, and thanks, Luis!

 A combination of -nocs and -Xmx4096m made the build succeed. The resulting
 PDF is 5742 pages, which might explain the issues I had. My fop wrapper
 script looks like this:

 #!/bin/sh
 export FOP_OPTS=-Xmx4096m
 exec /usr/bin/fop -nocs $@

 BTW, is the -nocs option new to FOP 1.1? It doesn't seem to work with FOP
 1.0.


yes, complex script (CS) support is new in 1.1


Re: FOP 1.1 out of memory

2012-10-26 Thread Stefan Hinz

Thanks for confirming, Glenn!

On 26.10.2012 10:59, Glenn Adams wrote:


On Thu, Oct 25, 2012 at 6:11 PM, Stefan Hinz stefan.h...@oracle.com
mailto:stefan.h...@oracle.com wrote:

Thanks, Glenn, and thanks, Luis!

A combination of -nocs and -Xmx4096m made the build succeed. The
resulting PDF is 5742 pages, which might explain the issues I had.
My fop wrapper script looks like this:

#!/bin/sh
export FOP_OPTS=-Xmx4096m
exec /usr/bin/fop -nocs $@

BTW, is the -nocs option new to FOP 1.1? It doesn't seem to work
with FOP 1.0.


yes, complex script (CS) support is new in 1.1




--
Cheers,

Stefan Hinz stefan.h...@oracle.com, MySQL Documentation Manager

Phone: +49-30-82702940, Fax: +49-30-82702941, http://dev.mysql.com/doc

ORACLE Deutschland B.V.  Co. KG
Registered Office: Riesstr. 25, 80992 Muenchen, Germany
Commercial Register: Local Court Of Munich, HRA 95603
Managing Director: Jürgen Kunz

General Partner: ORACLE Deutschland Verwaltung B.V.
Hertogswetering 163/167, 3543 AS Utrecht, Niederlande
Register Of Chamber Of Commerce: Midden-Niederlande, No. 30143697
Managing Directors: Alexander van der Ven, Astrid Kepper, Val Maher

-
To unsubscribe, e-mail: fop-users-unsubscr...@xmlgraphics.apache.org
For additional commands, e-mail: fop-users-h...@xmlgraphics.apache.org



Re: FOP 1.1 out of memory

2012-10-25 Thread Stefan Hinz

  
  
Thanks, Glenn, and thanks, Luis!
  
  A combination of -nocs and -Xmx4096m made the build succeed. The
  resulting PDF is 5742 pages, which might explain the issues I had.
  My fop wrapper script looks like this:
  
  #!/bin/sh
  export FOP_OPTS=-Xmx4096m
  exec /usr/bin/fop -nocs "$@"
  
  BTW, is the -nocs option new to FOP 1.1? It doesn't seem to work
  with FOP 1.0.
  
  Again, thanks, and have a good day!
  
  Cheers,
  
  Stefan
  
  
  
  On 25.10.2012 00:44, Glenn Adams wrote:

You might also try disabling complex script support,
  about which see [1], if you don't require use of the complex text
  path. Somewhat greater memory may be consumed when CS is enabled
  (which it is by default in 1.1).
  

  
  [1] http://xmlgraphics.apache.org/fop/1.1/complexscripts.html#Disabling+complex+scripts


  On Thu, Oct 25, 2012 at 6:03 AM, Luis Bernardo lmpmberna...@gmail.com
  wrote:
  

You can control how much memory the jvm uses by using the
-Xmx flags. So I think you can try that first.

The only situation where I know that FOP runs out of memory
(also in a machine with 8 GB) is when you have a very long
paragraph (and I mean a paragraph with 200K+ words). Then
the line breaking algorithm has to hold the full paragraph
in memory and decide on the optimal break points and that
very likely will use all available memory. Do you know if
you have anything like that in your book? Having thousands
of pages by itself should not be an issue. It really depends
on the content and on how the line breaks happen (if you
insert line breaks FOP uses a lot less memory than if you
don't).

  
  
  On 10/24/12 7:49 PM, Stefan Hinz wrote:
  
I like FOP 1.1 a lot: Unlike previous versions, it tells
you which page it's processing, which can make debugging
easier, and also it gives you that warm fuzzy feeling
that you're somewhat in control of things. :-)

However, with really big books, I'm hitting a wall, like
this:

Okt 24, 2012 8:21:16 PM org.apache.fop.events.LoggingEventListener
processEvent
INFO: Rendered page #2630.
Exception in thread "main" java.lang.OutOfMemoryError:
GC overhead limit exceeded

That is, FOP stops at about 2600 pages with an out of
memory error. On Stackoverflow (http://preview.tinyurl.com/94qute5),
there's an indication why this happens:

"This message means that for some reason the garbage
collector is taking an excessive amount of time (by
default 98% of all CPU time of the process) and recovers
very little memory in each run (by default 2% of the
heap).
This effectively means that your program stops doing any
progress and is busy running only the garbage collection
at all time."

Does this mean it's a FOP 1.1 bug, or would there be
anything I could do to give it/Java more memory and
prevent it from failing?

The error happens on an 8 GB RAM 4-core machine. At the
time FOP ended, there was like 2 GB RAM left.

  
  
  


  
-
To unsubscribe, e-mail: fop-users-unsubscr...@xmlgraphics.apache.org
For additional commands, e-mail: fop-users-h...@xmlgraphics.apache.org

  

  


  



-- 
Cheers,

Stefan Hinz stefan.h...@oracle.com, MySQL Documentation Manager

Phone: +49-30-82702940, Fax: +49-30-82702941, http://dev.mysql.com/doc

ORACLE Deutschland B.V.  Co. KG
Registered Office: Riesstr. 25, 80992 Muenchen, Germany
Commercial Register: Local Court Of Munich, HRA 95603
Managing Director: Jürgen Kunz

General Partner: ORACLE Deutschland Verwaltung B.V.
Hertogswetering 163/167, 3543 AS Utrecht, Niederlande
Register Of Chamber Of Commerce: Midden-Niederlande, No. 30143697
Managing Directors: Alexander van der Ven, Astrid Kepper, Val Maher

  


-
To unsubscribe, e-mail: fop-users-unsubscr...@xmlgraphics.apache.org
For additional 

Re: FOP 1.1 out of memory

2012-10-25 Thread Stefan Hinz

  
  
Hi Mehdi,

As for your FO, you need to provide a little more
  information:
  
  
  1) Tables in the FO tend to consume quite a lot of memory,
especially when they span multiple pages, are you using them for
laying out the documents?
  
  

No, but we have many lng tables in that huge (5,742 pages) MySQL
5.5 Manual. Some of them have overflow issues (table columns
overlapping), which might add to the problem I was facing.


  2) Fonts take up more room than they really should; are you
using custom fonts? Are you embedding fonts?

No, nothing.

  
  
  Those are the usual culprits that come to mind (though others
may wish to add to them), if you can upload the FO, then someone
will take a look at it.

Should I still do that? As said in the previous reply to Luis and
Glenn, I was able to overcome the build failure with -Xmx4096m and
-nocs. Maybe, as a rule of thumb, give FOP 1 GB per 1000 pages? :-) 

Thanks again for FOP 1.1 -- it seems to be a heck of a lot quicker
than FOP 1.0!

Cheers,

Stefan


  
  
  Hope that helps,
  

  
  Mehdi

  
On 24 October 2012 19:49, Stefan
  Hinz stefan.h...@oracle.com
  wrote:
  I like
FOP 1.1 a lot: Unlike previous versions, it tells you
which page it's processing, which can make debugging
easier, and also it gives you that warm fuzzy feeling
that you're somewhat in control of things. :-)

However, with really big books, I'm hitting a wall, like
this:

Okt 24, 2012 8:21:16 PM org.apache.fop.events.LoggingEventListener
processEvent
INFO: Rendered page #2630.
Exception in thread "main" java.lang.OutOfMemoryError:
GC overhead limit exceeded

That is, FOP stops at about 2600 pages with an out of
memory error. On Stackoverflow (http://preview.tinyurl.com/94qute5),
there's an indication why this happens:

"This message means that for some reason the garbage
collector is taking an excessive amount of time (by
default 98% of all CPU time of the process) and recovers
very little memory in each run (by default 2% of the
heap).
This effectively means that your program stops doing any
progress and is busy running only the garbage collection
at all time."

Does this mean it's a FOP 1.1 bug, or would there be
anything I could do to give it/Java more memory and
prevent it from failing?

The error happens on an 8 GB RAM 4-core machine. At the
time FOP ended, there was like 2 GB RAM left.

-- 
Cheers,

Stefan Hinz stefan.h...@oracle.com,
MySQL Documentation Manager

Phone: +49-30-82702940, Fax: +49-30-82702941,
http://dev.mysql.com/doc

ORACLE Deutschland B.V. Co. KG
Registered Office: Riesstr. 25, 80992 Muenchen, Germany
Commercial Register: Local Court Of Munich, HRA 95603
Managing Director: Jrgen Kunz

General Partner: ORACLE Deutschland Verwaltung B.V.
Hertogswetering 163/167, 3543 AS Utrecht, Niederlande
Register Of Chamber Of Commerce: Midden-Niederlande, No.
30143697
Managing Directors: Alexander van der Ven, Astrid
Kepper, Val Maher

-
To unsubscribe, e-mail: fop-users-unsubscr...@xmlgraphics.apache.org
For additional commands, e-mail: fop-users-h...@xmlgraphics.apache.org

  


  

  



-- 
Cheers,

Stefan Hinz stefan.h...@oracle.com, MySQL Documentation Manager

Phone: +49-30-82702940, Fax: +49-30-82702941, http://dev.mysql.com/doc

ORACLE Deutschland B.V.  Co. KG
Registered Office: Riesstr. 25, 80992 Muenchen, Germany
Commercial Register: Local Court Of Munich, HRA 95603
Managing Director: Jrgen Kunz

General Partner: ORACLE Deutschland Verwaltung B.V.
Hertogswetering 163/167, 3543 AS Utrecht, Niederlande
Register Of Chamber Of Commerce: Midden-Niederlande, No. 30143697
Managing Directors: Alexander van der Ven, Astrid Kepper, Val 

Re: FOP 1.1 out of memory

2012-10-24 Thread mehdi houshmand
Hi Stefan,

Thanks for your support, people have worked hard to give you that warm
fuzzy feeling, so it's nice to know that work is appreciated.

As for your FO, you need to provide a little more information:

1) Tables in the FO tend to consume quite a lot of memory, especially when
they span multiple pages, are you using them for laying out the documents?

2) Fonts take up more room than they really should; are you using custom
fonts? Are you embedding fonts?

Those are the usual culprits that come to mind (though others may wish to
add to them), if you can upload the FO, then someone will take a look at it.

Hope that helps,

Mehdi

On 24 October 2012 19:49, Stefan Hinz stefan.h...@oracle.com wrote:

 I like FOP 1.1 a lot: Unlike previous versions, it tells you which page
 it's processing, which can make debugging easier, and also it gives you
 that warm fuzzy feeling that you're somewhat in control of things. :-)

 However, with really big books, I'm hitting a wall, like this:

 Okt 24, 2012 8:21:16 PM org.apache.fop.events.**LoggingEventListener
 processEvent
 INFO: Rendered page #2630.
 Exception in thread main java.lang.OutOfMemoryError: GC overhead limit
 exceeded

 That is, FOP stops at about 2600 pages with an out of memory error. On
 Stackoverflow 
 (http://preview.tinyurl.com/**94qute5http://preview.tinyurl.com/94qute5),
 there's an indication why this happens:

 This message means that for some reason the garbage collector is taking
 an excessive amount of time (by default 98% of all CPU time of the process)
 and recovers very little memory in each run (by default 2% of the heap).
 This effectively means that your program stops doing any progress and is
 busy running only the garbage collection at all time.

 Does this mean it's a FOP 1.1 bug, or would there be anything I could do
 to give it/Java more memory and prevent it from failing?

 The error happens on an 8 GB RAM 4-core machine. At the time FOP ended,
 there was like 2 GB RAM left.

 --
 Cheers,

 Stefan Hinz stefan.h...@oracle.com, MySQL Documentation Manager

 Phone: +49-30-82702940, Fax: +49-30-82702941, http://dev.mysql.com/doc

 ORACLE Deutschland B.V.  Co. KG
 Registered Office: Riesstr. 25, 80992 Muenchen, Germany
 Commercial Register: Local Court Of Munich, HRA 95603
 Managing Director: Jürgen Kunz

 General Partner: ORACLE Deutschland Verwaltung B.V.
 Hertogswetering 163/167, 3543 AS Utrecht, Niederlande
 Register Of Chamber Of Commerce: Midden-Niederlande, No. 30143697
 Managing Directors: Alexander van der Ven, Astrid Kepper, Val Maher

 --**--**-
 To unsubscribe, e-mail: 
 fop-users-unsubscribe@**xmlgraphics.apache.orgfop-users-unsubscr...@xmlgraphics.apache.org
 For additional commands, e-mail: 
 fop-users-help@xmlgraphics.**apache.orgfop-users-h...@xmlgraphics.apache.org




Re: FOP 1.1 out of memory

2012-10-24 Thread Luis Bernardo


You can control how much memory the jvm uses by using the -Xmx flags. So 
I think you can try that first.


The only situation where I know that FOP runs out of memory (also in a 
machine with 8 GB) is when you have a very long paragraph (and I mean a 
paragraph with 200K+ words). Then the line breaking algorithm has to 
hold the full paragraph in memory and decide on the optimal break points 
and that very likely will use all available memory. Do you know if you 
have anything like that in your book? Having thousands of pages by 
itself should not be an issue. It really depends on the content and on 
how the line breaks happen (if you insert line breaks FOP uses a lot 
less memory than if you don't).


On 10/24/12 7:49 PM, Stefan Hinz wrote:
I like FOP 1.1 a lot: Unlike previous versions, it tells you which 
page it's processing, which can make debugging easier, and also it 
gives you that warm fuzzy feeling that you're somewhat in control of 
things. :-)


However, with really big books, I'm hitting a wall, like this:

Okt 24, 2012 8:21:16 PM org.apache.fop.events.LoggingEventListener 
processEvent

INFO: Rendered page #2630.
Exception in thread main java.lang.OutOfMemoryError: GC overhead 
limit exceeded


That is, FOP stops at about 2600 pages with an out of memory error. On 
Stackoverflow (http://preview.tinyurl.com/94qute5), there's an 
indication why this happens:


This message means that for some reason the garbage collector is 
taking an excessive amount of time (by default 98% of all CPU time of 
the process) and recovers very little memory in each run (by default 
2% of the heap).
This effectively means that your program stops doing any progress and 
is busy running only the garbage collection at all time.


Does this mean it's a FOP 1.1 bug, or would there be anything I could 
do to give it/Java more memory and prevent it from failing?


The error happens on an 8 GB RAM 4-core machine. At the time FOP 
ended, there was like 2 GB RAM left.





-
To unsubscribe, e-mail: fop-users-unsubscr...@xmlgraphics.apache.org
For additional commands, e-mail: fop-users-h...@xmlgraphics.apache.org



Re: FOP 1.1 out of memory

2012-10-24 Thread Glenn Adams
You might also try disabling complex script support, about which see [1],
if you don't require use of the complex text path. Somewhat greater memory
may be consumed when CS is enabled (which it is by default in 1.1).

[1]
http://xmlgraphics.apache.org/fop/1.1/complexscripts.html#Disabling+complex+scripts

On Thu, Oct 25, 2012 at 6:03 AM, Luis Bernardo lmpmberna...@gmail.comwrote:


 You can control how much memory the jvm uses by using the -Xmx flags. So I
 think you can try that first.

 The only situation where I know that FOP runs out of memory (also in a
 machine with 8 GB) is when you have a very long paragraph (and I mean a
 paragraph with 200K+ words). Then the line breaking algorithm has to hold
 the full paragraph in memory and decide on the optimal break points and
 that very likely will use all available memory. Do you know if you have
 anything like that in your book? Having thousands of pages by itself should
 not be an issue. It really depends on the content and on how the line
 breaks happen (if you insert line breaks FOP uses a lot less memory than if
 you don't).


 On 10/24/12 7:49 PM, Stefan Hinz wrote:

 I like FOP 1.1 a lot: Unlike previous versions, it tells you which page
 it's processing, which can make debugging easier, and also it gives you
 that warm fuzzy feeling that you're somewhat in control of things. :-)

 However, with really big books, I'm hitting a wall, like this:

 Okt 24, 2012 8:21:16 PM org.apache.fop.events.**LoggingEventListener
 processEvent
 INFO: Rendered page #2630.
 Exception in thread main java.lang.OutOfMemoryError: GC overhead limit
 exceeded

 That is, FOP stops at about 2600 pages with an out of memory error. On
 Stackoverflow 
 (http://preview.tinyurl.com/**94qute5http://preview.tinyurl.com/94qute5),
 there's an indication why this happens:

 This message means that for some reason the garbage collector is taking
 an excessive amount of time (by default 98% of all CPU time of the process)
 and recovers very little memory in each run (by default 2% of the heap).
 This effectively means that your program stops doing any progress and is
 busy running only the garbage collection at all time.

 Does this mean it's a FOP 1.1 bug, or would there be anything I could do
 to give it/Java more memory and prevent it from failing?

 The error happens on an 8 GB RAM 4-core machine. At the time FOP ended,
 there was like 2 GB RAM left.



 --**--**-
 To unsubscribe, e-mail: 
 fop-users-unsubscribe@**xmlgraphics.apache.orgfop-users-unsubscr...@xmlgraphics.apache.org
 For additional commands, e-mail: 
 fop-users-help@xmlgraphics.**apache.orgfop-users-h...@xmlgraphics.apache.org