RE: Out of Memory Error from Java

2009-01-26 Thread Steffanina, Jeff
David,
Thanks for the additional input.   As your excellent article implies,
the solution is not as easy to implement as the concept implies.   

Specifically, my region-body is primarily a single, four-column, table
that is eventually followed by some summary text.  When the table
contains more than 4,500 lines I get the out-of-memory error.
Currently, the entire invoice is one page-sequence.  Your approach
appears to solve that problem and will keep the paging in tact.  My
problem is finding a way to create multiple page-seq inside a single
table without ruining the spacing.

I am working on that now.

Thanks..Any additional thoughts/examples would be greatly
appreciated.
Here is my code.

LINE 64 =   
...
...
...
...
LINE 335 = 
 







  false
  
  


 


.
.


LINE 479 = 

Jeff 


-Original Message-
From: DavidJKelly [mailto:dke...@scriptorium.com] 
Sent: Friday, January 23, 2009 8:24 AM
To: fop-users@xmlgraphics.apache.org
Subject: RE: Out of Memory Error from Java


At the risk of providing a redundant answer, I would also like to point
out a
paper I wrote on a specific method for creating multiple page sequences
in
the DITA Open Toolkit.  It provides code examples and explanations for
some
of the "gotchas" in using multiple page sequences.  There may be
information
in this approach that you or others in this community would find
helpful.

http://www.scriptorium.com/whitepapers/xslfo/index.html

Regards,
David Kelly



Steffanina, Jeff wrote:
> 
> 
> I read the article that you referenced below.   It suggests using
> multiple page sequences.  Currently, I have only one.
> 
> I will make that change and test it.
> 
> THANKS for the info!
> 
> 
> 
> Jeff 
> 
> 
> 
> 
> 
> -Original Message-
> From: Chris Bowditch [mailto:bowditch_ch...@hotmail.com] 
> Sent: Thursday, January 22, 2009 4:11 AM
> To: fop-users@xmlgraphics.apache.org
> Subject: Re: Out of Memory Error from Java
> 
> Steffanina, Jeff wrote:
> 
>> I have:
>> FOP 0.95
>> Linux
>> Java 1.5
>> My Java starts with memory set as:  -Xmx192m
>> 
>> I can print an unlimited number of individual invoices in a single
> Java 
>> process (i.e. the user chooses the option to print).  For example,
> print 
>> 2,000 invoices where each  invoice contains 100 lines of details.  
>> Therefore, print a total of 200,000 lines of detail.
>> 
>> However, I receive an "Out of Memory" error from Java when I attempt
> to 
>> print a single invoice  that contains more than 4,500 lines of
detail.
> 
> Have you read:
> http://xmlgraphics.apache.org/fop/0.94/running.html#memory
> 
> Specifically the part about breaking up the FO into multiple page
> sequences.
> 
>> 
>> Other than continuing to increase the amount of memory at startup, is

>> there something I can do to prevent the error when printing a single,

>> large invoice? 
> 
> Regards,
> 
> Chris
> 
> 
> 
> -
> To unsubscribe, e-mail: fop-users-unsubscr...@xmlgraphics.apache.org
> For additional commands, e-mail: fop-users-h...@xmlgraphics.apache.org
> 
> 
> -------------
> To unsubscribe, e-mail: fop-users-unsubscr...@xmlgraphics.apache.org
> For additional commands, e-mail: fop-users-h...@xmlgraphics.apache.org
> 
> 
> 

-- 
View this message in context:
http://www.nabble.com/Out-of-Memory-Error-from-Java-tp21591957p21624377.
html
Sent from the FOP - Users mailing list archive at Nabble.com.


-
To unsubscribe, e-mail: fop-users-unsubscr...@xmlgraphics.apache.org
For additional commands, e-mail: fop-users-h...@xmlgraphics.apache.org


-
To unsubscribe, e-mail: fop-users-unsubscr...@xmlgraphics.apache.org
For additional commands, e-mail: fop-users-h...@xmlgraphics.apache.org



RE: Out of Memory Error from Java

2009-01-23 Thread DavidJKelly

At the risk of providing a redundant answer, I would also like to point out a
paper I wrote on a specific method for creating multiple page sequences in
the DITA Open Toolkit.  It provides code examples and explanations for some
of the "gotchas" in using multiple page sequences.  There may be information
in this approach that you or others in this community would find helpful.

http://www.scriptorium.com/whitepapers/xslfo/index.html

Regards,
David Kelly



Steffanina, Jeff wrote:
> 
> 
> I read the article that you referenced below.   It suggests using
> multiple page sequences.  Currently, I have only one.
> 
> I will make that change and test it.
> 
> THANKS for the info!
> 
> 
> 
> Jeff 
> 
> 
> 
> 
> 
> -Original Message-
> From: Chris Bowditch [mailto:bowditch_ch...@hotmail.com] 
> Sent: Thursday, January 22, 2009 4:11 AM
> To: fop-users@xmlgraphics.apache.org
> Subject: Re: Out of Memory Error from Java
> 
> Steffanina, Jeff wrote:
> 
>> I have:
>> FOP 0.95
>> Linux
>> Java 1.5
>> My Java starts with memory set as:  -Xmx192m
>> 
>> I can print an unlimited number of individual invoices in a single
> Java 
>> process (i.e. the user chooses the option to print).  For example,
> print 
>> 2,000 invoices where each  invoice contains 100 lines of details.  
>> Therefore, print a total of 200,000 lines of detail.
>> 
>> However, I receive an "Out of Memory" error from Java when I attempt
> to 
>> print a single invoice  that contains more than 4,500 lines of detail.
> 
> Have you read:
> http://xmlgraphics.apache.org/fop/0.94/running.html#memory
> 
> Specifically the part about breaking up the FO into multiple page
> sequences.
> 
>> 
>> Other than continuing to increase the amount of memory at startup, is 
>> there something I can do to prevent the error when printing a single, 
>> large invoice? 
> 
> Regards,
> 
> Chris
> 
> 
> 
> -
> To unsubscribe, e-mail: fop-users-unsubscr...@xmlgraphics.apache.org
> For additional commands, e-mail: fop-users-h...@xmlgraphics.apache.org
> 
> 
> -----------------
> To unsubscribe, e-mail: fop-users-unsubscr...@xmlgraphics.apache.org
> For additional commands, e-mail: fop-users-h...@xmlgraphics.apache.org
> 
> 
> 

-- 
View this message in context: 
http://www.nabble.com/Out-of-Memory-Error-from-Java-tp21591957p21624377.html
Sent from the FOP - Users mailing list archive at Nabble.com.


-
To unsubscribe, e-mail: fop-users-unsubscr...@xmlgraphics.apache.org
For additional commands, e-mail: fop-users-h...@xmlgraphics.apache.org



RE: Out of Memory Error from Java

2009-01-22 Thread Steffanina, Jeff

I read the article that you referenced below.   It suggests using
multiple page sequences.  Currently, I have only one.

I will make that change and test it.

THANKS for the info!



Jeff 





-Original Message-
From: Chris Bowditch [mailto:bowditch_ch...@hotmail.com] 
Sent: Thursday, January 22, 2009 4:11 AM
To: fop-users@xmlgraphics.apache.org
Subject: Re: Out of Memory Error from Java

Steffanina, Jeff wrote:

> I have:
> FOP 0.95
> Linux
> Java 1.5
> My Java starts with memory set as:  -Xmx192m
> 
> I can print an unlimited number of individual invoices in a single
Java 
> process (i.e. the user chooses the option to print).  For example,
print 
> 2,000 invoices where each  invoice contains 100 lines of details.  
> Therefore, print a total of 200,000 lines of detail.
> 
> However, I receive an "Out of Memory" error from Java when I attempt
to 
> print a single invoice  that contains more than 4,500 lines of detail.

Have you read:
http://xmlgraphics.apache.org/fop/0.94/running.html#memory

Specifically the part about breaking up the FO into multiple page
sequences.

> 
> Other than continuing to increase the amount of memory at startup, is 
> there something I can do to prevent the error when printing a single, 
> large invoice? 

Regards,

Chris



-
To unsubscribe, e-mail: fop-users-unsubscr...@xmlgraphics.apache.org
For additional commands, e-mail: fop-users-h...@xmlgraphics.apache.org


-
To unsubscribe, e-mail: fop-users-unsubscr...@xmlgraphics.apache.org
For additional commands, e-mail: fop-users-h...@xmlgraphics.apache.org



Re: Out of Memory Error from Java

2009-01-22 Thread Chris Bowditch

Steffanina, Jeff wrote:


I have:
FOP 0.95
Linux
Java 1.5
My Java starts with memory set as:  -Xmx192m

I can print an unlimited number of individual invoices in a single Java 
process (i.e. the user chooses the option to print).  For example, print 
2,000 invoices where each  invoice contains 100 lines of details.  
Therefore, print a total of 200,000 lines of detail.


However, I receive an "Out of Memory" error from Java when I attempt to 
print a single invoice  that contains more than 4,500 lines of detail.


Have you read: http://xmlgraphics.apache.org/fop/0.94/running.html#memory

Specifically the part about breaking up the FO into multiple page sequences.



Other than continuing to increase the amount of memory at startup, is 
there something I can do to prevent the error when printing a single, 
large invoice? 


Regards,

Chris



-
To unsubscribe, e-mail: fop-users-unsubscr...@xmlgraphics.apache.org
For additional commands, e-mail: fop-users-h...@xmlgraphics.apache.org



Out of Memory Error from Java

2009-01-21 Thread Steffanina, Jeff
I have:
FOP 0.95
Linux
Java 1.5
My Java starts with memory set as:  -Xmx192m

I can print an unlimited number of individual invoices in a single Java
process (i.e. the user chooses the option to print).  For example, print
2,000 invoices where each  invoice contains 100 lines of details.
Therefore, print a total of 200,000 lines of detail.

However, I receive an "Out of Memory" error from Java when I attempt to
print a single invoice  that contains more than 4,500 lines of detail.

Other than continuing to increase the amount of memory at startup, is
there something I can do to prevent the error when printing a single,
large invoice?  


Jeff 


Re: Out of Memory error

2008-12-17 Thread vsyamala

Thank you Sean and Andreas. I will look into those possibilities.

Vsyamala


Andreas Delmelle-2 wrote:
> 
> On 12 Dec 2008, at 15:28, Griffin,Sean wrote:
> 
> Hi Sean, Vsyamala,
> 
>> I was assuming you were embedding FOP into your application and  
>> calling directly through Java.  Since it appears you're launching  
>> from the command-line, it's up to the FOP command-line program on  
>> how it streams the output.  Since you're specifying file names as  
>> your I/O, it stands to reason that you'd be reading from/writing to  
>> File I/O streams, so the use of a ByteArrayOutputStream in the  
>> PDFStream.add() method might be a red herring and perfectly normal.
> 
> Indeed. In general, a stack trace is next to useless for an OOMError.  
> It will only tell you at what point in the process the error occurred,  
> but it doesn't really say anything about the actual cause.
> If the error were to occur during a StringBuffer.append(), chances  
> would be very slim that it's actually the StringBuffer that is  
> responsible. It just means that append() needed to allocate a few  
> extra bytes of memory, but other objects/code already used up the  
> available heap space.
> 
> In this case, the PDFStream.add() method is FOP-internal, and IIC, no  
> cause for concern. With a FO file of 45 MB, I'm suspecting the issue  
> is caused by the fact that the entire document consists of a single  
> fo:page-sequence, which is a very well-known limitation at the moment.  
> If that is the case, the only workaround would be to introduce more  
> structure in the document, so that you get sequences of 10-15 pages  
> max. Well, even 40-50 pages should work nicely, and with 2GB of heap,  
> I guess this could even be a lot more...
> 
> 
> Cheers
> 
> Andreas
> 
> -
> To unsubscribe, e-mail: fop-users-unsubscr...@xmlgraphics.apache.org
> For additional commands, e-mail: fop-users-h...@xmlgraphics.apache.org
> 
> 
> 

-- 
View this message in context: 
http://www.nabble.com/Out-of-Memory-error-tp20962360p21063152.html
Sent from the FOP - Users mailing list archive at Nabble.com.


-
To unsubscribe, e-mail: fop-users-unsubscr...@xmlgraphics.apache.org
For additional commands, e-mail: fop-users-h...@xmlgraphics.apache.org



Re: Out of Memory error

2008-12-12 Thread Andreas Delmelle

On 12 Dec 2008, at 15:28, Griffin,Sean wrote:

Hi Sean, Vsyamala,

I was assuming you were embedding FOP into your application and  
calling directly through Java.  Since it appears you're launching  
from the command-line, it's up to the FOP command-line program on  
how it streams the output.  Since you're specifying file names as  
your I/O, it stands to reason that you'd be reading from/writing to  
File I/O streams, so the use of a ByteArrayOutputStream in the  
PDFStream.add() method might be a red herring and perfectly normal.


Indeed. In general, a stack trace is next to useless for an OOMError.  
It will only tell you at what point in the process the error occurred,  
but it doesn't really say anything about the actual cause.
If the error were to occur during a StringBuffer.append(), chances  
would be very slim that it's actually the StringBuffer that is  
responsible. It just means that append() needed to allocate a few  
extra bytes of memory, but other objects/code already used up the  
available heap space.


In this case, the PDFStream.add() method is FOP-internal, and IIC, no  
cause for concern. With a FO file of 45 MB, I'm suspecting the issue  
is caused by the fact that the entire document consists of a single  
fo:page-sequence, which is a very well-known limitation at the moment.  
If that is the case, the only workaround would be to introduce more  
structure in the document, so that you get sequences of 10-15 pages  
max. Well, even 40-50 pages should work nicely, and with 2GB of heap,  
I guess this could even be a lot more...



Cheers

Andreas

-
To unsubscribe, e-mail: fop-users-unsubscr...@xmlgraphics.apache.org
For additional commands, e-mail: fop-users-h...@xmlgraphics.apache.org



RE: Out of Memory error

2008-12-12 Thread Griffin,Sean
Vsyamala,
I was assuming you were embedding FOP into your application and calling 
directly through Java.  Since it appears you're launching from the 
command-line, it's up to the FOP command-line program on how it streams the 
output.  Since you're specifying file names as your I/O, it stands to reason 
that you'd be reading from/writing to File I/O streams, so the use of a 
ByteArrayOutputStream in the PDFStream.add() method might be a red herring and 
perfectly normal.

If you haven't already, you'll want to read this, which may point you in a 
direction to finding the source of your problem: 
http://xmlgraphics.apache.org/fop/0.94/running.html#memory.

Sean


-Original Message-
From: vsyamala [mailto:vsyam...@gmail.com] 
Sent: Friday, December 12, 2008 8:12 AM
To: fop-users@xmlgraphics.apache.org
Subject: RE: Out of Memory error


Hi Sean,

Thanks for responding. I am using FOP 0.95 version and generating .fo file
from XSL transformation using xsltproc processor:

xsltproc --nonet --novalid --output Part5.fo PDFPublish_External.xsl
PDF_Book_Part5.xml

And then using Fop 0.95 version to convert .fo file to .pdf

call Fop -q -r -c fop.xconf Part5.fo PDF_Part5.pdf

I am not sure whether I am using memory stream or file stream. How do I
check that? How should I use the FileInputStream method or write to a file
instead of disk?

Thanks,
Vsyamala

--
CONFIDENTIALITY NOTICE This message and any included attachments are from 
Cerner Corporation and are intended only for the addressee. The information 
contained in this message is confidential and may constitute inside or 
non-public information under international, federal, or state securities laws. 
Unauthorized forwarding, printing, copying, distribution, or use of such 
information is strictly prohibited and may be unlawful. If you are not the 
addressee, please promptly delete this message and notify the sender of the 
delivery error by e-mail or you may call Cerner's corporate offices in Kansas 
City, Missouri, U.S.A at (+1) (816)221-1024.

-
To unsubscribe, e-mail: fop-users-unsubscr...@xmlgraphics.apache.org
For additional commands, e-mail: fop-users-h...@xmlgraphics.apache.org



RE: Out of Memory error

2008-12-12 Thread vsyamala

Hi Sean,

Thanks for responding. I am using FOP 0.95 version and generating .fo file
from XSL transformation using xsltproc processor:

xsltproc --nonet --novalid --output Part5.fo PDFPublish_External.xsl
PDF_Book_Part5.xml

And then using Fop 0.95 version to convert .fo file to .pdf

call Fop -q -r -c fop.xconf Part5.fo PDF_Part5.pdf

I am not sure whether I am using memory stream or file stream. How do I
check that? How should I use the FileInputStream method or write to a file
instead of disk?

Thanks,
Vsyamala



Griffin,Sean wrote:
> 
> Vsyamala,
> There are a variety of factors that could contribute to your OOM error,
> including, most significantly, the size of your page sequences within you
> FO, the types of images that you're embedding, the version of FOP that
> you're using, and whether you're writing to the PDF in memory or into a
> file.  300 images @ 48KB each doesn't sound extreme, but a 45MB FO file is
> pretty big.  It's too hard to say whether a 45MB FO file with 300 images
> at 48KB each should be able to run in 2GB memory...there's just too many
> variables that aren't mentioned.  So I would recommend a
> divide-and-conquer approach.  Remove the image references from the FO and
> try again.  Still run out of memory or come close to it?  If so, it must
> not be the images.  Different versions of FOP handle images in different
> ways.  For example, in v0.20.5 (maybe 0.95 as well), JPG images were
> directly embedded into the PDF whereas PNG images were transcoded into
> bitmaps and compressed into the PDF.  As a consequence, JPG images used a
> lot less processor power but created, generally, larger PDFs.
> 
> I just noticed from your stack trace that it appears you're writing to a
> ByteArrayOutputStream.  That could be a big problem right there.  You're
> wanting to store the entire 12+MB PDF in memory in addition to the memory
> needed for FOP to function?  A much better option would be to write that
> file to disk.  Likewise, are you feeding the FO into FOP from a memory
> stream, a file stream, or as the result of an XML/XSLT transformation?  Of
> all options, if you're rendering files this big, you should use the
> FileInputStream method.
> 
> Sean
> 
> -Original Message-
> From: vsyamala [mailto:vsyam...@gmail.com] 
> Sent: Thursday, December 11, 2008 1:10 PM
> To: fop-users@xmlgraphics.apache.org
> Subject: Out of Memory error
> 
> 
> Hi,
> 
> I am trying to generate PDF from 45MB .fo file, fop produces a pdf of
> about
> 12MB and it fails with "out of memory" errors. XMx and XMs options are set
> for 2048MB(that's max I could go on the machine). And this .fo file is
> referring to about 300 images of appx 48KB each. FOP should be able to
> generate PDF. I am not sure if images are causing this issue. Does anyone
> know if images are the issue? Here is the error:
> 
> Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
> at java.util.Arrays.copyOf(Unknown Source)
> at java.io.ByteArrayOutputStream.write(Unknown Source)
> at java.io.OutputStream.write(Unknown Source)
> at org.apache.fop.pdf.PDFStream.add(PDFStream.java:60)
> at
> org.apache.fop.render.pdf.PDFRenderer.concatenateTransformationMatrix
> (PDFRenderer.java:839)
> at
> org.apache.fop.render.AbstractPathOrientedRenderer.renderReferenceAre
> a(AbstractPathOrientedRenderer.java:539)
> at
> org.apache.fop.render.AbstractRenderer.renderBlock(AbstractRenderer.j
> ava:560)
> at
> org.apache.fop.render.pdf.PDFRenderer.renderBlock(PDFRenderer.java:13
> 29)
> at
> org.apache.fop.render.AbstractRenderer.renderBlocks(AbstractRenderer.
> java:526)
> at
> org.apache.fop.render.AbstractRenderer.renderBlock(AbstractRenderer.j
> ava:573)
>     at
> org.apache.fop.render.pdf.PDFRenderer.renderBlock(PDFRenderer.java:13
> 29)
> at
> org.apache.fop.render.AbstractRenderer.renderBlocks(AbstractRenderer.
> java:526)
> 
> Thanks,
> Vsyamala
> 
>  
> -- 
> View this message in context:
> http://www.nabble.com/Out-of-Memory-error-tp20962360p20962360.html
> Sent from the FOP - Users mailing list archive at Nabble.com.
> 
> 
> -
> To unsubscribe, e-mail: fop-users-unsubscr...@xmlgraphics.apache.org
> For additional commands, e-mail: fop-users-h...@xmlgraphics.apache.org
> 
> --
> CONFIDENTIALITY NOTICE This message and any included attachments are from
> Cerner Corporation and are intended only for the addressee. The
> information contained in

RE: Out of Memory error

2008-12-11 Thread Griffin,Sean
Vsyamala,
There are a variety of factors that could contribute to your OOM error, 
including, most significantly, the size of your page sequences within you FO, 
the types of images that you're embedding, the version of FOP that you're 
using, and whether you're writing to the PDF in memory or into a file.  300 
images @ 48KB each doesn't sound extreme, but a 45MB FO file is pretty big.  
It's too hard to say whether a 45MB FO file with 300 images at 48KB each should 
be able to run in 2GB memory...there's just too many variables that aren't 
mentioned.  So I would recommend a divide-and-conquer approach.  Remove the 
image references from the FO and try again.  Still run out of memory or come 
close to it?  If so, it must not be the images.  Different versions of FOP 
handle images in different ways.  For example, in v0.20.5 (maybe 0.95 as well), 
JPG images were directly embedded into the PDF whereas PNG images were 
transcoded into bitmaps and compressed into the PDF.  As a consequence, JPG 
images used a lot less processor power but created, generally, larger PDFs.

I just noticed from your stack trace that it appears you're writing to a 
ByteArrayOutputStream.  That could be a big problem right there.  You're 
wanting to store the entire 12+MB PDF in memory in addition to the memory 
needed for FOP to function?  A much better option would be to write that file 
to disk.  Likewise, are you feeding the FO into FOP from a memory stream, a 
file stream, or as the result of an XML/XSLT transformation?  Of all options, 
if you're rendering files this big, you should use the FileInputStream method.

Sean

-Original Message-
From: vsyamala [mailto:vsyam...@gmail.com] 
Sent: Thursday, December 11, 2008 1:10 PM
To: fop-users@xmlgraphics.apache.org
Subject: Out of Memory error


Hi,

I am trying to generate PDF from 45MB .fo file, fop produces a pdf of about
12MB and it fails with "out of memory" errors. XMx and XMs options are set
for 2048MB(that's max I could go on the machine). And this .fo file is
referring to about 300 images of appx 48KB each. FOP should be able to
generate PDF. I am not sure if images are causing this issue. Does anyone
know if images are the issue? Here is the error:

Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
at java.util.Arrays.copyOf(Unknown Source)
at java.io.ByteArrayOutputStream.write(Unknown Source)
at java.io.OutputStream.write(Unknown Source)
at org.apache.fop.pdf.PDFStream.add(PDFStream.java:60)
at
org.apache.fop.render.pdf.PDFRenderer.concatenateTransformationMatrix
(PDFRenderer.java:839)
at
org.apache.fop.render.AbstractPathOrientedRenderer.renderReferenceAre
a(AbstractPathOrientedRenderer.java:539)
at
org.apache.fop.render.AbstractRenderer.renderBlock(AbstractRenderer.j
ava:560)
at
org.apache.fop.render.pdf.PDFRenderer.renderBlock(PDFRenderer.java:13
29)
at
org.apache.fop.render.AbstractRenderer.renderBlocks(AbstractRenderer.
java:526)
at
org.apache.fop.render.AbstractRenderer.renderBlock(AbstractRenderer.j
ava:573)
at
org.apache.fop.render.pdf.PDFRenderer.renderBlock(PDFRenderer.java:13
29)
at
org.apache.fop.render.AbstractRenderer.renderBlocks(AbstractRenderer.
java:526)

Thanks,
Vsyamala

 
-- 
View this message in context: 
http://www.nabble.com/Out-of-Memory-error-tp20962360p20962360.html
Sent from the FOP - Users mailing list archive at Nabble.com.


-
To unsubscribe, e-mail: fop-users-unsubscr...@xmlgraphics.apache.org
For additional commands, e-mail: fop-users-h...@xmlgraphics.apache.org

--
CONFIDENTIALITY NOTICE This message and any included attachments are from 
Cerner Corporation and are intended only for the addressee. The information 
contained in this message is confidential and may constitute inside or 
non-public information under international, federal, or state securities laws. 
Unauthorized forwarding, printing, copying, distribution, or use of such 
information is strictly prohibited and may be unlawful. If you are not the 
addressee, please promptly delete this message and notify the sender of the 
delivery error by e-mail or you may call Cerner's corporate offices in Kansas 
City, Missouri, U.S.A at (+1) (816)221-1024.

-
To unsubscribe, e-mail: fop-users-unsubscr...@xmlgraphics.apache.org
For additional commands, e-mail: fop-users-h...@xmlgraphics.apache.org



Out of Memory error

2008-12-11 Thread vsyamala

Hi,

I am trying to generate PDF from 45MB .fo file, fop produces a pdf of about
12MB and it fails with "out of memory" errors. XMx and XMs options are set
for 2048MB(that's max I could go on the machine). And this .fo file is
referring to about 300 images of appx 48KB each. FOP should be able to
generate PDF. I am not sure if images are causing this issue. Does anyone
know if images are the issue? Here is the error:

Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
at java.util.Arrays.copyOf(Unknown Source)
at java.io.ByteArrayOutputStream.write(Unknown Source)
at java.io.OutputStream.write(Unknown Source)
at org.apache.fop.pdf.PDFStream.add(PDFStream.java:60)
at
org.apache.fop.render.pdf.PDFRenderer.concatenateTransformationMatrix
(PDFRenderer.java:839)
at
org.apache.fop.render.AbstractPathOrientedRenderer.renderReferenceAre
a(AbstractPathOrientedRenderer.java:539)
at
org.apache.fop.render.AbstractRenderer.renderBlock(AbstractRenderer.j
ava:560)
at
org.apache.fop.render.pdf.PDFRenderer.renderBlock(PDFRenderer.java:13
29)
at
org.apache.fop.render.AbstractRenderer.renderBlocks(AbstractRenderer.
java:526)
at
org.apache.fop.render.AbstractRenderer.renderBlock(AbstractRenderer.j
ava:573)
at
org.apache.fop.render.pdf.PDFRenderer.renderBlock(PDFRenderer.java:13
29)
at
org.apache.fop.render.AbstractRenderer.renderBlocks(AbstractRenderer.
java:526)

Thanks,
Vsyamala

 
-- 
View this message in context: 
http://www.nabble.com/Out-of-Memory-error-tp20962360p20962360.html
Sent from the FOP - Users mailing list archive at Nabble.com.


-
To unsubscribe, e-mail: fop-users-unsubscr...@xmlgraphics.apache.org
For additional commands, e-mail: fop-users-h...@xmlgraphics.apache.org



AW: AW: Font autodetection yields Out-of-Memory error

2008-09-09 Thread Ingo Maas
Hi Jeremias,

if the loop condition in PFMInputStream.readString() only checks for 0
(zero) as a terminator then that loop 'll run forever if the (last) zero
before EOF is missing. It 'll read EOF characters again and again increasing
the StringBuffer char by char until infinity, once terminated in the case of
lacking any more memory for the StringBuffer.

I don't know, if the missing zero is outside of the specs, but FOP can (and
should IMHO) care a little more at this point.

Greetings, Ingo

-Ursprüngliche Nachricht-
Von: Jeremias Maerki [mailto:[EMAIL PROTECTED] 
Gesendet: Dienstag, 9. September 2008 13:16
An: fop-users@xmlgraphics.apache.org
Betreff: Re: AW: Font autodetection yields Out-of-Memory error

Please note that an OutOfMemoryError can occur anywhere. A stack trace
in such a case is not an reliable indicator for a problem.

I don't think there's a problem with any specific font. PFM files are
rarely larger than a couple of kilobytes. I'd rather increase the heap
size (as a work-around) or start up a profiler to find out who exactly
uses so much memory (or rather doesn't release it in time). FOP's font
handling is known to be a bit memory-hungry. I've experienced an
OutOfMemoryError myself occasionally during font auto-detection but so
far I haven't been able put my finger on it.

Ingo, you can generate a heap dump if you add the following VM
parameter:
-XX:+HeapDumpOnOutOfMemoryError
The generated file can then be used to figure out memory problems with
the right tools (like IBM's Heap Analyzer or YourKit). If you want you
can put it somewhere on a server and send me a link. I can then take a
look. Don't send those files by mail: they are huge!

On 09.09.2008 11:41:20 Ingo Maas wrote:
> Hi Adrian,
> 
> the fonts are original Adobe fonts, which I took for tests from older
> Acrobat Reader. Therefore, they are trustworthy, at least more than those
> Ghostscript fonts, which have failed for me in metrics generation.
> 
> Having a look at PFMInputStream.readString(), it is evident what is going
> wrong. The stream only catches zero terminations but not EOF (i.e. -1).
> Wouldn't make it sense to catch EOF here as well?
> 
> Greetings, Ingo
> 
> -Ursprüngliche Nachricht-
> Von: Adrian Cumiskey [mailto:[EMAIL PROTECTED] 
> Gesendet: Montag, 8. September 2008 18:15
> An: fop-users@xmlgraphics.apache.org
> Betreff: Re: Font autodetection yields Out-of-Memory error
> 
> Hi Ingo,
> 
> Looking at the stack trace it doesn't look to be a problem with font
> autodetection, more to do with 
> a parsing problem on one of your PFM files.  I'm guessing it must be an
> erroneous one as I wasn't 
> able to reproduce the problem with the PFM font file that I tried.
> 
> Could you try isolating which PFM file is the offending one?  You should
be
> able to see which font 
> file is causing you the problem by changing your LOGLEVEL variable towards
> the bottom of your fop 
> launcher file to "info" or "debug" and then relaunching.
> 
> Adrian.
> 
> Ingo Maas wrote:
> > Hallo,
> > 
> >  
> > 
> > In font autodetection seems to be a bug, introduced sometime in 
> > June/July. I hoped that this has been fixed in 0.95 or current trunk, 
> > but it has not. See stacktrace for current trunk below.
> > 
> >  
> > 
> > Regards, Ingo
> > 
> >  
> > 
> > Stacktrace: java.lang.OutOfMemoryError: Java heap space
> > 
> >   at java.lang.AbstractStringBuilder.expandCapacity(Unknown Source)
> > 
> >   at java.lang.AbstractStringBuilder.append(Unknown Source)
> > 
> >   at java.lang.StringBuffer.append(Unknown Source)
> > 
> >   at 
> >
>
org.apache.fop.fonts.type1.PFMInputStream.readString(_PFMInputStream.java:10
> 2_)
> > 
> >   at
org.apache.fop.fonts.type1.PFMFile.loadHeader(_PFMFile.java:135_)
> > 
> >   at org.apache.fop.fonts.type1.PFMFile.load(_PFMFile.java:108_)
> > 
> >   at 
> >
>
org.apache.fop.fonts.type1.Type1FontLoader.read(_Type1FontLoader.java:102_)
> > 
> >   at org.apache.fop.fonts.FontLoader.getFont(_FontLoader.java:171_)
> > 
> >   at org.apache.fop.fonts.FontLoader.loadFont(_FontLoader.java:120_)
> > 
> >   at org.apache.fop.fonts.FontLoader.loadFont(_FontLoader.java:98_)
> > 
> >   at 
> >
>
org.apache.fop.fonts.autodetect.FontInfoFinder.find(_FontInfoFinder.java:251
> _)
> > 
> > …
> > 
> >  
> > 
> > -- 
> > 
> > Ernst Basler + Partner GmbH
> > Ingo Maas
> > Tuchmacherstraße 47
> > DE-14482

Re: AW: Font autodetection yields Out-of-Memory error

2008-09-09 Thread Jeremias Maerki
Please note that an OutOfMemoryError can occur anywhere. A stack trace
in such a case is not an reliable indicator for a problem.

I don't think there's a problem with any specific font. PFM files are
rarely larger than a couple of kilobytes. I'd rather increase the heap
size (as a work-around) or start up a profiler to find out who exactly
uses so much memory (or rather doesn't release it in time). FOP's font
handling is known to be a bit memory-hungry. I've experienced an
OutOfMemoryError myself occasionally during font auto-detection but so
far I haven't been able put my finger on it.

Ingo, you can generate a heap dump if you add the following VM
parameter:
-XX:+HeapDumpOnOutOfMemoryError
The generated file can then be used to figure out memory problems with
the right tools (like IBM's Heap Analyzer or YourKit). If you want you
can put it somewhere on a server and send me a link. I can then take a
look. Don't send those files by mail: they are huge!

On 09.09.2008 11:41:20 Ingo Maas wrote:
> Hi Adrian,
> 
> the fonts are original Adobe fonts, which I took for tests from older
> Acrobat Reader. Therefore, they are trustworthy, at least more than those
> Ghostscript fonts, which have failed for me in metrics generation.
> 
> Having a look at PFMInputStream.readString(), it is evident what is going
> wrong. The stream only catches zero terminations but not EOF (i.e. -1).
> Wouldn't make it sense to catch EOF here as well?
> 
> Greetings, Ingo
> 
> -Ursprüngliche Nachricht-
> Von: Adrian Cumiskey [mailto:[EMAIL PROTECTED] 
> Gesendet: Montag, 8. September 2008 18:15
> An: fop-users@xmlgraphics.apache.org
> Betreff: Re: Font autodetection yields Out-of-Memory error
> 
> Hi Ingo,
> 
> Looking at the stack trace it doesn't look to be a problem with font
> autodetection, more to do with 
> a parsing problem on one of your PFM files.  I'm guessing it must be an
> erroneous one as I wasn't 
> able to reproduce the problem with the PFM font file that I tried.
> 
> Could you try isolating which PFM file is the offending one?  You should be
> able to see which font 
> file is causing you the problem by changing your LOGLEVEL variable towards
> the bottom of your fop 
> launcher file to "info" or "debug" and then relaunching.
> 
> Adrian.
> 
> Ingo Maas wrote:
> > Hallo,
> > 
> >  
> > 
> > In font autodetection seems to be a bug, introduced sometime in 
> > June/July. I hoped that this has been fixed in 0.95 or current trunk, 
> > but it has not. See stacktrace for current trunk below.
> > 
> >  
> > 
> > Regards, Ingo
> > 
> >  
> > 
> > Stacktrace: java.lang.OutOfMemoryError: Java heap space
> > 
> >   at java.lang.AbstractStringBuilder.expandCapacity(Unknown Source)
> > 
> >   at java.lang.AbstractStringBuilder.append(Unknown Source)
> > 
> >   at java.lang.StringBuffer.append(Unknown Source)
> > 
> >   at 
> >
> org.apache.fop.fonts.type1.PFMInputStream.readString(_PFMInputStream.java:10
> 2_)
> > 
> >   at org.apache.fop.fonts.type1.PFMFile.loadHeader(_PFMFile.java:135_)
> > 
> >   at org.apache.fop.fonts.type1.PFMFile.load(_PFMFile.java:108_)
> > 
> >   at 
> >
> org.apache.fop.fonts.type1.Type1FontLoader.read(_Type1FontLoader.java:102_)
> > 
> >   at org.apache.fop.fonts.FontLoader.getFont(_FontLoader.java:171_)
> > 
> >   at org.apache.fop.fonts.FontLoader.loadFont(_FontLoader.java:120_)
> > 
> >   at org.apache.fop.fonts.FontLoader.loadFont(_FontLoader.java:98_)
> > 
> >   at 
> >
> org.apache.fop.fonts.autodetect.FontInfoFinder.find(_FontInfoFinder.java:251
> _)
> > 
> > …
> > 
> >  
> > 
> > -- 
> > 
> > Ernst Basler + Partner GmbH
> > Ingo Maas
> > Tuchmacherstraße 47
> > DE-14482 Potsdam
> > 
> >  
> > 
> > +49 331 74 75 9 0  (Zentrale)
> > +49 331 74 75 9 78 (Direkt)
> > +49 331 74 75 9 90 (Fax)
> > 
> >  
> > 
> > Geschäftsführer: Dr. Ludger Paus
> > Amtsgericht Potsdam - HRB 6362 P
> > USt-Id Nr. 152768391
> > 
> >  
> > 
> > mailto:[EMAIL PROTECTED]
> > http://www.ebp.de
> > 
> >  
> > 
> >  
> > 
> >  
> > 
> 




Jeremias Maerki


-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



Re: AW: Font autodetection yields Out-of-Memory error

2008-09-09 Thread Adrian Cumiskey

Hi Ingo,

You definitely have a bad font somewhere on your system.  I have just committed a small change to 
the latest FOP code which should make the string parsing a little more robust by catching the EOF 
exception case and when an erroneous PFM file is encountered.  Hopefully you will be able to locate 
the offending font now.


Adrian.

Ingo Maas wrote:

Hi Adrian,

the fonts are original Adobe fonts, which I took for tests from older
Acrobat Reader. Therefore, they are trustworthy, at least more than those
Ghostscript fonts, which have failed for me in metrics generation.

Having a look at PFMInputStream.readString(), it is evident what is going
wrong. The stream only catches zero terminations but not EOF (i.e. -1).
Wouldn't make it sense to catch EOF here as well?

Greetings, Ingo

-Ursprüngliche Nachricht-
Von: Adrian Cumiskey [mailto:[EMAIL PROTECTED] 
Gesendet: Montag, 8. September 2008 18:15

An: fop-users@xmlgraphics.apache.org
Betreff: Re: Font autodetection yields Out-of-Memory error

Hi Ingo,

Looking at the stack trace it doesn't look to be a problem with font
autodetection, more to do with 
a parsing problem on one of your PFM files.  I'm guessing it must be an
erroneous one as I wasn't 
able to reproduce the problem with the PFM font file that I tried.


Could you try isolating which PFM file is the offending one?  You should be
able to see which font 
file is causing you the problem by changing your LOGLEVEL variable towards
the bottom of your fop 
launcher file to "info" or "debug" and then relaunching.


Adrian.

Ingo Maas wrote:

Hallo,

 

In font autodetection seems to be a bug, introduced sometime in 
June/July. I hoped that this has been fixed in 0.95 or current trunk, 
but it has not. See stacktrace for current trunk below.


 


Regards, Ingo

 


Stacktrace: java.lang.OutOfMemoryError: Java heap space

  at java.lang.AbstractStringBuilder.expandCapacity(Unknown Source)

  at java.lang.AbstractStringBuilder.append(Unknown Source)

  at java.lang.StringBuffer.append(Unknown Source)

  at 


org.apache.fop.fonts.type1.PFMInputStream.readString(_PFMInputStream.java:10
2_)

  at org.apache.fop.fonts.type1.PFMFile.loadHeader(_PFMFile.java:135_)

  at org.apache.fop.fonts.type1.PFMFile.load(_PFMFile.java:108_)

  at 


org.apache.fop.fonts.type1.Type1FontLoader.read(_Type1FontLoader.java:102_)

  at org.apache.fop.fonts.FontLoader.getFont(_FontLoader.java:171_)

  at org.apache.fop.fonts.FontLoader.loadFont(_FontLoader.java:120_)

  at org.apache.fop.fonts.FontLoader.loadFont(_FontLoader.java:98_)

  at 


org.apache.fop.fonts.autodetect.FontInfoFinder.find(_FontInfoFinder.java:251
_)

…

 


--

Ernst Basler + Partner GmbH
Ingo Maas
Tuchmacherstraße 47
DE-14482 Potsdam

 


+49 331 74 75 9 0  (Zentrale)
+49 331 74 75 9 78 (Direkt)
+49 331 74 75 9 90 (Fax)

 


Geschäftsführer: Dr. Ludger Paus
Amtsgericht Potsdam - HRB 6362 P
USt-Id Nr. 152768391

 


mailto:[EMAIL PROTECTED]
http://www.ebp.de

 

 

 




-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]





-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



AW: Font autodetection yields Out-of-Memory error

2008-09-09 Thread Ingo Maas
Hi Adrian,

the fonts are original Adobe fonts, which I took for tests from older
Acrobat Reader. Therefore, they are trustworthy, at least more than those
Ghostscript fonts, which have failed for me in metrics generation.

Having a look at PFMInputStream.readString(), it is evident what is going
wrong. The stream only catches zero terminations but not EOF (i.e. -1).
Wouldn't make it sense to catch EOF here as well?

Greetings, Ingo

-Ursprüngliche Nachricht-
Von: Adrian Cumiskey [mailto:[EMAIL PROTECTED] 
Gesendet: Montag, 8. September 2008 18:15
An: fop-users@xmlgraphics.apache.org
Betreff: Re: Font autodetection yields Out-of-Memory error

Hi Ingo,

Looking at the stack trace it doesn't look to be a problem with font
autodetection, more to do with 
a parsing problem on one of your PFM files.  I'm guessing it must be an
erroneous one as I wasn't 
able to reproduce the problem with the PFM font file that I tried.

Could you try isolating which PFM file is the offending one?  You should be
able to see which font 
file is causing you the problem by changing your LOGLEVEL variable towards
the bottom of your fop 
launcher file to "info" or "debug" and then relaunching.

Adrian.

Ingo Maas wrote:
> Hallo,
> 
>  
> 
> In font autodetection seems to be a bug, introduced sometime in 
> June/July. I hoped that this has been fixed in 0.95 or current trunk, 
> but it has not. See stacktrace for current trunk below.
> 
>  
> 
> Regards, Ingo
> 
>  
> 
> Stacktrace: java.lang.OutOfMemoryError: Java heap space
> 
>   at java.lang.AbstractStringBuilder.expandCapacity(Unknown Source)
> 
>   at java.lang.AbstractStringBuilder.append(Unknown Source)
> 
>   at java.lang.StringBuffer.append(Unknown Source)
> 
>   at 
>
org.apache.fop.fonts.type1.PFMInputStream.readString(_PFMInputStream.java:10
2_)
> 
>   at org.apache.fop.fonts.type1.PFMFile.loadHeader(_PFMFile.java:135_)
> 
>   at org.apache.fop.fonts.type1.PFMFile.load(_PFMFile.java:108_)
> 
>   at 
>
org.apache.fop.fonts.type1.Type1FontLoader.read(_Type1FontLoader.java:102_)
> 
>   at org.apache.fop.fonts.FontLoader.getFont(_FontLoader.java:171_)
> 
>   at org.apache.fop.fonts.FontLoader.loadFont(_FontLoader.java:120_)
> 
>   at org.apache.fop.fonts.FontLoader.loadFont(_FontLoader.java:98_)
> 
>   at 
>
org.apache.fop.fonts.autodetect.FontInfoFinder.find(_FontInfoFinder.java:251
_)
> 
> …
> 
>  
> 
> -- 
> 
> Ernst Basler + Partner GmbH
> Ingo Maas
> Tuchmacherstraße 47
> DE-14482 Potsdam
> 
>  
> 
> +49 331 74 75 9 0  (Zentrale)
> +49 331 74 75 9 78 (Direkt)
> +49 331 74 75 9 90 (Fax)
> 
>  
> 
> Geschäftsführer: Dr. Ludger Paus
> Amtsgericht Potsdam - HRB 6362 P
> USt-Id Nr. 152768391
> 
>  
> 
> mailto:[EMAIL PROTECTED]
> http://www.ebp.de
> 
>  
> 
>  
> 
>  
> 


-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



Re: Font autodetection yields Out-of-Memory error

2008-09-08 Thread Adrian Cumiskey

Hi Ingo,

Looking at the stack trace it doesn't look to be a problem with font autodetection, more to do with 
a parsing problem on one of your PFM files.  I'm guessing it must be an erroneous one as I wasn't 
able to reproduce the problem with the PFM font file that I tried.


Could you try isolating which PFM file is the offending one?  You should be able to see which font 
file is causing you the problem by changing your LOGLEVEL variable towards the bottom of your fop 
launcher file to "info" or "debug" and then relaunching.


Adrian.

Ingo Maas wrote:

Hallo,

 

In font autodetection seems to be a bug, introduced sometime in 
June/July. I hoped that this has been fixed in 0.95 or current trunk, 
but it has not. See stacktrace for current trunk below.


 


Regards, Ingo

 


Stacktrace: java.lang.OutOfMemoryError: Java heap space

  at java.lang.AbstractStringBuilder.expandCapacity(Unknown Source)

  at java.lang.AbstractStringBuilder.append(Unknown Source)

  at java.lang.StringBuffer.append(Unknown Source)

  at 
org.apache.fop.fonts.type1.PFMInputStream.readString(_PFMInputStream.java:102_)


  at org.apache.fop.fonts.type1.PFMFile.loadHeader(_PFMFile.java:135_)

  at org.apache.fop.fonts.type1.PFMFile.load(_PFMFile.java:108_)

  at 
org.apache.fop.fonts.type1.Type1FontLoader.read(_Type1FontLoader.java:102_)


  at org.apache.fop.fonts.FontLoader.getFont(_FontLoader.java:171_)

  at org.apache.fop.fonts.FontLoader.loadFont(_FontLoader.java:120_)

  at org.apache.fop.fonts.FontLoader.loadFont(_FontLoader.java:98_)

  at 
org.apache.fop.fonts.autodetect.FontInfoFinder.find(_FontInfoFinder.java:251_)


…

 


--

Ernst Basler + Partner GmbH
Ingo Maas
Tuchmacherstraße 47
DE-14482 Potsdam

 


+49 331 74 75 9 0  (Zentrale)
+49 331 74 75 9 78 (Direkt)
+49 331 74 75 9 90 (Fax)

 


Geschäftsführer: Dr. Ludger Paus
Amtsgericht Potsdam - HRB 6362 P
USt-Id Nr. 152768391

 


mailto:[EMAIL PROTECTED]
http://www.ebp.de

 

 

 




-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



Font autodetection yields Out-of-Memory error

2008-09-08 Thread Ingo Maas
Hallo,

 

In font autodetection seems to be a bug, introduced sometime in June/July. I
hoped that this has been fixed in 0.95 or current trunk, but it has not. See
stacktrace for current trunk below.

 

Regards, Ingo 

 

Stacktrace: java.lang.OutOfMemoryError: Java heap space

  at java.lang.AbstractStringBuilder.expandCapacity(Unknown Source)

  at java.lang.AbstractStringBuilder.append(Unknown Source)

  at java.lang.StringBuffer.append(Unknown Source)

  at
org.apache.fop.fonts.type1.PFMInputStream.readString(PFMInputStream.java:102
)

  at org.apache.fop.fonts.type1.PFMFile.loadHeader(PFMFile.java:135)

  at org.apache.fop.fonts.type1.PFMFile.load(PFMFile.java:108)

  at
org.apache.fop.fonts.type1.Type1FontLoader.read(Type1FontLoader.java:102)

  at org.apache.fop.fonts.FontLoader.getFont(FontLoader.java:171)

  at org.apache.fop.fonts.FontLoader.loadFont(FontLoader.java:120)

  at org.apache.fop.fonts.FontLoader.loadFont(FontLoader.java:98)

  at
org.apache.fop.fonts.autodetect.FontInfoFinder.find(FontInfoFinder.java:251)

…

 

-- 

Ernst Basler + Partner GmbH
Ingo Maas
Tuchmacherstraße 47
DE-14482 Potsdam

 

+49 331 74 75 9 0  (Zentrale)
+49 331 74 75 9 78 (Direkt)
+49 331 74 75 9 90 (Fax) 

 

Geschäftsführer: Dr. Ludger Paus
Amtsgericht Potsdam - HRB 6362 P
USt-Id Nr. 152768391

 

  mailto:[EMAIL PROTECTED]
  http://www.ebp.de

 

 

 



Re: Out Of Memory Error

2007-02-06 Thread Jay Bryant
Out Of Memory ErrorI routinely make much larger files (600+ pages with lots of 
images, so 6+ MB filesize). Try the items mentioned in the following link:

http://xmlgraphics.apache.org/fop/0.93/running.html#memory

I have also used FOP to make PDF files of more than 2,000 pages, but those 
didn't have images (they were data catalogs for a data warehouse).

Jay Bryant
Bryant Communication Services
  - Original Message - 
  From: [EMAIL PROTECTED] 
  To: fop-users@xmlgraphics.apache.org 
  Sent: Tuesday, February 06, 2007 4:54 PM
  Subject: Out Of Memory Error


  Hello All, 

  I am trying to convert a huge XML document(approx about 110 pages long, 
777KB) into a pdf document using an FO transformation. But it gives me the 
following error.

  Exception in thread "main" java.lang.OutOfMemoryError 

  Can anybody guide me as to what is going wrong? Is there a limit on the size 
of the document that can be transformed?If so , can it be increased?

  Thanks, 
  Sindhu 



  Email Firewall made the following annotations 
  -
  --- NOTICE --- 
  This message is for the designated recipient only and may contain 
confidential, privileged or proprietary information. If you have received it in 
error, please notify the sender immediately and delete the original and any 
copy or printout. Unintended recipients are prohibited from making any other 
use of this e-mail. Although we have taken reasonable precautions to ensure no 
viruses are present in this e-mail, we accept no liability for any loss or 
damage arising from the use of this e-mail or attachments, or for any delay or 
errors or omissions in the contents which result from e-mail transmission. 
  -



Re: Out Of Memory Error

2007-02-06 Thread Roland Neilands

Check the FAQ:
http://xmlgraphics.apache.org/fop/0.93/running.html#memory

Regards,
Roland 



[EMAIL PROTECTED] wrote:


Hello All,

I am trying to convert a huge XML document(approx about 110 pages 
long, 777KB) into a pdf document using an FO transformation. But it 
gives me the following error.


Exception in thread "main" java.lang.OutOfMemoryError

Can anybody guide me as to what is going wrong? Is there a limit on 
the size of the document that can be transformed?If so , can it be 
increased?


Thanks,
Sindhu


Email Firewall made the following annotations
-
--- NOTICE ---
This message is for the designated recipient only and may contain 
confidential, privileged or proprietary information. If you have 
received it in error, please notify the sender immediately and delete 
the original and any copy or printout. Unintended recipients are 
prohibited from making any other use of this e-mail. Although we have 
taken reasonable precautions to ensure no viruses are present in this 
e-mail, we accept no liability for any loss or damage arising from the 
use of this e-mail or attachments, or for any delay or errors or 
omissions in the contents which result from e-mail transmission.

-



-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



Out Of Memory Error

2007-02-06 Thread Krishna . Cherla
Hello All,

I am trying to convert a huge XML document(approx about 110 pages long,
777KB) into a pdf document using an FO transformation. But it gives me
the following error.

Exception in thread "main" java.lang.OutOfMemoryError

Can anybody guide me as to what is going wrong? Is there a limit on the
size of the document that can be transformed?If so , can it be
increased?

Thanks,
Sindhu 



Email Firewall made the following annotations
 NOTICE 
---

This message is for the designated recipient only and may contain confidential, 
privileged or proprietary information. If you have received it in error, please 
notify the sender immediately and delete the original and any copy or printout. 
Unintended recipients are prohibited from making any other use of this e-mail. 
Although we have taken reasonable precautions to ensure no viruses are present 
in this e-mail, we accept no liability for any loss or damage arising from the 
use of this e-mail or attachments, or for any delay or errors or omissions in 
the contents which result from e-mail transmission.

-


Re: out of memory error

2006-08-30 Thread Jeremias Maerki

On 30.08.2006 17:30:09 Naveen Bandhu wrote:
> 
> Currently we are using fop version 0.20.5, and get Out of memory error. I
> have seen the fix in trunk but we cant use the trunk because of the changes
> we have to do for XSL files to make our them compatiable to trunk version. 

That's a strange excuse for not upgrading. The newer FOP releases are
much more conformant to the XSL spec, whereas 0.20.5 did many things
wrong. Usually, a stylesheet is updated for the latest release in a
couple of hours.

> As per the fix in trunk, PageSequence.java has new method public void
> releasePageSequence(). I would like to do changes to fop_0.20.5, from where
> should I start? what all the classes to modified? I am trying to debug to
> find it myself but seems to be time consuming and im worried about side
> effects.

Frankly, that's the wrong approach. You'd invest a lot of time in a
software that's no longer maintained. The latest releases are coming
from a new line of development where lots of things have changed. The
whole layout engine is new. It might not at all be possible to port the
"releasePageSequence()" changes to FOP 0.20.5.

Furthermore, you don't tell us anything about the document you're having
problems with (type of document, size, number of images, number of
page-sequences...). We don't know if you've tried all the tips and
work-arounds we published. I would at least give FOP 0.92beta (or FOP
Trunk) a try.

> Thanks, Naveen 
> 
>  
> 
>  Leena,
> you don't say what FOP version you're using, but would you please getthe
> latest code (FOP Trunk) from SVN and recheck if you get anyimprovements?
> I've just committed a fix for a memory-leak which mightimprove your
> situation. Please report back on the results.On 03.08.2006 09:06:34 Madala,
> Leena \(GE Healthcare, consultant\) wrote:> Hello team,> > I'm facing
> similar issue. I'm getting out of memory error, when trying> to generate a
> report.> I have checked FAQ and tried all the options provided there...> > I
> tried increasing Java virtual memory to 1024M and also removed page>
> citation in the report generated. I also tried with new beta version of>
> fop.jar.> It didn't help.> > Report which is to be generated will generate
> approximately 70 pages for> 40 records in database. It works fine for 40
> records.> We have to generate a similar report for 700 records. Report
> doesn't> contain any images. It has table for every 2 pages.> Please help me
> resolve the issue. > > Thanks for your time> > Thanks & Regards,> Leena
> Madala Jeremias Maerki-



Jeremias Maerki


-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



Re: out of memory error

2006-08-30 Thread Naveen Bandhu



Currently we are using fop version 0.20.5, and get Out of memory error. I have seen the fix in trunk but we cant use the trunk because of the changes we have to do for XSL files to make our them compatiable to trunk version. As per the fix in trunk, PageSequence.java has new method public void releasePageSequence(). I would like to do changes to fop_0.20.5, from where should I start? what all the classes to modified? I am trying to debug to find it myself but seems to be time consuming and im worried about side effects.Thanks, Naveen   Leena,you don't say what 
FOP version you're using, but would you please getthe latest code (FOP Trunk) 
from SVN and recheck if you get anyimprovements? I've just committed a fix for a 
memory-leak which mightimprove your situation. Please report back on the 
results.On 03.08.2006 09:06:34 Madala, Leena \(GE Healthcare, consultant\) 
wrote:> Hello team,>  > I'm facing similar issue. I'm getting out of 
memory error, when trying> to generate a report.> I have checked FAQ and 
tried all the options provided there...>  > I tried increasing Java 
virtual memory to 1024M and also removed page> citation in the report 
generated. I also tried with new beta version of> fop.jar.> It didn't 
help.>  > Report which is to be generated will generate approximately 70 
pages for> 40 records in database. It works fine for 40 records.> We have 
to generate a similar report for 700 records. Report doesn't> contain any 
images. It has table for every 2 pages.> Please help me resolve the issue.
>  > Thanks for your time>  > Thanks & Regards,> Leena Madala
Jeremias Maerki-


Re: out of memory error

2006-08-10 Thread Louis . Masters

Most of our reports/docs use a single
table in a single page sequence.  My Windows boxes run with a max
JVM of ~1.5G and are generally OK.  Our reports range from one or
two pages to several hundred.

Under JDK 1.5 and/or Tomcat, have seen
several posts about memory issues with classloading and the perm gen space
so if that sounds like your environment, you may want to dejanews it.

-Lou
~~
LOG-NET, Inc.
The Logistics Network Management System
~~
230 Half Mile Road
Third Floor
Red Bank, NJ 07701
PH: 732-758-6800
FAX: 732-747-7497
http://www.LOG-NET.com
~~
CONFIDENTIAL & PRIVILEGED
Unless otherwise indicated or if obvious from the nature of the content,
the information contained herein is privileged and confidential information/work
product. The communication is intended for the use of the individual or
entity named above.  If the reader of this transmission is not the
intended recipient, you are  hereby notified that any dissemination,
distribution or copying of this communication is strictly prohibited.  If
you have received this communication in error, please notify the sender
immediately by telephone (732-758-6800) or by electronic mail ([EMAIL PROTECTED]),
and destroy any copies, electronic, paper or otherwise, which you may have
of this communication.  Thank you.
~~





Andreas L Delmelle <[EMAIL PROTECTED]>

08/09/2006 01:07 PM



Please respond to
fop-users@xmlgraphics.apache.org





To
fop-users@xmlgraphics.apache.org


cc



Subject
Re: out of memory error








On Aug 9, 2006, at 14:34, Luis Ferro wrote:

Hi,

> I'm using the latest fop version from svn as of yesterday...
>
> In the attachment i send the files i use to create a PDF with the
 
> index and
> one chapter of the book. The index references are all turned off  
> but it
> still gets out of heap...
>
> When trying to assemble the PDF in command line (environment in the
> attachment also)... it always gives the same error of lack of heap
 
> memory...

Please check whether your FOs contain multiple fo:page-sequences.  
Yes, a memory leak was fixed, but no, it won't matter if you cram a  
large number of tables (or one large table) into one single page- 
sequence...


Cheers,

Andreas

-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]




Re: out of memory error

2006-08-09 Thread Luis Ferro

That sequence should be the "chapter A" ;)

Chapter C has about the double of "A"'s size...

LF

(i will update the file with a new one with less... warnings/errors as soon
as i can - it started as a 0.20.5 system, and still needs some twiching)
-- 
View this message in context: 
http://www.nabble.com/out-of-memory-error-tf2044079.html#a5738048
Sent from the FOP - Users forum at Nabble.com.


-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



Re: out of memory error

2006-08-09 Thread Jeremias Maerki
Not sure that playing with the page-masters will really help. I don't
think they have an influence here.

I've just committed a helper XSLT [1] which can split Luis' FO file at
page-sequence boundaries. It reveals that one of the page-sequences
alone makes up 18MB out of the 23MB of the full FO file. I guess that is
what FOP chokes on: Just too much data in one page-sequence and FOP
currently cannot free any objects while inside a page-sequence.

[1] http://svn.apache.org/viewvc?rev=430134&view=rev

I'd try to find way to further split up that large page-sequence. This
should enable FOP to free memory and handle this file with less heap
space.

On 09.08.2006 20:13:43 Andreas L Delmelle wrote:
> On Aug 9, 2006, at 19:27, Luis Ferro wrote:
> 
> Hi,
> 
> > It has LOADS of page sequences... it's a book with 2500 pages where  
> > from 3 or
> > 4 pages, the "template" changes from one column page to 2 column  
> > page...
> >
> > Is there a better way of doing this "swap" of columns?
> 
> Not that I'm aware of... (you're using conditional-page-masters, right?)
> Can you do us a favor and try if using just one simple-page-master  
> makes a difference? Does it also consume a large amount of memory?
> 
> Tinker a bit with the properties on each of the page-masters, see if  
> that changes anything... Never mind the output, it's simply to narrow  
> down the searching area. If the test succeeds with one simple-page- 
> master, or with different settings we'll get a better idea of where  
> to start looking.
> 
> Thanks in advance!
> 
> >
> > Right now... with 477 pages, to render it ocupies 1.1Gb memory... (my
> > machine has a top of 1.5Gb adressable to java...)...
> >
> > Will try now to simplefy everything as best as i can...
> 
> Jeremias recently added a MemoryEater to the trunk with which we can  
> test, using one of your FO fragments and copying it a given number of  
> times.
> Choose one representative page-sequence, and the structure of your  
> conditional-page-master-alternatives, post them in a Bugzilla --so  
> only those people that are interested need to download it-- and we'll  
> have a look.
> 
> > How can i test if there is memory leaks somewhere (i'm a programmer  
> > but i'm
> > very very green in java)?
> 
> There is no easy way, I'm afraid. You can use a profiling tool one  
> one of the sessions --the JDK comes with some profiling facilities,  
> if you're a console-geek ;)-- to have a look at what the reference  
> trees in the heap look like at a certain point in the process, but  
> you'd still need some basic understanding of the process to figure  
> out which active references are totally unnecessary.
> 
> If you're willing to invest time in this, of course you'd be welcome  
> to do so. If you have any questions or remarks, or need help  
> interpreting the results of a profiling session, just direct them to  
> fop-dev, or use Bugzilla to track the issue.
> 
> Thanks again!
> 
> 
> Cheers,
> 
> Andreas


Jeremias Maerki


-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



Re: out of memory error

2006-08-09 Thread Andreas L Delmelle

On Aug 9, 2006, at 19:27, Luis Ferro wrote:

Hi,

It has LOADS of page sequences... it's a book with 2500 pages where  
from 3 or
4 pages, the "template" changes from one column page to 2 column  
page...


Is there a better way of doing this "swap" of columns?


Not that I'm aware of... (you're using conditional-page-masters, right?)
Can you do us a favor and try if using just one simple-page-master  
makes a difference? Does it also consume a large amount of memory?


Tinker a bit with the properties on each of the page-masters, see if  
that changes anything... Never mind the output, it's simply to narrow  
down the searching area. If the test succeeds with one simple-page- 
master, or with different settings we'll get a better idea of where  
to start looking.


Thanks in advance!



Right now... with 477 pages, to render it ocupies 1.1Gb memory... (my
machine has a top of 1.5Gb adressable to java...)...

Will try now to simplefy everything as best as i can...


Jeremias recently added a MemoryEater to the trunk with which we can  
test, using one of your FO fragments and copying it a given number of  
times.
Choose one representative page-sequence, and the structure of your  
conditional-page-master-alternatives, post them in a Bugzilla --so  
only those people that are interested need to download it-- and we'll  
have a look.


How can i test if there is memory leaks somewhere (i'm a programmer  
but i'm

very very green in java)?


There is no easy way, I'm afraid. You can use a profiling tool one  
one of the sessions --the JDK comes with some profiling facilities,  
if you're a console-geek ;)-- to have a look at what the reference  
trees in the heap look like at a certain point in the process, but  
you'd still need some basic understanding of the process to figure  
out which active references are totally unnecessary.


If you're willing to invest time in this, of course you'd be welcome  
to do so. If you have any questions or remarks, or need help  
interpreting the results of a profiling session, just direct them to  
fop-dev, or use Bugzilla to track the issue.


Thanks again!


Cheers,

Andreas


-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



Re: out of memory error

2006-08-09 Thread Luis Ferro

It has LOADS of page sequences... it's a book with 2500 pages where from 3 or
4 pages, the "template" changes from one column page to 2 column page...

Is there a better way of doing this "swap" of columns?

Right now... with 477 pages, to render it ocupies 1.1Gb memory... (my
machine has a top of 1.5Gb adressable to java...)...

Will try now to simplefy everything as best as i can...

How can i test if there is memory leaks somewhere (i'm a programmer but i'm
very very green in java)?

Cheers...
LF
-- 
View this message in context: 
http://www.nabble.com/out-of-memory-error-tf2044079.html#a5730306
Sent from the FOP - Users forum at Nabble.com.


-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



Re: out of memory error

2006-08-09 Thread Andreas L Delmelle

On Aug 9, 2006, at 14:34, Luis Ferro wrote:

Hi,


I'm using the latest fop version from svn as of yesterday...

In the attachment i send the files i use to create a PDF with the  
index and
one chapter of the book. The index references are all turned off  
but it

still gets out of heap...

When trying to assemble the PDF in command line (environment in the
attachment also)... it always gives the same error of lack of heap  
memory...


Please check whether your FOs contain multiple fo:page-sequences.  
Yes, a memory leak was fixed, but no, it won't matter if you cram a  
large number of tables (or one large table) into one single page- 
sequence...



Cheers,

Andreas

-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



Re: out of memory error

2006-08-09 Thread Luis Ferro

A report as such shouldn't pose a problem... 

Edit the "fop.bat" and add the "-Xmx1024m" argument before the "-cp" as i
did... it worked like a charm with 0.92b/thrunk...

To build the report try using the fop.bat at a command prompt... (it worked
for me will thru tomcat it was giving problems, but that is another issue)

Cheers...
LF
-- 
View this message in context: 
http://www.nabble.com/out-of-memory-error-tf2044079.html#a5728922
Sent from the FOP - Users forum at Nabble.com.


-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



Re: out of memory error

2006-08-09 Thread Luis Ferro

I've added the switch -Xmx1500m in the fop bat file in both versions just
before the "-cp" switch.

Now, fop thrunk is crashing with some errors in the .fo file (which i will
correct) and fop 0.20.0.5 is stoping somewhere because of this message:

[ERROR]
file:/D:/sites/mapaterapeutico/extranet/makebook/output/makebook_done_wi
th_images_1.fo:3978:33 No meaningful layout in block after many attempts. 
Infin
ite loop is assumed.  Processing halted.

Hope this "sheeds some light" on the problems...
LF
-- 
View this message in context: 
http://www.nabble.com/out-of-memory-error-tf2044079.html#a5726329
Sent from the FOP - Users forum at Nabble.com.


-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



Re: out of memory error

2006-08-09 Thread Luis Ferro

I'm using the latest fop version from svn as of yesterday...

In the attachment i send the files i use to create a PDF with the index and
one chapter of the book. The index references are all turned off but it
still gets out of heap...

When trying to assemble the PDF in command line (environment in the
attachment also)... it always gives the same error of lack of heap memory...

I've tryed with -XX Aggressive (spelling) as someone suggested and with
several options to increase the heap... but after processing the full .fo
file (when it stops to echo errors and i suppose it starts crunshing)...

(btw the problem also happens to fop 0.20.5)

The machine has 2gb of ram and is a pentium 4 HT...

I'm completly lost on how to make this work... :(

Thanx in advance,
Luis Ferro
Portugal
http://www.nabble.com/user-files/232/memory_problem_thrunk_svn_20060807.zip
memory_problem_thrunk_svn_20060807.zip 
-- 
View this message in context: 
http://www.nabble.com/out-of-memory-error-tf2044079.html#a5724811
Sent from the FOP - Users forum at Nabble.com.


-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



Re: out of memory error

2006-08-03 Thread Jeremias Maerki
Leena,

you don't say what FOP version you're using, but would you please get
the latest code (FOP Trunk) from SVN and recheck if you get any
improvements? I've just committed a fix for a memory-leak which might
improve your situation. Please report back on the results.

On 03.08.2006 09:06:34 Madala, Leena \(GE Healthcare, consultant\) wrote:
> Hello team,
>  
> I'm facing similar issue. I'm getting out of memory error, when trying
> to generate a report.
> I have checked FAQ and tried all the options provided there...
>  
> I tried increasing Java virtual memory to 1024M and also removed page
> citation in the report generated. I also tried with new beta version of
> fop.jar.
> It didn't help.
>  
> Report which is to be generated will generate approximately 70 pages for
> 40 records in database. It works fine for 40 records.
> We have to generate a similar report for 700 records. Report doesn't
> contain any images. It has table for every 2 pages.
> Please help me resolve the issue.
>  
> Thanks for your time
>  
> Thanks & Regards,
> Leena Madala


Jeremias Maerki


-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



out of memory error

2006-08-03 Thread Madala, Leena \(GE Healthcare, consultant\)








Hello team,

 

I’m facing similar issue. I’m
getting out of memory error, when trying to generate a report.

I have checked FAQ and tried all the
options provided there…

 

I tried increasing Java virtual memory to
1024M and also removed page citation in the report generated. I also tried with
new beta version of fop.jar.

It didn’t help.

 

Report which is to be generated will
generate approximately 70 pages for 40 records in database. It works fine for
40 records.

We have to generate a similar report for
700 records. Report doesn’t contain any images. It has table for every 2
pages.

Please help me resolve the issue.

 

Thanks for your time

 

Thanks & Regards,
Leena Madala



 








Re: out of memory error

2006-07-26 Thread Jeremias Maerki
Yes. I'm pretty sure you won't need to clear the image cache anymore.
BTW, I'd appreciate feedback concerning memory consumption if anyone
runs the latest FOP Trunk with many images. It would be good to know if
I really hit the right nail.

On 26.07.2006 18:27:39 Dominic Brügger wrote:
> Is the problem with the image cache that I described a few weeks ago  
> now solved too?
> 
> 
> On 26.07.2006, at 15:07, Jeremias Maerki wrote:
> 
> > Change was last week.
> >
> > On 26.07.2006 14:38:39 Rick Roen wrote:
> >> Jeremias - I am running from trunk code from about a month ago.   
> >> Have you
> >> done some memory adjustments since then?


Jeremias Maerki


-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



Re: out of memory error

2006-07-26 Thread Dominic Brügger
Is the problem with the image cache that I described a few weeks ago now solved too?On 26.07.2006, at 15:07, Jeremias Maerki wrote:Change was last week.On 26.07.2006 14:38:39 Rick Roen wrote: Jeremias - I am running from trunk code from about a month ago.  Have youdone some memory adjustments since then? Jeremias Maerki-To unsubscribe, e-mail: [EMAIL PROTECTED]For additional commands, e-mail: [EMAIL PROTECTED]  

Re: out of memory error

2006-07-26 Thread Jeremias Maerki
Change was last week.

On 26.07.2006 14:38:39 Rick Roen wrote:
> Jeremias - I am running from trunk code from about a month ago.  Have you
> done some memory adjustments since then?

Jeremias Maerki


-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



Re: out of memory error

2006-07-26 Thread Chris Bowditch

Rick Roen wrote:


Jeremias - I am running from trunk code from about a month ago.  Have you
done some memory adjustments since then?  I remember I used to have extra
large resulting PDF files when I had an image that did not have a color
table (or something ?), but the trunk code brought a 225KB file back down to
25KB.


Jeremias has made some memory optimisations to the way images are 
handled within FOP in the last week.


Chris



-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



RE: out of memory error

2006-07-26 Thread Rick Roen
Thanks everyone for your responses.

The size does not appear large to me either, 25+ pages, but nevertheless it
does run out of memory. I have only three images (small) and several tables
with embedded tables, plus the page-number-citation for "page x of y".

FYI in case it helps someone else, the -XX:+AggressiveHeap seems to do what
the combination of -Xmx and -Xms do, however automatically.  The only
documentation I could find at Sun suggested it was only for multiple
processor machines with lots of RAM, however it seems to work for me with
only one processor.

The advantage is that it allocates the Heap dynamically according to the
available RAM.  In my case, the client PC's have from 320MB to 1+GB of RAM
and they are all running locally from a batch file on the server.  Although
I could set the -Xmx and -Xms to the lowest common denominator, the -XX
seems to take care of this on whatever machine it is running on. 

Jeremias - I am running from trunk code from about a month ago.  Have you
done some memory adjustments since then?  I remember I used to have extra
large resulting PDF files when I had an image that did not have a color
table (or something ?), but the trunk code brought a 225KB file back down to
25KB.

Rick

-Original Message-
From: Jeremias Maerki [mailto:[EMAIL PROTECTED] 
Sent: Wednesday, July 26, 2006 2:00 AM
To: fop-users@xmlgraphics.apache.org
Subject: Re: out of memory error

20 to 25 pages does not really count as "extra large". Does your
document contain a few images? If yes and assuming you use FOP 0.92beta
or later, you might want to try the latest code from FOP Trunk (from
Subversion) where I've fixed two memory leaks which dramatically
improves memory usage of FOP when using bitmap images.

I've never used the AggressiveHeap option before. I can't even tell from
the descriptions I found what it really does. If it helps, cool, but
sometimes these measures just cover an underlying problem that should be
fixed at some point.



-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



RE: out of memory error

2006-07-26 Thread Bert Strobbe
We use the -Xmx1024m option, which allows us to generate a PDF of 1000
pages, with table of contents and bookmarks, page of , and a
lot of other internal links. It takes like 20min on a Linux server to
generate this PDF with FOP 0.92 (we generate an internal xml file, which
is converted with a self-written xsl to fo, so this conversion is
included in the time). Without the option, we've got the same memory
problem. With FOP 0.20.5, we never succeeded to generate this kind of
PDF, with or without the -Xmx option.

Bert


-Oorspronkelijk bericht-
Van: Pascal Sancho [mailto:[EMAIL PROTECTED] 
Verzonden: woensdag 26 juli 2006 9:03
Aan: fop-users@xmlgraphics.apache.org; [EMAIL PROTECTED]
Onderwerp: RE: out of memory error

> -Original Message-
> From: Rick Roen [mailto:[EMAIL PROTECTED] 
> 
> Update...
> 
> I was googling in the wrong place.  When I looked in Java I 
> found the switch "-XX:+AggessiveHeap" which fixed the problem.
> 
> Is this the best switch to use, or is there something better?

Hi Rick,
You probably omitted the 'm' from '-xmx' option.
-xmx is the usual way to increase max memory for JVM.
You can use -xms in conjunction to define initial memory.

See [1] the whole java options.

Pascal

[1]
http://java.sun.com/j2se/1.4.2/docs/tooldocs/windows/java.html#options

-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]


-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



RE: out of memory error

2006-07-26 Thread Pascal Sancho
> -Original Message-
> From: Rick Roen [mailto:[EMAIL PROTECTED] 
> 
> Update...
> 
> I was googling in the wrong place.  When I looked in Java I 
> found the switch "-XX:+AggessiveHeap" which fixed the problem.
> 
> Is this the best switch to use, or is there something better?

Hi Rick,
You probably omitted the 'm' from '-xmx' option.
-xmx is the usual way to increase max memory for JVM.
You can use -xms in conjunction to define initial memory.

See [1] the whole java options.

Pascal

[1]
http://java.sun.com/j2se/1.4.2/docs/tooldocs/windows/java.html#options

-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



Re: out of memory error

2006-07-26 Thread Jeremias Maerki
20 to 25 pages does not really count as "extra large". Does your
document contain a few images? If yes and assuming you use FOP 0.92beta
or later, you might want to try the latest code from FOP Trunk (from
Subversion) where I've fixed two memory leaks which dramatically
improves memory usage of FOP when using bitmap images.

I've never used the AggressiveHeap option before. I can't even tell from
the descriptions I found what it really does. If it helps, cool, but
sometimes these measures just cover an underlying problem that should be
fixed at some point.


On 25.07.2006 23:58:51 Rick Roen wrote:
> Update.
> 
>  
> 
> I was googling in the wrong place.  When I looked in Java I found the switch
> "-XX:+AggessiveHeap" which fixed the problem.
> 
>  
> 
> Is this the best switch to use, or is there something better?
> 
>  
> 
> Rick
> 
>  
> 
>   _  
> 
> From: Rick Roen [mailto:[EMAIL PROTECTED] 
> Sent: Tuesday, July 25, 2006 4:12 PM
> To: fop-users@xmlgraphics.apache.org
> Subject: out of memory error
> 
>  
> 
> I'm running FOP from the current build of about a month ago on XP Pro SP2
> 1.5GB RAM
> 
>  
> 
> I have a command line routine that runs sales documents (packing list,
> invoice etc.) from xml  .-> xslt (using Saxon 8 )->pdf.
> 
>  
> 
> This works with no problem except when I have an extra large document (maybe
> 20 or 25 pages, I'm not really sure since it does not work) when I get an
> error: 
> 
>  
> 
> "Exception in thread "main" java.lang.OutOfMemoryError: Java heap space"
> 
>  
> 
> I am using "page-number-citation" at the end of the document to print "page
> x of y" at the bottom of each page.  I seem to recall that this might force
> the entire document to be created in memory.  Does anyone know if this could
> be the problem, or is there some other way to reduce the memory consumption?
> 
>  
> 
> Thanks,
> 
>  
> 
> Rick
> 



Jeremias Maerki


-
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



RE: out of memory error

2006-07-25 Thread Rick Roen








Update…

 

I was googling in the wrong place.  When
I looked in Java I found the switch “–XX:+AggessiveHeap”
which fixed the problem.

 

Is this the best switch to use, or is
there something better?

 

Rick

 









From: Rick Roen
[mailto:[EMAIL PROTECTED] 
Sent: Tuesday, July 25, 2006 4:12
PM
To:
fop-users@xmlgraphics.apache.org
Subject: out of memory error



 

I’m running FOP from the current build of about a
month ago on XP Pro SP2 1.5GB RAM

 

I have a command line routine that runs sales documents
(packing list, invoice etc.) from xml  .-> xslt (using Saxon 8
)->pdf.

 

This works with no problem except when I have an extra large
document (maybe 20 or 25 pages, I’m not really sure since it does not
work) when I get an error: 

 

“Exception in thread "main"
java.lang.OutOfMemoryError: Java heap space”

 

I am using “page-number-citation” at the end of
the document to print “page x of y” at the bottom of each page.
 I seem to recall that this might force the entire document to be created
in memory.  Does anyone know if this could be the problem, or is there
some other way to reduce the memory consumption?

 

Thanks,

 

Rick








out of memory error

2006-07-25 Thread Rick Roen








I’m running FOP from the current build of about a
month ago on XP Pro SP2 1.5GB RAM

 

I have a command line routine that runs sales documents
(packing list, invoice etc.) from xml  .-> xslt (using Saxon 8 )->pdf.

 

This works with no problem except when I have an extra large
document (maybe 20 or 25 pages, I’m not really sure since it does not
work) when I get an error: 

 

“Exception in thread "main"
java.lang.OutOfMemoryError: Java heap space”

 

I am using “page-number-citation” at the end of
the document to print “page x of y” at the bottom of each page.  I
seem to recall that this might force the entire document to be created in
memory.  Does anyone know if this could be the problem, or is there some
other way to reduce the memory consumption?

 

Thanks,

 

Rick