On Thu, 19 Jun 2008, Singh, Saurabh (IT) wrote:
We have large data fetched from database. Around 10000 rows and 33 columns per row. Also, we do not want to increase the heap size of jvm.
Your two options are: * increase the JVM heap size * use a less memory hungry format, eg csv
Can you suggest something on lines of writingthe rows incrementally into Excel sheet. I mean, say every 1000 records I write I close the output stream object.
Alas the excel file format doesn't work like that. You need to keep going back and adding / changing earlier records when you write the later ones. The excel file format was also never really designed to hold the sorts of data volumes you're talking about, which is why you're hitting issues - it's just not optimised for it.
I'd suggest you just whack up the jvm heap size - the default is really rather small.
Nick --------------------------------------------------------------------- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
