I missed the point of this email? 
but I sort of assumed the following.

Assumptions:
1. Files are very large and machine generated.
2. The source of the files is a third party.
3. You don't want humans to ever look at the files.
4. Breaking the files up arbitrarily (which you would have to do if 
   your processing is memory bound) is not an option. 
5. All your machines have different memory profiles and hence can handle
   different size files in xalan.

Solutions:

1. get the user to break up the large file into smaller piece
   but if you are going to pre-parse a large file you might as well
   not use xalan and just do the transforms yourself in SAX.

2. change xalan to handle arbitrary length files (I'm not even suggesting
this
   or saying this easy or anything, it's just a stated option)

Wishing everyone a Merry Xmas and a very happy new year etc... *8-)

Greg

-----Original Message-----
From: Andrew Welch [mailto:[EMAIL PROTECTED]
Sent: Tuesday, 16 December 2003 22:28
To: [EMAIL PROTECTED]
Subject: RE: Memory Consumption for Large XML Transformation (50MB -
1GB)



> I'm hoping we'll start doing some of that this year, 
> actually... I'd hoped to see some of it last year, but XDM 
> has been taking much more effort than anticipated.

There's not long left of the year now... Or are you a few weeks ahead of
the rest of us.. :)

I was just wondering how it was possible that a 50mb xml file could not
be broken up into smaller pieces.  If it can't then it surely is poor
design - if the 1gb file can't be broken up then someone should give up
somewhere.


=====================================================================
WARNING -This e-mail, including any attachments, is for the 
personal use of the recipient(s) only.
Republication and re-dissemination, including posting to news 
groups or web pages, is strictly prohibited without the express
prior consent of
Thomson Legal & Regulatory Limited
ABN 64 058 914 668
=====================================================================

Reply via email to