>Sounds like it may just be trying to load the entire 187 MB into memory 

No. My initial approach was to attempt to load the entire file. Obviously, that 
would consume a lot of memory and cause problems. Instead, I'm using the 
<cfloop file=""> process to load one line of xml into memory at a time for 
processing (thankfully each child element of the parent exists on its own 
unique line). 

Reading http://forums.adobe.com/thread/43968 makes it seem as if the 
GCOverheadLimit error is being thrown because the java garbage collector is 
taking to long to clean up memory; this is consistent with the behavior I have 
seen. If I load up multiple files to be simultaneously processed (with parallel 
threads) the smaller files finish fine but the filesize for the large file 
outputs slows to a crawl the closer jrun.exe gets to the maximum heap size. 
Finally the GCOverheadLimit error is thrown, the <cfthread action="join"> ops 
runs, and I see the dumped error. 

It's as if the garbage collector can't effectively clean up memory allocated to 
threaded processes; or at least can't release it until all threads are joined 
together?

Belch. I had so much hope for the parallel processing but it just seems to be 
obscuring and creating more problems than if I just did a straightforward, 
linear processing of files. 

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~|
Want to reach the ColdFusion community with something they want? Let them know 
on the House of Fusion mailing lists
Archive: 
http://www.houseoffusion.com/groups/cf-talk/message.cfm/messageid:327531
Subscription: http://www.houseoffusion.com/groups/cf-talk/subscribe.cfm
Unsubscribe: http://www.houseoffusion.com/cf_lists/unsubscribe.cfm?user=89.70.4

Reply via email to