Nice...

-----Original Message-----
From: Matt Robertson [mailto:websitema...@gmail.com] 
Sent: Saturday, February 19, 2011 9:06 PM
To: cf-talk
Subject: Re: Big XML files processing Really s-l-o-w. Solution?


Here's the update:

Jochem's StAX-based solution worked so well -- and it allowed me to
re-use existing code since I was already processing xml -- that I
didn't bother to consider testing the xml2csv utility.

Original code processing a 45mb file yielded an insert time of 90
seconds per record with total file processing time being a projected
66 hours.

Using code that incorporated xmlSplitter.cfc, the routine created 2682
discrete xml files in approximately 45 seconds.  From there, the
insert loop did its work pulling out xml and pouring it into roughly
100 db table fields at ... 20 records per second.

Total time to read in the 45mb xml file, create the discrete files,
read them, insert them into the db and delete them (one at a time as I
went along in the loop) was 192250ms.

A bit more than three minutes.

Thats an acceptable level of improvement.

I owe you one, Jochem.

-- 
--m@Robertson--
Janitor, The Robertson Team
mysecretbase.com



~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~|
Order the Adobe Coldfusion Anthology now!
http://www.amazon.com/Adobe-Coldfusion-Anthology/dp/1430272155/?tag=houseoffusion
Archive: 
http://www.houseoffusion.com/groups/cf-talk/message.cfm/messageid:342452
Subscription: http://www.houseoffusion.com/groups/cf-talk/subscribe.cfm
Unsubscribe: http://www.houseoffusion.com/groups/cf-talk/unsubscribe.cfm

Reply via email to