From: Matt Robertson [mailto:websitema...@gmail.com]
Sent: Saturday, February 19, 2011 8:06 PM
To: cf-talk
Subject: Re: Big XML files processing Really s-l-o-w. Solution?
Here's the update:
Jochem's StAX-based solution worked so well -- and it allowed me to
re-use existing code sinc
Nice...
-Original Message-
From: Matt Robertson [mailto:websitema...@gmail.com]
Sent: Saturday, February 19, 2011 9:06 PM
To: cf-talk
Subject: Re: Big XML files processing Really s-l-o-w. Solution?
Here's the update:
Jochem's StAX-based solution worked so well -- and it all
Nice report! I'll try and remember that the next time I have to parse huge
XML files. :-)
~|
Order the Adobe Coldfusion Anthology now!
http://www.amazon.com/Adobe-Coldfusion-Anthology/dp/1430272155/?tag=houseoffusion
Archive:
Here's the update:
Jochem's StAX-based solution worked so well -- and it allowed me to
re-use existing code since I was already processing xml -- that I
didn't bother to consider testing the xml2csv utility.
Original code processing a 45mb file yielded an insert time of 90
seconds per record wit
On Thu, Feb 17, 2011 at 8:56 AM, Rick Faircloth wrote:
>
> Good to know!
Indeed. Thanks to Marco and Mark. I plan on trying out both this and
Jochem's solution starting maybe as soon as tomorrow and certainly
through the long U.S. holiday weekend. Maybe try both and see which
is faster. I'll
Good to know!
-Original Message-
From: Marco Antonio C. Santos [mailto:marcoacsan...@gmail.com]
Sent: Thursday, February 17, 2011 10:54 AM
To: cf-talk
Subject: Re: Big XML files processing Really s-l-o-w. Solution?
In our company I found a solution to parse big XML files in
-Mark
-Original Message-
From: Marco Antonio C. Santos [mailto:marcoacsan...@gmail.com]
Sent: Thursday, February 17, 2011 9:54 AM
To: cf-talk
Subject: Re: Big XML files processing Really s-l-o-w. Solution?
In our company I found a solution to parse big XML files in ColdFusion
application
27;re working on pans out.
>
>
>
> -Original Message-
> From: Matt Robertson [mailto:websitema...@gmail.com]
> Sent: Thursday, February 17, 2011 9:42 AM
> To: cf-talk
> Subject: Re: Big XML files processing Really s-l-o-w. Solution?
>
>
> On Wed, Feb 16, 2011 at 8:08 PM, R
ebruary 17, 2011 9:42 AM
To: cf-talk
Subject: Re: Big XML files processing Really s-l-o-w. Solution?
On Wed, Feb 16, 2011 at 8:08 PM, Rick Faircloth wrote:
> Perhaps this will be of some help.
Thanks Rick, Interesting stuff. I don't work with xml so much so a
lot of this is new, and
On Wed, Feb 16, 2011 at 8:08 PM, Rick Faircloth wrote:
> Perhaps this will be of some help.
Thanks Rick, Interesting stuff. I don't work with xml so much so a
lot of this is new, and I hate new :-). I haven't been a full-time
programmer for a few years now so I haven't kept up with everything
d, this is just the result of tinkering around with
what I've been finding and attempting to get some usable output.
Perhaps this will be of some help.
Rick
-Original Message-
From: Matt Robertson [mailto:websitema...@gmail.com]
Sent: Wednesday, February 16, 2011 10:18 PM
To: cf-t
On Wed, Feb 16, 2011 at 3:49 PM, Jochem van Dieten wrote:
> Would it help if you split the file into individual records before
> processing them?
I think it would help immensely I think. I'd prefer to fight my way
thru this in CF if at all possible and things are just complex enough
that I may n
On Mon, Feb 14, 2011 at 11:44 PM, Matt Robertson wrote:
> I am tasked with doing an import on a file whose size can essentially
> be unlimited. We've been able to handle in the 10-15mb range but it
> recently ballooned to 100 mb, and its going to get larger. Processing
> time seems to be about 6
essage-
> From: Matt Robertson [mailto:websitema...@gmail.com]
> Sent: Wednesday, February 16, 2011 12:22 PM
> To: cf-talk
> Subject: Re: Big XML files processing Really s-l-o-w. Solution?
>
>
> Hey everybody, we're in a bind here and I pitched the idea to the
> client of
sing
> MySQL's load data infile, the time was reduced to less than a second.
>
> Check out the page above for details on usage and syntax.
>
> hth,
>
> Rick
>
> -----Original Message-----
> From: Matt Robertson [mailto:websitema...@gmail.com]
> Sent: Wednesday
1 12:22 PM
To: cf-talk
Subject: Re: Big XML files processing Really s-l-o-w. Solution?
Hey everybody, we're in a bind here and I pitched the idea to the
client of bringing in a ringer. Is there someone out there who is
interested in building a direct-to-SQL Server or direct-to-mySQL-based
solu
Hey everybody, we're in a bind here and I pitched the idea to the
client of bringing in a ringer. Is there someone out there who is
interested in building a direct-to-SQL Server or direct-to-mySQL-based
solution?
Solution would have to, on a scheduled basis, grab a file in a
location on the serv
Very interesting stuff, guys. Very much appreciated. Should have
come here before I did the project as opposed to afterward. You'd
think I'd know better by now.
--
--m@Robertson--
Janitor, The Robertson Team
mysecretbase.com
~~~
That is probably the solution I was thinking of too, I expect Ben or someone
just wrapped it up into a UDF or CFC or posted it on one of the
aforementioned sites.
Russ
On Tue, Feb 15, 2011 at 12:32 AM, Michael Grant wrote:
>
> That actually looks like a pretty nice solution if bulkinsert isn't
That actually looks like a pretty nice solution if bulkinsert isn't an
option.
On Mon, Feb 14, 2011 at 7:24 PM, Alan Rother wrote:
>
> When in doubt, ask Ben Nadel(or check his blog)... Words to live by
>
> I had the same issue, huge variable XML file, lots of variations, had a
> hell
> of a t
Processing the XML with Java rather than CFML will speed things up.
check riaforge.org or cflib.org and the adobe exchange, somewhere there is a
XML parser that does just this and may give you the boost you want.
Russ
On Tue, Feb 15, 2011 at 12:15 AM, Casey Dougall <
ca...@uberwebsitesolutions.c
When in doubt, ask Ben Nadel(or check his blog)... Words to live by
I had the same issue, huge variable XML file, lots of variations, had a hell
of a time making it work in SQL Server due to the complexity
Used this, works amazingly well, all things considered, executes
very quickly
http://www.
On Mon, Feb 14, 2011 at 7:07 PM, Matt Robertson wrote:
>
> Client is running mySQL, But a SQL Server could probably be arranged
> if we have to. Kind of a painful investment for them, though.
>
>
Well seems there is support for this in mysql as well but examples use
something simular to BulkIns
>
> > you can make use of BulkInsert
+1. I love love love bulkinsert.
~|
Order the Adobe Coldfusion Anthology now!
http://www.amazon.com/Adobe-Coldfusion-Anthology/dp/1430272155/?tag=houseoffusion
Archive:
http://www.houseoff
On Mon, Feb 14, 2011 at 3:27 PM, Mark Mandel wrote:
> Either that, or start looking at something like StaX -
> http://stax.codehaus.org/Home
thx for the tip on Stax but it frankly looks a bit out of my league
for this project.
--
--m@Robertson--
Janitor, The Robertson Team
mysecretbase.com
~~~
Client is running mySQL, But a SQL Server could probably be arranged
if we have to. Kind of a painful investment for them, though.
I was kind of hoping you guys would find a bonehead mistake that would
magically clear this up. I'm out of CF-based ideas.
Speaking of which, I tried shortening th
On Mon, Feb 14, 2011 at 6:27 PM, Mark Mandel wrote:
>
> Either that, or start looking at something like StaX -
> http://stax.codehaus.org/Home
>
> Pulling that large a file in memory is going to suck a lot, no matter
> what you do.
>
> Mark
>
>
yeah, I don't know, still seems better equipped for
Either that, or start looking at something like StaX -
http://stax.codehaus.org/Home
Pulling that large a file in memory is going to suck a lot, no matter
what you do.
Mark
On Tue, Feb 15, 2011 at 10:24 AM, Casey Dougall
wrote:
>
> On Mon, Feb 14, 2011 at 5:44 PM, Matt Robertson wrote:
>
>>
>>
On Mon, Feb 14, 2011 at 5:44 PM, Matt Robertson wrote:
>
> My question is how do I reduce processing time... is there anything I
> could be doing better?
>
> I am tasked with doing an import on a file whose size can essentially
> be unlimited. We've been able to handle in the 10-15mb range but i
My question is how do I reduce processing time... is there anything I
could be doing better?
I am tasked with doing an import on a file whose size can essentially
be unlimited. We've been able to handle in the 10-15mb range but it
recently ballooned to 100 mb, and its going to get larger. Proce
30 matches
Mail list logo