You can provide an upload using a CF page, rename the file to something fixed, or copy it. Or if you want to be very clever, script a DTS task which checks a folder and figures out which files it hasn't imported. OK so the scripting is in this bizarre VBScript stuff, not quite VB, not quite _javascript_ but its not impossible to use, just close to.
 
You can then run the DTS using the command line utility DTSRUN and CFEXECUTE (thinking about it, you can pass the filename in as a param to the command line)
 
Anything that involves importing more than a couple of hundred lines of data is a big, potentially slow CF page to run.
 
DTS will always be vastly faster than anything else you can write. The stuff we import, hits a whole sequence of triggers to prevent duplication and also to consolidate the data into aggregate tables. We also store each data line in a set of secondary tables so we can roll it back out and/or back in again if there's a problem. So 25 mins to import a little under 150,000 lines (twice) is, well,  pretty quick.
 
The problem will be figuring out whne the DTS package has finished.
 
Th only other way is to do a SQL INSERT FROM and use a dynamic data source to link to your uploaded CSV file. At least that way you are using the RDBMS for processing the data, COuld be a slow query though. CF might think its timed out, when it just hasn;'t finished.
 
How does CFMX do dynamic data sources?
 
regards
 
 
Steve
----- Original Message -----
Sent: Thursday, June 10, 2004 9:44 AM
Subject: RE: [ cf-dev ] Importing large amounts of data to sql

Hi Steve,

 

Thanks for your input, its appreciated, although my main issue with using DTS is that the files are being uploaded by the client so the size of the file is unknown as is the filename.

Also the data is required in the quickest time possible after the upload so I am after a solution which can be fast, dynamic and can replace the existing cfhttp process.

 

 

 

 

-----Original Message-----
From: Steve Powell [mailto:[EMAIL PROTECTED]
Sent: 09 June 2004 20:49
To: [EMAIL PROTECTED]
Subject: Re: [ cf-dev ] Importing large amounts of data to sql

 

Would you not be better using a Data Transformations Service Package inside SQL Server?

 

We regularly import aorund 80Mb of data exported form our overnbight porcessing systems for live reports. It usually takes about 25 mins on a fairly lowly SQL box.

 

If all your CF is doing is reading the CSV and stuffing the data into columns in a SQL table a DTS will be vastly more efficient.

 

If the file is a standard format, you can set up the DTS using its wizard and save the packge. Then just replace the file each time and run the package again.

 

You can script inside DTS packages and get clever if you so wich, heirarchical data set and so on but that's less straightforward. there are some good books on DTS. Its a vastly mis-understood and under-estimated component inside SQL Server.

 

 

 

----- Original Message -----

Sent: Wednesday, June 09, 2004 5:48 PM

Subject: RE: [ cf-dev ] Importing large amounts of data to sql

 


Does this help?

http://www.houseoffusion.com/cf_lists/index.cfm?method=messages&threadid=18243&forumid=4

Rafe

Reply via email to