Hi Steve,

 

Thanks for your input, its appreciated, although my main issue with using DTS is that the files are being uploaded by the client so the size of the file is unknown as is the filename.

Also the data is required in the quickest time possible after the upload so I am after a solution which can be fast, dynamic and can replace the existing cfhttp process.

 

 

 

 

-----Original Message-----
From: Steve Powell [mailto:[EMAIL PROTECTED]
Sent: 09 June 2004 20:49
To: [EMAIL PROTECTED]
Subject: Re: [ cf-dev ] Importing large amounts of data to sql

 

Would you not be better using a Data Transformations Service Package inside SQL Server?

 

We regularly import aorund 80Mb of data exported form our overnbight porcessing systems for live reports. It usually takes about 25 mins on a fairly lowly SQL box.

 

If all your CF is doing is reading the CSV and stuffing the data into columns in a SQL table a DTS will be vastly more efficient.

 

If the file is a standard format, you can set up the DTS using its wizard and save the packge. Then just replace the file each time and run the package again.

 

You can script inside DTS packages and get clever if you so wich, heirarchical data set and so on but that's less straightforward. there are some good books on DTS. Its a vastly mis-understood and under-estimated component inside SQL Server.

 

 

 

----- Original Message -----

Sent: Wednesday, June 09, 2004 5:48 PM

Subject: RE: [ cf-dev ] Importing large amounts of data to sql

 


Does this help?

http://www.houseoffusion.com/cf_lists/index.cfm?method=messages&threadid=18243&forumid=4

Rafe

Reply via email to