|
Would you not be better using a Data
Transformations Service Package inside SQL Server?
We regularly import aorund 80Mb of data exported
form our overnbight porcessing systems for live reports. It usually takes about
25 mins on a fairly lowly SQL box.
If all your CF is doing is reading the CSV and
stuffing the data into columns in a SQL table a DTS will be vastly more
efficient.
If the file is a standard format, you can set up
the DTS using its wizard and save the packge. Then just replace the file each
time and run the package again.
You can script inside DTS packages and get clever
if you so wich, heirarchical data set and so on but that's less straightforward.
there are some good books on DTS. Its a vastly mis-understood and
under-estimated component inside SQL Server.
|
- RE: [ cf-dev ] Importing large amounts of data t... rafe . fitzpatrick
- RE: [ cf-dev ] Importing large amounts of d... rafe . fitzpatrick
- RE: [ cf-dev ] Importing large amounts ... LIVE Support
- RE: [ cf-dev ] Importing large amounts of d... rafe . fitzpatrick
- RE: [ cf-dev ] Importing large amounts ... LIVE Support
- RE: [ cf-dev ] Importing large amounts ... Steve Powell
- RE: [ cf-dev ] Importing large amou... LIVE Support
- Re: [ cf-dev ] Importing large ... Steve Powell
- RE: [ cf-dev ] Importing large amou... Kola Oyedeji
- RE: [ cf-dev ] Importing large amounts of d... Albert Popkov
- [ cf-dev ] SoEditor and XHTML validatio... Damian Watson
- RE: [ cf-dev ] SoEditor and XHTML v... Adrian Lynch
- RE: [ cf-dev ] SoEditor and XHT... Damian Watson
- RE: [ cf-dev ] Importing large amounts of d... Mark Smyth
- RE: [ cf-dev ] Importing large amounts of d... Robertson-Ravo, Neil (RX)
- RE: [ cf-dev ] Importing large amounts of d... Barry L Beattie
