David,
        Unless you need to transform the data as you import it, you will
probably have better luck with a bulk insert.  Use bcp to create a format
file that you can use with the bulk insert operation.  You can run your
tests with bcp to make sure it will do the job, then script it out in
query analyzer.

Thanks,
Eric



"David Brown" <[EMAIL PROTECTED]>
12/19/2003 11:02 AM
Please respond to sql

        To:     SQL <[EMAIL PROTECTED]>
        cc:
        Subject:        DTS package

I have a pipe delimited text file (17 columns) with LF as the row
delimiter.  One of the files that I need to import is over 56 megs.  I
have tried DTS to import the file, but get around 1000 rows per second.

Is bulk copy the best or just straight import through a dts package?

I am looking for the fastest way to import it.  After 4 min I had 86000
rows out of about a million.

David
[Todays Threads] [This Message] [Subscription] [Fast Unsubscribe] [User Settings]

Reply via email to