You can't use BCP to load data without some form of input file, can
you?  So what could you do with BCP that you can't do by landing the
data in a CSV file and then loading it using some other tool?

   I think that DRH's point was that the functionality provided by the
command line tool is nothing more than an INSERT statement that is
prepared once and executed many times.  What could be more programmatic
than that?  And if that approach isn't fast enough for you purposes -
and I mean isn't actually fast enough, not just isn't what you think
must be necessary for acceptable performance - then I submit to you that
SQLite is the wrong database for you.

   -Tom

> -----Original Message-----
> From: Anderson, James H (IT) [mailto:[EMAIL PROTECTED] 
> Sent: Monday, December 18, 2006 9:29 AM
> To: sqlite-users@sqlite.org
> Subject: RE: [sqlite] Is there a method for doing bulk insertion?
> 
> I was hoping there was the equivalent of Sybase's BCP program. I was
> also hoping something programmatic was available, i.e., not something
> from the command shell. Maybe a little background would help.
> 
> I'm planning on using the perl package DBD::SQLite. My department is a
> big sybase user but because of the nature of our workload, we 
> experience
> a lot of contention in both the transaction log and tempdb 
> (the database
> that houses temporary tables). I'm investigating the feasibility of
> transferring data into SQLite, doing all the data manipulations there,
> and then transferring it back to the appropriate sybase tables. I
> suspect this could be a big win for a number of our applications.
> 
> But if it can be avoided, I don't want to do a CSV 
> conversion, nor do I
> want to shell out of the code to invoke this.
> 
> jim
> 
> -----Original Message-----
> From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] 
> Sent: Monday, December 18, 2006 9:12 AM
> To: sqlite-users@sqlite.org
> Subject: Re: [sqlite] Is there a method for doing bulk insertion?
> 
> "Anderson, James H \(IT\)" <[EMAIL PROTECTED]> wrote:
> > ....or do I have to creation a gazillion insert statements?
> > 
> 
> The sqlite3 command-line shell has a ".import" command which
> can be used to read CSV data.  But the way this works internally
> is that the command-line shell constructs an INSERT statement,
> parses each line of the CSV file and binds the values to that
> INSERT statement, then runs the INSERT statement for each line.
> So at the end of the day, a bunch of INSERT statements are still
> getting evaluated - you just don't see them.
> 
> On my workstation, an INSERT statement can be parsed, compiled,
> and evaluated in 25-40 microseconds.  That's about 30000 rows
> per second.  How much performance do you need?
> 
> --
> D. Richard Hipp  <[EMAIL PROTECTED]>
> 
> 
> --------------------------------------------------------------
> ----------
> -----
> To unsubscribe, send email to [EMAIL PROTECTED]
> --------------------------------------------------------------
> ----------
> -----
> --------------------------------------------------------
> 
> NOTICE: If received in error, please destroy and notify 
> sender. Sender does not intend to waive confidentiality or 
> privilege. Use of this email is prohibited when received in error.
> 
> --------------------------------------------------------------
> ---------------
> To unsubscribe, send email to [EMAIL PROTECTED]
> --------------------------------------------------------------
> ---------------
> 
> 

-----------------------------------------------------------------------------
To unsubscribe, send email to [EMAIL PROTECTED]
-----------------------------------------------------------------------------

Reply via email to