Can't use REPLACE.  The table contains a little more data than what's
supplied in the bi-weekly logfile.  The REPLACE command will delete the
old record before inserting its replacement, which in this case would
lose the data in the other fields.

p

On Wed, Jul 09, 2003 at 03:50:42PM +0200, Rudy Metzger wrote:
> 
> Paul,
> 
> Did you try using REPLACE instead of UPDATE/INSERT? Could give you some
> more speed enhancement.
> 
> Cheers
> /rudy
> 
> -----Original Message-----
> From: Paul Chvostek [mailto:[EMAIL PROTECTED] 
> Sent: woensdag 9 juli 2003 15:21
> To: cmars
> Cc: [EMAIL PROTECTED]
> Subject: Re: Bulk loading data
> 
> On Tue, Jul 08, 2003 at 09:36:11PM -0700, cmars wrote:
> > 
> > I want to insert massive amounts of data in near real-time into a
> MySQL database.  The data is in a structured binary format, and I have
> code that parses the data into logical data structures.
> ...
> > How can I load data directly into the database at a lower more direct
> level than LOAD FILE?  What are my options to integrate my data parser
> with MySQL?
> 
> Other than patching mysqld, I can't see how you'd get an additional
> interface to the database files.
> 
> I have s similar (though certainly not identical) issue with data I'm
> dealing with -- about 4 million records twice a week.  Source data is in
> files with fixed-length fields with a newline after each record, and
> contains some data that must be updated, and some that must be inserted.
> (I.e. I receive replacements not diffs.)
> 
> My solution was lowbrow, but works well.  I pipe the source file through
> a huge honkin' sed script that grabs the fields and converts them into
> an INSERT statement, with sed's output piped through the mysql text
> client.  (sed 's/^(...)(..)(....)/ etc etc/')
> 
> Each record turns into two statements, first an UPDATE (which fails if
> the record doesn't exist), second an INSERT (which fails if the record
> does exist).  I obviously run the mysql client with the -f option.  This
> method lets me keep the database live while the update runs, whereas
> LOAD FILE would require that I flush the data before inserting.
> 
> Sure, I'm pushing far more goop through the pipe than the data I'm
> updating, but it's just a pipe, and the run happens at 4AM when nobody
> is watching.
> 
> My db server is a 400MHz P-I and my disks are slow, but it takes me over
> an hour to run the file so I'm not sure how this solution would fare a
> hundred or a thousand times a day....

-- 
  Paul Chvostek                                             <[EMAIL PROTECTED]>
  Operations / Abuse / Whatever
  it.canada, hosting and development                   http://www.it.ca/


-- 
MySQL General Mailing List
For list archives: http://lists.mysql.com/mysql
To unsubscribe:    http://lists.mysql.com/[EMAIL PROTECTED]

Reply via email to