On Tue, Jul 08, 2003 at 09:36:11PM -0700, cmars wrote: > > I want to insert massive amounts of data in near real-time into a MySQL database. > The data is in a structured binary format, and I have code that parses the data into > logical data structures. ... > How can I load data directly into the database at a lower more direct level than > LOAD FILE? What are my options to integrate my data parser with MySQL?
Other than patching mysqld, I can't see how you'd get an additional interface to the database files. I have s similar (though certainly not identical) issue with data I'm dealing with -- about 4 million records twice a week. Source data is in files with fixed-length fields with a newline after each record, and contains some data that must be updated, and some that must be inserted. (I.e. I receive replacements not diffs.) My solution was lowbrow, but works well. I pipe the source file through a huge honkin' sed script that grabs the fields and converts them into an INSERT statement, with sed's output piped through the mysql text client. (sed 's/^(...)(..)(....)/ etc etc/') Each record turns into two statements, first an UPDATE (which fails if the record doesn't exist), second an INSERT (which fails if the record does exist). I obviously run the mysql client with the -f option. This method lets me keep the database live while the update runs, whereas LOAD FILE would require that I flush the data before inserting. Sure, I'm pushing far more goop through the pipe than the data I'm updating, but it's just a pipe, and the run happens at 4AM when nobody is watching. My db server is a 400MHz P-I and my disks are slow, but it takes me over an hour to run the file so I'm not sure how this solution would fare a hundred or a thousand times a day.... -- Paul Chvostek <[EMAIL PROTECTED]> Operations / Abuse / Whatever it.canada, hosting and development http://www.it.ca/ -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe: http://lists.mysql.com/[EMAIL PROTECTED]