I'm load a CSV file with five fields into mysql.  It loads up fine. 
Problem comes in when I try to refresh the data with updates.

I grab this file and convert it to CSV every 12 hours.  The data shows
past 24 hours only, so basically I'm trying to make a permanent archive. 
I tried creating a PRIMARY key of the first 4 fields to be unique to
filter out duplicates.  Problem is, when I have that key on, no new data
gets written to the database, even clearly NEW records.

Table is warlog, fields are time, attacker, coords, defender, status
(obviously a game).  A unique record would match the first four, or at the
very least time and coords.  Whenever I go to load data infile the second
time around, no records get written if any indexes are present.  If no
indexes I get duplicates.  If I put an index with the duplicates, I only
get the first set of data with no updates.

Is there something about primary keys I should know about?  I've created
this database with them and tried every combination, but I can't seem to
get the update part to work.

-- 
j
http://decision.csl.uiuc.edu/~bambenek

-- 
MySQL General Mailing List
For list archives: http://lists.mysql.com/mysql
To unsubscribe:    http://lists.mysql.com/[EMAIL PROTECTED]

Reply via email to