Re: mysql import or write my own Perl parser

2005-04-19 Thread Eric Bergen
Both load data infile and mysqlimport both support the ignore option. If you are loading a file into a table with a unique index and duplicate rows in the file any duplicates will be silently ignored. On 4/19/05, newbie c <[EMAIL PROTECTED]> wrote: > thanks for the reply! I am not too concerned a

Re: mysql import or write my own Perl parser

2005-04-19 Thread newbie c
thanks for the reply! I am not too concerned about cutting out the columns as I may need to other information for later. I was just wondering does it make a difference if both of the columns that I am interested have entries that are NOT unique? Also, at what point does one need a parser or use

Re: mysql import or write my own Perl parser

2005-04-18 Thread Eric Bergen
awk is probably the best tool I can think of for cutting columns out of a text file. Something like awk -F \\t '{ print $2 "," $3 }' my_file could be used to pick the second and third column out of a file prior to importing it. -Eric On 4/18/05, newbie c <[EMAIL PROTECTED]> wrote: > Hi, > > I am

mysql import or write my own Perl parser

2005-04-18 Thread newbie c
Hi, I am about to create a database and there are a number of files that I need to load into the database. They are tab delimited files. One of the files contains about 4 or 5 columns. I am only interested in the second and the third column right now but I will load the whole table. The val