Jason Ferguson wrote:
The data is split into about 60 files, average file size of 5 MB (varying
from 1 to 10 MB). Since there are many files, I'm trying to minimize the
required work (if there was just one consolidated file, no problem).
Jason
snippety-snip
Hi Jason
If it's not too late
Jason Ferguson wrote:
The data is split into about 60 files, average file size of 5 MB (varying
from 1 to 10 MB). Since there are many files, I'm trying to minimize the
required work (if there was just one consolidated file, no problem).
The work can be automated easily with the right tools
Jason Ferguson [EMAIL PROTECTED] wrote on 09/26/2005 10:58:02 PM:
Many thanks for the earlier response to why LOAD DATA INFILE wasnt
working
for me. However, another problem has appeared.
In the file I am reading, 2 of the fields are SUPPOSED to be float
values.
However, in several
Many thanks for the earlier response to why LOAD DATA INFILE wasnt working
for me. However, another problem has appeared.
In the file I am reading, 2 of the fields are SUPPOSED to be float values.
However, in several places, they are set to UNKNOWN. This seems to cause
LOAD to abort.
Is there a
Jason Ferguson wrote:
Many thanks for the earlier response to why LOAD DATA INFILE wasnt working
for me. However, another problem has appeared.
In the file I am reading, 2 of the fields are SUPPOSED to be float values.
However, in several places, they are set to UNKNOWN. This seems to cause
The data is split into about 60 files, average file size of 5 MB (varying
from 1 to 10 MB). Since there are many files, I'm trying to minimize the
required work (if there was just one consolidated file, no problem).
Jason
On 9/26/05, Jasper Bryant-Greene [EMAIL PROTECTED] wrote:
Jason Ferguson
You'll have to edit your input file. There will always be instances
where some field is quirky and you need to fix it/them/entire rows.
Don't expect the input file to be perfect.
I'd also suggest that you have a test database on a test machine that is
devoted entirely to getting your tables
Then you are in for quite a lot of editing work. I've done it a lot
myself. Don't expect your project to be easy. Look for automated ways to
edit the data according to your needs and the actual table structure.
Bob Cochran
Jason Ferguson wrote:
The data is split into about 60 files, average