RE: Large Data Import

2002-11-21 Thread Lorenzo Curtis
The problem was with the ending record/line break character.

I added:
LINES TERMINATED BY '\r\n'

and the import worked wonderully! And only took a little over ten
minutes to process.

Thanks for the help.

-Original Message-
From: Roger Abrahamsson [mailto:[EMAIL PROTECTED]] 
Sent: Wednesday, November 20, 2002 10:04 AM
To: [EMAIL PROTECTED]
Cc: [EMAIL PROTECTED]
Subject: Re: Large Data Import


Well, I've done imports of about 50.000-100.000 posts from other
databases. I've gone another way then, and through perl/vi generated a
file of 
mysql commands
that I've then run. Maybe is a bit slower, but you get a complete log in

case some things dont
work out when you have logging turned on.

I would guess that most likely you have problems with quotation marks, 
special tokens or the like
that interferes.

anyway, one way of doing it would be something like this. (

cat file | perl -e 'while($x=){chomp $x; $x=~s/,/\,\/g; print 
insert into XX values(\; print $x; print \);\n }'  file.sql cat
file.sql | mysql

as for limitations.. I've run databases that had 12million records in 
mysql, with no
problems, except selects with criterias from that database tended to 
become slow. :-)

 /Roger


Lorenzo Curtis wrote:

MySQL Gurus:
 
I am working on a process to import data from a VMS database (SJ2) to 
MySQL. The table that I am importing has 10378507 (over 10 million) 
records in it.  I have the records
exported to a comma-delimited text file, as the fields within the
database do not
contain any commas at all.
 
The command I am using to do the import is:
mysql LOAD DATA LOCAL INFILE '/home/warehouse/importfile.txt'
   - INTO TABLE importtable
   - FIELDS TERINATED BY ','
   - ENCLOSED BY '';
 
The process runs through (it takes about an hour into a non-indexed 
MyISAM table). At the end I get the following message:
Query OK, 5184509 rows affected
Records: 5184509Deleted: 0Skipped: 0Warnings 13376589
 
I have logging turned on, but the log file only shows the MySQL 
commands that I entered without showing me what all the warnings are.
 
Now my questions:
How can I see what the errors are that are occuring that cause half

of the records not to import?
Has anyone every done an import of over 10 million records before? 
Is there a limitation in MySQL?
 
Thank you in advance for you assistance with this issue.
 
Lorenzo Curtis
Dead River Company
[EMAIL PROTECTED]
www.deadriver.com



-
Before posting, please check:
   http://www.mysql.com/manual.php   (the manual)
   http://lists.mysql.com/   (the list archive)

To request this thread, e-mail [EMAIL PROTECTED]
To unsubscribe, e-mail 
[EMAIL PROTECTED]
Trouble unsubscribing? Try: http://lists.mysql.com/php/unsubscribe.php

  




-
Before posting, please check:
   http://www.mysql.com/manual.php   (the manual)
   http://lists.mysql.com/   (the list archive)

To request this thread, e-mail [EMAIL PROTECTED]
To unsubscribe, e-mail [EMAIL PROTECTED]
Trouble unsubscribing? Try: http://lists.mysql.com/php/unsubscribe.php




Re: Large Data Import

2002-11-20 Thread Roger Abrahamsson
Well, I've done imports of about 50.000-100.000 posts from other databases.
I've gone another way then, and through perl/vi generated a file of 
mysql commands
that I've then run. Maybe is a bit slower, but you get a complete log in 
case some things dont
work out when you have logging turned on.

I would guess that most likely you have problems with quotation marks, 
special tokens or the like
that interferes.

anyway, one way of doing it would be something like this. (

cat file | perl -e 'while($x=){chomp $x; $x=~s/,/\,\/g; print 
insert into XX values(\; print $x; print \);\n }'  file.sql
cat file.sql | mysql

as for limitations.. I've run databases that had 12million records in 
mysql, with no
problems, except selects with criterias from that database tended to 
become slow. :-)

/Roger


Lorenzo Curtis wrote:

MySQL Gurus:

I am working on a process to import data from a VMS database (SJ2) to
MySQL.
The table that I am importing has 10378507 (over 10 million) records in
it.  I have the records
exported to a comma-delimited text file, as the fields within the
database do not
contain any commas at all.

The command I am using to do the import is:
   mysql LOAD DATA LOCAL INFILE '/home/warehouse/importfile.txt'
  - INTO TABLE importtable
  - FIELDS TERINATED BY ','
  - ENCLOSED BY '';

The process runs through (it takes about an hour into a non-indexed
MyISAM table).
At the end I get the following message:
   Query OK, 5184509 rows affected
   Records: 5184509Deleted: 0Skipped: 0Warnings 13376589

I have logging turned on, but the log file only shows the MySQL commands
that I entered without showing me what
all the warnings are.

Now my questions:
   How can I see what the errors are that are occuring that cause half
of the records not to import?
   Has anyone every done an import of over 10 million records before?
Is there a limitation in MySQL?

Thank you in advance for you assistance with this issue.

Lorenzo Curtis
Dead River Company
[EMAIL PROTECTED]
www.deadriver.com



-
Before posting, please check:
  http://www.mysql.com/manual.php   (the manual)
  http://lists.mysql.com/   (the list archive)

To request this thread, e-mail [EMAIL PROTECTED]
To unsubscribe, e-mail [EMAIL PROTECTED]
Trouble unsubscribing? Try: http://lists.mysql.com/php/unsubscribe.php

 



-
Before posting, please check:
  http://www.mysql.com/manual.php   (the manual)
  http://lists.mysql.com/   (the list archive)

To request this thread, e-mail [EMAIL PROTECTED]
To unsubscribe, e-mail [EMAIL PROTECTED]
Trouble unsubscribing? Try: http://lists.mysql.com/php/unsubscribe.php