I'm loading the data through the command below mysql -f -u root -p
enwiki enwiki.sql
The version is MySQL 5.0.51a-community
I've disabled the primary key, so there are no indexes. The CPU has 2
cores and 2 Gigs memory.
The import fell over overnight with a table full error as it hit 1T (I
Simon,
Why dont u split the file and use LOAD DATA INFILE command which would
improve the performance while loading into an empty table with keys
disabled.
regards
anandkl
On 6/5/08, Simon Collins [EMAIL PROTECTED] wrote:
I'm loading the data through the command below mysql -f -u root -p
You could load the data into several smaller tables and combine them
into a merged table which would have no real effect on the schema.
Ade
Simon Collins wrote:
I'm loading the data through the command below mysql -f -u root -p
enwiki enwiki.sql
The version is MySQL 5.0.51a-community
I've
I can do - if the load data infile command definitely improves
performance and splitting the file does the same I have no problem with
doing this. It just seems strange that it's problems with the way the
import file is configured. I thought the problem would be somehow with
the table getting
PROTECTED]gt; wrote:
From: Simon Collins lt;[EMAIL PROTECTED]gt;
Subject: Re: Large import into MYISAM - performance problems
To: mysql@lists.mysql.com
Date: Thursday, June 5, 2008, 3:05 PM
I#39;m loading the data through the command below mysql -f -u root -p
enwiki lt; enwiki.sql
The version
Simon,
In my experience load data infile is a lot faster than a sql file htrough
the client.
I would parse the sql file and create a csv file with just the columns of
your table and then use load data infile using the created csv file
Olaf
On 6/5/08 4:52 AM, Simon Collins [EMAIL PROTECTED]
At 10:30 AM 6/5/2008, you wrote:
Simon,
In my experience load data infile is a lot faster than a sql file htrough
the client.
I would parse the sql file and create a csv file with just the columns of
your table and then use load data infile using the created csv file
Olaf
Olaf,
Using
Olaf, Mike
Thanks for the input, the blob data is just text, I'll have a go at
using the load data command
Regards
Simon
mos wrote:
At 10:30 AM 6/5/2008, you wrote:
Simon,
In my experience load data infile is a lot faster than a sql file
htrough
the client.
I would parse the sql file
Even more when you compare to a script executing the inserts, instead the
mysql client...
Olaf
On 6/5/08 12:06 PM, mos [EMAIL PROTECTED] wrote:
At 10:30 AM 6/5/2008, you wrote:
Simon,
In my experience load data infile is a lot faster than a sql file htrough
the client.
I would parse
Dear all,
I'm presently trying to import the full wikipedia dump for one of our
research users. Unsurprisingly it's a massive import file (2.7T)
Most of the data is importing into a single MyISAM table which has an id
field and a blob field. There are no constraints / indexes on this
table.
Hi Simon,
How ur doing this import into ur table.
On 6/4/08, Simon Collins [EMAIL PROTECTED] wrote:
Dear all,
I'm presently trying to import the full wikipedia dump for one of our
research users. Unsurprisingly it's a massive import file (2.7T)
Most of the data is importing into a single
Simon,
As someone else mentioned, how are you loading the data? Can you
post the SQL?
You have an Id field, so is that not the primary key? If so, the
slowdown could be maintaining the index. If so, add up to 30% of your
available ram to your key_bufer_size in your my.cnf file
Hi,
Break up the file into small chunks and then import one by one.
On Wed, Jun 4, 2008 at 10:12 PM, Simon Collins
[EMAIL PROTECTED] wrote:
Dear all,
I'm presently trying to import the full wikipedia dump for one of our
research users. Unsurprisingly it's a massive import file (2.7T)
13 matches
Mail list logo