You really should reply to the list and not just me. I may not be able to
help you or I may not have the time.
----- Original Message -----
From: "Lisa M. FitzGerald" <[EMAIL PROTECTED]>
To: "Rolf Hopkins" <[EMAIL PROTECTED]>
Sent: Wednesday, February 28, 2001 9:07
Subject: Re: load large data files?
> Rolf,
>
> -The file I'm loading is a text file with 18 tab-delimited fields.
> -I am not out of disk space or anywhere near.
I didn't think you would be.
> -All the columns match the type definitions for the table. I admit
> that I haven't looked at every row in both data files, but
> the file was dumped (somehow) out of a mysql db by someone else
> and now I am trying to recreate the database on my server. It
> came out of mysql, it should go back in.
That all depends on how it was dumped. I do not know the answer to this but
what I suggest is that you check the whole file to see if there is some sort
of problem. Check for things like use of reserved words for table/column
names, unescaped special characters, etc.
Now, it is a huge file, so what you may be able to do is break it in half
and continuously do that until you find the problem/s.
> -When the process 'hangs' mysqladmin says its still running
> even though the sizes of the table files haven't changed in hours.
>
> bin/mysqladmin -uroot -p process
> Enter password:
>
+----+------+-----------+-----+-----------+-------+-------+-----------------
------------------------------------------------------+
>
> | Id | User | Host | db | Command | Time | State |
> Info |
>
+----+------+-----------+-----+-----------+-------+-------+-----------------
------------------------------------------------------+
>
> | 2 | root | localhost | hg5 | Query | 55734 | | LOAD DATA
INFILE
> "/opt/usr/local/mysql/mrna1.txt" INTO TABLE hg5.mrna |
> | 11 | root | localhost | | Processes | 0 |
> | |
>
+----+------+-----------+-----+-----------+-------+-------+-----------------
------------------------------------------------------+
>
> When it is in this state, 'running' but not actually doing anything, 1cpu
on
> my machine is full utilized and I can't get do anything else in any open
> mysql session I have running, and I can't open a new mysql session.
>
> The disk on my machine (where the mysql code lives but not the db)
> is also grinding away, but not seeming to do anything.
>
> If I kill it with mysqladmin kill 2, it shows up as killed in the
processlist,
> but never actually dies. I still can't do anything else in any mysql
session.
>
> I then try to kill the server using mysqladmin -u root -p shutdown
> It says it did it, but it doesn't. The server keeps running. I can see
> it running on top, 1cpu is still fully utilized, and if I try to re-start
> the server it says the server is till running.
>
> The only way I have found to kill the server is to use UNIX kill for all
> the mysql processes.
>
> I have done this 6 or 7 times now, always with the same pattern as above.
> The load hangs, I can't kill the process, I cant shutdown the server
> gracefully.
>
> I've already loaded 338 tables with data from the same source and they all
> loaded fine. Its just these last two that fail. They sent individual
files
> with each of the 340 table definitions, and 340 files with the data for
> each table.
>
> I'm not sure what you meant about putting a symbolic link to the data in
> the database. The whole database is already accessed by a
> symbolic link from the mysql/var directory to where I have built
> the database. The individual table files built by mysql are binary,
> so clearly mysql is formatting them some how. So if I just have a
> tab-delimited text file, can I just point to it rather than loading it in?
Sorry, maybe I wasn't clear. I didn't mean a symbolic link. For example if
your files were pictures or music, just store the files on your hard drive
and store the URL/location in the database instead. But obviously they're
not.
>
> Right now I've divided one of the files into many peices and am trying to
> load the parts, but I don't think this should be necessary and I'd rather
> not have to do this every time I reload data updates.
As mentioned above, a good idea. But just be careful on how you dump the DB
in future.
> more observations:
> If I re-start the server and try to access the table its says there are no
> rows, eventhough the actual table files are quite large.
>
> database files
> -rw-rw---- 1 lmfitzg pronet 82500000 Feb 27 12:25
> /home/lmfitzg/MYSQL/db/hg5/mrna.ISD
> -rw-rw---- 1 lmfitzg pronet 144540672 Feb 27 14:33
> /home/lmfitzg/MYSQL/db/hg5/mrna.ISM
> -rw-rw---- 1 lmfitzg pronet 9073 Feb 27 10:49
> /home/lmfitzg/MYSQL/db/hg5/mrna.frm
>
> note time on ISM file keeps incrementing even though the file itself is
not
> changing size
> and the .ISD file is neither changing size nor timestamp.
>
> I ran myisamcheck on the .ISM file.
Ouch, this is a no no. Use isamcheck on isam file and myisamcheck on myism
files. What version of mysql do you have? Should you do an upgrade and can
you upgrade your tables to myism?
>
> bin/isamchk /home/lmfitzg/MYSQL/db/hg5/mrna.ISM
> Checking ISAM file: /home/lmfitzg/MYSQL/db/hg5/mrna.ISM
> Data records: 0 Deleted blocks: 0
> - check file-size
> bin/isamchk: warning: Size of indexfile is: 144540672 Should be: 2048
> bin/isamchk: warning: Size of datafile is: 82500000 Should be: 0
> - check delete-chain
> - check index reference
> - check data record references index: 1
> - check data record references index: 2
> - check data record references index: 3
> - check data record references index: 4
> - check data record references index: 5
> - check data record references index: 6
> - check data record references index: 7
> - check data record references index: 8
> - check data record references index: 9
> - check data record references index: 10
> - check data record references index: 11
> - check data record references index: 12
> - check data record references index: 13
> ISAM-table '/home/lmfitzg/MYSQL/db/hg5/mrna.ISM' is useable but should be
fixed
>
>
>
>
> Any ideas?
>
>
> Lisa
>
>
> Hopkins wrote:
>
> > I'm presuming that hard drive space isn't a problem.
> >
> > What is it you're trying to import that's so large? regular data?
graphics?
> > music? Can you store the files on your hard drive instead and just
store
> > the link in the database? Are the fields, of the table you are importing
> > too, of an appropriate type?
> >
> > >
> > > I am trying to load two tables from space-delimited files.
> > > Both are quite large.
> > > Neither ever seems to finish. I have allowed them to run for days.
> > > The file sizes of the table files increase for a while and stop,
> > > but the mysql cursor never returns, 'mysqlamysqladmin processlist'
> > > says the process is still running, top says the process mysql is still
> > > taking up an entire CPU, but nothing is happening. When this
> > > happens I can't do anything else in the database.
> > >
> > > I have tried LOAD DATA INFILE from the mysql prompt,
> > > and mysqlimport from the command line (which launches LOAD
> > > DATA INFILE), both with the same results.
> > >
> > > file size
> > > file 1: 342580425 FAILS
> > > file 2: 233952880 FAILS
> > > file 3: 156121911 WORKS
> > >
> > > I am running Sun Solaris 8, which allows file sizes over 2gig. Is
there
> > > a file size limit for mysql? Is there some setting I need to
> > > allow import of large files? Is there some different/better way
> > > to import data than what I have tried?
> > >
> > > The only way I have found to stop the process is to kill the server.
---------------------------------------------------------------------
Before posting, please check:
http://www.mysql.com/manual.php (the manual)
http://lists.mysql.com/ (the list archive)
To request this thread, e-mail <[EMAIL PROTECTED]>
To unsubscribe, e-mail <[EMAIL PROTECTED]>
Trouble unsubscribing? Try: http://lists.mysql.com/php/unsubscribe.php