hotshot, so the basis of my assumption is a *hunch* only).
I'm not sure (yet) if a single-threaded operation would run into an
i/o bottleneck. I didn't run mysqlimport using --use-threads=1 just
yet (will do if I have the time), but when I've ran it with
--use-threads=4 the import (of
0:52 AM
> To: mysql@lists.mysql.com
> Subject: mysqlimport --use-threads / mysqladmin processlist
>
> I'm in the middle of migrating an InnoDB database to an NDBCluster. I
> use mysqldump to first create two dumps, the first one contains only
> the database schema, the second one contain
E=NDB, etc.) import it and after this I import
the InnoDB data *as is* using mysqlimport.
I use it like this:
mysqlimport --local --use-threads=4 db dir/*.txt
(dir of course cotains the tab delimited data I dumped before.)
The import starts, and I check its progress via mysqladmin, like
>-Original Message-
>From: Carsten Pedersen [mailto:cars...@bitbybit.dk]
>Sent: Monday, January 03, 2011 1:48 PM
>To: Jerry Schwartz
>Cc: 'mos'; mysql@lists.mysql.com
>Subject: Re: mysqlimport doesn't work for me
>
>It's been a long time sine I
It's been a long time sine I used mysqlimport, but you might want to try:
- using "--fields-terminated-by" rather than "--fields-terminated"
- losing (or escaping) the backticks in --columns=
- checking my.cnf to see if the client settings are the same for mysql>
3071 | 299522 |
+--+--++
4 rows in set (0.03 sec)
This does not work:
localhost >TRUNCATE t_dmu_history;
localhost >quit
C:\Users\Jerry\Documents\Access MySQL
Production>mysqlimport --columns=`dm_history_dm_id`,`dm_history_customer_id`
--fields-t
Thanks. That works great.
On 10-Mar-09, at 9:36 PM, Rob Wultsch wrote:
On Tue, Mar 10, 2009 at 7:16 PM, René Fournier
wrote:
OK, I've managed to do the same thing with just the mysql command
line
program:
mysql -h 192.168.0.224 -u root -p alba2 <
/Backup/Latest/alba2_2009-03-10_0
zcat /Backup/Latest/alba2_2009-03-10_00h45m.Tuesday.sql.gz | mysql -h
192.168.0.224 -u root -p alba2
Cheers
Claudio
2009/3/11 Rob Wultsch
> On Tue, Mar 10, 2009 at 7:16 PM, René Fournier
> wrote:
>
> > OK, I've managed to do the same thing with just the mysql command line
> > program:
> >
>
On Tue, Mar 10, 2009 at 7:16 PM, René Fournier wrote:
> OK, I've managed to do the same thing with just the mysql command line
> program:
>
>mysql -h 192.168.0.224 -u root -p alba2 <
> /Backup/Latest/alba2_2009-03-10_00h45m.Tuesday.sql
>
> Works great. However, the sql file is normally gz
info if we had
ill intents.
Sent from my Verizon Wireless BlackBerry
-Original Message-
From: Darryle Steplight
Date: Tue, 10 Mar 2009 22:20:26
To: René Fournier
Cc: mysql
Subject: Re: mysqlimport remote host problem
Hi Rene,
Just a head's up. You might want to keep your use
hat, each night, copies a small database to my laptop
>> on the local network. I'm having trouble getting it to work. Here's my
>> syntax so far (run on the server):
>>
>> mysqlimport --host=192.168.0.224 --user=root --password alba2
>> alba2_2009-03-10_00h45m.
é Fournier wrote:
I'm writing script that, each night, copies a small database to my
laptop on the local network. I'm having trouble getting it to work.
Here's my syntax so far (run on the server):
mysqlimport --host=192.168.0.224 --user=root --password alba2
alba2_2009-0
I'm writing script that, each night, copies a small database to my
laptop on the local network. I'm having trouble getting it to work.
Here's my syntax so far (run on the server):
mysqlimport --host=192.168.0.224 --user=root --password alba2
alba2_2009-03-10_00h45m.Tues
Thanks for the resource!
However, I am loading into a MyISAM table, while logging is disabled.
In a matter of fact, I was able to load the data at the end, after
disabling the keys. But now, as I try to create the keys, it takes a
huge amount of time. there are 250 Million rows in one of the
Hello Shuly,
Try this.
http://www.mysqlperformanceblog.com/2008/07/03/how-to-load-large-files-safely-into-innodb-with-load-data-infile/
On Tue, Feb 24, 2009 at 1:08 PM, Shuly Avraham wrote:
> Hi,
>
> I have a huge table with about 50 millions of rows.
> I dumped the table using mysqldump -T ,
Hi,
I have a huge table with about 50 millions of rows.
I dumped the table using mysqldump -T , as text dump, and now trying
to import it to a database on another server, but it keeps hanging.
Are there any options or server variables I can set to help out with it?
Thanks,
Shuly.
--
MySQL Ge
;
Then I issued the command inside computer_inventory data folder as follows:
Mysqlimport -u root -p computer_inventory disposed.csv
And got the error:
Mysqlimport: Error: Data truncated for column 'mot_id' at row 1, when using
table: disposed
What am I doing wrong?
[cid:
[EMAIL PROTECTED]> wrote:
> >
> > Hi,
> >
> > I've been trying to import a 10G dump file using mysqlimport
> > and it is eventually failing because it runs out of tmpdir
> > space -- I get Errcode: 28.
> >
> > I was surprised that it was using a temp fi
Mysql use tmpdir,
when ever there is any index creation.
regards
anandkl
On 8/21/08, jthorpe <[EMAIL PROTECTED]> wrote:
>
> Hi,
>
> I've been trying to import a 10G dump file using mysqlimport
> and it is eventually failing because it runs out of tmpdir
> space --
Hi,
I've been trying to import a 10G dump file using mysqlimport
and it is eventually failing because it runs out of tmpdir
space -- I get Errcode: 28.
I was surprised that it was using a temp file at all. I've
looked in the documentation and other sources but have not been
ab
i read about mysqlimport & load data infile for mysql, but i can't find a
way to import text file using length of column, instead of delimiter
my text file contains fixed length column:
<><--><---><-><>
i can use ms excel to conv
qt4x11 wrote:
Hi-
I'm using the command 'mysqlimport -u usr -ppassh -h mysqlserver -P 3306 -v
db --local $workdir/$filename'to import a table into mysql from a file
$filename.
The data in $filename looks something like:
test test
where there is a blank space between the
Hi-
I'm using the command 'mysqlimport -u usr -ppassh -h mysqlserver -P 3306 -v
db --local $workdir/$filename'to import a table into mysql from a file
$filename.
The data in $filename looks something like:
test test
where there is a blank space between the two 'test
Scott Hamm wrote:
>
> Line 48:
>
> "48", "14.729606", "10.1.1.22", "10.182.167.209", "TCP", "pop3 >
> [SYN,
> ACK] Seq=0 Ack=1 Win=16384 Len=0 MSS=1460"
>
Is the line 48 is different than other lines?
-
Hey Scott -
I dont think you can use , with mysqlimport as a field separator if it is
part of the data. use something else - I used the pipe | character...
This is what worked for me:
C:\>mysqlimport --fields-enclosed-by=""" --fields-terminated-by=|
--lines-terminated-by=&qu
I've been trying to import fields that contains the comma character ','
inside double quotes '"', with the results following:
mysqlimport --fields-optionally-enclosed-by=""" --fields-terminated-by=,
--lines-terminated-by="\r\n" --ignore
On 2006-10-04 Scott Hamm wrote:
> OBJECTIVE:
> INPUT:
> E524541015.txt:20061004|,535,999|Scot|Hamm|,410|||101 Walter
> Rd|Dover|MD|76709|,041|
...
> Been trying to get mysqlimport to use these characters to no avail, how do
> I get around to it?
I can't answer your
l,
ord int(8) not null,
error int(3) not null,
error1 int(3),
error2 int(3),
error3 int(3),
error4 int(3),
error5 int(3),
unique key(filename,ord),
index(filename)
);
/*
Been trying to get mysqlimport to use these characters to no avail, how do I
get around to it?
*/
I don't think you can do this with mysqlimport. It wouldn't be hard to do
with Perl or PHP, though, and that could be automated any way you want with
a shell script.
Regards,
Jerry Schwartz
Global Information Incorporated
195 Farmington Ave.
Farmington, CT 06032
860.674.
I need to update a table with the contents of a CSV file regularly, I've
used mysqlimport to load all the initial data, but I have a problem with
using it for updates. The data in the CSV file does not contain all of the
data in the table, there is a field that is updated by another applic
30.89
> >2006-08-3
> > ^DJI
> > 11242.6
> >2006-08-3
> > ^IXIC
> > 2092.34
> >2006-08-3
> > ^GSPC
> > 1280.27
> >2006-08-3
>
table for this data is in this format
>
> | 2006-08-02 | 20.72 | 8.81 | 10.08 | 22.19 | 20.48 | 23.19 | 28.52 |
> 96.21 | 18.87 | 32.14 | 10.31 | 30.95 | 11199.93 | 2078.81 | 1278.55 |
>
> Is there a way to get mysqlimport to pull the data from specific
> column/row to insert into a
table for this data is in this format
| 2006-08-02 | 20.72 | 8.81 | 10.08 | 22.19 | 20.48 | 23.19 | 28.52 |
96.21 | 18.87 | 32.14 | 10.31 | 30.95 | 11199.93 | 2078.81 | 1278.55 |
Is there a way to get mysqlimport to pull the data from specific
column/row to insert into a specified field? Trying
mysqldump -u [user] -h [host] -p [database] [table] > fixme.sql
This is for one table.
As I need it for all my tables in all my databases, I'd have to write a
script for that.
And as --tab uses less space, I prefer --tab option for mysqldump.
--
MySQL General Mailing List
For list archives:
Gabriel PREDA schrieb:
Try:
ALTER TABLE `tbl_name` DISABLE KEYS;
-- now insert in the TXT file
ALTER TABLE `tbl_name` ENABLE KEYS;
I think this is what you were looking for !
That would have been a possibility. I did it this way now:
...
echo "set sql_log_bin=0; set foreign_key_checks=0; u
Try:
ALTER TABLE `tbl_name` DISABLE KEYS;
-- now insert in the TXT file
ALTER TABLE `tbl_name` ENABLE KEYS;
I think this is what you were looking for !
--
Gabriel PREDA
Senior Web Developer
--
MySQL General Mailing List
For list archives: http://lists.mysql.com/mysql
To unsubscribe:http:/
On Thursday 20 July 2006 04:10 am, Dominik Klein wrote:
> Hello
>
> When I re-insert dumped data with "mysql < file.sql", I can simply put
> "set foreign_key_checks=0;" at the beginning of the file and this works
> fine.
So do it that way
> So if there's any other well-known solution for per-tabl
Hello
When I re-insert dumped data with "mysql < file.sql", I can simply put
"set foreign_key_checks=0;" at the beginning of the file and this works
fine.
How can I achieve this when inserting a text file that is read with
mysqlimport?
I tried to put the mentioned s
I imported the data of the table using mysqlimport.
I used the following command line.
$ mysqlimport -ukawabe -paaa -h192.168.1.92 --local kawabe
"C:\index_test.txt"
kawabe.index_test:
Records: 4 Deleted: 0 Skipped: 4 Warnings: 2
I want to show the contents of warn
On 2/22/06, sheeri kritzer <[EMAIL PROTECTED]> wrote:
> A batch script or shell script can easily be written to do this.
>
> -Sheeri
>
> On 2/20/06, Daniel Kasak <[EMAIL PROTECTED]> wrote:
> > I've got some import scripts that are giving me trouble.
> >
> > Some MOFOs keep changing the format of th
A batch script or shell script can easily be written to do this.
-Sheeri
On 2/20/06, Daniel Kasak <[EMAIL PROTECTED]> wrote:
> I've got some import scripts that are giving me trouble.
>
> Some MOFOs keep changing the format of the data they give us, and
> sometimes I loose half the records. When
rt routine is being triggered from MS Access ...
> and come to think of it, I'm using 'load data infile' and not
> 'mysqlimport', but anyway, you get the idea. AFAIK there is no way to
> trigger anything useful via ODBC. I could write a Perl script, chuck it
&g
running on
Linux.
Unfortunately the import routine is being triggered from MS Access ...
and come to think of it, I'm using 'load data infile' and not
'mysqlimport', but anyway, you get the idea. AFAIK there is no way to
trigger anything useful via ODBC. I could write a
[EMAIL PROTECTED]
> Sent: Monday, February 20, 2006 11:21 PM
> To: mysql@lists.mysql.com
> Subject: mysqlimport, \r\n and \n
>
> I've got some import scripts that are giving me trouble.
>
> Some MOFOs keep changing the format of the data they give us,
> and sometimes
I've got some import scripts that are giving me trouble.
Some MOFOs keep changing the format of the data they give us, and
sometimes I loose half the records. When this happens, I change the line
terminator from \r\n to \n ... or from \n to \r\n.
It's starting to get to me. Is there any way
Hello.
See:
http://dev.mysql.com/doc/refman/5.0/en/the-dbug-package.html
P. Evans wrote:
> Hello Listers,
> Can anyone explain what are valid values for the 'debug options' on a
> mysqlimport ? The manuals just say :
> --debug[=debug_options], -# [debug_
Hello Listers,
Can anyone explain what are valid values for the 'debug options' on a
mysqlimport ? The manuals just say :
--debug[=debug_options], -# [debug_options]
Write a debugging log. The debug_options string often is 'd:t:o,file_name'.
What is d:
one like the one at
http://www.hotscripts.com/Detailed/28161.html (note, I claim no
liability, I just found that script doing a web search).
-Sheeri
On 11/22/05, Jacek Becla <[EMAIL PROTECTED]> wrote:
> Hi,
>
> Is there a way to load a section of an input file into mysql
> (MyISAM table)
Hi,
Is there a way to load a section of an input file into mysql
(MyISAM table) using mysqlimport or LOAD DATA INTO?
The input data is in relatively large ascii files (10 million
rows per file), and I'd like to break the load into smaller
pieces rather than load whole file at once. Of cou
Hi Guys,
I have been searching for the answer to this question for a while. The
answer is so obvious, yet there was no useful source of documentation
that showed it. I am using the load data infile syntax rather than the
command line utility. To get the warnings the "show warnings" command
is s
Hello.
Similar questions are often asked on the list, but I don't remember
any solution for old versions.
"michael watson (IAH-C)" <[EMAIL PROTECTED]> wrote:
> Hi
>
> The subject says it all! My mysqlimport command reports 43 warnings,
> but
Hi
The subject says it all! My mysqlimport command reports 43 warnings,
but I have no idea how to access them. "SHOW WARNINGS" was only
implemented after MySQL version 4.1, and I have 4.0.15-standard-log.
Any help?
Many thanks
Mick
--
MySQL General Mailing List
For list arch
ning,load.trace
>
> Nothing I do in the 'debug' param seems to actually
> output any information.
>
> Here's my command:
>
> mysqlimport -v -h [host] [database]
> [table_and_file_name].csv -u [username] -p
>
> Here's my version info:
the 'debug' param seems to actually
output any information.
Here's my command:
mysqlimport -v -h [host] [database]
[table_and_file_name].csv -u [username] -p
Here's my version info:
mysqlimport Ver 3.4 Distrib 4.0.18, for pc-linux
(i686)
Here's my output to STDOUT/STDERR:
Hi,
Anyone know how to find out what are the rows that are reported by
mysqlimport as "deleted"?
Thanks
HT
--
MySQL General Mailing List
For list archives: http://lists.mysql.com/mysql
To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]
Yes I am appending to the end of an existing database. So why are rows 1
to N locked if I'm only adding rows at N+1? Wouldn't the write
privileges apply to rows being modified? And during this period even an
interactive mysql shell hangs until the mysqlimport completes. i.e. I
the d
Hi Joseph,
> I have been trying to use mysqlimport to load a primarily read only
database with data at regular intervals. My problem occurs when my
tables are myisam. In this case all access to the database and the
tables blocks until mysqlimport completes. The -lock-tables=false
parameter d
I have been trying to use mysqlimport to load a primarily read only
database with data at regular intervals. My problem occurs when my
tables are myisam. In this case all access to the database and the
tables blocks until mysqlimport completes. The -lock-tables=false
parameter does not help. Is
Please do not consider the previous email. I found my error.
mysqlimport is behaving exactly as expected.
Regards, Clodoaldo Pinto
On Sat, 29 Jan 2005 15:53:48 -0200, Clodoaldo Pinto
<[EMAIL PROTECTED]> wrote:
> I need to update a table with mysqlimport.
>
> I would like to u
I need to update a table with mysqlimport.
I would like to update the table lines with the same unique key and
mantain the other lines untouched.
When I use "mysqlimport --replace" the table lines with the same
unique key are updated but the others are deleted.
Is it the expected beh
omplete sense). It would be
>nice if the documentation for mysqlimport would at least make note of this.
>Thanks for the response."Settles, Aaron" <[EMAIL PROTECTED]> wrote:
--
For technical support contracts, goto https://order.mysql.com/?ref=ensita
This email is spon
Sure enough... I just installed the latest 4.1 linux binaries, I didn't
realize that the server itself had to be compiled with the debug enabled
(although now that I realize that it makes complete sense). It would be
nice if the documentation for mysqlimport would at least make note of
lize the debug switch with mysqlimport so that I can figure
>out why I'm getting errors on the data I'm importing, but I have yet to
>figure out a way to do this. I've tried to read the sparse documentation
>concerning this feature and no debug file is ever produced. I
I'm trying to utilize the debug switch with mysqlimport so that I can figure
out why I'm getting errors on the data I'm importing, but I have yet to
figure out a way to do this. I've tried to read the sparse documentation
concerning this feature and no debug file is ever p
Hi,
I just wondering if anyone notice or can confirm this: I tried to load
11 million records using mysqlimport using both 4.1.3b and 4.0.20.
4.1.3b took 1.5 hours, but the 4.0.20 took over 10 hours. This
includes loading the data and then build the index.
Thanks
Haitao
--
MySQL General
Subject: HP-UX 11.11/4.0.20 mysqlimport BUS ERROR
>Description:
Installed mysql from the binary download on mysql.com
according to the INSTALL-BINARY instructions. Attempted
to use mysqlimport as described in the online documentation
(http://dev.mysql.com/
k.CSV;
mysqlimport --fields-terminated-by=',' --ignore-lines=1 db_name Bank.CSV;
done
Something tells me that greater minds have a better way.
--
___ Patrick Connolly
{~._.~}
_( Y )_ Good judgment comes from experience
(:_~*~_:) Experience comes fr
At 12:03 -0300 7/5/04, j llarens wrote:
Hi people
I'm facing a (not huge) problem with mysqlimport.
The mysql version I'm using is MySQL
4.0.11a-gamma'-Max'
For updating a #29000 records table from fixed-lenght
ASCII file, I'm using a php script that gets a record
and exe
Hi people
I'm facing a (not huge) problem with mysqlimport.
The mysql version I'm using is MySQL
4.0.11a-gamma'-Max'
For updating a #29000 records table from fixed-lenght
ASCII file, I'm using a php script that gets a record
and executes and UPDATE for each one: pretty S
[EMAIL PROTECTED] wrote:
> is that a problem?
> I want to avoid ftp db.txt files and then mysqlimport them
You should run mysqlimport on the Windows box and specify MySQL server host with -h
option.
>
> - Original Message -
> From: "Egor Egorov" <[EMA
is that a problem?
I want to avoid ftp db.txt files and then mysqlimport them
- Original Message -
From: "Egor Egorov" <[EMAIL PROTECTED]>
To: <[EMAIL PROTECTED]>
Sent: Wednesday, June 02, 2004 4:38 PM
Subject: Re: [mysql-php] mysqlimport error
> "ni
"nikos" <[EMAIL PROTECTED]> wrote:
>
> I'm trying
>
> mysqlimport --local -d --fields-enclosed-by="|" --fields-terminated-by=";" -
> -lines-terminted-by="\n" -unikos -p mydb c://temp//programs.txt
>
> but i get an
> Er
Hello list
I'm trying
mysqlimport --local -d --fields-enclosed-by="|" --fields-terminated-by=";" -
-lines-terminted-by="\n" -unikos -p mydb c://temp//programs.txt
but i get an
Error: File 'c:/temp/programs.txt' not found (Errcode: 2), when using
Can you run a find and replace to double up the backslashes?
\ --> \\
-Original Message-
From: Hans van Dalen
To: [EMAIL PROTECTED]
Sent: 4/28/04 3:46 AM
Subject: mysqlimport and "\" as data
Hi Group,
Does anybody have any expierence with mysqlimport and a comma separat
Hi Group,
Does anybody have any expierence with mysqlimport and a comma separated
file with data wich contains field data with a: \. For example I have the
data in de column path: C:\temp.
When I import this I got something like : c:||emp ...
Does anybody know how to solve this problem? This
"Ron McKeever" <[EMAIL PROTECTED]> wrote:
> I'm trying to use mysqimport instead of LOAD DATA INFILE from a shell
> script. I notice an option for mysqlimport is not working or im doing it
> wrong.
>
> This works with LOAD DATA INFILE :
> mysql -e &quo
I'm trying to use mysqimport instead of LOAD DATA INFILE from a shell
script. I notice an option for mysqlimport is not working or im doing it
wrong.
This works with LOAD DATA INFILE :
mysql -e "LOAD DATA INFILE 'x' INTO TABLE x IGNORE 2 LINES"
but when i try:
mys
EP wrote:
I am wondering:
I can see the MySQL data files for my various databases.
What technically prevents me from simply copying those files and using
copies
- to move my database to another file structure or server
- to back-up my current db
Yes, I did put my finger in the electrical socke
On Tue, 13 Jan 2004, EP wrote:
> I am wondering:
>
> I can see the MySQL data files for my various databases.
>
> What technically prevents me from simply copying those files and using copies
> - to move my database to another file structure or server
> - to back-up my current db
Copying will not
In the last episode (Jan 13), EP said:
> I am wondering:
>
> I can see the MySQL data files for my various databases.
>
> What technically prevents me from simply copying those files and using
> copies
> - to move my database to another file structure or server
> - to back-up my current db
Abso
I am wondering:
I can see the MySQL data files for my various databases.
What technically prevents me from simply copying those files and using copies
- to move my database to another file structure or server
- to back-up my current db
Yes, I did put my finger in the electrical socket as a kid.
> --local is a valid option for mysqlimport in 3.23.49 according to the
> manual. (Source: http://www.cict.fr/app/mysql/manual.html#mysqlimport)
> What is the entire command you are using?
Hi,
My command is:
mysqlimport -p -L ilk gwarancje.txt
And I get error:
"mysqlimport: E
Paul,
--local is a valid option for mysqlimport in 3.23.49 according to the
manual. (Source: http://www.cict.fr/app/mysql/manual.html#mysqlimport)
What is the entire command you are using?
Matt
-Original Message-
From: Paweł Filutowski [mailto:[EMAIL PROTECTED]
Sent: Tuesday, December
I tryed this option but i got following error:
"mysqlimport: Error: The used command is not allowed with this MySQL
version, when using table: gwarancje"
MySQL version is 3.23.49
What Can I do ??
Regards
- Original Message -
From: "Matt Griffin" <[EMAIL
If the the file is on the same machine as your shell is running,
specify --local when running mysqlimport.
Matt
-Original Message-
From: Paweł Filutowski [mailto:[EMAIL PROTECTED]
Sent: Tuesday, December 09, 2003 10:27 AM
To: [EMAIL PROTECTED]
Subject: Problem with mysqlimport
I tryed
[EMAIL PROTECTED]>
rfam.pl> cc:
Subject: Problem with mysqlimport
I tryed to import from text file (columns divided by tabulators) like this:
5724KF2003CSDEUROPAPARKAN2003-12-12MarcinTamkanono
.
.
.
On PHPTriad (under Windows 2000) it works perfectly !
I use command: mysqlimport database file.txt
But under Linux (RedHat) i
Good question - mysqlimport seems to work siliently and not report any errors. I had
read that this was to be improved in 4 - has it ??
appreciate answers
Thomas
-Original Message-
From: Greg G [mailto:[EMAIL PROTECTED]
Sent: 02 December 2003 15:22
To: [EMAIL PROTECTED]
Subject
I'm having a tough time with the debug options for mysqlimport. I've
check the docs, but they're as clear as mud. I've tried a number of
combinations of -#d:t:o,filename and everything else I can think of, but
I can't get any debug information.
What I'm really
Good morning,
We recently put a new server online (FreeBSD) with MySQL version 4.0.15.
I have a couple of AWK scripts which used to work properly (on another
FreeBSD server with 3.23.n MySQL) with mysqlimport. Now I am getting the
following error...
/usr/local/bin/mysqlimport: Error: The used
May Yang <[EMAIL PROTECTED]> wrote:
> Dear everyone,
>
> I'd like to ask you a question, thanks in advance.
>
> Q: How long will it take to import 100GB data into MySQL DB by command "mysqlimport"
> ?
>
Depends on how many records are to be insert
Dear everyone,
(B
(BI'd like to ask you a question, thanks in advance.
(B
(BQ: How long will it take to import 100GB data into MySQL DB by command "mysqlimport" ?
(B
(BBest regards,
(BMay Yang
(B
(B
(B--
(BMySQL General Mailing List
(BFor list archives: http://lists.
"Cersosimo, Steve" <[EMAIL PROTECTED]> wrote:
>
> Am I wrong to assume mysqlimport is supposed to emulate the LOAD DATA INFILE
> syntax? I cannot find the command line option to turn on the CONCURRENT
> flag.
>
CONCURRENT currently is not supported by mysqlimpo
Am I wrong to assume mysqlimport is supposed to emulate the LOAD DATA INFILE
syntax? I cannot find the command line option to turn on the CONCURRENT
flag.
Steve Cersosimo
"When all is said and done, more is said than done"
*
"The information transmitted is intended only fo
"Will Tyburczy" <[EMAIL PROTECTED]> wrote:
>
> I've unable to load data from files into existing tables. When I give the command:
> mysqlimport [database] [filename] I keep getting the following error.
>
> mysqlimport: Error: Can't get stat of
Check file permissions.
The file must be readable by the user 'mysql'.
Will Tyburczy wrote:
I've unable to load data from files into existing tables. When I give the command: mysqlimport [database] [filename] I keep getting the following error.
mysqlimport: Error: Can't ge
I've unable to load data from files into existing tables. When I give the command:
mysqlimport [database] [filename] I keep getting the following error.
mysqlimport: Error: Can't get stat of '[filename]' (Errcode: 13), when using table:
[table]
I get a similar error whe
Carl Anthony-uzoeto <[EMAIL PROTECTED]> wrote:
>
> I need to periodically load a dumpfile from another DB into
> mysql. Now, since this is a cronjob, and for which I would
> need to avoid interactivity, I DO NEED to run this as a
> non-root user.
>
> I have setup such a user, and have granted th
ERT
statements for each table in the database. When I try to recreate this db
on another server using mysqlimport:
mysqlimport -u root -p dbname 'dbname.dump'
mysqlimport is not the complement of mysqldump, it is a command-line
interface to the LOAD DATA INFILE statement.
To import the dum
I try to recreate this db
on another server using mysqlimport:
mysqlimport -u root -p dbname 'dbname.dump'
I keep getting the error:
Error: Table 'dbname.dbname' doesn't exist, when using table: dbname
My question: Why is mysqlimport interpreting the dbname argument a
1 - 100 of 214 matches
Mail list logo