RE: mysqlimport --use-threads / mysqladmin processlist
I'm skeptical that use-treads can every be very effective. What order are the rows in? They are probably in PRIMARY KEY order, which means that the INSERTing threads will be fighting over similar spots in the table. Is it I/O bound when it is single-threaded? If so, then there can't be any improvement with use-threads. etc. Suggest you file a bug with bugs.mysql.com. If nothing else, the documentation should say more than it does. -Original Message- From: Róbert Kohányi [mailto:kohanyi.rob...@gmail.com] Sent: Tuesday, July 24, 2012 10:52 AM To: mysql@lists.mysql.com Subject: mysqlimport --use-threads / mysqladmin processlist I'm in the middle of migrating an InnoDB database to an NDBCluster. I use mysqldump to first create two dumps, the first one contains only the database schema, the second one contains only tab delimited data (via mysqldump --tab). I edit my InnoDB schema here and there (ENGINE=InnoDB to ENGINE=NDB, etc.) import it and after this I import the InnoDB data *as is* using mysqlimport. I use it like this: mysqlimport --local --use-threads=4 db dir/*.txt (dir of course cotains the tab delimited data I dumped before.) The import starts, and I check its progress via mysqladmin, like this: mysqladmin --sleep=1 processlist this is what I see: http://pastebin.com/raw.php?i=M23fWVjc Only a single process seems to be loading my data. Is this what I *should* see, or, in my case using 4 threads, should I see four processes? I'm not asking which one will be faster, I'm just simply confused because I don't know what to expect. If I start four different mysqlimport processes, each one importing different files, then I can see four different process in the mysql processlist. If it's matters, here is my server version (I use the official binaries). Server version: 5.5.25a-ndb-7.2.7-gpl MySQL Cluster Community Server (GPL) Regards, Kohányi Róbert -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/mysql
Re: mysqlimport --use-threads / mysqladmin processlist
Yes, the rows are in primary key order, however each row contains specific integer primary keys; I'm not inserting nulls into a table where the primary key is auto increment, so I don't see why concurrent inserts would fight for similar spots (although, I'm admittedly not a MySQL hotshot, so the basis of my assumption is a *hunch* only). I'm not sure (yet) if a single-threaded operation would run into an i/o bottleneck. I didn't run mysqlimport using --use-threads=1 just yet (will do if I have the time), but when I've ran it with --use-threads=4 the import (of a ~500 MB dump) took more time than running for different processes (I've split my tab delimited dumps with split into four even pieces and imported those in four different sessions). Anyway, it seems that doing a simple import (from a dump, which isn't tab delimited, but contains complete or extended inserts) takes the same amount of time than doing a mysqlimport using --use-threads=4 and as it turns out splitting my tab delimited dump is too complex to handle gracefully, because my data contains newline characters all over the place, so I've dropped the idea of this whole mysqlimport thing for now. (I'll try the method of migrating an InnoDB database to an NDBCluster described here[1] instead.) If I have the time I'll write up a bug report, or documentation enhancement request for this. Thanks for the input! Regards, Kohányi Róbert [1]: http://johanandersson.blogspot.se/2012/04/mysql-cluster-how-to-load-it-with-data.html On Wed, Jul 25, 2012 at 6:49 PM, Rick James rja...@yahoo-inc.com wrote: I'm skeptical that use-treads can every be very effective. What order are the rows in? They are probably in PRIMARY KEY order, which means that the INSERTing threads will be fighting over similar spots in the table. Is it I/O bound when it is single-threaded? If so, then there can't be any improvement with use-threads. etc. Suggest you file a bug with bugs.mysql.com. If nothing else, the documentation should say more than it does. -Original Message- From: Róbert Kohányi [mailto:kohanyi.rob...@gmail.com] Sent: Tuesday, July 24, 2012 10:52 AM To: mysql@lists.mysql.com Subject: mysqlimport --use-threads / mysqladmin processlist I'm in the middle of migrating an InnoDB database to an NDBCluster. I use mysqldump to first create two dumps, the first one contains only the database schema, the second one contains only tab delimited data (via mysqldump --tab). I edit my InnoDB schema here and there (ENGINE=InnoDB to ENGINE=NDB, etc.) import it and after this I import the InnoDB data *as is* using mysqlimport. I use it like this: mysqlimport --local --use-threads=4 db dir/*.txt (dir of course cotains the tab delimited data I dumped before.) The import starts, and I check its progress via mysqladmin, like this: mysqladmin --sleep=1 processlist this is what I see: http://pastebin.com/raw.php?i=M23fWVjc Only a single process seems to be loading my data. Is this what I *should* see, or, in my case using 4 threads, should I see four processes? I'm not asking which one will be faster, I'm just simply confused because I don't know what to expect. If I start four different mysqlimport processes, each one importing different files, then I can see four different process in the mysql processlist. If it's matters, here is my server version (I use the official binaries). Server version: 5.5.25a-ndb-7.2.7-gpl MySQL Cluster Community Server (GPL) Regards, Kohányi Róbert -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/mysql -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/mysql
mysqlimport --use-threads / mysqladmin processlist
I'm in the middle of migrating an InnoDB database to an NDBCluster. I use mysqldump to first create two dumps, the first one contains only the database schema, the second one contains only tab delimited data (via mysqldump --tab). I edit my InnoDB schema here and there (ENGINE=InnoDB to ENGINE=NDB, etc.) import it and after this I import the InnoDB data *as is* using mysqlimport. I use it like this: mysqlimport --local --use-threads=4 db dir/*.txt (dir of course cotains the tab delimited data I dumped before.) The import starts, and I check its progress via mysqladmin, like this: mysqladmin --sleep=1 processlist this is what I see: http://pastebin.com/raw.php?i=M23fWVjc Only a single process seems to be loading my data. Is this what I *should* see, or, in my case using 4 threads, should I see four processes? I'm not asking which one will be faster, I'm just simply confused because I don't know what to expect. If I start four different mysqlimport processes, each one importing different files, then I can see four different process in the mysql processlist. If it's matters, here is my server version (I use the official binaries). Server version: 5.5.25a-ndb-7.2.7-gpl MySQL Cluster Community Server (GPL) Regards, Kohányi Róbert -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/mysql
mysqlimport doesn't work for me
sigh This works: localhost TRUNCATE t_dmu_history; Query OK, 0 rows affected (0.41 sec) localhost LOAD DATA LOCAL INFILE 't_dmu_history.txt' INTO TABLE t_dmu_history FIELDS TERMINATED BY , (`dm_history_dm_id`,`dm_history_customer_id`); Query OK, 876211 rows affected (25.16 sec) Records: 876211 Deleted: 0 Skipped: 0 Warnings: 0 localhost SELECT * FROM t_dmu_history LIMIT 4; +--+--++ | t_dmu_history_id | DM_History_DM_ID | DM_History_Customer_ID | +--+--++ |1 |13071 | 299519 | |2 |13071 | 299520 | |3 |13071 | 299521 | |4 |13071 | 299522 | +--+--++ 4 rows in set (0.03 sec) This does not work: localhost TRUNCATE t_dmu_history; localhost quit C:\Users\Jerry\Documents\Access MySQL Productionmysqlimport --columns=`dm_history_dm_id`,`dm_history_customer_id` --fields-terminated=',' --local --password=xxx --pipe --user=access --verbose maintable_usa t_dmu_history.txt Connecting to localhost Selecting database maintable_usa Loading data from LOCAL file: C:/Users/Jerry/Documents/Access MySQL Production/t_dmu_history.txt into t_dmu_history maintable_usa.t_dmu_history: Records: 876211 Deleted: 0 Skipped: 0 Warnings: 1752422 Disconnecting from localhost localhost SELECT * FROM t_dmu_history LIMIT 4; +--+--++ | t_dmu_history_id | DM_History_DM_ID | DM_History_Customer_ID | +--+--++ |1 |13071 | NULL | |2 |13071 | NULL | |3 |13071 | NULL | |4 |13071 | NULL | +--+--++ 4 rows in set (0.00 sec) = Before you ask, the mysql CLI is also using a named pipe. Windows Vista 32-bit MySQL version 5.1.31-community Mysqlimport Ver 3.7 Distrib 5.1.31, for Win32 (ia32) What am I missing? Regards, Jerry Schwartz Global Information Incorporated 195 Farmington Ave. Farmington, CT 06032 860.674.8796 / FAX: 860.674.8341 E-mail: je...@gii.co.jp Web site: www.the-infoshop.com -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/mysql?unsub=arch...@jab.org
Re: mysqlimport doesn't work for me
It's been a long time sine I used mysqlimport, but you might want to try: - using --fields-terminated-by rather than --fields-terminated - losing (or escaping) the backticks in --columns= - checking my.cnf to see if the client settings are the same for mysql and mysqlimport - checking user privileges (are you using the same account in both instances?) - checking the line delimiter and --lines-terminated-by FWIW, I always prefer tab-delimited files over comma-separated ones. This gets around a lot of i18n issues. / Carsten Den 03-01-2011 19:33, Jerry Schwartz skrev: sigh This works: localhostTRUNCATE t_dmu_history; Query OK, 0 rows affected (0.41 sec) localhostLOAD DATA LOCAL INFILE 't_dmu_history.txt' INTO TABLE t_dmu_history FIELDS TERMINATED BY , (`dm_history_dm_id`,`dm_history_customer_id`); Query OK, 876211 rows affected (25.16 sec) Records: 876211 Deleted: 0 Skipped: 0 Warnings: 0 localhostSELECT * FROM t_dmu_history LIMIT 4; +--+--++ | t_dmu_history_id | DM_History_DM_ID | DM_History_Customer_ID | +--+--++ |1 |13071 | 299519 | |2 |13071 | 299520 | |3 |13071 | 299521 | |4 |13071 | 299522 | +--+--++ 4 rows in set (0.03 sec) This does not work: localhostTRUNCATE t_dmu_history; localhostquit C:\Users\Jerry\Documents\Access MySQL Productionmysqlimport --columns=`dm_history_dm_id`,`dm_history_customer_id` --fields-terminated=',' --local --password=xxx --pipe --user=access --verbose maintable_usa t_dmu_history.txt Connecting to localhost Selecting database maintable_usa Loading data from LOCAL file: C:/Users/Jerry/Documents/Access MySQL Production/t_dmu_history.txt into t_dmu_history maintable_usa.t_dmu_history: Records: 876211 Deleted: 0 Skipped: 0 Warnings: 1752422 Disconnecting from localhost localhostSELECT * FROM t_dmu_history LIMIT 4; +--+--++ | t_dmu_history_id | DM_History_DM_ID | DM_History_Customer_ID | +--+--++ |1 |13071 | NULL | |2 |13071 | NULL | |3 |13071 | NULL | |4 |13071 | NULL | +--+--++ 4 rows in set (0.00 sec) = Before you ask, the mysql CLI is also using a named pipe. Windows Vista 32-bit MySQL version 5.1.31-community Mysqlimport Ver 3.7 Distrib 5.1.31, for Win32 (ia32) What am I missing? Regards, Jerry Schwartz Global Information Incorporated 195 Farmington Ave. Farmington, CT 06032 860.674.8796 / FAX: 860.674.8341 E-mail: je...@gii.co.jp Web site: www.the-infoshop.com -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/mysql?unsub=arch...@jab.org
RE: mysqlimport doesn't work for me
-Original Message- From: Carsten Pedersen [mailto:cars...@bitbybit.dk] Sent: Monday, January 03, 2011 1:48 PM To: Jerry Schwartz Cc: 'mos'; mysql@lists.mysql.com Subject: Re: mysqlimport doesn't work for me It's been a long time sine I used mysqlimport, but you might want to try: - using --fields-terminated-by rather than --fields-terminated [JS] Good catch! Unfortunately, it didn't fix the problem: Connecting to localhost Selecting database maintable_usa Deleting the old data from table t_dmu_history Loading data from LOCAL file: C:/Users/Jerry/Documents/Access_MySQL Tests/t_dmu_history.txt into t_dmu_history maintable_usa.t_dmu_history: Records: 1 Deleted: 0 Skipped: 0 Warnings: 2 Disconnecting from localhost I don't know if there's any way to find out what those warnings are. - losing (or escaping) the backticks in --columns= [JS] The loons who designed this system were fond of putting spaces in the table names. I've tried this particular table with and without the back-ticks. Many of the table and field names are in Japanese, too. I shudder to think how that will work out. - checking my.cnf to see if the client settings are the same for mysql and mysqlimport [JS] Good thought. - checking user privileges (are you using the same account in both instances?) [JS] Yes. - checking the line delimiter and --lines-terminated-by FWIW, I always prefer tab-delimited files over comma-separated ones. This gets around a lot of i18n issues. [JS] No doubt. / Carsten Den 03-01-2011 19:33, Jerry Schwartz skrev: sigh This works: localhostTRUNCATE t_dmu_history; Query OK, 0 rows affected (0.41 sec) localhostLOAD DATA LOCAL INFILE 't_dmu_history.txt' INTO TABLE t_dmu_history FIELDS TERMINATED BY , (`dm_history_dm_id`,`dm_history_customer_id`); Query OK, 876211 rows affected (25.16 sec) Records: 876211 Deleted: 0 Skipped: 0 Warnings: 0 localhostSELECT * FROM t_dmu_history LIMIT 4; +--+--++ | t_dmu_history_id | DM_History_DM_ID | DM_History_Customer_ID | +--+--++ |1 |13071 | 299519 | |2 |13071 | 299520 | |3 |13071 | 299521 | |4 |13071 | 299522 | +--+--++ 4 rows in set (0.03 sec) This does not work: localhostTRUNCATE t_dmu_history; localhostquit C:\Users\Jerry\Documents\Access MySQL Productionmysqlimport --columns=`dm_history_dm_id`,`dm_history_customer_id` --fields-terminated=',' --local --password=xxx --pipe --user=access --verbose maintable_usa t_dmu_history.txt Connecting to localhost Selecting database maintable_usa Loading data from LOCAL file: C:/Users/Jerry/Documents/Access MySQL Production/t_dmu_history.txt into t_dmu_history maintable_usa.t_dmu_history: Records: 876211 Deleted: 0 Skipped: 0 Warnings: 1752422 Disconnecting from localhost localhostSELECT * FROM t_dmu_history LIMIT 4; +--+--++ | t_dmu_history_id | DM_History_DM_ID | DM_History_Customer_ID | +--+--++ |1 |13071 | NULL | |2 |13071 | NULL | |3 |13071 | NULL | |4 |13071 | NULL | +--+--++ 4 rows in set (0.00 sec) = Before you ask, the mysql CLI is also using a named pipe. Windows Vista 32-bit MySQL version 5.1.31-community Mysqlimport Ver 3.7 Distrib 5.1.31, for Win32 (ia32) What am I missing? Regards, Jerry Schwartz Global Information Incorporated 195 Farmington Ave. Farmington, CT 06032 860.674.8796 / FAX: 860.674.8341 E-mail: je...@gii.co.jp Web site: www.the-infoshop.com -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/mysql?unsub=arch...@jab.org
Re: mysqlimport remote host problem
zcat /Backup/Latest/alba2_2009-03-10_00h45m.Tuesday.sql.gz | mysql -h 192.168.0.224 -u root -p alba2 Cheers Claudio 2009/3/11 Rob Wultsch wult...@gmail.com On Tue, Mar 10, 2009 at 7:16 PM, René Fournier m...@renefournier.com wrote: OK, I've managed to do the same thing with just the mysql command line program: mysql -h 192.168.0.224 -u root -p alba2 /Backup/Latest/alba2_2009-03-10_00h45m.Tuesday.sql Works great. However, the sql file is normally gzipped, so Can I ungzip the file on the fly (and without removing the .gzip version) and pipe the contents as I did above? (Yes, I'm UNIX-impaired.) Something like: mysql -h 192.168.0.224 -u root -p alba2 gzip -dc /Backup/Latest/alba2_2009-03-10_00h45m.Tuesday.sql.gz But so that it works... ...Rene Pipe is your friend. You probably want something like: gzip -dc /Backup/Latest/alba2_2009-03-10_00h45m.Tuesday.sql.gz | mysql -h 192.168.0.224 -u root -p alba2
Re: mysqlimport remote host problem
Thanks. That works great. On 10-Mar-09, at 9:36 PM, Rob Wultsch wrote: On Tue, Mar 10, 2009 at 7:16 PM, René Fournier m...@renefournier.com wrote: OK, I've managed to do the same thing with just the mysql command line program: mysql -h 192.168.0.224 -u root -p alba2 /Backup/Latest/alba2_2009-03-10_00h45m.Tuesday.sql Works great. However, the sql file is normally gzipped, so Can I ungzip the file on the fly (and without removing the .gzip version) and pipe the contents as I did above? (Yes, I'm UNIX-impaired.) Something like: mysql -h 192.168.0.224 -u root -p alba2 gzip -dc /Backup/Latest/alba2_2009-03-10_00h45m.Tuesday.sql.gz But so that it works... ...Rene Pipe is your friend. You probably want something like: gzip -dc /Backup/Latest/alba2_2009-03-10_00h45m.Tuesday.sql.gz | mysql -h 192.168.0.224 -u root -p alba2 -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/mysql?unsub=arch...@jab.org
mysqlimport remote host problem
I'm writing script that, each night, copies a small database to my laptop on the local network. I'm having trouble getting it to work. Here's my syntax so far (run on the server): mysqlimport --host=192.168.0.224 --user=root --password alba2 alba2_2009-03-10_00h45m.Tuesday.sql Which produces: --- mysqlimport: Error: You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near '-03-10_00h45m IGNORE 0 LINES' at line 1, when using table: alba2_2009-03-10_00h45m --- The sql file is produced by automysqlbackup... Not sure what I'm missing, probably something obvious. Anyway, here's the first part of the sql file I want to import: -- MySQL Administrator dump 1.4 -- -- -- -- Server version 5.0.67-log /*!40101 SET @OLD_CHARACTER_SET_CLIENT=@@CHARACTER_SET_CLIENT */; /*!40101 SET @OLD_CHARACTER_SET_RESULTS=@@CHARACTER_SET_RESULTS */; /*!40101 SET @OLD_COLLATION_CONNECTION=@@COLLATION_CONNECTION */; /*!40101 SET NAMES utf8 */; /*!40014 SET @OLD_UNIQUE_CHECKS=@@UNIQUE_CHECKS, UNIQUE_CHECKS=0 */; /*!40014 SET @OLD_FOREIGN_KEY_CHECKS=@@FOREIGN_KEY_CHECKS, FOREIGN_KEY_CHECKS=0 */; /*!40101 SET @OLD_SQL_MODE=@@SQL_MODE, SQL_MODE='NO_AUTO_VALUE_ON_ZERO' */; -- -- Create schema alba2 -- CREATE DATABASE IF NOT EXISTS alba2; USE alba2; -- -- Definition of table `alba2`.`accounts` -- DROP TABLE IF EXISTS `alba2`.`accounts`; CREATE TABLE `alba2`.`accounts` ( `id` smallint(5) unsigned NOT NULL auto_increment, `fp` varchar(40) NOT NULL, `created` int(10) unsigned NOT NULL default '0', `status` enum('Active','Inactive') NOT NULL default 'Active', `account_name` varchar(40) NOT NULL, `account_full_name` varchar(40) NOT NULL, `address` varchar(40) NOT NULL, `city` varchar(40) NOT NULL, `province` varchar(10) NOT NULL, `postcode` varchar(10) NOT NULL, UNIQUE KEY `id` (`id`) ) ENGINE=MyISAM AUTO_INCREMENT=10 DEFAULT CHARSET=utf8; -- -- Dumping data for table `alba2`.`accounts` -- -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/mysql?unsub=arch...@jab.org
Re: mysqlimport remote host problem
OK, I've managed to do the same thing with just the mysql command line program: mysql -h 192.168.0.224 -u root -p alba2 /Backup/Latest/ alba2_2009-03-10_00h45m.Tuesday.sql Works great. However, the sql file is normally gzipped, so Can I ungzip the file on the fly (and without removing the .gzip version) and pipe the contents as I did above? (Yes, I'm UNIX-impaired.) Something like: mysql -h 192.168.0.224 -u root -p alba2 gzip -dc /Backup/Latest/ alba2_2009-03-10_00h45m.Tuesday.sql.gz But so that it works... ...Rene On 10-Mar-09, at 7:38 PM, René Fournier wrote: I'm writing script that, each night, copies a small database to my laptop on the local network. I'm having trouble getting it to work. Here's my syntax so far (run on the server): mysqlimport --host=192.168.0.224 --user=root --password alba2 alba2_2009-03-10_00h45m.Tuesday.sql Which produces: --- mysqlimport: Error: You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near '-03-10_00h45m IGNORE 0 LINES' at line 1, when using table: alba2_2009-03-10_00h45m --- The sql file is produced by automysqlbackup... Not sure what I'm missing, probably something obvious. Anyway, here's the first part of the sql file I want to import: -- MySQL Administrator dump 1.4 -- -- -- -- Server version 5.0.67-log /*!40101 SET @OLD_CHARACTER_SET_CLIENT=@@CHARACTER_SET_CLIENT */; /*!40101 SET @OLD_CHARACTER_SET_RESULTS=@@CHARACTER_SET_RESULTS */; /*!40101 SET @OLD_COLLATION_CONNECTION=@@COLLATION_CONNECTION */; /*!40101 SET NAMES utf8 */; /*!40014 SET @OLD_UNIQUE_CHECKS=@@UNIQUE_CHECKS, UNIQUE_CHECKS=0 */; /*!40014 SET @OLD_FOREIGN_KEY_CHECKS=@@FOREIGN_KEY_CHECKS, FOREIGN_KEY_CHECKS=0 */; /*!40101 SET @OLD_SQL_MODE=@@SQL_MODE, SQL_MODE='NO_AUTO_VALUE_ON_ZERO' */; -- -- Create schema alba2 -- CREATE DATABASE IF NOT EXISTS alba2; USE alba2; -- -- Definition of table `alba2`.`accounts` -- DROP TABLE IF EXISTS `alba2`.`accounts`; CREATE TABLE `alba2`.`accounts` ( `id` smallint(5) unsigned NOT NULL auto_increment, `fp` varchar(40) NOT NULL, `created` int(10) unsigned NOT NULL default '0', `status` enum('Active','Inactive') NOT NULL default 'Active', `account_name` varchar(40) NOT NULL, `account_full_name` varchar(40) NOT NULL, `address` varchar(40) NOT NULL, `city` varchar(40) NOT NULL, `province` varchar(10) NOT NULL, `postcode` varchar(10) NOT NULL, UNIQUE KEY `id` (`id`) ) ENGINE=MyISAM AUTO_INCREMENT=10 DEFAULT CHARSET=utf8; -- -- Dumping data for table `alba2`.`accounts` -- -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/mysql?unsub...@renefournier.com -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/mysql?unsub=arch...@jab.org
Re: mysqlimport remote host problem
Hi Rene, Just a head's up. You might want to keep your username/password credentials private. On Tue, Mar 10, 2009 at 10:16 PM, René Fournier m...@renefournier.com wrote: OK, I've managed to do the same thing with just the mysql command line program: mysql -h 192.168.0.224 -u root -p alba2 /Backup/Latest/alba2_2009-03-10_00h45m.Tuesday.sql Works great. However, the sql file is normally gzipped, so Can I ungzip the file on the fly (and without removing the .gzip version) and pipe the contents as I did above? (Yes, I'm UNIX-impaired.) Something like: mysql -h 192.168.0.224 -u root -p alba2 gzip -dc /Backup/Latest/alba2_2009-03-10_00h45m.Tuesday.sql.gz But so that it works... ...Rene On 10-Mar-09, at 7:38 PM, René Fournier wrote: I'm writing script that, each night, copies a small database to my laptop on the local network. I'm having trouble getting it to work. Here's my syntax so far (run on the server): mysqlimport --host=192.168.0.224 --user=root --password alba2 alba2_2009-03-10_00h45m.Tuesday.sql Which produces: --- mysqlimport: Error: You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near '-03-10_00h45m IGNORE 0 LINES' at line 1, when using table: alba2_2009-03-10_00h45m --- The sql file is produced by automysqlbackup... Not sure what I'm missing, probably something obvious. Anyway, here's the first part of the sql file I want to import: -- MySQL Administrator dump 1.4 -- -- -- -- Server version 5.0.67-log /*!40101 SET @OLD_CHARACTER_SET_CLIENT=@@CHARACTER_SET_CLIENT */; /*!40101 SET @OLD_CHARACTER_SET_RESULTS=@@CHARACTER_SET_RESULTS */; /*!40101 SET @OLD_COLLATION_CONNECTION=@@COLLATION_CONNECTION */; /*!40101 SET NAMES utf8 */; /*!40014 SET @OLD_UNIQUE_CHECKS=@@UNIQUE_CHECKS, UNIQUE_CHECKS=0 */; /*!40014 SET @OLD_FOREIGN_KEY_CHECKS=@@FOREIGN_KEY_CHECKS, FOREIGN_KEY_CHECKS=0 */; /*!40101 SET @OLD_SQL_MODE=@@SQL_MODE, SQL_MODE='NO_AUTO_VALUE_ON_ZERO' */; -- -- Create schema alba2 -- CREATE DATABASE IF NOT EXISTS alba2; USE alba2; -- -- Definition of table `alba2`.`accounts` -- DROP TABLE IF EXISTS `alba2`.`accounts`; CREATE TABLE `alba2`.`accounts` ( `id` smallint(5) unsigned NOT NULL auto_increment, `fp` varchar(40) NOT NULL, `created` int(10) unsigned NOT NULL default '0', `status` enum('Active','Inactive') NOT NULL default 'Active', `account_name` varchar(40) NOT NULL, `account_full_name` varchar(40) NOT NULL, `address` varchar(40) NOT NULL, `city` varchar(40) NOT NULL, `province` varchar(10) NOT NULL, `postcode` varchar(10) NOT NULL, UNIQUE KEY `id` (`id`) ) ENGINE=MyISAM AUTO_INCREMENT=10 DEFAULT CHARSET=utf8; -- -- Dumping data for table `alba2`.`accounts` -- -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe: http://lists.mysql.com/mysql?unsub...@renefournier.com -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe: http://lists.mysql.com/mysql?unsub=dstepli...@gmail.com -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/mysql?unsub=arch...@jab.org
Re: mysqlimport remote host problem
Note the space after the -p , alba2 will be the defaut database *after* he is prompted and corrctly give the password for r...@whateverhishostis . As he did not give the password, and is not connecting to an that one could get to from the net he really has not given out particularly useful info if we had ill intents. Sent from my Verizon Wireless BlackBerry -Original Message- From: Darryle Steplight dstepli...@gmail.com Date: Tue, 10 Mar 2009 22:20:26 To: René Fournierm...@renefournier.com Cc: mysqlmysql@lists.mysql.com Subject: Re: mysqlimport remote host problem Hi Rene, Just a head's up. You might want to keep your username/password credentials private. On Tue, Mar 10, 2009 at 10:16 PM, René Fournier m...@renefournier.com wrote: OK, I've managed to do the same thing with just the mysql command line program: mysql -h 192.168.0.224 -u root -p alba2 /Backup/Latest/alba2_2009-03-10_00h45m.Tuesday.sql Works great. However, the sql file is normally gzipped, so Can I ungzip the file on the fly (and without removing the .gzip version) and pipe the contents as I did above? (Yes, I'm UNIX-impaired.) Something like: mysql -h 192.168.0.224 -u root -p alba2 gzip -dc /Backup/Latest/alba2_2009-03-10_00h45m.Tuesday.sql.gz But so that it works... ...Rene On 10-Mar-09, at 7:38 PM, René Fournier wrote: I'm writing script that, each night, copies a small database to my laptop on the local network. I'm having trouble getting it to work. Here's my syntax so far (run on the server): mysqlimport --host=192.168.0.224 --user=root --password alba2 alba2_2009-03-10_00h45m.Tuesday.sql Which produces: --- mysqlimport: Error: You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near '-03-10_00h45m IGNORE 0 LINES' at line 1, when using table: alba2_2009-03-10_00h45m --- The sql file is produced by automysqlbackup... Not sure what I'm missing, probably something obvious. Anyway, here's the first part of the sql file I want to import: -- MySQL Administrator dump 1.4 -- -- -- -- Server version 5.0.67-log /*!40101 SET @OLD_CHARACTER_SET_CLIENT=@@CHARACTER_SET_CLIENT */; /*!40101 SET @OLD_CHARACTER_SET_RESULTS=@@CHARACTER_SET_RESULTS */; /*!40101 SET @OLD_COLLATION_CONNECTION=@@COLLATION_CONNECTION */; /*!40101 SET NAMES utf8 */; /*!40014 SET @OLD_UNIQUE_CHECKS=@@UNIQUE_CHECKS, UNIQUE_CHECKS=0 */; /*!40014 SET @OLD_FOREIGN_KEY_CHECKS=@@FOREIGN_KEY_CHECKS, FOREIGN_KEY_CHECKS=0 */; /*!40101 SET @OLD_SQL_MODE=@@SQL_MODE, SQL_MODE='NO_AUTO_VALUE_ON_ZERO' */; -- -- Create schema alba2 -- CREATE DATABASE IF NOT EXISTS alba2; USE alba2; -- -- Definition of table `alba2`.`accounts` -- DROP TABLE IF EXISTS `alba2`.`accounts`; CREATE TABLE `alba2`.`accounts` ( `id` smallint(5) unsigned NOT NULL auto_increment, `fp` varchar(40) NOT NULL, `created` int(10) unsigned NOT NULL default '0', `status` enum('Active','Inactive') NOT NULL default 'Active', `account_name` varchar(40) NOT NULL, `account_full_name` varchar(40) NOT NULL, `address` varchar(40) NOT NULL, `city` varchar(40) NOT NULL, `province` varchar(10) NOT NULL, `postcode` varchar(10) NOT NULL, UNIQUE KEY `id` (`id`) ) ENGINE=MyISAM AUTO_INCREMENT=10 DEFAULT CHARSET=utf8; -- -- Dumping data for table `alba2`.`accounts` -- -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe: http://lists.mysql.com/mysql?unsub...@renefournier.com -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe: http://lists.mysql.com/mysql?unsub=dstepli...@gmail.com -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/mysql?unsub=wult...@gmail.com
Re: mysqlimport remote host problem
On Tue, Mar 10, 2009 at 7:16 PM, René Fournier m...@renefournier.com wrote: OK, I've managed to do the same thing with just the mysql command line program: mysql -h 192.168.0.224 -u root -p alba2 /Backup/Latest/alba2_2009-03-10_00h45m.Tuesday.sql Works great. However, the sql file is normally gzipped, so Can I ungzip the file on the fly (and without removing the .gzip version) and pipe the contents as I did above? (Yes, I'm UNIX-impaired.) Something like: mysql -h 192.168.0.224 -u root -p alba2 gzip -dc /Backup/Latest/alba2_2009-03-10_00h45m.Tuesday.sql.gz But so that it works... ...Rene Pipe is your friend. You probably want something like: gzip -dc /Backup/Latest/alba2_2009-03-10_00h45m.Tuesday.sql.gz | mysql -h 192.168.0.224 -u root -p alba2
mysqlimport huge table
Hi, I have a huge table with about 50 millions of rows. I dumped the table using mysqldump -T , as text dump, and now trying to import it to a database on another server, but it keeps hanging. Are there any options or server variables I can set to help out with it? Thanks, Shuly. -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/mysql?unsub=arch...@jab.org
Re: mysqlimport huge table
Hello Shuly, Try this. http://www.mysqlperformanceblog.com/2008/07/03/how-to-load-large-files-safely-into-innodb-with-load-data-infile/ On Tue, Feb 24, 2009 at 1:08 PM, Shuly Avraham sh...@cshl.edu wrote: Hi, I have a huge table with about 50 millions of rows. I dumped the table using mysqldump -T , as text dump, and now trying to import it to a database on another server, but it keeps hanging. Are there any options or server variables I can set to help out with it? Thanks, Shuly. -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe: http://lists.mysql.com/mysql?unsub=ba...@xaprb.com -- Baron Schwartz, Director of Consulting, Percona Inc. Our Blog: http://www.mysqlperformanceblog.com/ Our Services: http://www.percona.com/services.html -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/mysql?unsub=arch...@jab.org
Re: mysqlimport huge table
Thanks for the resource! However, I am loading into a MyISAM table, while logging is disabled. In a matter of fact, I was able to load the data at the end, after disabling the keys. But now, as I try to create the keys, it takes a huge amount of time. there are 250 Million rows in one of the tables. My key_buffer_size is set to 4G, on a machine with 16G ram. Are there any other system variable I could set in order to speed it up? Thanks, Shuly. On Feb 24, 2009, at 5:17 PM, Baron Schwartz wrote: Hello Shuly, Try this. http://www.mysqlperformanceblog.com/2008/07/03/how-to- load-large-files-safely-into-innodb-with-load-data-infile/ On Tue, Feb 24, 2009 at 1:08 PM, Shuly Avraham sh...@cshl.edu wrote: Hi, I have a huge table with about 50 millions of rows. I dumped the table using mysqldump -T , as text dump, and now trying to import it to a database on another server, but it keeps hanging. Are there any options or server variables I can set to help out with it? Thanks, Shuly. -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/mysql?unsub=ba...@xaprb.com -- Baron Schwartz, Director of Consulting, Percona Inc. Our Blog: http://www.mysqlperformanceblog.com/ Our Services: http://www.percona.com/services.html -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/mysql?unsub=arch...@jab.org
mysqlimport failed to import
I created a csv file entitled 'disposed.csv' and placed it in computer_inventory data folder with the following inside: 1087 1046 1086 1161 1049 1178 1029 1030 1224 1044 1106 Now I created the table 'disposed' as following: Create disposed ( Mot_id INT(4) UNIQUE NOT NULL ); Then I issued the command inside computer_inventory data folder as follows: Mysqlimport -u root -p computer_inventory disposed.csv And got the error: Mysqlimport: Error: Data truncated for column 'mot_id' at row 1, when using table: disposed What am I doing wrong? [cid:image001.jpg@01C91A66.5935E360] inline: image001.jpg -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]
mysqlimport/load data infile is using a temp file - why?
Hi, I've been trying to import a 10G dump file using mysqlimport and it is eventually failing because it runs out of tmpdir space -- I get Errcode: 28. I was surprised that it was using a temp file at all. I've looked in the documentation and other sources but have not been able to find anything about this file. This is on v5.0.51. The table is innodb and has 75 columns with 65% of them tinyint, 20% float, and the rest char,datetime,int. The primary key is composite on two columns (int,char). The data file that is being imported is on the database server. I'll try this again and monitor the status parameters to see what it is doing, but I was wondering if anyone already knows? Thanks, John -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]
Re: mysqlimport/load data infile is using a temp file - why?
Mysql use tmpdir, when ever there is any index creation. regards anandkl On 8/21/08, jthorpe [EMAIL PROTECTED] wrote: Hi, I've been trying to import a 10G dump file using mysqlimport and it is eventually failing because it runs out of tmpdir space -- I get Errcode: 28. I was surprised that it was using a temp file at all. I've looked in the documentation and other sources but have not been able to find anything about this file. This is on v5.0.51. The table is innodb and has 75 columns with 65% of them tinyint, 20% float, and the rest char,datetime,int. The primary key is composite on two columns (int,char). The data file that is being imported is on the database server. I'll try this again and monitor the status parameters to see what it is doing, but I was wondering if anyone already knows? Thanks, John -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]
Re: mysqlimport/load data infile is using a temp file - why?
You should increase parameter named max_bulk_insert_buffer_size and max_allowed_packet. On 8/21/08, Ananda Kumar [EMAIL PROTECTED] wrote: Mysql use tmpdir, when ever there is any index creation. regards anandkl On 8/21/08, jthorpe [EMAIL PROTECTED] wrote: Hi, I've been trying to import a 10G dump file using mysqlimport and it is eventually failing because it runs out of tmpdir space -- I get Errcode: 28. I was surprised that it was using a temp file at all. I've looked in the documentation and other sources but have not been able to find anything about this file. This is on v5.0.51. The table is innodb and has 75 columns with 65% of them tinyint, 20% float, and the rest char,datetime,int. The primary key is composite on two columns (int,char). The data file that is being imported is on the database server. I'll try this again and monitor the status parameters to see what it is doing, but I was wondering if anyone already knows? Thanks, John -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED] -- I'm a MySQL DBA in china. More about me just visit here: http://yueliangdao0608.cublog.cn
mysqlimport load data infile
i read about mysqlimport load data infile for mysql, but i can't find a way to import text file using length of column, instead of delimiter my text file contains fixed length column: -- i can use ms excel to convert all files to .csv format and import, but it would take a long time and i have to escape delimiter. so, is there a way to import text file with fixed column size into mysql??? thanks -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]
Problem with mysqlimport and timestamp
Hi- I'm using the command 'mysqlimport -u usr -ppassh -h mysqlserver -P 3306 -v db --local $workdir/$filename'to import a table into mysql from a file $filename. The data in $filename looks something like: test test where there is a blank space between the two 'test's to represent an empty column. This column is of type datetime NULL DEFAULT NULL in the database. The blank space between the two 'test's gets imported as (err) instead of NULL. Is there a way I can import empty column data as NULL? I'd like there to be a NULL for every row for which this column is empty. I'm not sure how to do this given the format of my $filename. As a check, I tried inserting a row at the top of $filename like test The row was imported correctly, as in there was a NULL in the timestamp column in the database, as well as a NULL in the next column in the database. So, this may have something to do with the format of the $filename, but I may not be able to control the format of that file. Thanks.
Re: Problem with mysqlimport and timestamp
qt4x11 wrote: Hi- I'm using the command 'mysqlimport -u usr -ppassh -h mysqlserver -P 3306 -v db --local $workdir/$filename'to import a table into mysql from a file $filename. The data in $filename looks something like: test test where there is a blank space between the two 'test's to represent an empty column. This column is of type datetime NULL DEFAULT NULL in the database. The blank space between the two 'test's gets imported as (err) instead of NULL. Is there a way I can import empty column data as NULL? I'd like there to be a NULL for every row for which this column is empty. I'm not sure how to do this given the format of my $filename. As a check, I tried inserting a row at the top of $filename like test The row was imported correctly, as in there was a NULL in the timestamp column in the database, as well as a NULL in the next column in the database. So, this may have something to do with the format of the $filename, but I may not be able to control the format of that file. Thanks. You should alter your data like so: test\N test That is, separate the fields with a tab and place \N wherever you want a NULL. As it is, the data is being misinterpreted, which is why the second import you mentioned worked--there was nothing after the first field to be erroneously put into the timestamp field. If the data comes from somewhere else you will need to parse it out and re-write it using proper formatting, i'm afraid. brian -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]
Re: [Fwd: search issue] mysqlimport problem with , inside fields
Scott Hamm wrote: Line 48: 48, 14.729606, 10.1.1.22, 10.182.167.209, TCP, pop3 [SYN, ACK] Seq=0 Ack=1 Win=16384 Len=0 MSS=1460 Is the line 48 is different than other lines? -- View this message in context: http://www.nabble.com/mysqlimport-problem-with-%2C-inside-fields-tf3249141.html#a9055104 Sent from the MySQL - General mailing list archive at Nabble.com. -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]
mysqlimport problem with , inside fields
I've been trying to import fields that contains the comma character ',' inside double quotes '', with the results following: code mysqlimport --fields-optionally-enclosed-by= --fields-terminated-by=, --lines-terminated-by=\r\n --ignore-lines=1 --user=root --password shark c:\documents and settings\shamm\desktop\result.csv Enter password: ** mysqlimport: Error: Row 48 was truncated; it contained more data than there were input columns, when using table: result Line 48: 48, 14.729606, 10.1.1.22, 10.182.167.209, TCP, pop3 [SYN, ACK] Seq=0 Ack=1 Win=16384 Len=0 MSS=1460 mysql desc result; +-+--+--+-+-+---+ | Field | Type | Null | Key | Default | Extra | +-+--+--+-+-+---+ | No | int(10) unsigned | NO | | | | | Time| text | NO | | | | | Source | text | NO | | | | | Destination | text | NO | | | | | Protocol| text | NO | | | | | Info| text | NO | | | | +-+--+--+-+-+---+ 6 rows in set (0.03 sec) C:\Documents and Settings\shammmysql --version mysql Ver 14.12 Distrib 5.0.26, for Win32 (ia32) /code -- `Twas brillig, and the slithy toves Did gyre and gimble in the wabe: All mimsy were the borogoves, And the mome raths outgrabe.
Re: mysqlimport problem with , inside fields
Hey Scott - I dont think you can use , with mysqlimport as a field separator if it is part of the data. use something else - I used the pipe | character... This is what worked for me: C:\mysqlimport --fields-enclosed-by= --fields-terminated-by=| --lines-terminated-by=\r\n --ignore-lines=1 --user=root --password test c:\result.csv --verbose=TRUE --ignore=tr ue Enter password: Connecting to localhost Selecting database test Loading data from SERVER file: c:/result.csv into result test.result: Records: 7 Deleted: 0 Skipped: 0 Warnings: 0 Disconnecting from localhost C:\type result.csv 48| 14.729606| 10.1.1.22| 10.182.167.209| TCP| pop3 [SYN, ACK] Seq=0 Ack=1 Win=16384 Len=0 MSS=1460 48| 14.729606| 10.1.1.22| 10.182.167.209| TCP| pop3 [SYN, ACK] Seq=0 Ack=1 Win=16384 Len=0 MSS=1460 48| 14.729606| 10.1.1.22| 10.182.167.209| TCP| pop3 [SYN, ACK] Seq=0 Ack=1 Win=16384 Len=0 MSS=1460 48| 14.729606| 10.1.1.22| 10.182.167.209| TCP| pop3 [SYN, ACK] Seq=0 Ack=1 Win=16384 Len=0 MSS=1460 48| 14.729606| 10.1.1.22| 10.182.167.209| TCP| pop3 [SYN, ACK] Seq=0 Ack=1 Win=16384 Len=0 MSS=1460 48| 14.729606| 10.1.1.22| 10.182.167.209| TCP| pop3 [SYN, ACK] Seq=0 Ack=1 Win=16384 Len=0 MSS=1460 48| 14.729606| 10.1.1.22| 10.182.167.209| TCP| pop3 [SYN, ACK] Seq=0 Ack=1 Win=16384 Len=0 MSS=1460 mysql select info from result; ++ | info | ++ | pop3 [SYN, ACK] Seq=0 Ack=1 Win=16384 Len=0 MSS=1460 | | pop3 [SYN, ACK] Seq=0 Ack=1 Win=16384 Len=0 MSS=1460 | | pop3 [SYN, ACK] Seq=0 Ack=1 Win=16384 Len=0 MSS=1460 | | pop3 [SYN, ACK] Seq=0 Ack=1 Win=16384 Len=0 MSS=1460 | | pop3 [SYN, ACK] Seq=0 Ack=1 Win=16384 Len=0 MSS=1460 | | pop3 [SYN, ACK] Seq=0 Ack=1 Win=16384 Len=0 MSS=1460 | | pop3 [SYN, ACK] Seq=0 Ack=1 Win=16384 Len=0 MSS=1460 | ++ 7 rows in set (0.00 sec) Let me know how it goes.. Anoop (anokun7) On 2/18/07, Scott Hamm [EMAIL PROTECTED] wrote: I've been trying to import fields that contains the comma character ',' inside double quotes '', with the results following: code mysqlimport --fields-optionally-enclosed-by= --fields-terminated-by=, --lines-terminated-by=\r\n --ignore-lines=1 --user=root --password shark c:\documents and settings\shamm\desktop\result.csv Enter password: ** mysqlimport: Error: Row 48 was truncated; it contained more data than there were input columns, when using table: result Line 48: 48, 14.729606, 10.1.1.22, 10.182.167.209, TCP, pop3 [SYN, ACK] Seq=0 Ack=1 Win=16384 Len=0 MSS=1460 mysql desc result; +-+--+--+-+-+---+ | Field | Type | Null | Key | Default | Extra | +-+--+--+-+-+---+ | No | int(10) unsigned | NO | | | | | Time| text | NO | | | | | Source | text | NO | | | | | Destination | text | NO | | | | | Protocol| text | NO | | | | | Info| text | NO | | | | +-+--+--+-+-+---+ 6 rows in set (0.03 sec) C:\Documents and Settings\shammmysql --version mysql Ver 14.12 Distrib 5.0.26, for Win32 (ia32) /code -- `Twas brillig, and the slithy toves Did gyre and gimble in the wabe: All mimsy were the borogoves, And the mome raths outgrabe. -- Thanks and best regards, Anoop
Mysqlimport complex question
/* OBJECTIVE: INPUT: E524541015.txt:20061004|,535,999|Scot|Hamm|,410|||101 Walter Rd|Dover|MD|76709|,041| WHERE error (,###) can be in any fields *AFTER* the first | char DESIRED OUTPUT: filename: E524541015.txt ord: 20061004 error: 535 error1: 999 error2: 410 error3: 041 error4: error5: Explanation: filename: 14 chars before : ord: 8 chars after : error#: 3 chars (first error MUST NOT be null) after , Table specifications: */ create table all_files ( ID int auto_increment primary key, filename varchar(255) not null, ord int(8) not null, error int(3) not null, error1 int(3), error2 int(3), error3 int(3), error4 int(3), error5 int(3), unique key(filename,ord), index(filename) ); /* Been trying to get mysqlimport to use these characters to no avail, how do I get around to it? */
Re: Mysqlimport complex question
On 2006-10-04 Scott Hamm wrote: OBJECTIVE: INPUT: E524541015.txt:20061004|,535,999|Scot|Hamm|,410|||101 Walter Rd|Dover|MD|76709|,041| ... Been trying to get mysqlimport to use these characters to no avail, how do I get around to it? I can't answer your question but think that you're following the wrong approach instead of the simple rule: Use SQL for storing and calculating and a script language for parsing and formatting. It's quite easy with regular expressions (they can be made more readable with /x): cat input.txt | perl -n -e '/^(.{14}):(.{8})\|,(\d+),(\d+)\|[^\|]+\|[^\|]+\|,(\d+)\|.*\|,(\d+)\|$/ print INSERT INTO t VALUES (\$1\,$2,$3,$4,$5,$6);\n;' The output is ready to be piped into mysql: INSERT INTO t VALUES (E524541015.txt,20061004,535,999,410,041); bye, -christian- -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]
RE: Questions about using mysqlimport to update a table.
I don't think you can do this with mysqlimport. It wouldn't be hard to do with Perl or PHP, though, and that could be automated any way you want with a shell script. Regards, Jerry Schwartz Global Information Incorporated 195 Farmington Ave. Farmington, CT 06032 860.674.8796 / FAX: 860.674.8341 -Original Message- From: Joe User [mailto:[EMAIL PROTECTED] Sent: Wednesday, September 20, 2006 3:16 PM To: mysql@lists.mysql.com Subject: Questions about using mysqlimport to update a table. I need to update a table with the contents of a CSV file regularly, I've used mysqlimport to load all the initial data, but I have a problem with using it for updates. The data in the CSV file does not contain all of the data in the table, there is a field that is updated by another application as well. I need to be able to get updates to the data that is from the CSV file without deleting the data that is not present from those rows. If I run it with --ignore, I don't get the updates to rows that are already present, and if I run it with --replace, I lose the data that wasn't contained within the CSV file. I was really hoping to be able to use mysqlmport for this, since I need to schedule these updates fairly regularly and would like to be able to automate that process. Is there something I am missing that will make this work, or do I need to go about it in another way? Thanks -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]
Questions about using mysqlimport to update a table.
I need to update a table with the contents of a CSV file regularly, I've used mysqlimport to load all the initial data, but I have a problem with using it for updates. The data in the CSV file does not contain all of the data in the table, there is a field that is updated by another application as well. I need to be able to get updates to the data that is from the CSV file without deleting the data that is not present from those rows. If I run it with --ignore, I don't get the updates to rows that are already present, and if I run it with --replace, I lose the data that wasn't contained within the CSV file. I was really hoping to be able to use mysqlmport for this, since I need to schedule these updates fairly regularly and would like to be able to automate that process. Is there something I am missing that will make this work, or do I need to go about it in another way? Thanks
Re: mysqlimport csv file import problem
On Sun, 27 Aug 2006, Jim Seymour wrote: Hi, I download a csv file from Yahoo in this format: ABIAX 20.63 2006-08-3 ACEIX 8.78 2006-08-3 CIGAX 10.08 2006-08-3 FSCTX 22.25 2006-08-3 GGOAX 20.55 2006-08-3 HWLAX 23.3 2006-08-3 HWMAX 28.74 2006-08-3 MLEIX 96.37 2006-08-3 NBPBX 18.98 2006-08-3 PSVIX 32.43 2006-08-3 PTRAX 10.3 2006-08-3 RGACX 30.89 2006-08-3 ^DJI 11242.6 2006-08-3 ^IXIC 2092.34 2006-08-3 ^GSPC 1280.27 2006-08-3 My table for this data is in this format | 2006-08-02 | 20.72 | 8.81 | 10.08 | 22.19 | 20.48 | 23.19 | 28.52 | 96.21 | 18.87 | 32.14 | 10.31 | 30.95 | 11199.93 | 2078.81 | 1278.55 | Is there a way to get mysqlimport to pull the data from specific column/row to insert into a specified field? Trying to find an easier way than typing all of the data into a text file for import. Thanks, Hi Jim, that needs some preprocessing, but 'sed' is your friend. You could use some shell script doing the work for you: #!/bin/sh BLANK=' ' TAB=' ' WHITESPACE=${BLANK}${TAB} DATE=$(cat mydatafile.csv | sed \ -e 3 !d \ -e s/^[$WHITESPACE]*// \ -e s/.*/''/) DATA=$(cat mydatafile.csv | sed \ -e /[A-Z]/ d \ -e /-/ d \ -e s/^[$WHITESPACE]*// \ -e s/.*/''/ | tr '\012' ',' | sed \ -e s/,*$//) echo INSERT INTO mytable VALUES(${DATE},$DATA); exit 0 # end of shell script The resulting queries can be piped into the mysql client. HTH, Thomas Spahni -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]
Re: mysqlimport csv file import problem
On Mon, 2006-08-28 at 13:47 +0200, Thomas Spahni wrote: On Sun, 27 Aug 2006, Jim Seymour wrote: Hi, I download a csv file from Yahoo in this format: ABIAX 20.63 2006-08-3 ACEIX 8.78 2006-08-3 CIGAX 10.08 2006-08-3 FSCTX 22.25 2006-08-3 GGOAX 20.55 2006-08-3 HWLAX 23.3 2006-08-3 HWMAX 28.74 2006-08-3 MLEIX 96.37 2006-08-3 NBPBX 18.98 2006-08-3 PSVIX 32.43 2006-08-3 PTRAX 10.3 2006-08-3 RGACX 30.89 2006-08-3 ^DJI 11242.6 2006-08-3 ^IXIC 2092.34 2006-08-3 ^GSPC 1280.27 2006-08-3 My table for this data is in this format | 2006-08-02 | 20.72 | 8.81 | 10.08 | 22.19 | 20.48 | 23.19 | 28.52 | 96.21 | 18.87 | 32.14 | 10.31 | 30.95 | 11199.93 | 2078.81 | 1278.55 | Is there a way to get mysqlimport to pull the data from specific column/row to insert into a specified field? Trying to find an easier way than typing all of the data into a text file for import. Thanks, Hi Jim, that needs some preprocessing, but 'sed' is your friend. You could use some shell script doing the work for you: #!/bin/sh BLANK=' ' TAB=' ' WHITESPACE=${BLANK}${TAB} DATE=$(cat mydatafile.csv | sed \ -e 3 !d \ -e s/^[$WHITESPACE]*// \ -e s/.*/''/) DATA=$(cat mydatafile.csv | sed \ -e /[A-Z]/ d \ -e /-/ d \ -e s/^[$WHITESPACE]*// \ -e s/.*/''/ | tr '\012' ',' | sed \ -e s/,*$//) echo INSERT INTO mytable VALUES(${DATE},$DATA); exit 0 # end of shell script The resulting queries can be piped into the mysql client. Thanks Thomas, From what I had read it looked like I would have to do something like this. This will save me a lot of time. Thanks Again, -- Jim Seymour [EMAIL PROTECTED] signature.asc Description: This is a digitally signed message part
mysqlimport csv file import problem
Hi, I download a csv file from Yahoo in this format: ABIAX 20.63 2006-08-3 ACEIX 8.78 2006-08-3 CIGAX 10.08 2006-08-3 FSCTX 22.25 2006-08-3 GGOAX 20.55 2006-08-3 HWLAX 23.3 2006-08-3 HWMAX 28.74 2006-08-3 MLEIX 96.37 2006-08-3 NBPBX 18.98 2006-08-3 PSVIX 32.43 2006-08-3 PTRAX 10.3 2006-08-3 RGACX 30.89 2006-08-3 ^DJI 11242.6 2006-08-3 ^IXIC 2092.34 2006-08-3 ^GSPC 1280.27 2006-08-3 My table for this data is in this format | 2006-08-02 | 20.72 | 8.81 | 10.08 | 22.19 | 20.48 | 23.19 | 28.52 | 96.21 | 18.87 | 32.14 | 10.31 | 30.95 | 11199.93 | 2078.81 | 1278.55 | Is there a way to get mysqlimport to pull the data from specific column/row to insert into a specified field? Trying to find an easier way than typing all of the data into a text file for import. Thanks, -- Jim Seymour [EMAIL PROTECTED] signature.asc Description: This is a digitally signed message part
Re: How to disable foreign_key_checks when using mysqlimport?
Gabriel PREDA schrieb: Try: ALTER TABLE `tbl_name` DISABLE KEYS; -- now insert in the TXT file ALTER TABLE `tbl_name` ENABLE KEYS; I think this is what you were looking for ! That would have been a possibility. I did it this way now: ... echo set sql_log_bin=0; set foreign_key_checks=0; use $db; load data infile \$txtfilename\ into table $tablename fields enclosed by '\';| $MYSQL_BIN $MYSQL_CONNECT ... Script performs several checks before this statement and puts the txtfile in the appropriate directory. Regards Dominik -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]
Re: How to disable foreign_key_checks when using mysqlimport?
mysqldump -u [user] -h [host] -p [database] [table] fixme.sql This is for one table. As I need it for all my tables in all my databases, I'd have to write a script for that. And as --tab uses less space, I prefer --tab option for mysqldump. -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]
How to disable foreign_key_checks when using mysqlimport?
Hello When I re-insert dumped data with mysql file.sql, I can simply put set foreign_key_checks=0; at the beginning of the file and this works fine. How can I achieve this when inserting a text file that is read with mysqlimport? I tried to put the mentioned sql-statement in the txt-file, but this does not affect anything. Background: One of my developers accidently dropped a table yesterday which had to be re-created from a dump. I was sort of lucky as the table was in a not too large database, so I could just open the dumpfile of that database and get the lines I needed to recreate the table and data. If I imagine this happened on a larger database which results in several-GB-size dumpfiles, it would have taken MUCH longer to get that table back. So now I'm playing with mysqldump --tab which gives nice per-table data and structure files. So if there's any other well-known solution for per-table dumpfiles, let me know. I'm not too keen on writing something myself right now. Thanks for your help. Dominik -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]
Re: How to disable foreign_key_checks when using mysqlimport?
On Thursday 20 July 2006 04:10 am, Dominik Klein wrote: Hello When I re-insert dumped data with mysql file.sql, I can simply put set foreign_key_checks=0; at the beginning of the file and this works fine. So do it that way So if there's any other well-known solution for per-table dumpfiles, let me know. I'm not too keen on writing something myself right now. mysqldump -u [user] -h [host] -p [database] [table] fixme.sql mysql -u [user] -h [host] -p [database] fixme.sql and for the paranoid, you can always use the ever efficient test database to test things first: mysql -u [user] -h [host] -p test fixme.sql Thanks for your help. Dominik -- Chris White PHP Programmer/DBlankRounds Interfuel -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]
Re: How to disable foreign_key_checks when using mysqlimport?
Try: ALTER TABLE `tbl_name` DISABLE KEYS; -- now insert in the TXT file ALTER TABLE `tbl_name` ENABLE KEYS; I think this is what you were looking for ! -- Gabriel PREDA Senior Web Developer -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]
How can I show the contents of warning using mysqlimport?
I imported the data of the table using mysqlimport. I used the following command line. $ mysqlimport -ukawabe -paaa -h192.168.1.92 --local kawabe C:\index_test.txt kawabe.index_test: Records: 4 Deleted: 0 Skipped: 4 Warnings: 2 I want to show the contents of warning ,so I executed the following: $ mysql -ukawabe -paaa -h192.168.1.92 -e show warnings; But I cannot show the contents of warning . If I use LOAD DATA INFILE・・・ I can show. For mysqlimport How can I show?
Re: mysqlimport, \r\n and \n
A batch script or shell script can easily be written to do this. -Sheeri On 2/20/06, Daniel Kasak [EMAIL PROTECTED] wrote: I've got some import scripts that are giving me trouble. Some MOFOs keep changing the format of the data they give us, and sometimes I loose half the records. When this happens, I change the line terminator from \r\n to \n ... or from \n to \r\n. It's starting to get to me. Is there any way to ease my pain here, short of importing with one format, counting the number of records, truncating the table, importing wit the other format, counting the number of records, and then selecting the format with the most number of records? -- Daniel Kasak IT Developer NUS Consulting Group Level 5, 77 Pacific Highway North Sydney, NSW, Australia 2060 T: (+61) 2 9922-7676 / F: (+61) 2 9922 7989 email: [EMAIL PROTECTED] website: http://www.nusconsulting.com.au -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED] -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]
Re: mysqlimport, \r\n and \n
On 2/22/06, sheeri kritzer [EMAIL PROTECTED] wrote: A batch script or shell script can easily be written to do this. -Sheeri On 2/20/06, Daniel Kasak [EMAIL PROTECTED] wrote: I've got some import scripts that are giving me trouble. Some MOFOs keep changing the format of the data they give us, and sometimes I loose half the records. When this happens, I change the line terminator from \r\n to \n ... or from \n to \r\n. It's starting to get to me. Is there any way to ease my pain here, short of importing with one format, counting the number of records, truncating the table, importing wit the other format, counting the number of records, and then selecting the format with the most number of records? -- Daniel Kasak IT Developer NUS Consulting Group Level 5, 77 Pacific Highway North Sydney, NSW, Australia 2060 T: (+61) 2 9922-7676 / F: (+61) 2 9922 7989 email: [EMAIL PROTECTED] website: http://www.nusconsulting.com.au -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED] -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED] I would stick with the VBA solution, heh, if you use it you can even call Word and use save as, in fact converting your document to plain ol' windows text file... Then I guess you can go back to Access and just let it roll... -- Daniel da Veiga Computer Operator - RS - Brazil -BEGIN GEEK CODE BLOCK- Version: 3.1 GCM/IT/P/O d-? s:- a? C++$ UBLA++ P+ L++ E--- W+++$ N o+ K- w O M- V- PS PE Y PGP- t+ 5 X+++ R+* tv b+ DI+++ D+ G+ e h+ r+ y++ --END GEEK CODE BLOCK-- -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]
RE: mysqlimport, \r\n and \n
Daniel, What about just running your import file through dos2unix before piping it to mysql? That way, it would leave \n alone (since that is unix format) and replace any occurrances of \r\n with plan old \n. -- George -Original Message- From: Daniel Kasak [mailto:[EMAIL PROTECTED] Sent: Monday, February 20, 2006 11:21 PM To: mysql@lists.mysql.com Subject: mysqlimport, \r\n and \n I've got some import scripts that are giving me trouble. Some MOFOs keep changing the format of the data they give us, and sometimes I loose half the records. When this happens, I change the line terminator from \r\n to \n ... or from \n to \r\n. It's starting to get to me. Is there any way to ease my pain here, short of importing with one format, counting the number of records, truncating the table, importing wit the other format, counting the number of records, and then selecting the format with the most number of records? -- Daniel Kasak IT Developer NUS Consulting Group Level 5, 77 Pacific Highway North Sydney, NSW, Australia 2060 T: (+61) 2 9922-7676 / F: (+61) 2 9922 7989 email: [EMAIL PROTECTED] website: http://www.nusconsulting.com.au -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe: http://lists.mysql.com/[EMAIL PROTECTED] -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]
Re: mysqlimport, \r\n and \n
George Law wrote: Daniel, What about just running your import file through dos2unix before piping it to mysql? That way, it would leave \n alone (since that is unix format) and replace any occurrances of \r\n with plan old \n. I could think of a lot of things I could do if this were all running on Linux. Unfortunately the import routine is being triggered from MS Access ... and come to think of it, I'm using 'load data infile' and not 'mysqlimport', but anyway, you get the idea. AFAIK there is no way to trigger anything useful via ODBC. I could write a Perl script, chuck it in my cgi-bin folder, and opening Firefox from Access, passing the script some pointers to the file. But that's dodgy. I suppose while I'm working with Access I'd better get used to dodgy solutions, eh? ;-) -- Daniel Kasak IT Developer NUS Consulting Group Level 5, 77 Pacific Highway North Sydney, NSW, Australia 2060 T: (+61) 2 9922-7676 / F: (+61) 2 9922 7989 email: [EMAIL PROTECTED] website: http://www.nusconsulting.com.au -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]
Re: mysqlimport, \r\n and \n
A little less dodgy is to write a VBA routine within Access to do the filtering. Alternatively, if you don't want to trust access to do it, you can write a VBScript or JScript routine and run it through the shell (yes, Windoze has shell scripts, too) There are all kinds of things you can do. Shawn Green Database Administrator Unimin Corporation - Spruce Pine Daniel Kasak [EMAIL PROTECTED] wrote on 02/21/2006 04:13:28 PM: George Law wrote: Daniel, What about just running your import file through dos2unix before piping it to mysql? That way, it would leave \n alone (since that is unix format) and replace any occurrances of \r\n with plan old \n. I could think of a lot of things I could do if this were all running on Linux. Unfortunately the import routine is being triggered from MS Access ... and come to think of it, I'm using 'load data infile' and not 'mysqlimport', but anyway, you get the idea. AFAIK there is no way to trigger anything useful via ODBC. I could write a Perl script, chuck it in my cgi-bin folder, and opening Firefox from Access, passing the script some pointers to the file. But that's dodgy. I suppose while I'm working with Access I'd better get used to dodgy solutions, eh? ;-) -- Daniel Kasak IT Developer NUS Consulting Group Level 5, 77 Pacific Highway North Sydney, NSW, Australia 2060 T: (+61) 2 9922-7676 / F: (+61) 2 9922 7989 email: [EMAIL PROTECTED] website: http://www.nusconsulting.com.au -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]
mysqlimport, \r\n and \n
I've got some import scripts that are giving me trouble. Some MOFOs keep changing the format of the data they give us, and sometimes I loose half the records. When this happens, I change the line terminator from \r\n to \n ... or from \n to \r\n. It's starting to get to me. Is there any way to ease my pain here, short of importing with one format, counting the number of records, truncating the table, importing wit the other format, counting the number of records, and then selecting the format with the most number of records? -- Daniel Kasak IT Developer NUS Consulting Group Level 5, 77 Pacific Highway North Sydney, NSW, Australia 2060 T: (+61) 2 9922-7676 / F: (+61) 2 9922 7989 email: [EMAIL PROTECTED] website: http://www.nusconsulting.com.au -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]
mysqlimport debug options
Hello Listers, Can anyone explain what are valid values for the 'debug options' on a mysqlimport ? The manuals just say : --debug[=debug_options], -# [debug_options] Write a debugging log. The debug_options string often is 'd:t:o,file_name'. What is d: ? t: ? o (ok,thats relatively easy but just to be sure...) ? This is mysql Ver 14.12 Distrib 5.0.15, for sun-solaris2.9 (sparc) using readline 5.0 Thanks Pierre - Yahoo! Shopping Find Great Deals on Holiday Gifts at Yahoo! Shopping
Re: mysqlimport debug options
Hello. See: http://dev.mysql.com/doc/refman/5.0/en/the-dbug-package.html P. Evans wrote: Hello Listers, Can anyone explain what are valid values for the 'debug options' on a mysqlimport ? The manuals just say : --debug[=debug_options], -# [debug_options] Write a debugging log. The debug_options string often is 'd:t:o,file_name'. What is d: ? t: ? o (ok,thats relatively easy but just to be sure...) ? This is mysql Ver 14.12 Distrib 5.0.15, for sun-solaris2.9 (sparc) using readline 5.0 Thanks Pierre - Yahoo! Shopping Find Great Deals on Holiday Gifts at Yahoo! Shopping -- For technical support contracts, goto https://order.mysql.com/?ref=ensita This email is sponsored by Ensita.NET http://www.ensita.net/ __ ___ ___ __ / |/ /_ __/ __/ __ \/ /Gleb Paharenko / /|_/ / // /\ \/ /_/ / /__ [EMAIL PROTECTED] /_/ /_/\_, /___/\___\_\___/ MySQL AB / Ensita.NET ___/ www.mysql.com -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]
Re: loading a subset of input file using mysqlimport?
As far as I can tell, there's no way to do what you specify. If you want less pre-processing, you can use the IGNORE n LINES command of LOAD DATA INFILE: http://dev.mysql.com/doc/refman/5.0/en/load-data.html The pre-processing involved with that is: 1) Decide how many parts you want to split your file into (say, x) 2) Figure out how many rows are in each part (y rows * x parts = 10 million rows) 3) copy the ascii file x-1 times 4) edit *1* of the files to only contain the first y rows 5) use LOAD DATA INFILE x times -- the first time you'll use LOAD DATA INFILE from the file you edited in step 4, the other x-1 times you'll use LOAD DATA INFILE IGNORE n LINES, where n is y for the 2nd batch, 2y for the 3rd batch, 3y for the 4th batch, 4y for the 5th batch, etc. If you really are worried about the pre-processing, make a perl script, or download one like the one at http://www.hotscripts.com/Detailed/28161.html (note, I claim no liability, I just found that script doing a web search). -Sheeri On 11/22/05, Jacek Becla [EMAIL PROTECTED] wrote: Hi, Is there a way to load a section of an input file into mysql (MyISAM table) using mysqlimport or LOAD DATA INTO? The input data is in relatively large ascii files (10 million rows per file), and I'd like to break the load into smaller pieces rather than load whole file at once. Of course I could pre-process each file and split it, but I'd like to avoid this extra I/O. Thanks, Jacek -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED] -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]
loading a subset of input file using mysqlimport?
Hi, Is there a way to load a section of an input file into mysql (MyISAM table) using mysqlimport or LOAD DATA INTO? The input data is in relatively large ascii files (10 million rows per file), and I'd like to break the load into smaller pieces rather than load whole file at once. Of course I could pre-process each file and split it, but I'd like to avoid this extra I/O. Thanks, Jacek -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]
Re: how to print warnings from mysqlimport...debug_options ??
Hi Guys, I have been searching for the answer to this question for a while. The answer is so obvious, yet there was no useful source of documentation that showed it. I am using the load data infile syntax rather than the command line utility. To get the warnings the show warnings command is sufficient. mysql load data infile '/sttn/done/st062020' replace into table mm_stock; Query OK, 2739 rows affected, 21 warnings (0.20 sec) Records: 2739 Deleted: 0 Skipped: 0 Warnings: 21 mysql show warnings; -Trev -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]
Getting warnings from mysqlimport with MySQL v4.0.15-standard-log
Hi The subject says it all! My mysqlimport command reports 43 warnings, but I have no idea how to access them. SHOW WARNINGS was only implemented after MySQL version 4.1, and I have 4.0.15-standard-log. Any help? Many thanks Mick -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]
Re: Getting warnings from mysqlimport with MySQL v4.0.15-standard-log
Hello. Similar questions are often asked on the list, but I don't remember any solution for old versions. michael watson (IAH-C) [EMAIL PROTECTED] wrote: Hi The subject says it all! My mysqlimport command reports 43 warnings, but I have no idea how to access them. SHOW WARNINGS was only implemented after MySQL version 4.1, and I have 4.0.15-standard-log. Any help? Many thanks Mick -- For technical support contracts, goto https://order.mysql.com/?ref=ensita This email is sponsored by Ensita.NET http://www.ensita.net/ __ ___ ___ __ / |/ /_ __/ __/ __ \/ /Gleb Paharenko / /|_/ / // /\ \/ /_/ / /__ [EMAIL PROTECTED] /_/ /_/\_, /___/\___\_\___/ MySQL AB / Ensita.NET ___/ www.mysql.com -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]
Re: how to print warnings from mysqlimport...debug_options ??
Hello. Are you sure that you have compiled in debugging support? You have an old version of MySQL, and there was a bug when MySQL programs were showing that they have debugging, but really they were without it. Bomb Diggy [EMAIL PROTECTED] wrote: I can't seem to get ahold of the 6 warnings being generated when I import a csv file. My ~/.my.cnf file says this: [client] fields-terminated-by=, fields-enclosed-by=\ #debug=log.txt #debug=d,info,error,query,general,where:1,load.trace debug=warning,load.trace Nothing I do in the 'debug' param seems to actually output any information. Here's my command: mysqlimport -v -h [host] [database] [table_and_file_name].csv -u [username] -p Here's my version info: mysqlimport Ver 3.4 Distrib 4.0.18, for pc-linux (i686) Here's my output to STDOUT/STDERR: Connecting to [hostname] Selecting database [database] Loading data from LOCAL file: [table_and_file_name.csv into [table_name] [database].[table_name]: Records: 3 Deleted: 0 Skipped: 0 Warnings: 6 Disconnecting from [hostname] I want to see those 6 warnings. I've tried command-line '--debug=foo' as well. Thanks. __ Do you Yahoo!? Yahoo! Small Business - Try our new Resources site http://smallbusiness.yahoo.com/resources/ -- For technical support contracts, goto https://order.mysql.com/?ref=ensita This email is sponsored by Ensita.NET http://www.ensita.net/ __ ___ ___ __ / |/ /_ __/ __/ __ \/ /Gleb Paharenko / /|_/ / // /\ \/ /_/ / /__ [EMAIL PROTECTED] /_/ /_/\_, /___/\___\_\___/ MySQL AB / Ensita.NET ___/ www.mysql.com -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]
how to print warnings from mysqlimport...debug_options ??
I can't seem to get ahold of the 6 warnings being generated when I import a csv file. My ~/.my.cnf file says this: [client] fields-terminated-by=, fields-enclosed-by=\ #debug=log.txt #debug=d,info,error,query,general,where:1,load.trace debug=warning,load.trace Nothing I do in the 'debug' param seems to actually output any information. Here's my command: mysqlimport -v -h [host] [database] [table_and_file_name].csv -u [username] -p Here's my version info: mysqlimport Ver 3.4 Distrib 4.0.18, for pc-linux (i686) Here's my output to STDOUT/STDERR: Connecting to [hostname] Selecting database [database] Loading data from LOCAL file: [table_and_file_name.csv into [table_name] [database].[table_name]: Records: 3 Deleted: 0 Skipped: 0 Warnings: 6 Disconnecting from [hostname] I want to see those 6 warnings. I've tried command-line '--debug=foo' as well. Thanks. __ Do you Yahoo!? Yahoo! Small Business - Try our new Resources site http://smallbusiness.yahoo.com/resources/ -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]
How to find out which rows gets deleted by the mysqlimport
Hi, Anyone know how to find out what are the rows that are reported by mysqlimport as deleted? Thanks HT -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]
re: mysqlimport blocks all access to database
Yes I am appending to the end of an existing database. So why are rows 1 to N locked if I'm only adding rows at N+1? Wouldn't the write privileges apply to rows being modified? And during this period even an interactive mysql shell hangs until the mysqlimport completes. i.e. I the database is pretty much inaccessible until mysqlimport completes. -Original Message- From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] Sent: Monday, March 28, 2005 12:30 AM To: [EMAIL PROTECTED] Subject: Re: mysqlimport blocks all access to database Hi Joseph, I have been trying to use mysqlimport to load a primarily read only database with data at regular intervals. My problem occurs when my tables are myisam. In this case all access to the database and the tables blocks until mysqlimport completes. The -lock-tables=false parameter does not help. Is this the normal operation? If not how can I still have concurrent read access to the database while mysqlimport is running? Correct me if I am wrong, but the write request has privilege. So ANY read request is queued until the write request finishes. Otherwise you would get inconsistent read results. Am I correct assuming that you are appending the imported data to the existing data in the database ? Best regards Nils Valentin Tokyo / Japan www.be-known-online.com -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]
mysqlimport blocks all access to database
I have been trying to use mysqlimport to load a primarily read only database with data at regular intervals. My problem occurs when my tables are myisam. In this case all access to the database and the tables blocks until mysqlimport completes. The -lock-tables=false parameter does not help. Is this the normal operation? If not how can I still have concurrent read access to the database while mysqlimport is running?
Re: mysqlimport blocks all access to database
Hi Joseph, I have been trying to use mysqlimport to load a primarily read only database with data at regular intervals. My problem occurs when my tables are myisam. In this case all access to the database and the tables blocks until mysqlimport completes. The -lock-tables=false parameter does not help. Is this the normal operation? If not how can I still have concurrent read access to the database while mysqlimport is running? Correct me if I am wrong, but the write request has privilege. So ANY read request is queued until the write request finishes. Otherwise you would get inconsistent read results. Am I correct assuming that you are appending the imported data to the existing data in the database ? Best regards Nils Valentin Tokyo / Japan www.be-known-online.com -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]
mysqlimport --replace
I need to update a table with mysqlimport. I would like to update the table lines with the same unique key and mantain the other lines untouched. When I use mysqlimport --replace the table lines with the same unique key are updated but the others are deleted. Is it the expected behavior? How to avoid the other lines being deleted? Using FC3. The server version is 4.0.23-standard and the mysqlimport version is Ver 3.4 Distrib 4.0.23. The command line: /usr/local/bin/mysqlimport --replace --local --fields-terminated-by=';' --fields-optionally-enclosed-by='' --lines-terminated-by='\n' --fields-escaped-by='' --host=localhost --user=* --password=* --verbose dbname /path/to/txt/Temp.txt import.log 2importErr.log import.log: Connecting to localhost Selecting database dbname Loading data from LOCAL file: /path/to/txt/Temp.txt into Temp dbname.Temp: Records: 2415 Deleted: 0 Skipped: 0 Warnings: 0 Disconnecting from localhost importErr.log is empty. Table structure: CREATE TABLE Temp ( team int(10) unsigned NOT NULL default '0', points int(11) NOT NULL default '0', wus int(11) NOT NULL default '0', day date NOT NULL default '-00-00', PRIMARY KEY (team, day) ) TYPE=MyISAM; Regards, Clodoaldo Pinto -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]
Re: mysqlimport --replace
Please do not consider the previous email. I found my error. mysqlimport is behaving exactly as expected. Regards, Clodoaldo Pinto On Sat, 29 Jan 2005 15:53:48 -0200, Clodoaldo Pinto [EMAIL PROTECTED] wrote: I need to update a table with mysqlimport. I would like to update the table lines with the same unique key and mantain the other lines untouched. When I use mysqlimport --replace the table lines with the same unique key are updated but the others are deleted. Is it the expected behavior? How to avoid the other lines being deleted? Using FC3. The server version is 4.0.23-standard and the mysqlimport version is Ver 3.4 Distrib 4.0.23. The command line: /usr/local/bin/mysqlimport --replace --local --fields-terminated-by=';' --fields-optionally-enclosed-by='' --lines-terminated-by='\n' --fields-escaped-by='' --host=localhost --user=* --password=* --verbose dbname /path/to/txt/Temp.txt import.log 2importErr.log import.log: Connecting to localhost Selecting database dbname Loading data from LOCAL file: /path/to/txt/Temp.txt into Temp dbname.Temp: Records: 2415 Deleted: 0 Skipped: 0 Warnings: 0 Disconnecting from localhost importErr.log is empty. Table structure: CREATE TABLE Temp ( team int(10) unsigned NOT NULL default '0', points int(11) NOT NULL default '0', wus int(11) NOT NULL default '0', day date NOT NULL default '-00-00', PRIMARY KEY (team, day) ) TYPE=MyISAM; Regards, Clodoaldo Pinto -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]
Re: Problem using debug switch with mysqlimport
Hello. I've submitted a bug: http://bugs.mysql.com/7137 Sure enough... I just installed the latest 4.1 linux binaries, I didn't realize that the server itself had to be compiled with the debug enabled (although now that I realize that it makes complete sense). It would be nice if the documentation for mysqlimport would at least make note of this. Thanks for the response.Settles, Aaron [EMAIL PROTECTED] wrote: -- For technical support contracts, goto https://order.mysql.com/?ref=ensita This email is sponsored by Ensita.NET http://www.ensita.net/ __ ___ ___ __ / |/ /_ __/ __/ __ \/ /Gleb Paharenko / /|_/ / // /\ \/ /_/ / /__ [EMAIL PROTECTED] /_/ /_/\_, /___/\___\_\___/ MySQL AB / Ensita.NET ___/ www.mysql.com -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]
Re: Problem using debug switch with mysqlimport
Sure enough... I just installed the latest 4.1 linux binaries, I didn't realize that the server itself had to be compiled with the debug enabled (although now that I realize that it makes complete sense). It would be nice if the documentation for mysqlimport would at least make note of this. Thanks for the response.
Re: Problem using debug switch with mysqlimport
Hello. Check that your MySQL installation was compiled with debug support: mysqld --help If the --debug flag is listed with the options then you have debugging enabled. mysqladmin ver also lists the mysqld version as mysql ... --debug in this case. I'm trying to utilize the debug switch with mysqlimport so that I can figure out why I'm getting errors on the data I'm importing, but I have yet to figure out a way to do this. I've tried to read the sparse documentation concerning this feature and no debug file is ever produced. I've tried using it as indicated by the mysqlimport --help documentation such as: /usr/local/mysql/bin/mysqlimport --debug=d:t:o,mysql.dbg -uuser -ppassword --local database import_file As well as many variations of this. I've been unable to locate through Google or anything else anyone who has successfully utilized this feature. Does anyone out there have any input? Thanks. Aaron Settles, Aaron [EMAIL PROTECTED] wrote: -- For technical support contracts, goto https://order.mysql.com/?ref=ensita This email is sponsored by Ensita.NET http://www.ensita.net/ __ ___ ___ __ / |/ /_ __/ __/ __ \/ /Gleb Paharenko / /|_/ / // /\ \/ /_/ / /__ [EMAIL PROTECTED] /_/ /_/\_, /___/\___\_\___/ MySQL AB / Ensita.NET ___/ www.mysql.com -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]
Problem using debug switch with mysqlimport
I'm trying to utilize the debug switch with mysqlimport so that I can figure out why I'm getting errors on the data I'm importing, but I have yet to figure out a way to do this. I've tried to read the sparse documentation concerning this feature and no debug file is ever produced. I've tried using it as indicated by the mysqlimport --help documentation such as: /usr/local/mysql/bin/mysqlimport --debug=d:t:o,mysql.dbg -uuser -ppassword --local database import_file As well as many variations of this. I've been unable to locate through Google or anything else anyone who has successfully utilized this feature. Does anyone out there have any input? Thanks. Aaron
Slower mysqlimport in 4..0.20?
Hi, I just wondering if anyone notice or can confirm this: I tried to load 11 million records using mysqlimport using both 4.1.3b and 4.0.20. 4.1.3b took 1.5 hours, but the 4.0.20 took over 10 hours. This includes loading the data and then build the index. Thanks Haitao -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]
mysqlimport - HP-UX 11.11 bus error (coredump)
Subject: HP-UX 11.11/4.0.20 mysqlimport BUS ERROR Description: Installed mysql from the binary download on mysql.com according to the INSTALL-BINARY instructions. Attempted to use mysqlimport as described in the online documentation (http://dev.mysql.com/doc/mysql/en/mysqlimport.html) using simple two line file imptest.txt. Received bus error and coredump as follows: mysql describe imptest; +---+-+--+-+-+---+ | Field | Type| Null | Key | Default | Extra | +---+-+--+-+-+---+ | id| int(11) | YES | | NULL| | | n | varchar(30) | YES | | NULL| | +---+-+--+-+-+---+ 2 rows in set (0.02 sec) mysql exit Bye % cat imptest.txt 100 Max Sydow 101 Count Dracula % mysqlimport --local test imptest.txt Bus error(coredump) % file core core: core file from 'mysqlimport' - received SIGBUS % uname -a HP-UX myhost B.11.11 U 9000/800 547706587 unlimited-user license % getconf KERNEL_BITS 64 How-To-Repeat: See above. Same problem occurs with either of these binaries. mysql-standard-4.0.20-hp-hpux11.11-hppa2.0w-64bit.tar.gz mysql-standard-4.0.20-hp-hpux11.11-hppa2.0w.tar.gz mysql-standard-4.0.17-hp-hpux11.11-hppa2.0w.tar.gz Fix: I need one. My data loader depends on mysqlimport! Submitter-Id: submitter ID Originator:Gary Lehr Organization: Computer Sciences Corporation MySQL support: none Synopsis: mysqlimport bus error on HP-UX 11.11 Severity: Serious. I can't deploy my app without this. Priority: High Category: mysql Class: sw-bug Release: mysql-4.0.20-standard (Official MySQL-standard binary) Server: /usr/local/mysql/bin/mysqladmin Ver 8.40 Distrib 4.0.20, for hp-hpux11.11 on hppa2.0w Copyright (C) 2000 MySQL AB MySQL Finland AB TCX DataKonsult AB This software comes with ABSOLUTELY NO WARRANTY. This is free software, and you are welcome to modify and redistribute it under the GPL license Server version 4.0.20-standard-log Protocol version10 Connection Localhost via UNIX socket UNIX socket /tmp/mysql.sock Uptime: 20 hours 25 min 40 sec Threads: 1 Questions: 30 Slow queries: 0 Opens: 12 Flush tables: 1 Open tables: 3 Queries per second avg: 0.000 C compiler: C++ compiler: Environment: System: HP-UX ndceqign B.11.11 U 9000/800 547706587 unlimited-user license Some paths: /usr/contrib/bin/perl /usr/bin/make /usr/local/bin/gcc /usr/bin/cc GCC: Reading specs from /opt/gcc/lib/gcc-lib/hppa2.0n-hp-hpux11.00/3.0.1 /specs Configured with: ./configure : (reconfigured) ./configure : (reconfigured) ./configure : (reconfigured) ./configure : (reconfigured) ./configure Thread model: single gcc version 3.0.1 Compilation info: CC='cc' CFLAGS='+DAportable' CXX='aCC' CXXFLAGS='+DAportable' LDFLAGS='' ASFLAGS='' LIBC: lrwxr-xr-x 1 root root 8 Mar 6 2002 /lib/libc.0 - ./libc.1 -r-xr-xr-x 1 binbin1867776 Dec 7 2001 /lib/libc.1 -r-xr-xr-x 1 binbin1814528 Jun 27 2003 /lib/libc.2 -r--r--r-- 1 binbin2537200 Jun 27 2003 /lib/libc.a lrwxr-xr-x 1 root root15 Mar 6 2002 /lib/libc.sl - /usr/lib/libc.2 lrwxr-xr-x 1 root root 8 Mar 6 2002 /usr/lib/libc.0 - ./libc.1 -r-xr-xr-x 1 binbin1867776 Dec 7 2001 /usr/lib/libc.1 -r-xr-xr-x 1 binbin1814528 Jun 27 2003 /usr/lib/libc.2 -r--r--r-- 1 binbin2537200 Jun 27 2003 /usr/lib/libc.a lrwxr-xr-x 1 root root15 Mar 6 2002 /usr/lib/libc.sl - /usr/lib/libc.2 Configure command: ./configure '--prefix=/usr/local/mysql' '--localstatedir=/usr/local/mysql/data' '--libexecdir=/usr/local/mysql/bin' '--with-comment=Official MySQL-standard binary' '--with-extra-charsets=complex' '--with-server-suffix=-standard' '--enable-thread-safe-client' '--enable-local-infile' '--disable-shared' '--with-embedded-server' '--with-innodb' 'CC=cc' 'CFLAGS=+DAportable' 'CXXFLAGS=+DAportable' 'CXX=aCC' Perl: This is perl, version 5.005_03 built for PA-RISC2.0 Gary F. Lehr, PhD. Technical Lead, North America GIS GPES Unix Engineering Newark, DE 302-391-8490 -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]
Specifying table with mysqlimport
I have a bunch of CSV files that I wish to import into a table. I can do that with LOAD DATA INFILE INTO TABLE .. but there's a bunch of them with different names (naturally) and I wish to do them in one go. From my understanding, there is no way to specify the table since it is inferred from the name of the file. Is there a smarter way to deal with this than making a shell script that copies the CSV files in turn to a name that matches the table I wish to add data to, and then use that name in a loop? If my table is Bank, this will work: for i in *CSV; do cp $i Bank.CSV; mysqlimport --fields-terminated-by=',' --ignore-lines=1 db_name Bank.CSV; done Something tells me that greater minds have a better way. -- ___ Patrick Connolly {~._.~} _( Y )_ Good judgment comes from experience (:_~*~_:) Experience comes from bad judgment (_)-(_) -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]
update problem with mysqlimport (bug/misuse?)
Hi people I'm facing a (not huge) problem with mysqlimport. The mysql version I'm using is MySQL 4.0.11a-gamma'-Max' For updating a #29000 records table from fixed-lenght ASCII file, I'm using a php script that gets a record and executes and UPDATE for each one: pretty SLOW. SO I read carefully mysqlimport and think that is THE solution for the speed matter. BUT the fields that I don't include in the mysqlimport field list get EMPTY! Here are the details: This is the table: CREATE TABLE alfabeta ( id mediumint(5) unsigned NOT NULL default '0', nombre varchar(30) NOT NULL default '', presenta varchar(30) NOT NULL default '', precio float(7,2) unsigned NOT NULL default '0.00', precio_ponderado float(7,2) unsigned NOT NULL default '0.00', fecha_mod date NOT NULL default '-00-00', id_psico tinyint(1) NOT NULL default '0', anexo tinyint(1) unsigned NOT NULL default '0', pami char(1) NOT NULL default '0', id_laboratorio smallint(4) unsigned NOT NULL default '0', baja tinyint(1) unsigned NOT NULL default '0', heladera tinyint(1) unsigned NOT NULL default '0', id_troquel int(10) unsigned NOT NULL default '0', id_monodroga mediumint(6) unsigned NOT NULL default '0', id_acfa smallint(5) unsigned NOT NULL default '0', id_codbarra bigint(13) unsigned NOT NULL default '0', unidades smallint(4) unsigned NOT NULL default '0', importado tinyint(1) unsigned NOT NULL default '0', sifar char(1) NOT NULL default , id_tamano smallint(2) unsigned NOT NULL default '0', id_tipounid smallint(4) unsigned NOT NULL default '0', id_tipovta tinyint(1) NOT NULL default '0', id_acciofar int(2) NOT NULL default '0', PRIMARY KEY (id), KEY id_troquel (id_troquel), KEY nombre (nombre,presenta) ) TYPE=MyISAM; The mysqlimport command: mysqlimport osdata -u owner -pofthedata --debug -c id,nombre,presenta,precio,id_psico,pami,id_laboratorio,baja,heladera,id_troquel,id_codbarra,unidades,importado,sifar,id_tamano,id_tipovta -r -v -l alfabeta.csv Note: the original file alfabeta.txt have fixed-lenght fields, I use an awk script to convert to alfabeta.csv The question is that, the field precio_ponderado, for example, comes from another source, NOT from alfabeta.csv, so I load it from another .csv in another .php script. But after the execution of the mysqlimport, it gets 0.00, when, well, I expect that mysqlimport leaves it unchanged because I don't list it in the -c clause. So, is that a bug or mysqlimport is only useful for populating empty tables and for that reason emptyes the values of the columns not listed? If so, I'll must to discard that sooo elegant solution and use a .php script to make a HUGE .sql file with UPDATE's (IdontwantitIdontwantitIdontwantit!) thanks in advance Jorge Llarens ___ 100mb gratis, Antivirus y Antispam Correo Yahoo!, el mejor correo web del mundo http://correo.yahoo.com.ar -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]
Re: update problem with mysqlimport (bug/misuse?)
At 12:03 -0300 7/5/04, j llarens wrote: Hi people I'm facing a (not huge) problem with mysqlimport. The mysql version I'm using is MySQL 4.0.11a-gamma'-Max' For updating a #29000 records table from fixed-lenght ASCII file, I'm using a php script that gets a record and executes and UPDATE for each one: pretty SLOW. SO I read carefully mysqlimport and think that is THE solution for the speed matter. BUT the fields that I don't include in the mysqlimport field list get EMPTY! mysqlimport is for adding new records (or replacing existing ones). It does not update existing records. -- Paul DuBois, MySQL Documentation Team Madison, Wisconsin, USA MySQL AB, www.mysql.com -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]
Re: [mysql-php] mysqlimport error
[EMAIL PROTECTED] wrote: is that a problem? I want to avoid ftp db.txt files and then mysqlimport them You should run mysqlimport on the Windows box and specify MySQL server host with -h option. - Original Message - From: Egor Egorov [EMAIL PROTECTED] To: [EMAIL PROTECTED] Sent: Wednesday, June 02, 2004 4:38 PM Subject: Re: [mysql-php] mysqlimport error nikos [EMAIL PROTECTED] wrote: I'm trying mysqlimport --local -d --fields-enclosed-by=| --fields-terminated-by=; - -lines-terminted-by=\n -unikos -p mydb c://temp//programs.txt but i get an Error: File 'c:/temp/programs.txt' not found (Errcode: 2), when using table: programs What I'm doing wrong? RH-9 - APACHE v2 - mysql 3.23.54 Looks like you run mysqlimport on the server host but file is located on Windows box. -- For technical support contracts, goto https://order.mysql.com/?ref=ensita This email is sponsored by Ensita.net http://www.ensita.net/ __ ___ ___ __ / |/ /_ __/ __/ __ \/ /Egor Egorov / /|_/ / // /\ \/ /_/ / /__ [EMAIL PROTECTED] /_/ /_/\_, /___/\___\_\___/ MySQL AB / Ensita.net ___/ www.mysql.com -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]
Re: [mysql-php] mysqlimport error
is that a problem? I want to avoid ftp db.txt files and then mysqlimport them - Original Message - From: Egor Egorov [EMAIL PROTECTED] To: [EMAIL PROTECTED] Sent: Wednesday, June 02, 2004 4:38 PM Subject: Re: [mysql-php] mysqlimport error nikos [EMAIL PROTECTED] wrote: I'm trying mysqlimport --local -d --fields-enclosed-by=| --fields-terminated-by=; - -lines-terminted-by=\n -unikos -p mydb c://temp//programs.txt but i get an Error: File 'c:/temp/programs.txt' not found (Errcode: 2), when using table: programs What I'm doing wrong? RH-9 - APACHE v2 - mysql 3.23.54 Looks like you run mysqlimport on the server host but file is located on Windows box. -- For technical support contracts, goto https://order.mysql.com/?ref=ensita This email is sponsored by Ensita.net http://www.ensita.net/ __ ___ ___ __ / |/ /_ __/ __/ __ \/ /Egor Egorov / /|_/ / // /\ \/ /_/ / /__ [EMAIL PROTECTED] /_/ /_/\_, /___/\___\_\___/ MySQL AB / Ensita.net ___/ www.mysql.com -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]
[mysql-php] mysqlimport error
Hello list I'm trying mysqlimport --local -d --fields-enclosed-by=| --fields-terminated-by=; - -lines-terminted-by=\n -unikos -p mydb c://temp//programs.txt but i get an Error: File 'c:/temp/programs.txt' not found (Errcode: 2), when using table: programs What I'm doing wrong? RH-9 - APACHE v2 - mysql 3.23.54 Thanx -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]
Re: [mysql-php] mysqlimport error
nikos [EMAIL PROTECTED] wrote: I'm trying mysqlimport --local -d --fields-enclosed-by=| --fields-terminated-by=; - -lines-terminted-by=\n -unikos -p mydb c://temp//programs.txt but i get an Error: File 'c:/temp/programs.txt' not found (Errcode: 2), when using table: programs What I'm doing wrong? RH-9 - APACHE v2 - mysql 3.23.54 Looks like you run mysqlimport on the server host but file is located on Windows box. -- For technical support contracts, goto https://order.mysql.com/?ref=ensita This email is sponsored by Ensita.net http://www.ensita.net/ __ ___ ___ __ / |/ /_ __/ __/ __ \/ /Egor Egorov / /|_/ / // /\ \/ /_/ / /__ [EMAIL PROTECTED] /_/ /_/\_, /___/\___\_\___/ MySQL AB / Ensita.net ___/ www.mysql.com -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]
mysqlimport and \ as data
Hi Group, Does anybody have any expierence with mysqlimport and a comma separated file with data wich contains field data with a: \. For example I have the data in de column path: C:\temp. When I import this I got something like : c:||emp ... Does anybody know how to solve this problem? This \ occurs to much to do the job manually ;-) Thanks a lot. Hans -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]
RE: mysqlimport and \ as data
Can you run a find and replace to double up the backslashes? \ -- \\ -Original Message- From: Hans van Dalen To: [EMAIL PROTECTED] Sent: 4/28/04 3:46 AM Subject: mysqlimport and \ as data Hi Group, Does anybody have any expierence with mysqlimport and a comma separated file with data wich contains field data with a: \. For example I have the data in de column path: C:\temp. When I import this I got something like : c:||emp ... Does anybody know how to solve this problem? This \ occurs to much to do the job manually ;-) Thanks a lot. Hans -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe: http://lists.mysql.com/[EMAIL PROTECTED] -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]
Re: mysqlimport question
Ron McKeever [EMAIL PROTECTED] wrote: I'm trying to use mysqimport instead of LOAD DATA INFILE from a shell script. I notice an option for mysqlimport is not working or im doing it wrong. This works with LOAD DATA INFILE : mysql -e LOAD DATA INFILE 'x' INTO TABLE x IGNORE 2 LINES but when i try: mysqlimport --ignore-lines=2 -uroot -ppassword month 0114.txt It says unknown option --ignore-lines=? Am i doing this wrong? Your version of mysqlimport doesn't support this option. --ignore-lines option was added in v4.0.2 of MySQL Also will mysqlimport work only if the .txt file is the same name as the table in the db( mine is month.Jan04, but the files are always mmdd.txt) -- For technical support contracts, goto https://order.mysql.com/?ref=ensita This email is sponsored by Ensita.net http://www.ensita.net/ __ ___ ___ __ / |/ /_ __/ __/ __ \/ /Victoria Reznichenko / /|_/ / // /\ \/ /_/ / /__ [EMAIL PROTECTED] /_/ /_/\_, /___/\___\_\___/ MySQL AB / Ensita.net ___/ www.mysql.com -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]
Re: why: mysqldump and mysqlimport?
On Tue, 13 Jan 2004, EP wrote: I am wondering: I can see the MySQL data files for my various databases. What technically prevents me from simply copying those files and using copies - to move my database to another file structure or server - to back-up my current db Copying will not work across certain versions, will also not work between system archtechtures, port to other RDBMS or if you do not have physical access to the files. For the cases above mysqldump works fine. cheers, Tobias -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]
Re: why: mysqldump and mysqlimport?
EP wrote: I am wondering: I can see the MySQL data files for my various databases. What technically prevents me from simply copying those files and using copies - to move my database to another file structure or server - to back-up my current db Yes, I did put my finger in the electrical socket as a kid. But only several times. A simple file copy works if: a) No-one is accessing the database at the time, and b) There is no database corruption If you have database corruption, mysqldump will often be the first to know about it, as you're selecting all rows ( as opposed to other mysql clients which are selecting specific rows ). So using mysqldump and watching the output ( make a cron job and it'll email you the output ), you can catch database corruption early and have a good chance to do something about it, instead of simply copying corrupt files day after day until you finally don't have a valid backup around. Dan -- Daniel Kasak IT Developer NUS Consulting Group Level 5, 77 Pacific Highway North Sydney, NSW, Australia 2060 T: (+61) 2 9922-7676 / F: (+61) 2 9922 7989 email: [EMAIL PROTECTED] website: http://www.nusconsulting.com.au -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]
mysqlimport question
I'm trying to use mysqimport instead of LOAD DATA INFILE from a shell script. I notice an option for mysqlimport is not working or im doing it wrong. This works with LOAD DATA INFILE : mysql -e LOAD DATA INFILE 'x' INTO TABLE x IGNORE 2 LINES but when i try: mysqlimport --ignore-lines=2 -uroot -ppassword month 0114.txt It says unknown option --ignore-lines=? Am i doing this wrong? Also will mysqlimport work only if the .txt file is the same name as the table in the db( mine is month.Jan04, but the files are always mmdd.txt) -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]
why: mysqldump and mysqlimport?
I am wondering: I can see the MySQL data files for my various databases. What technically prevents me from simply copying those files and using copies - to move my database to another file structure or server - to back-up my current db Yes, I did put my finger in the electrical socket as a kid. But only several times. TIA Eric -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]
Re: why: mysqldump and mysqlimport?
In the last episode (Jan 13), EP said: I am wondering: I can see the MySQL data files for my various databases. What technically prevents me from simply copying those files and using copies - to move my database to another file structure or server - to back-up my current db Absolutely nothing. As long as you ensure there are no active writes while you make the copy, this is the fastest way to back up a database. The mysqlhotcopy script automates the process. -- Dan Nelson [EMAIL PROTECTED] -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]
Re: Problem with mysqlimport
--local is a valid option for mysqlimport in 3.23.49 according to the manual. (Source: http://www.cict.fr/app/mysql/manual.html#mysqlimport) What is the entire command you are using? Hi, My command is: mysqlimport -p -L ilk gwarancje.txt And I get error: mysqlimport: Error: The used command is not allowed with this MySQL version, when using table: gwarancje I used this command by Linux shell (logged as root). Regards -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]
Problem with mysqlimport
I tryed to import from text file (columns divided by tabulators) like this: 5724KF2003CSDEUROPAPARKAN2003-12-12MarcinTamkanono . . . On PHPTriad (under Windows 2000) it works perfectly ! I use command: mysqlimport database file.txt But under Linux (RedHat) i have serious problem. System indicates problem: mysqlimport: Error: Can't get stat of '/root/mysql/gwarancje.txt' (Errcode: 13), when using table: gwarancje where gwarancje is the name of table. I tryed to set privileges by chmod 777 command or smilar but it still doesn`t work. Does anybody have any ideas ?? Regards, Paul
Re: Problem with mysqlimport
is there a higher level directory that does not allow excecution? does it work if you run it as root? can you 'vi /root/mysql/gwarancje.txt'? just some thoughts Jeff Pawe Filutowski [EMAIL PROTECTED]To: [EMAIL PROTECTED] rfam.pl cc: Subject: Problem with mysqlimport 12/09/2003 10:27 AM I tryed to import from text file (columns divided by tabulators) like this: 5724KF2003CSDEUROPAPARKAN2003-12-12MarcinTamka nono . . . On PHPTriad (under Windows 2000) it works perfectly ! I use command: mysqlimport database file.txt But under Linux (RedHat) i have serious problem. System indicates problem: mysqlimport: Error: Can't get stat of '/root/mysql/gwarancje.txt' (Errcode: 13), when using table: gwarancje where gwarancje is the name of table. I tryed to set privileges by chmod 777 command or smilar but it still doesn`t work. Does anybody have any ideas ?? Regards, Paul -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]
RE: Problem with mysqlimport
If the the file is on the same machine as your shell is running, specify --local when running mysqlimport. Matt -Original Message- From: Pawe Filutowski [mailto:[EMAIL PROTECTED] Sent: Tuesday, December 09, 2003 10:27 AM To: [EMAIL PROTECTED] Subject: Problem with mysqlimport I tryed to import from text file (columns divided by tabulators) like this: 5724KF2003CSDEUROPAPARKAN2003-12-12MarcinTamka nono . . . On PHPTriad (under Windows 2000) it works perfectly ! I use command: mysqlimport database file.txt But under Linux (RedHat) i have serious problem. System indicates problem: mysqlimport: Error: Can't get stat of '/root/mysql/gwarancje.txt' (Errcode: 13), when using table: gwarancje where gwarancje is the name of table. I tryed to set privileges by chmod 777 command or smilar but it still doesn`t work. Does anybody have any ideas ?? Regards, Paul -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]
Re: Problem with mysqlimport
I tryed this option but i got following error: mysqlimport: Error: The used command is not allowed with this MySQL version, when using table: gwarancje MySQL version is 3.23.49 What Can I do ?? Regards - Original Message - From: Matt Griffin [EMAIL PROTECTED] To: 'Pawe3 Filutowski' [EMAIL PROTECTED]; [EMAIL PROTECTED] Sent: Tuesday, December 09, 2003 5:11 PM Subject: RE: Problem with mysqlimport If the the file is on the same machine as your shell is running, specify --local when running mysqlimport. Matt -Original Message- From: Pawe Filutowski [mailto:[EMAIL PROTECTED] Sent: Tuesday, December 09, 2003 10:27 AM To: [EMAIL PROTECTED] Subject: Problem with mysqlimport I tryed to import from text file (columns divided by tabulators) like this: 5724KF2003CSDEUROPAPARKAN2003-12-12MarcinTamka nono . . . On PHPTriad (under Windows 2000) it works perfectly ! I use command: mysqlimport database file.txt But under Linux (RedHat) i have serious problem. System indicates problem: mysqlimport: Error: Can't get stat of '/root/mysql/gwarancje.txt' (Errcode: 13), when using table: gwarancje where gwarancje is the name of table. I tryed to set privileges by chmod 777 command or smilar but it still doesn`t work. Does anybody have any ideas ?? Regards, Paul -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED] -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]
RE: Problem with mysqlimport
Paul, --local is a valid option for mysqlimport in 3.23.49 according to the manual. (Source: http://www.cict.fr/app/mysql/manual.html#mysqlimport) What is the entire command you are using? Matt -Original Message- From: Pawe Filutowski [mailto:[EMAIL PROTECTED] Sent: Tuesday, December 09, 2003 12:06 PM To: [EMAIL PROTECTED] Subject: Re: Problem with mysqlimport I tryed this option but i got following error: mysqlimport: Error: The used command is not allowed with this MySQL version, when using table: gwarancje MySQL version is 3.23.49 What Can I do ?? Regards - Original Message - From: Matt Griffin [EMAIL PROTECTED] To: 'Pawe3 Filutowski' [EMAIL PROTECTED]; [EMAIL PROTECTED] Sent: Tuesday, December 09, 2003 5:11 PM Subject: RE: Problem with mysqlimport If the the file is on the same machine as your shell is running, specify --local when running mysqlimport. Matt -Original Message- From: Pawe Filutowski [mailto:[EMAIL PROTECTED] Sent: Tuesday, December 09, 2003 10:27 AM To: [EMAIL PROTECTED] Subject: Problem with mysqlimport I tryed to import from text file (columns divided by tabulators) like this: 5724KF2003CSDEUROPAPARKAN2003-12-12MarcinTamka nono . . . On PHPTriad (under Windows 2000) it works perfectly ! I use command: mysqlimport database file.txt But under Linux (RedHat) i have serious problem. System indicates problem: mysqlimport: Error: Can't get stat of '/root/mysql/gwarancje.txt' (Errcode: 13), when using table: gwarancje where gwarancje is the name of table. I tryed to set privileges by chmod 777 command or smilar but it still doesn`t work. Does anybody have any ideas ?? Regards, Paul -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED] -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED] -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]
RE: Debug options for mysqlimport?
Good question - mysqlimport seems to work siliently and not report any errors. I had read that this was to be improved in 4 - has it ?? appreciate answers Thomas -Original Message- From: Greg G [mailto:[EMAIL PROTECTED] Sent: 02 December 2003 15:22 To: [EMAIL PROTECTED] Subject: Debug options for mysqlimport? I'm having a tough time with the debug options for mysqlimport. I've check the docs, but they're as clear as mud. I've tried a number of combinations of -#d:t:o,filename and everything else I can think of, but I can't get any debug information. What I'm really looking for is to get the text of warnings that were generated during and import. -Greg G -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED] * This email and any attachments are confidential and intended for the sole use of the intended recipient(s).If you receive this email in error please notify [EMAIL PROTECTED] and delete it from your system. Any unauthorized dissemination, retransmission, or copying of this email and any attachments is prohibited. Euroconex does not accept any responsibility for any breach of confidence, which may arise from the use of email. Please note that any views or opinions presented in this email are solely those of the author and do not necessarily represent those of the Company. This message has been scanned for known computer viruses. * -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]
Debug options for mysqlimport?
I'm having a tough time with the debug options for mysqlimport. I've check the docs, but they're as clear as mud. I've tried a number of combinations of -#d:t:o,filename and everything else I can think of, but I can't get any debug information. What I'm really looking for is to get the text of warnings that were generated during and import. -Greg G -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]
mysqlimport error
Good morning, We recently put a new server online (FreeBSD) with MySQL version 4.0.15. I have a couple of AWK scripts which used to work properly (on another FreeBSD server with 3.23.n MySQL) with mysqlimport. Now I am getting the following error... /usr/local/bin/mysqlimport: Error: The used command is not allowed with this MySQL version, when using table: tblClass10 I have googled and found that adding the -L (for Local) to the line; /usr/local/bin/mysqlimport --host=server --user=user --password=*** -f --fields-terminated-by=\t crm tblClass10.tmp may fix this, but I couldn't get a clear explanation for why. Are there any other problems with the upgrade that I should be aware of? Thanks! Jay -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]
Re: mysqlimport
May Yang [EMAIL PROTECTED] wrote: Dear everyone, I'd like to ask you a question, thanks in advance. Q: How long will it take to import 100GB data into MySQL DB by command mysqlimport ? Depends on how many records are to be inserted. Hard to tell not knowing your hardware and record length. But anyway, it will take hours or tens of hours. -- For technical support contracts, goto https://order.mysql.com/?ref=ensita This email is sponsored by Ensita.net http://www.ensita.net/ __ ___ ___ __ / |/ /_ __/ __/ __ \/ /Egor Egorov / /|_/ / // /\ \/ /_/ / /__ [EMAIL PROTECTED] /_/ /_/\_, /___/\___\_\___/ MySQL AB / Ensita.net ___/ www.mysql.com -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]
mysqlimport
Dear everyone, (B (BI'd like to ask you a question, thanks in advance. (B (BQ: How long will it take to import 100GB data into MySQL DB by command "mysqlimport" ? (B (BBest regards, (BMay Yang (B (B (B-- (BMySQL General Mailing List (BFor list archives: http://lists.mysql.com/mysql (BTo unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]
Re: mysqlimport question
Cersosimo, Steve [EMAIL PROTECTED] wrote: Am I wrong to assume mysqlimport is supposed to emulate the LOAD DATA INFILE syntax? I cannot find the command line option to turn on the CONCURRENT flag. CONCURRENT currently is not supported by mysqlimport. -- For technical support contracts, goto https://order.mysql.com/?ref=ensita This email is sponsored by Ensita.net http://www.ensita.net/ __ ___ ___ __ / |/ /_ __/ __/ __ \/ /Victoria Reznichenko / /|_/ / // /\ \/ /_/ / /__ [EMAIL PROTECTED] /_/ /_/\_, /___/\___\_\___/ MySQL AB / Ensita.net ___/ www.mysql.com -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]
mysqlimport question
Am I wrong to assume mysqlimport is supposed to emulate the LOAD DATA INFILE syntax? I cannot find the command line option to turn on the CONCURRENT flag. Steve Cersosimo When all is said and done, more is said than done * The information transmitted is intended only for the person or entity to which it is addressed and may contain confidential, proprietary, and/or privileged material. Any review, retransmission, dissemination or other use of, or taking of any action in reliance upon, this information by persons or entities other than the intended recipient is prohibited. If you received this in error, please contact the sender and delete the material from all computers.60
mysqlimport 'can't get stat of [file]' error
I've unable to load data from files into existing tables. When I give the command: mysqlimport [database] [filename] I keep getting the following error. mysqlimport: Error: Can't get stat of '[filename]' (Errcode: 13), when using table: [table] I get a similar error when trying to use LOAD DATA INFILE from within mysql. The text files all have the extension .table I'm sure that I'm just missing something simple, but I've spent a good amount of time trying to figure this out and haven't been able to. Any advice on what's happening or how to get around this would be really appreciated. Thanks, Will Tyburczy ___ Join Excite! - http://www.excite.com The most personalized portal on the Web! -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]
Re: mysqlimport 'can't get stat of [file]' error
Check file permissions. The file must be readable by the user 'mysql'. Will Tyburczy wrote: I've unable to load data from files into existing tables. When I give the command: mysqlimport [database] [filename] I keep getting the following error. mysqlimport: Error: Can't get stat of '[filename]' (Errcode: 13), when using table: [table] I get a similar error when trying to use LOAD DATA INFILE from within mysql. The text files all have the extension .table I'm sure that I'm just missing something simple, but I've spent a good amount of time trying to figure this out and haven't been able to. Any advice on what's happening or how to get around this would be really appreciated. Thanks, Will Tyburczy ___ Join Excite! - http://www.excite.com The most personalized portal on the Web! -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]
Re: mysqlimport 'can't get stat of [file]' error
Will Tyburczy [EMAIL PROTECTED] wrote: I've unable to load data from files into existing tables. When I give the command: mysqlimport [database] [filename] I keep getting the following error. mysqlimport: Error: Can't get stat of '[filename]' (Errcode: 13), when using table: [table] I get a similar error when trying to use LOAD DATA INFILE from within mysql. The text files all have the extension .table I'm sure that I'm just missing something simple, but I've spent a good amount of time trying to figure this out and haven't been able to. Any advice on what's happening or how to get around this would be really appreciated. $ perror 13 Error code 13: Permission denied You doesn't have permission on that file. -- For technical support contracts, goto https://order.mysql.com/?ref=ensita This email is sponsored by Ensita.net http://www.ensita.net/ __ ___ ___ __ / |/ /_ __/ __/ __ \/ /Victoria Reznichenko / /|_/ / // /\ \/ /_/ / /__ [EMAIL PROTECTED] /_/ /_/\_, /___/\___\_\___/ MySQL AB / Ensita.net ___/ www.mysql.com -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]
Re: non-root users cannot run mysqlimport
Carl Anthony-uzoeto [EMAIL PROTECTED] wrote: I need to periodically load a dumpfile from another DB into mysql. Now, since this is a cronjob, and for which I would need to avoid interactivity, I DO NEED to run this as a non-root user. I have setup such a user, and have granted the user all on the destination database. The user also has the important 'file' privilege. But it seems only the user root can load these files. You also need INSERT privilege on the tables that you use for data import. Any ideas on whats going on here and what I may be missing/overlooking ... -- For technical support contracts, goto https://order.mysql.com/?ref=ensita This email is sponsored by Ensita.net http://www.ensita.net/ __ ___ ___ __ / |/ /_ __/ __/ __ \/ /Victoria Reznichenko / /|_/ / // /\ \/ /_/ / /__ [EMAIL PROTECTED] /_/ /_/\_, /___/\___\_\___/ MySQL AB / Ensita.net ___/ www.mysql.com -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]
mysqldump and mysqlimport
I've been trying this for a few hours now and I'm not sure what's going on. mysqldump --host=localhost --user=root --password=mypass dbname dbname.dump This works fine, it creates a text file with CREATE TABLE blocks and INSERT statements for each table in the database. When I try to recreate this db on another server using mysqlimport: mysqlimport -u root -p dbname 'dbname.dump' I keep getting the error: Error: Table 'dbname.dbname' doesn't exist, when using table: dbname My question: Why is mysqlimport interpreting the dbname argument as a table name? From the documentation for mysql 4.8.8: mysqlimport is invoked like this: shell mysqlimport [options] database textfile1 [textfile2 ...] It's not doing that. It's taking the argument database and using it as the name of a table instead. I checked to make sure I didn't have any CREATE TABLE dbname statements to make sure it wasn't the dump file trying to do this, but it's not. The command line for some reason is trying to make a table called dbname.dbname. I assumed that the mysqlimport would just create the necessary tables using the dumpfile. Anyone have a similar problem? Peter. -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]
Re: mysqldump and mysqlimport
At 4:47 PM -0400 9/15/03, Peter Koutsoulias wrote: I've been trying this for a few hours now and I'm not sure what's going on. mysqldump --host=localhost --user=root --password=mypass dbname dbname.dump This works fine, it creates a text file with CREATE TABLE blocks and INSERT statements for each table in the database. When I try to recreate this db on another server using mysqlimport: mysqlimport -u root -p dbname 'dbname.dump' mysqlimport is not the complement of mysqldump, it is a command-line interface to the LOAD DATA INFILE statement. To import the dump file, use mysql: mysql -u root -p dbname dbname.dump I keep getting the error: Error: Table 'dbname.dbname' doesn't exist, when using table: dbname My question: Why is mysqlimport interpreting the dbname argument as a table name? From the documentation for mysql 4.8.8: mysqlimport is invoked like this: shell mysqlimport [options] database textfile1 [textfile2 ...] It's not doing that. It's taking the argument database and using it as the name of a table instead. I checked to make sure I didn't have any CREATE TABLE dbname statements to make sure it wasn't the dump file trying to do this, but it's not. The command line for some reason is trying to make a table called dbname.dbname. I assumed that the mysqlimport would just create the necessary tables using the dumpfile. Anyone have a similar problem? Peter. -- Paul DuBois, Senior Technical Writer Madison, Wisconsin, USA MySQL AB, www.mysql.com Are you MySQL certified? http://www.mysql.com/certification/ -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]