On 2015/04/12 08:52, Pothanaboyina Trimurthy wrote:
The problem is , as mentioned the load data is taking around 2 hours, I
have 2 timestamp columns for one column I am passing the input through load
data, and for the column DB_MODIFIED_DATETIME no input is provided, At
the end of the load data
Hi All,
I am facing an issue with timestamp columns while working with MySQL load
data in file, I am loading around a million records which is taking around
2 hours to complete the load data.
Before get into more details about the problem, first let me share the
table structure.
CREATE TABLE
Hi all,
I've a question, i need to killing a load data in file. Normally used
show processlist and kill PID. But don't work.
any idea?
Thanks :D
{ name : Rafael Valenzuela,
open source: [Saiku Admin Console,Anaytical Sport],
location : Madrid Spain,
twitter : [@sowe https
Hello Neubyr,
On 1/29/2014 7:16 PM, neubyr wrote:
I am trying to understand MySQL statement based replication with LOAD DATA
LOCAL INFILE statement'.
According to manual -
https://dev.mysql.com/doc/refman/5.0/en/replication-features-load.html -
LOAD DATA LOCAL INFILE is replicated as LOAD DATA
Thanks for the details Shawn.
So row based replication would avoid server side LOAD DATA on slave.
Unfortunately, the Master is using MySQL ver 5.0, so I don't think it can
use row based replication.
- thanks,
N
On Thu, Jan 30, 2014 at 7:48 AM, shawn l.green shawn.l.gr...@oracle.comwrote
I am trying to understand MySQL statement based replication with LOAD DATA
LOCAL INFILE statement'.
According to manual -
https://dev.mysql.com/doc/refman/5.0/en/replication-features-load.html -
LOAD DATA LOCAL INFILE is replicated as LOAD DATA LOCAL INFILE, however, I
am seeing it replicated
If I don't mistake, there are some parameters to make that you are saying.
Check statement-based-replication and row-based-replication. I think that
this could help you.
Regards,
Antonio.
2013/12/18 11:07 -0500, Anthony Ball
I ran across a curious issue, I'd call it a bug but I'm sure others would
call it a feature.
I have a csv file with space between the and , and it causes MySQL to eat
that field and the field after it as a single field. Is there a setting I
can use to
no whitespace
intrudes?
Here is an example:
testa ,testb
create temporary table testa (a char(15), b char(5)); LOAD DATA LOCAL
INFILE '/tmp/test.csv' INTO TABLE testa FIELDS TERMINATED BY ',' OPTIONALLY
ENCLOSED BY '';
Data in table is
mysql select * from testa
(1)
yes it is an issue even i faced. for the remedy i search the {( ,) (,)}
values of , space between and , replaced by , in .csv itself.
(2)
The other way is, if all the values are like space between , then you can
use space and , in fields terminated by
LOAD DATA LOCAL INFILE '/tmp
Hello,
after switching from MySQL 5.0.67 to 5.1.59 we get the following problem:
We want to import data from a textfile
example of the textfile:
t...@test.com$yes$432145$xyz
The command is: LOAD DATA LOCAL INFILE 'textfile.txt' INTO TABLE TESTTABLE
FIELDS TERMINATED BY '$';
Selecting
mysql 5.1.53 after i see
some post on the internet saying we have some issues in the older version ,
but it keeps giving the same error.
thanks
Anand
On Mon, Dec 20, 2010 at 7:42 PM, who.cat win@gmail.com wrote:
i wanna know you have done LOAD DATA INFILE in master ,why are you tring
to do
Hi guys,
i am facing a serious issue with my replication , i tried so many things but
no luck.
my replication is running with mysql 5.0.51a in master and 5.0.90 in slave.
we run LOAD DATA INFILE in master to process some csv files and load it into
a table, it runs perfectly well in master
Hi Anand,
Just try 'load data local infile',it maybe work.
Eric
2010/12/20 Anand anand@gmail.com:
Hi guys,
i am facing a serious issue with my replication , i tried so many things but
no luck.
my replication is running with mysql 5.0.51a in master and 5.0.90 in slave.
we run LOAD
On Mon, Dec 20, 2010 at 9:00 AM, Anand anand@gmail.com wrote:
Hi guys,
i am facing a serious issue with my replication , i tried so many things
but no luck.
my replication is running with mysql 5.0.51a in master and 5.0.90 in slave.
we run LOAD DATA INFILE in master to process some
Hello,
I have a question about the execution cycle of LOAD DATA INFILE.
If I issue a large file via LDI LOCAL, I know that the file is copied to
the MySQL server and executed there.
But at what point does the statement finish from the sender's point of view?
1) When the file is successfully
: Monday, October 25, 2010 12:16 PM
To: mysql@lists.mysql.com
Subject: Load Data Infile Errors
Hello,
I have a question about the execution cycle of LOAD DATA INFILE.
If I issue a large file via LDI LOCAL, I know that the file is copied to
the MySQL server and executed there.
But at what point
Hi all,
I have the following script:
Load data
Local infile myData.csv
Into table myTable
Fields terminated by ,
Enclosed by
Lines terminated by \r\n
(field1, field2,
)
When this is sourced directly from mysql it works fine, but when invoked
from php, I get the error
If you are using v4 of php this will never work
On Tue, Jun 1, 2010 at 10:24 AM, memo garcia mgar...@cistrans.cl wrote:
Hi all,
I have the following script:
Load data
Local infile ‘myData.csv’
Into table myTable
Fields terminated by ‘,’
Enclosed by ‘’
Lines terminated by ‘\r\n
[mailto:hiji...@gmail.com]
Enviado el: Tuesday, June 01, 2010 10:34 AM
Para: mgar...@cistrans.cl
CC: mysql@lists.mysql.com
Asunto: Re: load data in php
If you are using v4 of php this will never work
On Tue, Jun 1, 2010 at 10:24 AM, memo garcia mgar...@cistrans.cl wrote:
Hi all,
I have
: Mike [mailto:hiji...@gmail.com]
Enviado el: Tuesday, June 01, 2010 10:34 AM
Para: mgar...@cistrans.cl
CC: mysql@lists.mysql.com
Asunto: Re: load data in php
If you are using v4 of php this will never work
On Tue, Jun 1, 2010 at 10:24 AM, memo garcia mgar...@cistrans.cl wrote:
Hi all,
I
loading the data, load the data, then do an Alter Table .. add index
for all of the indexes?
Or is it faster to just leave the indexes in place prior to loading the
data.
I know if the table is empty and optimized, the non-unique indexes will be
built AFTER the data is loaded using Load
Then before loading
alter table table_name disable keys;
load data
alter table table enable keys;
This will enable faster data load and faster index rebuild.
regards
anandkl
On Fri, Feb 26, 2010 at 8:03 AM, Baron Schwartz ba...@xaprb.com wrote:
Hi,
On Sun, Feb 21, 2010 at 1:42 PM, mos mo
for all of the indexes?
Or is it faster to just leave the indexes in place prior to loading the data.
I know if the table is empty and optimized, the non-unique indexes will be
built AFTER the data is loaded using Load Data Infile, but the unique and
primary indexes will be built
will
be built AFTER the data is loaded using Load Data Infile, but the unique
and primary indexes will be built as the data is being loaded and this
is going to slow down the import.
There is no point doing a Disable Indexes on the table because this only
affects non-unique indexes
At 05:40 AM 10/18/2009, John wrote:
Mike,
What behaviour you experience depends to some extent on what storage engine
you are using and on what other non-unique indexes you have on the tables.
With LOAD DATA INFILE on empty MyISAM tables all non-unique indexes are
created in a separate batch
Mike,
What behaviour you experience depends to some extent on what storage engine
you are using and on what other non-unique indexes you have on the tables.
With LOAD DATA INFILE on empty MyISAM tables all non-unique indexes are
created in a separate batch which makes it much faster if you have
I'm trying to speed up Load Data Infile and after some experimenting have
noticed this qwirk.
BTW, all of the tables used below are empty and have identical table
structures. The value being loaded into the primary key column is 'NULL'.
Test1:
246 seconds to run Load Data Infile into a table
like to import the first file field to the second table
field, the second file field to the third table,... Just to have an index.
I'm using:
| LOAD DATA INFILE 'test.csv' INTO TABLE table
| FIELDS TERMINATED BY ','
| LINES STARTING BY '' TERMINATED BY '\n'
| (Page, Device, GROUP , ItemID
Group is a keyword in mysql:
You need to put backticks around it in your statement:
| LOAD DATA INFILE 'test.csv' INTO TABLE table
| FIELDS TERMINATED BY ','
| LINES STARTING BY '' TERMINATED BY '\n'
| (Page, Device, `GROUP` , ItemID, Item, Value);
On Mon, Jun 29, 2009 at 7:07 AM, Ralph
Johnny Withers schrieb:
Group is a keyword in mysql:
You need to put backticks around it in your statement:
| LOAD DATA INFILE 'test.csv' INTO TABLE table
| FIELDS TERMINATED BY ','
| LINES STARTING BY '' TERMINATED BY '\n'
| (Page, Device, `GROUP` , ItemID, Item, Value);
Ooookay
Hi,
mysql create temporary table t(i int);
mysql \! echo 1 /tmp/data.txt
mysql load data infile '/tmp/data.txt' into table t;
Query OK, 1 row affected (0.00 sec)
Records: 1 Deleted: 0 Skipped: 0 Warnings: 0
mysql select * from t;
+--+
| i|
+--+
|1 |
+--+
1 row in set
Thank you but the real problem occurs when you don't know the schema
of the table in advance. If data.txt has two columns columns how can I
still load it in a temporary table? I'm asking this question because
I'd like to add an import csv feature to a web application. I know
that you can load data
Hi Alex,
It is true that use LOAD DATA INFILE you do need to know the schema of the
table. I'm not sure how useful it would be to import arbitrary data if you
don't have some expectations about what that data is. There are a couple
options for you:
1. Make sure your users upload a CSV
Hello,
Would anyone know how to load data infile into a temporary table?
Thank you,
Alex
--
MySQL General Mailing List
For list archives: http://lists.mysql.com/mysql
To unsubscribe:http://lists.mysql.com/mysql?unsub=arch...@jab.org
Hi,
MySQL v4.1.22 on Linux 2.6.18-6-686
I have a dump file generate with mysqldump created by a version 4.1.10 server.
I want to import the dump file into a different server. When I run
mysqldump --database mydb --debug mydumpfile.sql
I get the following:
-- MySQL dump 10.9
--
--
Hi,
On Thu, Apr 2, 2009 at 1:18 PM, Virgilio Quilario
virgilio.quila...@gmail.com wrote:
Hi,
MySQL v4.1.22 on Linux 2.6.18-6-686
I have a dump file generate with mysqldump created by a version 4.1.10
server.
I want to import the dump file into a different server. When I run
mysqldump
Hi,
MySQL v4.1.22 on Linux 2.6.18-6-686
I have a dump file generate with mysqldump created by a version 4.1.10 server.
I want to import the dump file into a different server. When I run
mysqldump --database mydb --debug mydumpfile.sql
I get the following:
-- MySQL dump 10.9
--
-- Host:
Hi,
I've been trying to import a 10G dump file using mysqlimport
and it is eventually failing because it runs out of tmpdir
space -- I get Errcode: 28.
I was surprised that it was using a temp file at all. I've
looked in the documentation and other sources but have not been
able to find
Mysql use tmpdir,
when ever there is any index creation.
regards
anandkl
On 8/21/08, jthorpe [EMAIL PROTECTED] wrote:
Hi,
I've been trying to import a 10G dump file using mysqlimport
and it is eventually failing because it runs out of tmpdir
space -- I get Errcode: 28.
I was surprised
You should increase parameter named max_bulk_insert_buffer_size and
max_allowed_packet.
On 8/21/08, Ananda Kumar [EMAIL PROTECTED] wrote:
Mysql use tmpdir,
when ever there is any index creation.
regards
anandkl
On 8/21/08, jthorpe [EMAIL PROTECTED] wrote:
Hi,
I've been trying to
can u please show use the content of the test.csv file. Also is comapny
name a single column or two different columns
If its two different columns than try this
load data file '/foo/test.csv' into table abc.test fields terminated by ','
(company,name);
On 6/28/08, bruce [EMAIL PROTECTED] wrote
Hi..
I've got an issue with doing a Load data file cmd..
my test text tbl has a column named company name i'm trying to figure out
how to use the load data file cmd, to be able to extract the company name
col...
when i do:
load data file '/foo/test.csv' into table abc.test (company name
Hi,
I would like to know if I can use the Load data infile to update a table on the
server from a workstation?
I tried it but was unsuccessful. Is there any other way to do this from a
workstation?
Thanks.
Regards,
Velen
LOAD DATA LOCAL INFILE
http://dev.mysql.com/doc/refman/5.0/en/load-data.html
http://www.mysql.com/news-and-events/newsletter/2002-05/a12.html
-Original Message-
From: Velen [mailto:[EMAIL PROTECTED]
Sent: Thursday, May 22, 2008 2:24 PM
To: mysql@lists.mysql.com
Subject: Load
Hi Everyone,
I am attempting to use this command: load data infile '/volumes/raider/
elks.test.txt' into table elksCurrent fields terminated by '\t' lines
terminated by '\n';
My table is created as such:
| elksCurrent | CREATE TABLE `elksCurrent` (
`FName` varchar(40) default NULL
On Mon, Apr 14, 2008 at 10:29 AM, Jason Pruim [EMAIL PROTECTED] wrote:
Hi Everyone,
I am attempting to use this command: load data infile
'/volumes/raider/elks.test.txt' into table elksCurrent fields terminated by
'\t' lines terminated by '\n';
My table is created
On Mon, Apr 14, 2008 at 10:47 AM, Rob Wultsch [EMAIL PROTECTED] wrote:
It is probably trying to insert a string of no length into the not null
field.
Try it with:
SET SQL_MODE = '';
Above should read into an int field, while the server is in strict mode.
--
Rob Wultsch
[EMAIL PROTECTED]
is in strict
mode.
Hi Rob,
Where would I set that? I tried to add it to the load data infile line
and it didn't like that... Should I try it before I do the indata?
--
Rob Wultsch
[EMAIL PROTECTED]
wultsch (aim)
--
MySQL General Mailing List
For list archives: http://lists.mysql.com/mysql
Everyone,
I am attempting to use this command: load data infile '/volumes/
raider/elks.test.txt' into table elksCurrent fields terminated by
'\t' lines terminated by '\n';
My table is created as such:
| elksCurrent | CREATE TABLE `elksCurrent` (
`FName` varchar(40) default NULL,
`LName` varchar
On Mon, Apr 14, 2008 at 1:29 PM, Jason Pruim [EMAIL PROTECTED] wrote:
Hi Everyone,
I am attempting to use this command: load data infile
'/volumes/raider/elks.test.txt' into table elksCurrent fields terminated by
'\t' lines terminated by '\n';
[snip!]
The error that I'm getting
On Mon, Apr 14, 2008 at 3:33 PM, Jason Pruim [EMAIL PROTECTED] wrote:
On Apr 14, 2008, at 3:29 PM, Daniel Brown wrote:
That's because it's attempting to insert the name of the columns
from your CSV into MySQL --- and 'Record' is not a valid INT.
Replaced field name with 0 and had
On Apr 14, 2008, at 3:29 PM, Daniel Brown wrote:
On Mon, Apr 14, 2008 at 1:29 PM, Jason Pruim [EMAIL PROTECTED]
wrote:
Hi Everyone,
I am attempting to use this command: load data infile
'/volumes/raider/elks.test.txt' into table elksCurrent fields
terminated by
'\t' lines terminated
On Mon, Apr 14, 2008 at 3:45 PM, Daniel Brown [EMAIL PROTECTED] wrote:
Does your file actually have the characters \t \t \n at the end of
each row like that?
Send it to me as an attachment off-list and I'll help you figure
it out and then post back here for the MySQL archives.
On Apr 14, 2008, at 4:37 PM, Daniel Brown wrote:
On Mon, Apr 14, 2008 at 3:45 PM, Daniel Brown [EMAIL PROTECTED]
wrote:
Does your file actually have the characters \t \t \n at the end of
each row like that?
Send it to me as an attachment off-list and I'll help you figure
it out and
i read about mysqlimport load data infile for mysql, but i can't find a
way to import text file using length of column, instead of delimiter
my text file contains fixed length column:
--
i can use ms excel to convert all files to .csv format and import, but it
would
The test box doesn't have incoming data when he's taking the snapshot. Lock
the production database while taking snapshot and setting up replication or
you will have this problem. I've tried all the methods (snapshot, dump,
hotcopy etc) and the issue is always the same. You can't bootstrap
Hi, I'm developing a PHP/MySQL app, and I use load data infile to feed
data to MySQL. At the develop server I haven't any error, the app works
great, but when I upload to the production server, all the load data
infile statements fails with an error like this: can't stat
path/to/my_cvs_file
Mauricio Tellez wrote:
Hi, I'm developing a PHP/MySQL app, and I use load data infile to feed
data to MySQL. At the develop server I haven't any error, the app works
great, but when I upload to the production server, all the load data
infile statements fails with an error like this: can't stat
You should paste the result of command show grants for
'filasPOS'@'localhost',not the message pasted here.
On Jan 8, 2008 12:04 PM, Mauricio Tellez [EMAIL PROTECTED] wrote:
Hi, I'm developing a PHP/MySQL app, and I use load data infile to feed
data to MySQL. At the develop server I haven't
root user with the following command:
*mysqladmin* -u *root password* my_passwdand if I connect with this
user like:
$db = mysqli_connect(localhost, root, my_passwd);
mysqli_select_db($db, 'filasPOS');
I can use the load data statement without errors. But when I tried to
connect with the user
Thanks a lot Joe, the LOAD DATA LOCAL did the trick, but just to be courius,
why work LOAD DATA with user root but not with filasPOS user, and LOAD DATA
LOCAL work with both users? thanks in advance
2008/1/8, joe [EMAIL PROTECTED]:
forgot to mention that I an using 5.1
I use
On Nov 28, 2007 11:18 PM, B. Keith Murphy [EMAIL PROTECTED] wrote:
The reason I asked about version is that it looks like there is problem
replcating a load data infile command from some versions of 4.x to 5.x
slaves.
Master and Slaves are 5.x. Hopefully I've figured out the issue.
When
,deletes) using LOAD DATA INFILE. Does this cause a
problem for replication?
Thanks,
Michael
--
MySQL General Mailing List
For list archives: http://lists.mysql.com/mysql
To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]
of that could
cause this is that we are inserting some data on the master
(updates,inserts,deletes) using LOAD DATA INFILE. Does this cause a
problem for replication?
Thanks,
Michael
--
Keith Murphy
editor: MySQL Magazine
http://www.mysqlzine.net
--
MySQL General Mailing List
What do you mean by falls out of sync?
LOAD DATA INFILE hasn't been a problem for me, and I use it a LOT.
It's so simple that I suspect something else. But then again, I don't
know what you mean by out of sync :)
On Nov 28, 2007 4:32 PM, B. Keith Murphy [EMAIL PROTECTED] wrote:
What versions
The reason I asked about version is that it looks like there is problem
replcating a load data infile command from some versions of 4.x to 5.x
slaves.
Baron Schwartz wrote:
What do you mean by falls out of sync?
LOAD DATA INFILE hasn't been a problem for me, and I use it a LOT.
It's so
hi there,
i have a text file that i prepare:
insert into `sa2007` (`id`,`amount`,`state`) values
('','1.00','oh'),
('','2.00','il'),
('','4.00','ks')
how do i import this file to sa2007 table from the command line? i tried
via phymyadmin, but it doesn't work (300 seconds timeout).
thnx,
T.
On 11/13/07, Hiep Nguyen [EMAIL PROTECTED] wrote:
hi there,
i have a text file that i prepare:
insert into `sa2007` (`id`,`amount`,`state`) values
('','1.00','oh'),
('','2.00','il'),
('','4.00','ks')
how do i import this file to sa2007 table from the command line? i tried
via
hi all,
right now i'm trying to migrate from db2 running under linux to mysql v5.1.
i manage to export out the db2 structure data into a del (ascii) file.
but when i try to load the data from the del file to mysql table, it
generate an error.
below is the load data infile syntax i use =
LOAD
.
Dusan
Caleb Racey napsal(a):
Does anyone know how to get the load data infile command to load utf8
data?
I have setup a database as utf8 with a collation of utf8_general_ci,
the
mysqld server is started with --character-set-server=utf8. Server
variables
say character_set_database = utf8
Are you sure your file is coded in utf8? Character set of your file must
be same as charset of your database.
Dusan
Caleb Racey napsal(a):
Does anyone know how to get the load data infile command to load utf8 data?
I have setup a database as utf8 with a collation of utf8_general_ci
Ananda Kumar wrote:
Hi,
Try this.
set session collation_database=latin1_swedish_ci;
set session character_set_database=latin1;
Rather:
set session collation_database=utf8_general_ci;
set session character_set_database=utf8;
Also, make sure you have these in my.cnf:
[client]
Does anyone know how to get the load data infile command to load utf8 data?
I have setup a database as utf8 with a collation of utf8_general_ci, the
mysqld server is started with --character-set-server=utf8. Server variables
say character_set_database = utf8. I use the sql below
LOAD DATA
Caleb Racey wrote:
Does anyone know how to get the load data infile command to load utf8 data?
I have setup a database as utf8 with a collation of utf8_general_ci, the
mysqld server is started with --character-set-server=utf8. Server variables
say character_set_database = utf8. I use the sql
Caleb Racey wrote:
Does anyone know how to get the load data infile command to load utf8 data?
I have setup a database as utf8 with a collation of utf8_general_ci, the
mysqld server is started with --character-set-server=utf8. Server variables
say character_set_database = utf8. I use the sql
Caleb Racey wrote:
On 10/26/07, Baron Schwartz [EMAIL PROTECTED] wrote:
Caleb Racey wrote:
It is indeed buggy and badly documented. It depends on the current
database's character set instead. Try this:
SET NAMES utf8;
SET character_set_database=utf8;
LOAD DATA INFILE...
Baron
Thanks
Hi Friend,
Today I was testing the command 'Load data infile ...' command (
http://dev.mysql.com/doc/refman/5.0/en/loading-tables.html ) in my system.
That time I was surprised when I put select statement in that table. The
scenario as follows :
In a text file which is to be loaded, I am having
Hi Ananda,
Ananda Kumar schrieb:
So you set the collation_database=utf8_bin, what was your
character_set_database values.
character_set_database is utf8. The collation utf8_bin slows down
queries, but is necessary in dealing with multilingual information.
utf8_general_ci is faster, but can not
Okie, i will also try this, as we also load data from a flat file.
regards
anandkl
On 8/31/07, Harald Vajkonny [EMAIL PROTECTED] wrote:
Hi Ananda,
Ananda Kumar schrieb:
So you set the collation_database=utf8_bin, what was your
character_set_database values.
character_set_database
Hello,
I would like to import data from a utf8-coded comma seperated file. I
created my database with DEFAULT CHARACTER SET utf8 COLLATE
utf8_general_ci and I started my mysql-client with the
--default-character-set=utf8 option. Nevertheless, when I input primary
key fields, which differ only in
Before you import at the mysql prompt set below variables and then try again
to load
set session max_error_count=50;
set session collation_database=latin1_swedish_ci;
set session character_set_database=latin1;
regards
anandkl
On 8/30/07, Harald Vajkonny [EMAIL PROTECTED] wrote:
Hello,
Ananda Kumar schrieb:
Before you import at the mysql prompt set below variables and then try
again to load
set session max_error_count=50;
set session collation_database=latin1_swedish_ci;
set session character_set_database=latin1;
This is not what I need, because I use utf8 as well as
I would like to import data from a utf8-coded comma seperated file. I
created my database with DEFAULT CHARACTER SET utf8 COLLATE
utf8_general_ci and I started my mysql-client with the
--default-character-set=utf8 option. Nevertheless, when I input primary
key fields, which differ only in one
Edward Kay schrieb:
Try using the SET NAMES 'utf8' statement [1] to tell MySQL that your client
is sending data in UTF-8. I believe that as your server is latin1, it will
assume this is the character set used by the command line client.
[1]
must be same as file's character set and this condition is OK.
For sure I used script:
USE database_with_correct_charset;
LOAD DATA ...;
And this worked fine for files with cp1250 and also with keybcs2 (I had
two databases, of course)
HTH,
Dusan
--
MySQL General Mailing List
For list archives
I used the latin collation and latin db character set, to load data similar
to you, and we got this done correctly.
If your inserting multi byte data, then u need to set the above parameters.
This was one of the solutions give by mysql, i am not able to get the url. I
will search my notes and get
Ananda Kumar schrieb:
I used the latin collation and latin db character set, to load data
similar to you, and we got this done correctly.
If your inserting multi byte data, then u need to set the above
parameters. This was one of the solutions give by mysql, i am not able
to get the url. I
Does anybody know how I restart my mysql-server with the correct
character and collation settings, if this is the cause for my problem,
or if there might be any other reason for it. My mysql version is
5.0.26-12, running on a Suse Linux 10.2.
Meanwhile I managed to change the server
strange. did u exit and reconnect and did the select?
On 8/30/07, Harald Vajkonny [EMAIL PROTECTED] wrote:
Ananda Kumar schrieb:
I used the latin collation and latin db character set, to load data
similar to you, and we got this done correctly.
If your inserting multi byte data, then u
Ananda Kumar schrieb:
strange. did u exit and reconnect and did the select?
Yes, I tried it once more. I have to put the USE command before I change
session settings to latin to make it work without error (otherwise I
still get the duplicate message). But even after exiting I get the
). But even after exiting I get the
national characters displayed as two (or more) bytes.
Try to convert file to latin1, if it's possible, create database with
latin1 charset, create table with required structure (you can set utf8
charset to string fields ) and then load data. What client do you use
Ananda Kumar schrieb:
strange. did u exit and reconnect and did the select?
Yes, I tried it once more. I have to put the USE command before I change
session settings to latin to make it work without error (otherwise I
still get the duplicate message). But even after exiting I get the
national
Dušan Pavlica schrieb:
Try to convert file to latin1, if it's possible, create database with
latin1 charset, create table with required structure (you can set utf8
charset to string fields ) and then load data.
I can not convert the file into latin1, because it is multilingual (i.e.
European
Harald Vajkonny napsal(a):
Dušan Pavlica schrieb:
Try to convert file to latin1, if it's possible, create database with
latin1 charset, create table with required structure (you can set utf8
charset to string fields ) and then load data.
I can not convert the file into latin1
Dušan Pavlica schrieb:
What's the charset and collation of your primary field in the table?
With which command do I get the charset and collation information of a
single field in a table? SHOW CREATE TABLE returns:
...
) ENGINE=MyISAM DEFAULT CHARSET=utf8 |
But I believe it is utf8, because when
Harald Vajkonny schrieb:
In doing this I got another idea: Does anybody know the difference
between the collations utf8_general_ci, utf8_unicode_ci and utf8_bin?
I'll try these first and then get back to you about the results.
That was it. If I choose utf8_bin as collation everything works
Hi Harald,
So you set the
collation_database=utf8_bin, what was your character_set_database values.
regards
anandkl
On 8/30/07, Harald Vajkonny [EMAIL PROTECTED] wrote:
Harald Vajkonny schrieb:
In doing this I got another idea: Does anybody know the difference
between the collations
the Oracle data to a fifo pipe (mknod) and
running a load data infile against it
2. write a program that dynamically builds extended insert statements up to
length of max_allowed_packet (similar to mysqldump -e)
is either one significantly faster than the other? I know I could benchmark
it but I
On 7/23/07, Sid Lane [EMAIL PROTECTED] wrote:
is either one significantly faster than the other?
Yes, LOAD DATA INFILE is much faster.
are there additional (faster) approaches I have not thought of?
Not that I've found. I think you'd have to write directly to the C
API to beat LOAD DATA
1 - 100 of 1106 matches
Mail list logo