field before it is loaded into the table directly.
will be quite hard to figure out on my own i guess?
Thanks again!
From: Harrison Fisk [EMAIL PROTECTED]
To: Jessica Svensson [EMAIL PROTECTED]
CC: mysql@lists.mysql.com
Subject: Re: LOAD DATA and skip columns in text file...
Date: Wed, 25 May
I cant tell you how much i love you right now :)
This works flawless!! Thanks a million times!
From: Harrison Fisk [EMAIL PROTECTED]
To: Jessica Svensson [EMAIL PROTECTED]
CC: mysql@lists.mysql.com
Subject: Re: LOAD DATA and skip columns in text file...
Date: Wed, 25 May 2005 20:05:40 -0400
thanks, that's what I feared already..
although ugly, will do it that way so far as I don't want to do any other
pre-processing on the fixed CSV's I receive.
moreover I just found out STR_TO_DATE isn't available in mysql 4.0 anyway..
Michael
-Original Message-
From: Jigal van Hemert
using mysql 4.0.24 I'm stuck with a problem importing a CSV-file
converting the date in the CSV from MM/DD/ to a mysql table..
I'm trying:
LOAD DATA INFILE '/tmp/mydata.csv'
INTO TABLE mytable
FIELDSTERMINATED BY ','OPTIONALLY ENCLOSED BY ''
(recordType,
@lists.mysql.com
Subject: Re: Load data infile and text fields
Date: Sun, 10 Apr 2005 02:32:28 +0200
Cc: [EMAIL PROTECTED]
Am Samstag, 2. April 2005 13.51 schrieb [EMAIL PROTECTED]:
First of all I hope you can be patient for my english
I'm working with data import into mysql from a txt file
Am Samstag, 2. April 2005 13.51 schrieb [EMAIL PROTECTED]:
First of all I hope you can be patient for my english
I'm working with data import into mysql from a txt file. I'm using LOAD
DATA INFILE
command but I cannot correctly import a text column of 595 characters.
I receive this (very
@lists.mysql.com,
[EMAIL PROTECTED]
Subject: Re: Load data infile and text fields
Date: Mon, 4 Apr 2005 15:52:06 -0400
Stefano,
I'm copying this to the mailing list. I think it is a lot better if we
have
discussions of this kind on the mailing list so that others can also learn
from them, either
Hello.
Do you use a VARCHAR type for that column? It's maximum
length is limited to 255 characters. I think, switching to TEXT type
could solve the problem.
[EMAIL PROTECTED] wrote:
First of all I hope you can be patient for my english
I'm working with data import into mysql
documents. Each record should have a single word file.
I'd like to write a query (I hope without using api as php or other languages)
that imports automatically all .doc files stored ina dir. Have any idea?
Thanks
Stefano
-- Messaggio originale --
Subject: Re: Load data infile and text fields
From
[EMAIL PROTECTED]
Sent: Monday, April 04, 2005 12:24 PM
Subject: Re: Load data infile and text fields
Rhino, many thanks for your answer! My problem is that I need a filed with
precision for a field of exactly 595 characters! Only text field type with
precision is the char type but its limit is 256
On Apr 4, 2005, at 3:52 PM, Rhino wrote:
Stefano,
I'm copying this to the mailing list. I think it is a lot better if we
have
discussions of this kind on the mailing list so that others can also
learn
from them, either now or in the future via the mailing list archive.
I'm glad to hear that you
What is the structure of the table you are importing to? you might have
merely hit the natural limit of the column type.
- michael dykman
On Sat, 2005-04-02 at 06:51, [EMAIL PROTECTED] wrote:
First of all I hope you can be patient for my english
I'm working with data import into mysql
Stefano,
The behaviour you are describing is normal, assuming that the column in your
MySQL table is defined as CHAR(255) or VARCHAR(255).
You didn't say which version of MySQL you are using. However, unless you are
using MySQL 5.0.3 or later, 255 is the largest size available for a CHAR or
I cannot stop or lock tables on the master webapps write data to it constantly.
I am copying over the binlogs and applying them to the slave.
It is taking a long time so I just want to know if load data or copying tables
over would bring replication back to where it is.
I
Renato Golin
On Tuesday 29 March 2005 11:26, Shamim Shaik wrote:
Can I run load data from master on myisam tables where my table size is
approx 30G?
- stop slave
- on master do:
- lock tables
- tar cpf - /var/lib/mysql/tbl | ssh -C slave tar xpf - -C /var/lib/mysql/tbl
- start slave
- on master again:
-
On Tuesday 29 March 2005 11:44, Shamim Shaik wrote:
I cannot stop or lock tables on the master webapps write data to it
constantly.
I am copying over the binlogs and applying them to the slave.
It is taking a long time so I just want to know if load data or copying
tables over would bring
On Tue, 29 Mar 2005, Shamim Shaik wrote:
Can I run load data from master on myisam tables where my table size is
approx 30G?
Is there a better way to do this ?
Hi,
1) LOAD DATA FROM MASTER: From the manual:
It acquires a global read lock on the master while taking the
snapshot,
At 05:18 PM 3/6/2005, Harrison Fisk wrote:
Hi,
On Mar 6, 2005, at 12:51 PM, mos wrote:
At 10:07 PM 3/2/2005, you wrote:
Hello.
You may use ALTER TABLE .. DISABLE KEYS to speed up the loading process
on the MyISAM table.
I gave that a try but I had to cancel Alter Table ... Enable Keys after
49
At 10:07 PM 3/2/2005, you wrote:
Hello.
You may use ALTER TABLE .. DISABLE KEYS to speed up the loading process
on the MyISAM table.
I gave that a try but I had to cancel Alter Table ... Enable Keys after
49 hours. I find it amazing that it takes only 6.25 hours to load 450
million rows into the
Hi,
On Mar 6, 2005, at 12:51 PM, mos wrote:
At 10:07 PM 3/2/2005, you wrote:
Hello.
You may use ALTER TABLE .. DISABLE KEYS to speed up the loading
process
on the MyISAM table.
I gave that a try but I had to cancel Alter Table ... Enable Keys
after 49 hours. I find it amazing that it takes only
Hello.
You may use ALTER TABLE .. DISABLE KEYS to speed up the loading process
on the MyISAM table.
mos [EMAIL PROTECTED] wrote:
I have a 50g CSV file that I am trying to import into an empty MyISAM
table. It appears to go fine except after 10 hours it hasn't completed. A
Show
At 10:07 PM 3/2/2005, you wrote:
Hello.
You may use ALTER TABLE .. DISABLE KEYS to speed up the loading process
on the MyISAM table.
That may work provided I can get the keys rebuilt later using FileSort and
not KeyCache.
You see the problem isn't in loading the data into the table which occurs
Hi,
On Mar 3, 2005, at 11:32 AM, mos wrote:
At 10:07 PM 3/2/2005, you wrote:
Hello.
You may use ALTER TABLE .. DISABLE KEYS to speed up the loading
process
on the MyISAM table.
That may work provided I can get the keys rebuilt later using FileSort
and not KeyCache.
You see the problem isn't in
At 12:39 PM 3/3/2005, Harrison Fisk wrote:
Hi,
On Mar 3, 2005, at 11:32 AM, mos wrote:
At 10:07 PM 3/2/2005, you wrote:
Hello.
You may use ALTER TABLE .. DISABLE KEYS to speed up the loading process
on the MyISAM table.
That may work provided I can get the keys rebuilt later using FileSort
and
Hi,
On Mar 3, 2005, at 3:13 PM, mos wrote:
At 12:39 PM 3/3/2005, Harrison Fisk wrote:
Hi,
On Mar 3, 2005, at 11:32 AM, mos wrote:
At 10:07 PM 3/2/2005, you wrote:
Hello.
You may use ALTER TABLE .. DISABLE KEYS to speed up the loading
process
on the MyISAM table.
That may work provided I can get
Hello.
Use 'SELECT INTO OUTFILE ...'
See:
http://dev.mysql.com/doc/mysql/en/select.html
shaun thornburgh [EMAIL PROTECTED] wrote:
Hi,
The following function loads data from a file:
http://dev.mysql.com/doc/mysql/en/load-data.html
Is there a function like this that I
From: shaun thornburgh [mailto:[EMAIL PROTECTED]
Hi,
The following function loads data from a file:
http://dev.mysql.com/doc/mysql/en/load-data.html
Is there a function like this that I can use to save the
results of a query to a CSV file for the user of my PHP
application to
shaun thornburgh [EMAIL PROTECTED] wrote on 02/15/2005
04:53:54 PM:
Hi,
I have a table with 26 fields, each row in this table must be unique. I
can't define all of the fields to be primary keys as the limit is 16.
Therefore before I insert data I have to check that each row is unique.
Hi,
Thanks for your reply, but the problem I am facing is that there may be
duplicate values in the uploaded file and I dont want these to appear in my
table...
From: Bastian Balthazar Bux [EMAIL PROTECTED]
To: shaun thornburgh [EMAIL PROTECTED]
Subject: Re: LOAD DATA INFILE using 4.0.17
Date
: LOAD DATA INFILE using 4.0.17
Hi,
Thanks for your reply, but the problem I am facing is that there may be
duplicate values in the uploaded file and I dont want these to appear in
my
table...
From: Bastian Balthazar Bux [EMAIL PROTECTED]
To: shaun thornburgh [EMAIL PROTECTED]
Subject: Re
: Bastian Balthazar Bux [EMAIL PROTECTED]
To: shaun thornburgh [EMAIL PROTECTED]
Subject: Re: LOAD DATA INFILE using 4.0.17
Date: Tue, 15 Feb 2005 23:32:56 +0100
shaun thornburgh ha scritto:
Hi,
I have a table with 26 fields, each row in this table must be
unique. I can't define all
No just every row needs to be unique. Sorry for the confusion...
From: Robert Dunlop [EMAIL PROTECTED]
To: shaun thornburgh
[EMAIL PROTECTED],mysql@lists.mysql.com
Subject: Re: LOAD DATA INFILE using 4.0.17
Date: Tue, 15 Feb 2005 15:06:19 -0800
So what you meant was every field in each row must
Richard Whitney mailto:[EMAIL PROTECTED] wrote on Tuesday, January 04, 2005
6:16
PM:
I think I'm bringing this up again but with different errors.
This: $sql = LOAD DATA INFILE '$file'
REPLACE INTO TABLE `jobs` FIELDS
TERMINATED BY '\\t' OPTIONALLY ENCLOSED BY '\' ESCAPED
Hello.
Similar problems are described at:
http://dev.mysql.com/doc/mysql/en/Connection_access.html
Richard Whitney [EMAIL PROTECTED] wrote:
I think I'm bringing this up again but with different errors.
This: $sql = LOAD DATA INFILE '$file' REPLACE INTO TABLE
Goutham
Thanks for your help. The problem in this case was line endings. I use an
Apple g4 for web work on system 10.2. By default BBEdit uses macintosh line
endings. MySQL does not recognize them. As soon as I changed the textfile
format to unix line endings, it imported the data without any
Hi Rob,
LOAD DATA INFILE is not a very verbose command. It
doesn't spill out the exact root cause of the error.
Forgive me, if I seem to be wrong for anybody who had
a different opinion.
mysqlimport is a wrapper around LOAD DATA INFILE with
a lot of command line options. You might try your luck
Software Engineer,
Hewlett Packard
--- rob byrne [EMAIL PROTECTED] wrote:
I am trying to load data from a text file into a
table using the Load data
infile statement. No matter how I change data types
I seem only able to load
in the first row of data into the MySQL table and no
more. I have not
Hello.
Looks like LOAD DATA INFILE supports only string constants in
its syntax. I think it has sense, because security risk grows,
when we can substitute variables in the file name.
Scott Klarenbach [EMAIL PROTECTED] wrote:
I can't seem to make the Load Data statement work inside of
Thanks !
-Original Message-
From: Gleb Paharenko [mailto:[EMAIL PROTECTED]
Sent: Wednesday, December 08, 2004 3:18 AM
To: [EMAIL PROTECTED]
Subject: Re: Load data question in cross database replication
Hello.
It seems to be a bug:
http://bugs.mysql.com/bug.php?id=6353
Sanjeev Sagar
. It do not work for those which are using this.
Thanks for you reply.
-Original Message-
From: Gleb Paharenko [mailto:[EMAIL PROTECTED]
Sent: Friday, December 03, 2004 4:14 AM
To: [EMAIL PROTECTED]
Subject: Re: Load data question in cross database replication
Hello
Hello.
--replicate-rewrite-db is not taken into account while executing LOAD DATA
FROM MASTER. See:
http://dev.mysql.com/doc/mysql/en/LOAD_DATA_FROM_MASTER.html
Sanjeev Sagar [EMAIL PROTECTED] wrote:
--
For technical support contracts, goto https://order.mysql.com/?ref=ensita
:[EMAIL PROTECTED]
Sent: Friday, December 03, 2004 4:14 AM
To: [EMAIL PROTECTED]
Subject: Re: Load data question in cross database replication
Hello.
--replicate-rewrite-db is not taken into account while executing LOAD
DATA
FROM MASTER. See:
http://dev.mysql.com/doc/mysql/en
Somewhere about Sat, 20-Nov-2004 at 06:27PM +0100 (give or take), Ferhat BINGOL
wrote:
| Hi,
|
| I have a 72 fields data txt file and I was inserting all data
| previously but now I need only some of them to dump into the table.
| I would like to select only 4 fields which are the 1st,
.
thanks to MySQL.
:)
thank yo again for answer...
- Original Message -
From: Patrick Connolly [EMAIL PROTECTED]
To: Ferhat BINGOL [EMAIL PROTECTED]
Cc: mysql [EMAIL PROTECTED]
Sent: Sunday, November 21, 2004 10:00 AM
Subject: Re: LOAD DATA INFILE question...
Somewhere about Sat, 20
You can load the file to an intermediate table and then complete your
process using
INSERT INTO targetTable
SELECT col_1, col_5, col_28, col_71
FROM intermediateTABLE
Ferhat BINGOL wrote:
Hi,
I have a 72 fields data txt file and I was inserting all data previously
but now I need only some of
BINGOL [EMAIL PROTECTED]
Cc: mysql [EMAIL PROTECTED]
Sent: Saturday, November 20, 2004 10:41 PM
Subject: Re: LOAD DATA INFILE question...
You can load the file to an intermediate table and then complete your
process using
INSERT INTO targetTable
SELECT col_1, col_5, col_28, col_71
FROM
Hi.
If your table has a unique index on field 'name',
then use
load data infile 'file' replace into table 'table';
Lewick, Taylor [EMAIL PROTECTED] wrote:
Can I perform an update on a table using load data infile..?
If I have the following table...
Name Score Rank
Hi.
See
http://dev.mysql.com/doc/mysql/en/LOAD_DATA_LOCAL.html
Martin Rytz [EMAIL PROTECTED] wrote:
--
For technical support contracts, goto https://order.mysql.com/?ref=ensita
This email is sponsored by Ensita.NET http://www.ensita.net/
__ ___ ___ __
/ |/ /_ __/ __/ __
Hi Richard,
Try looking at mysqlimport instead. I'm only taking a punt that it works
with that version but the manual doesn't say anything about it being
since a certain version. It works as at 3.23.58 so hopefully it may do
for .55
Regards
David Logan
Database Administrator
HP Managed
Richard Whitney wrote:
Hi!
Can someone point me in the right direction?
I have this that works in v.4x:
$sql = LOAD DATA LOCAL INFILE '$file' REPLACE INTO TABLE `members`
FIELDS TERMINATED BY '\\t' OPTIONALLY ENCLOSED BY '\' ESCAPED BY
'' LINES TERMINATED BY '\\r\\n';
When I try it using
At 09:24 AM 10/26/2004, you wrote:
I want to see the warnings when load data from text file using command
mysal load data local infile 'mydata.txt' into table my_table;
When I got
Query OK, 1431 rows affected, 1506 warnings (0.27 sec)
Records:1431 Deleted: 0 Skipped:0 Warnings:1506
I have to find
[EMAIL PROTECTED] wrote:
I use following code
use databasea
load data infileabcd.txt' into table databasea.tablename;
Data gets loaded in table, however at the end of each record I see a square
symbol, the symbol usually we see in compiled code.
This may be a linefeed or
mysql load data infile 'abcd.txt' into table b.chicago;
' for key 1 Duplicate entry '[EMAIL PROTECTED]
I think you have 2 e-mails that are equal in the file and the e-mail field is
declared as a primary key (which implies UNIQUE).
--
mack /
--
MySQL General Mailing List
For list archives:
What is the current value for you id field? Approximately how many records
are you inserting?
-Original Message-
From: [EMAIL PROTECTED]
To: [EMAIL PROTECTED]
Sent: 8/20/04 3:16 AM
Subject: load data infile
Dear freinds,
I am still getting errors.Load infile script. Guidance , please.
Dear friend,
I tried the IGNORE option so that data is loaded from the file to table,
still getting error pasted below. Any advice.
mysql use b
Database changed
mysql describe chicago
- ;
8408 4259 - Fax
-Original Message-
From: Michael Stassen [mailto:[EMAIL PROTECTED]
Sent: Tuesday, 10 August 2004 3:19 PM
To: Logan, David (SST - Adelaide)
Cc: MySQL List
Subject: Re: LOAD DATA LOCAL INFILE issue
Well, as you say, that error message means it's been disabled in either
[EMAIL PROTECTED] wrote:
My data in text file isn't been loaded to columns in table. Data text file is
in data directory of server.
rest of the commands are as follows, any guidance, please.
mysql load data
- infile 'kemailsusa.txt'
- into table
- kemailsusa;
ERROR 1062:
Perhaps the problem is that there is no such option as --enable-local-infile
in the mysql client. I believe you want --local-infile. Client options are
detailed in the manual http://dev.mysql.com/doc/mysql/en/mysql.html.
Michael
Logan, David (SST - Adelaide) wrote:
Hi Folks,
I am having a few
- Adelaide)
Cc: MySQL List
Subject: Re: LOAD DATA LOCAL INFILE issue
Perhaps the problem is that there is no such option as
--enable-local-infile
in the mysql client. I believe you want --local-infile. Client options
are
detailed in the manual http://dev.mysql.com/doc/mysql/en/mysql.html.
Michael
Frome Street,
Adelaide 5000
Australia
+61 8 8408 4273 - Work
+61 417 268 665 - Mobile
+61 8 8408 4259 - Fax
-Original Message-
From: Michael Stassen [mailto:[EMAIL PROTECTED]
Sent: Tuesday, 10 August 2004 2:37 PM
To: Logan, David (SST - Adelaide)
Cc: MySQL List
Subject: Re: LOAD DATA LOCAL
For this type of custom loading you may want to explore a programming
language such as Java or C/C++ or Perl. Depending on your platform you could
even explore some third party tools.
-Original Message-
From: sean c peters
To: [EMAIL PROTECTED]
Sent: 8/4/04 3:27 PM
Subject: load data
sean c peters wrote:
But when I load a parent table, an auto_increment column
autogenerates a value that will be a foreign key in a child table. So i cant
create the file to load into the child table until after the parent table has
been loaded. Then i'll need to get back all the auto increment
Somewhere about Sun, 01-Aug-2004 at 11:31AM -0400 (give or take), Michael Stassen
wrote:
|
| Patrick Connolly wrote:
[...]
| Looks to me the mysql user should have no trouble with it:
|
| -rw-rw-r--1 pat pat 332 Jun 28 20:42 Orders.txt
|
| Every piece of the path to
rmck [EMAIL PROTECTED] wrote:
I thought I could wrap this LOAD DATA option in my perl script which could
speed things up... Any help is great
You can consider using bulk insert.
See http://dev.mysql.com/doc/mysql/en/INSERT.html and note that you can
insert more than a single row at
Somewhere about Sat, 31-Jul-2004 at 11:17AM -0400 (give or take), Michael Stassen
wrote:
| With LOCAL, the *client* reads the file on the client's machine.
| Without LOCAL, the *server* reeads the file on the server's
| machine. Even though the client and server machines are the same
| in your
Patrick Connolly wrote:
Somewhere about Sat, 31-Jul-2004 at 11:17AM -0400 (give or take), Michael Stassen
wrote:
| With LOCAL, the *client* reads the file on the client's machine.
| Without LOCAL, the *server* reeads the file on the server's
| machine. Even though the client and server machines
With LOCAL, the *client* reads the file on the client's machine. Without
LOCAL, the *server* reeads the file on the server's machine. Even though
the client and server machines are the same in your case, those are still
different operations. There are restrictions on having the server do the
Resolved, used this syntax...
LOAD DATA INFILE '/path/from/root/to/file.csv' INTO TABLE ma0133 FIELDS
TERMINATED BY ',' OPTIONALLY ENCLOSED BY '' ESCAPED BY '\\' LINES
TERMINATED BY '\r\n'
- Phil.
---
Outgoing mail is certified Virus Free.
Checked by AVG anti-virus system
Not sure about the 3.x series, but in 4.x, if you build your own mysql, you
need to explicitly enable that feature via a configure option. I think
binary builds all have it enabled though.
Issac
- Original Message -
From: David Brännlund [EMAIL PROTECTED]
To: [EMAIL PROTECTED]
Sent:
Hi Shawn,
I wondered if you might be able to help me with an SQL query.
I want to list all the internet sites I've surfed in my database.
Here's a query that matches the url with a urlid:
SELECT concat(usc.scheme,://,us.server,up.path)
FROM url_visit uv
INNER JOIN url_servers us
ON
Figured it out! Took a gamble and run the below command!
SELECT iu.time, INET_NTOA(iu.ip), concat(usc.scheme,://,us.server,up.path)
FROM url_visit uv
INNER JOIN internet_usage iu
ON iu.urlid=uv.urlid
INNER JOIN url_servers us
ON us.id=uv.url_server_ID
INNER JOIN url_paths up
J S
[EMAIL PROTECTED]To: [EMAIL PROTECTED]
com cc:
[EMAIL PROTECTED]
Fax to:
06/22/2004 07:55 Subject: Re: load
to:
06/23/2004 04:13 Subject: Re: load data into 2 tables
and set id
AM
]
Fax to:
06/23/2004 04:13 Subject: Re: load data
into 2 tables and set id
AM
Shawn,
I uncovered a problem this morning. I wonder if you (or anyone else) can
help me out again?
mysql select * from url_visit where url_scheme_ID=3 limit 10
]
Fax to:
06/23/2004 04:13 Subject: Re: load data
into 2 tables and set id
AM
Shawn,
I uncovered a problem this morning. I wonder if you (or anyone else) can
help me out again?
mysql select * from url_visit where url_scheme_ID=3 limit 10
PROTECTED]
Fax to:
06/23/2004 09:57 Subject: Re: load data into 2 tables
and set id
:57 Subject: Re: load data
into 2 tables and set id
AM
Hi Shawn,
Here's the url_Schemes table (it's the same as the url_paths and
url_servers). This means url_scheme_ID is part of a unique constraint/key ?
mysql desc url_schemes
cc: [EMAIL PROTECTED]
Fax to:
06/23/2004 10:38 Subject: Re: load data into 2 tables
and set
Shawn,
Thanks for your reply below. I found it extremely useful. I have followed
your instructions and got good results up to the url_visits table.
I have a perl script to parse the values out of the log. The log has
3,770,246 lines and is gzipped. I then applied your SQL statements with the
Did you mean there to be duplicates in the url_visits? Do I need to use
IGNORE in the following SQL?
INSERT url_visit (url_server_ID, url_path_ID, querystring,
category)
SELECT us.ID, up.ID, if(bt.path_split 0, SUBSTRING(bt.url,path),
NULL),
bt.category
FROM bulk_table bt
INNER JOIN url_servers
I think I fixed it!
INSERT IGNORE url_visit (url_server_ID, url_path_ID, querystring,
category)
SELECT DISTINCT us.ID, up.ID, if(bt.path_split 0, SUBSTRING(bt.url,path),
NULL),
bt.category
FROM bulk_table bt
INNER JOIN url_servers us
ON us.server = bt.server
INNER JOIN url_paths up
on
/2004 07:55 Subject: Re: load data into 2 tables
and set id
AM
]
Fax to:
06/22/2004 07:55 Subject: Re: load data
into 2 tables and set id
AM
I think I fixed it!
INSERT IGNORE url_visit (url_server_ID, url_path_ID, querystring,
category)
SELECT DISTINCT us.ID, up.ID, if(bt.path_split 0
Mos forgot to populate the url_id column in your user table. I would use
his same process but re-arrange it like this:
1) create table BIG_TABLE
2) load data infile
3) create table URL_TABLE (
url_id bigint not null auto_increment,
url varchar(25) not null primary key,
cc: [EMAIL PROTECTED]
Fax to:
06/13/2004 12:29 Subject: Re: load data into 2 tables
Shawn,
Thanks for helping on this. I really appreciate it.
No problem!!
Please post the structures of your big_table and your url_table
(whatever you called them) and I will help you to rewrite step 4 to count
how many times a URL appears in the big_table.
mysql desc internet_usage;
to:
06/18/2004 09:40 Subject: Re: load data into 2 tables
and set id
AM
]
Fax to:
06/18/2004 09:40 Subject: Re: load data
into 2 tables and set id
AM
Shawn,
Thanks for helping on this. I really appreciate it.
No problem!!
Please post the structures of your big_table and your url_table
(whatever
welcome to a basic overview of bulk importing and normalizing as
you go
[ author's note: if you are seeing this thread for the first time and
certain items seem to be
introduced out of context, please review all previous posts in this thread.
There has been
a lot of
You have a file of SQL statements. LOAD DATA INFILE is for importing a file
of data (comma-separated, for example), not for reading a SQL file. You can
do this from the command line with
mysql name_of_db /tmp/updates.txt
or within the mysql client program with
source '/tmp/updates.txt';
Thanks Michael - I was being a putz wasn't I!
You have a file of SQL statements. LOAD DATA INFILE is for importing a
file of data (comma-separated, for example), not for reading a SQL file.
You can do this from the command line with
mysql name_of_db /tmp/updates.txt
or within the mysql
Nik Belajcic wrote:
I have a strange problem importing data from a text file. There are 1353
rows in the text file (generated by a Perl script) but only 1000 get
imported into MySQL. I am clueless why would this be happening - it
seems as if there was a cutoff point at 1000 rows which, of course,
Hi,
I need some help please! I have 60GB of proxy logs to parse and load into
a mysql database. I've written a parsing script but I'm stuck now on how
to load the data in.
I have a database called PROXY_LOG with 2 tables:
USER_TABLE
user_id date_time url_id size
and
URL_TABLE
url_id url
to:
06/09/2004 05:00 Subject: Re: load data into 2 tables
and set id
PM
sounds like --safe-mode has been turned on, check your my.cnf files
- hcir
On Jun 10, 2004, at 7:18 PM, Nik Belajcic wrote:
I have a strange problem importing data from a text file. There are
1353
rows in the text file (generated by a Perl script) but only 1000 get
imported into MySQL. I am
J S said:
Hi,
I need some help please! I have 60GB of proxy logs to parse and load
into a mysql database. I've written a parsing script but I'm stuck now
on how to load the data in.
I have a database called PROXY_LOG with 2 tables:
USER_TABLE
user_id date_time url_id size
and
At 02:34 PM 6/9/2004, you wrote:
Hi,
I need some help please! I have 60GB of proxy logs to parse and load into
a mysql database. I've written a parsing script but I'm stuck now on how
to load the data in.
I have a database called PROXY_LOG with 2 tables:
USER_TABLE
user_id date_time url_id
Hi,
I need some help please! I have 60GB of proxy logs to parse and load
into a mysql database. I've written a parsing script but I'm stuck now
on how to load the data in.
I have a database called PROXY_LOG with 2 tables:
USER_TABLE
user_id date_time url_id size
and
J S said:
Hi,
I need some help please! I have 60GB of proxy logs to parse and load
into a mysql database. I've written a parsing script but I'm stuck
now on how to load the data in.
I have a database called PROXY_LOG with 2 tables:
USER_TABLE
user_id date_time url_id
: load data help
My question is, how can I take a log file that has 25 columns of data and
tell mysql to only load column 1, column 3, and column 7 from the raw log
file?
I'm not sure mysql can do this. I'd be more inclined to use cut
on a Linux system in the following fashion:
cut -d
101 - 200 of 467 matches
Mail list logo