I was not able to load a function in udf_example.so. The response to
CREATE FUNCTION metaphon RETURNS STRING SONAME 'udf_example.so';
is
Error Code: 1126. Can't open shared library 'udf_example.so' (errno: 0
/usr/lib64/mysql/plugin/udf_example.so: cannot open shared object file: No
such file or
You can have a look at XMLPipeDB.
http://sourceforge.net/projects/xmlpipedb/
Iñigo
On 03/09, Sayth Renshaw wrote:
> Hi
>
> Hoping someone can help me work some information or direction to a good
> resourc
Hi
Hoping someone can help me work some information or direction to a good
resource for using XML with Mysql.
Specifically I have a complex XML file, I would like to create a schema in
MYSQL based on it (I have XSD as well) and be able to upload new xml data
into the database as it's released to
/xml2json
>>
>> > Sends odd that xml had done great document qualities but as a data
>> > format
>> > it seems rather hard to work with.
>>
>> Indeed.
>>
>>
>> > On Fri, 12 Dec 2014 9:55 PM Johan De Meersman
>> > wrote:
>
> > On Fri, 12 Dec 2014 9:55 PM Johan De Meersman
> wrote:
> >
> >>
> >> - Original Message -
> >> > From: "Sayth Renshaw"
> >> > Subject: Xml data import
> >> >
> >> > I have an xml data feed with xsd, it
ther hard to work with.
Indeed.
> On Fri, 12 Dec 2014 9:55 PM Johan De Meersman wrote:
>
>>
>> - Original Message -
>> > From: "Sayth Renshaw"
>> > Subject: Xml data import
>> >
>> > I have an xml data feed with xsd, it'
nal Message -
> > From: "Sayth Renshaw"
> > Subject: Xml data import
> >
> > I have an xml data feed with xsd, it's complex in elements not size. Wray
> > are the best way to get data into mysql, do I have to hack with xquery?
>
> That's going to dep
- Original Message -
> From: "Sayth Renshaw"
> Subject: Xml data import
>
> I have an xml data feed with xsd, it's complex in elements not size. Wray
> are the best way to get data into mysql, do I have to hack with xquery?
That's going to depend on
What is the best way to manage xml data feeds with mysql?
I have an xml data feed with xsd, it's complex in elements not size. Wray
are the best way to get data into mysql, do I have to hack with xquery?
My goal is to be able create queries and send csv files out for analysis in
R and plots in gg
2012/11/15 00:30 +0100, Mogens Melander
I guess I'm sill learning.
Does that mean that, if the last column in a load blabla. is a -00-00
terminated by ^n it might error ? Or are we talking ODBC ?
Find it under LOAD DATA
If an empty field is parsed for a NOT NULL DATE o
eed
>> to
>> get the date different(in days)
>> between the dates present in the c1 and c2. That days should be shown in
>> the c3. please help me out.
>>
>> On Wed, Nov 14, 2012 at 3:46 PM, wrote:
>>
>>> >>>> 2012/11/14 10:26 +0530,
I guess I'm sill learning.
Does that mean that, if the last column in a load blabla. is a -00-00
terminated by ^n it might error ? Or are we talking ODBC ?
On Wed, November 14, 2012 18:58, h...@tbbs.net wrote:
> 2012/11/14 18:27 +0530, sagar bs
> There are four columns in my table n
Wed, Nov 14, 2012 at 3:46 PM, wrote:
>
>> >>>> 2012/11/14 10:26 +0530, sagar bs >>>>
>> As i have the data with some 25 variables in csv file and i need to
>> import
>> to mysql.
>> The issue is that the date format in csv file is dd/mm/ and mysql
>
2012/11/14 18:27 +0530, sagar bs
There are four columns in my table named like account_name, c1, c2 and c3.
Account name is the primary key and c1, c2 contain two different dates and in
the column c2 there are few fields showing /00/00, now i need to get the
date different(in da
and c2. That days should be shown in
the c3. please help me out.
On Wed, Nov 14, 2012 at 3:46 PM, wrote:
> >>>> 2012/11/14 10:26 +0530, sagar bs >>>>
> As i have the data with some 25 variables in csv file and i need to import
> to mysql.
> The issue is that
>>>> 2012/11/14 10:26 +0530, sagar bs >>>>
As i have the data with some 25 variables in csv file and i need to import
to mysql.
The issue is that the date format in csv file is dd/mm/ and mysql takes
the date format like /mm/dd.
The number of variables in the
Did you change the target column to varchar before import ?
On Wed, November 14, 2012 10:23, sagar bs wrote:
> tried to import data as text, but its showing Operation failed with
> exitcode 1
>
> On Wed, Nov 14, 2012 at 1:12 PM, Mogens Melander
> wrote:
>
>> Or you could
tried to import data as text, but its showing Operation failed with
exitcode 1
On Wed, Nov 14, 2012 at 1:12 PM, Mogens Melander wrote:
> Or you could import the date as text and convert dates using:
>
> mysql> SELECT STR_TO_DATE('04/31/2004', '%m/%d/%Y');
&g
Or you could import the date as text and convert dates using:
mysql> SELECT STR_TO_DATE('04/31/2004', '%m/%d/%Y');
-> '2004-04-31'
On Wed, November 14, 2012 06:13, Larry Martell wrote:
> On Tue, Nov 13, 2012 at 9:56 PM, sagar bs wrote:
>> Hi,
&g
On Tue, Nov 13, 2012 at 9:56 PM, sagar bs wrote:
> Hi,
>
> As i have the data with some 25 variables in csv file and i need to import
> to mysql.
> The issue is that the date format in csv file is dd/mm/ and mysql takes
> the date format like /mm/dd.
> The number of
When you say locked, do queries on the other databases fail with an
error? If so, whats the error? Is it all queries, or just inserts?
Also, how are you doing your export and import?
Sent from my iPad
On Sep 10, 2012, at 2:38 AM, Roland RoLaNd wrote:
>
> Dear all,
>
> I realize th
Dear all,
I realize this is a very newbie question so bear with me please.
I know that when you import/export a DB its tables are locked to ensure
consistency and no data corruption.
but why would other DBs on the same server get locked if im importing/exporting
one DB ?
in other words
Hi,
I have some xml data following the following DTD. It seems that it is
better to import the xml file into multiple table, for example, author
table, journal table. I read the mysql manual section on "load xml",
but I don't see how to load xml in multiple table and how to de
>>> Jessica Bela 10/10/2011 4:47 PM >>>
Hi all,
how I can import in my PC a database Mysql that has been created in another
PC and with other tools?
Assuming the source and destination are BOTH mysql databases:
mysqldump database > export.sql
...creates a file
.here is one way
http://dev.mysql.com/doc/refman/5.1/en/mysqldump.html
On Mon, Oct 10, 2011 at 4:47 PM, Jessica Bela wrote:
> Hi all,
> how I can import in my PC a database Mysql that has been created in
> another PC and with other tools?
Hi all,
how I can import in my PC a database Mysql that has been created in another
PC and with other tools?
a QIF
into any relational database, you would have to translate the
resulting data into a delimited txt file and then import. You may
want to check to see if Quickbooks has an API that you can use to
access the data natively rather than trying to move it around from
platform to pla
I could find would be importing them
into Excel first, then CSV out of Excel into MySQL, which sounds like a lot of
bother and not readily scriptable for routine use. I find it hard to believe
I'm the first one to ever attempt this!
<<<<<<<<
I found this:
http://www.qbli
I don't think I have ever heard of anyone directly importing a QIF into any
relational database, you would have to translate the resulting data into a
delimited txt file and then import. You may want to check to see if Quickbooks
has an API that you can use to access the data natively r
lete duplicate, but in my
experience, it's easier to get it all and winnow out the bits you don't want
than to selectively import.
That said, we really only need the basic transaction info: date, payee, amount,
memo, category, account from, account to.
This is to reconcile the chart of
: mysql@lists.mysql.com
Subject: Import from Quicken 2004 Mac?
I'm looking for ways to import QuickBooks 2010 Mac. I've only just started
researching this, so feel free to "RTFM" me -- with a proper reference, of
course!
I'll be wanting to set up a process to do this pe
I'm looking for ways to import QuickBooks 2010 Mac. I've only just started
researching this, so feel free to "RTFM" me -- with a proper reference, of
course!
I'll be wanting to set up a process to do this periodically (and hopefully,
automagically) for new transactions
Administrative Services
Telephone: 515.281.6139 Fax: 515.281.6137
Email: kay.rozeb...@iowa.gov
-Original Message-
From: a.sm...@ukgrid.net [mailto:a.sm...@ukgrid.net]
Sent: Wednesday, August 03, 2011 3:25 PM
To: supr_star
Cc: mysql@lists.mysql.com
Subject: Re: very large import
On 8/3/2011 20:36, Nuno Tavares wrote:
The following page has some nice interesting stuff, assuming you have a
reasonable configuration in place (innodb_buffer_pool, etc[1])
http://download.oracle.com/docs/cd/E17952_01/refman-5.5-en/optimizing-innodb-bulk-data-loading.html
...
The same conte
The following page has some nice interesting stuff, assuming you have a
reasonable configuration in place (innodb_buffer_pool, etc[1])
http://download.oracle.com/docs/cd/E17952_01/refman-5.5-en/optimizing-innodb-bulk-data-loading.html
[1] http://gpshumano.blogs.dri.pt/2009/09/28/importing-wikim
Quoting supr_star :
Is there any way to speed up this process? by disabling indexes or
something? I can't afford to be down for 3 more days...
First stop, the mysql documentation:
http://dev.mysql.com/doc/refman/5.1/en/innodb-tuning.html
--
MySQL General Mailing List
For list archives
ith a lot more drive
space now, and the import is working, albeit very very slowly The ibdata1
file is growing at about 800MB-1GB per hour. It's been running for 24 hours
now, and it's only 1/4 of the way thru! Is there any way to speed up this
process? by disabling indexes o
On 2011/07/19 09:52 PM, andrewmchor...@cox.net wrote:
Hello
I am about to create a database in mysql. I would like to be able to import
some dbase3 (.dbf) files into the tables I will be defining. What is the
easiest way to import the table. Is there software that can be downloaded that
will
o be able to import
> some dbase3 (.dbf) files into the tables I will be defining. What is the
> easiest way to import the table. Is there software that can be downloaded
> that will allow me to do this?
>
> Andrew
>
> --
> MySQL General Mailing List
> For list archive
, Jul 19, 2011 at 3:52 PM, wrote:
> Hello
>
> I am about to create a database in mysql. I would like to be able to import
> some dbase3 (.dbf) files into the tables I will be defining. What is the
> easiest way to import the table. Is there software that can be downloaded
> tha
Hello
I am about to create a database in mysql. I would like to be able to import
some dbase3 (.dbf) files into the tables I will be defining. What is the
easiest way to import the table. Is there software that can be downloaded that
will allow me to do this?
Andrew
--
MySQL General Mailing
Hi all, I have upgraded a few test boxes and everything seems to work fine BUT
I
wanted to verify with the gurus if my syntax is correct so as to avoid any
future problems ;-)
The purpose is to dump all databases and users / user privileges from our
4.1.20
server and import it into our
>It seems that one of the tables we need to export and import
>contains rows which is used for dropdown menus.
>
>
>
>This has the following effect:
>
>
>
>. Each item in the "text" field is added in the field by
) it still writes the control characters causing each item to
be read as a different line and thus the import into Oracle fails.
Any idea on how we can resolve this as the process needs to be cronned to
run on a weekly basis and thus we need to get this process resolved.
You haven't describ
the tables we need to export and import
contains rows which is used for dropdown menus.
This has the following effect:
. Each item in the "text" field is added in the field by entering
the country name then pressing enter and then entering the
Thank you for the link but seeing that I am still new with MySQL , this does
not mean anything to me.
From: prabhat kumar [mailto:aim.prab...@gmail.com]
Sent: 08 January 2010 4:22 PM
To: machiel.richards
Cc: mysql@lists.mysql.com
Subject: Re: FW: MySQL export and import into Oracle
http
http://dev.mysql.com/doc/refman/5.0/en/control-flow-functions.html#function_if
On Fri, Jan 8, 2010 at 5:52 PM, machiel.richards wrote:
> Hi guys,
>
>
>
> Can you please assist me in rewriting this query in order to run this
> against a mysql database?
>
>
>
> It seems that the decode function do
Hi guys,
Can you please assist me in rewriting this query in order to run this
against a mysql database?
It seems that the decode function does not exist in mysql.
select
decode(nvl(receive_email, 'No'), 'Yes', 'Yes', 'No') email_corr,
count(*) tot
from profiles
where email i
r should I export the dbs and than import
tham?
Best regards,
Götz
--
Götz Reinicke
IT-Koordinator
Tel. +49 7141 969 420
Fax +49 7141 969 55 420
E-Mail goetz.reini...@filmakademie.de
Filmakademie Baden-Württemberg GmbH
Akademiehof 10
71638 Ludwigsburg
www.filmakademie.de
Eintragung A
A week or so ago, I was seeking a solution for breaking lines for
importing csv to phpmyadmin interface
Found a slick solution:
http://csv2sql.evandavey.com/
create table in database, upload the csv file to above page, copy/paste
the resulting code into SQL field for the database (not the ta
A very first thought I got is disable the constraint before import and
re-enable after that.
One way could be to set the foreign key checks to false or alter the
constraint and remove the 'cascade delete' part.
It's just a quick brain storm, please verify the goodness of it, I still
Hello,
we have two tables associated with a foreign key constraint.
Table A with the primary key and table B with an "on delete cascade" constraint.
We want to delete datasets in Table B if the related dataset in Table A is
deleted - that works.
Now the Problem:
There is a weekly impo
il.com]
> Sent: Tuesday, May 19, 2009 2:37 PM
> To: n...@jammconsulting.com; mysql@lists.mysql.com
> Subject: RE: mysql not able to import mysqldump file
>
>
> Neil-
>
> http://wiki.seas.harvard.edu/geos-chem/index.php/Floating_poin
t_math_issues
> so a value this large or s
com
> Subject: RE: mysql not able to import mysqldump file
> Date: Tue, 19 May 2009 13:40:43 -0500
>
> Gavin:
>
> Both servers are mysql-server.x86_64 5.0.45-7.el5
> installed using yum on CentOS 5.
>
> I was able to work around it by doing a global replace
> of tha
gt; Subject: RE: mysql not able to import mysqldump file
>
> Hi Niel,
>
> What version is the mysql dump from? Are you importing to a
> different version?
>
> Could you show the line from the file that is generating the error?
>
> Regards,
> Gavin Towey
>
>
To: mysql@lists.mysql.com
Subject: mysql not able to import mysqldump file
Hello:
I have a database with several double columns
in the tables.
I used mysqldump to dump the data from the primary
database and I am trying to import it into a
secondary database.
I am importing the data by passing
Hello:
I have a database with several double columns
in the tables.
I used mysqldump to dump the data from the primary
database and I am trying to import it into a
secondary database.
I am importing the data by passing the generated
sql file to the mysql command line client.
When I do that, I
most efficiently.
What do you think about the the hard and software requirements in
order to match the best combination?
The data will come from oracle, so it would be interessting to, how
I will have to import the data. Does this work once, or will I have
to divide the data in several parts for i
come from oracle, so it would be interessting to, how I will
have to import the data. Does this work once, or will I have to divide the
data in several parts for import?
Best Greetings,
Frank
Frank,
The fastest way to import data from Oracle would be as a CSV file and
then use Load
interessting to, how I will have
to import the data. Does this work once, or will I have to divide the data in
several parts for import?
Best Greetings,
Frank
--
MySQL General Mailing List
For list archives: http://lists.mysql.com/mysql
To unsubscribe:http://lists.mysql.com/mysql?unsub
nloads/gui-tools/5.0.html
- basically all is working great - some tables import no problem -
except...
I'm trying to import an address table and in the summary it says that
there's a few problems like:
incorrect string value for column 'street' at row 655
0 rows transferred
Hi folks,
I'm trying to use MySQL Migration Toolkit 1.1 with MS SQL server 2005
http://dev.mysql.com/downloads/gui-tools/5.0.html
- basically all is working great - some tables import no problem - except...
I'm trying to import an address table and in the summary it says that
the
I created a csv file entitled 'disposed.csv' and placed it in
computer_inventory data folder with the following inside:
1087
1046
1086
1161
1049
1178
1029
1030
1224
1044
1106
Now I created the table 'disposed' as following:
Create disposed (
Mot_id INT(4) UNIQUE NOT NULL
);
Then I issued the c
Warren Windvogel wrote:
> Hi,
>
> Is there a way to force mysql to import a dump which contains a mysql
> reserved word as a field name?
>
> Regards
> Warren
>
You can use "sed" to replace column names with other.
for example:
sed 's/timestamp (timestamp) n
Hi,
Is there a way to force mysql to import a dump which contains a mysql
reserved word as a field name?
Regards
Warren
--
MySQL General Mailing List
For list archives: http://lists.mysql.com/mysql
To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]
Even more when you compare to a script executing the inserts, instead the
mysql client...
Olaf
On 6/5/08 12:06 PM, "mos" <[EMAIL PROTECTED]> wrote:
> At 10:30 AM 6/5/2008, you wrote:
>> Simon,
>>
>> In my experience load data infile is a lot faster than a sql file htrough
>> the client.
>> I
Olaf, Mike
Thanks for the input, the blob data is just text, I'll have a go at
using the load data command
Regards
Simon
mos wrote:
At 10:30 AM 6/5/2008, you wrote:
Simon,
In my experience load data infile is a lot faster than a sql file
htrough
the client.
I would parse the sql file an
At 10:30 AM 6/5/2008, you wrote:
Simon,
In my experience load data infile is a lot faster than a sql file htrough
the client.
I would parse the sql file and create a csv file with just the columns of
your table and then use load data infile using the created csv file
Olaf
Olaf,
Using a
AIL PROTECTED]> wrote:
> I can do - if the load data infile command definitely improves
> performance and splitting the file does the same I have no problem with
> doing this. It just seems strange that it's problems with the way the
> import file is configured. I thought the pr
AIL PROTECTED]> wrote:
From: Simon Collins <[EMAIL PROTECTED]>
Subject: Re: Large import into MYISAM - performance problems
To: mysql@lists.mysql.com
Date: Thursday, June 5, 2008, 3:05 PM
I'm loading the data through the command below mysql -f -u root -p
enwiki < enwiki.sql
I can do - if the load data infile command definitely improves
performance and splitting the file does the same I have no problem with
doing this. It just seems strange that it's problems with the way the
import file is configured. I thought the problem would be somehow with
the table ge
ommunity
I've disabled the primary key, so there are no indexes. The CPU has 2
cores and 2 Gigs memory.
The import fell over overnight with a "table full" error as it hit 1T
(I think this may be a file system problem). As it's not importing
before anymore show status isn't
sql -f -u root -p enwiki <
> enwiki.sql
>
> The version is MySQL 5.0.51a-community
>
> I've disabled the primary key, so there are no indexes. The CPU has 2 cores
> and 2 Gigs memory.
>
> The import fell over overnight with a "table full" error as it hit 1T
I'm loading the data through the command below mysql -f -u root -p
enwiki < enwiki.sql
The version is MySQL 5.0.51a-community
I've disabled the primary key, so there are no indexes. The CPU has 2
cores and 2 Gigs memory.
The import fell over overnight with a "table full&q
Hi,
Break up the file into small chunks and then import one by one.
On Wed, Jun 4, 2008 at 10:12 PM, Simon Collins <
[EMAIL PROTECTED]> wrote:
> Dear all,
>
> I'm presently trying to import the full wikipedia dump for one of our
> research users. Unsurprisingly it's
and restart the
server. How much RAM do you have on your machine and how many CPU's do you
have? What version of MySQL are you using? Also can you post your "Show
Status" output after it has started to slow down? How much CPU is being
used after the import slows down?
Hi Simon,
How ur doing this import into ur table.
On 6/4/08, Simon Collins <[EMAIL PROTECTED]> wrote:
>
> Dear all,
>
> I'm presently trying to import the full wikipedia dump for one of our
> research users. Unsurprisingly it's a massive import file (2.7T)
>
&g
Dear all,
I'm presently trying to import the full wikipedia dump for one of our
research users. Unsurprisingly it's a massive import file (2.7T)
Most of the data is importing into a single MyISAM table which has an id
field and a blob field. There are no constraints / indexes on t
In this case, the command for the second suggestion is
gzip -d < slavesetup.sql.gz | mysql -u --password=
-Original Message-
From: Eramo, Mark [mailto:[EMAIL PROTECTED]
Sent: Monday, May 05, 2008 3:40 PM
To: Mysql
Cc: Kieran Kelleher
Subject: RE: Import of a mysldump file fails
Hi Kieran,
Try the following 2 things...
1) Add this to your my.cnf / my.ini in the [mysqld] section
max_allowed_packet=32M
(you might have to set this value higher based on your existing database).
2) If the import still does not work, try it like this as well.
mysql -u --password
MacMini Intel (aka slave)
which has a clean install of 5.0.51b and importing into the mysql
server. The import ran for a while and I then notice at some stage (by
looking at top or "show processlist" that importing has completed)
However checking the databases, only some of them h
tools/migration-toolkit/
>
> Raj Mehrotra
> HCCS - Experts in Healthcare Learning
>
>
>
>
>
>
> -Original Message-
> From: Metalpalo [mailto:[EMAIL PROTECTED]
> Sent: Tuesday, March 18, 2008 3:25 AM
> To: mysql@lists.mysql.com
> Subject: How to import or
Subject: How to import oracle dump?
Hello
I have got one question.
I need to convert oracle dump file and import it to MySQl server. I have
found some utitlity OraDump-to-MySQL but it is not free and convert only
5
record from each table.
Can somebody help me ?
Thanks
--
View this message in
Hello
I have got one question.
I need to convert oracle dump file and import it to MySQl server. I have
found some utitlity OraDump-to-MySQL but it is not free and convert only 5
record from each table.
Can somebody help me ?
Thanks
--
View this message in context:
http://www.nabble.com/How
On Mon, 16 Apr 2007 10:28:30 +1000, Chris wrote:
> What did *you* do differently this time?
>
> Obviously the user is different, but what about permissions? What were
> they before? 644 should have worked previously but since we don't know
> what they were before we can't tell you.
>
> It has n
e fields terminated
> by '\t' lines terminated by 'w'
>
> that has always worked for me.
>
> Hope it helps!
>
>
> On Jan 14, 2008, at 10:51 AM, Hiep Nguyen wrote:
>
> > hi everyone,
> >
> > i have a large ms excel data (text) file tha
reate Database Link to your Spreadsheet.
6. Construct 'Append Query' to copy data from the XLS to the MySQL table
Have fun !!!
-Original Message-
From: Hiep Nguyen [mailto:[EMAIL PROTECTED]
Sent: Monday, January 14, 2008 10:52 AM
To: mysql@lists.mysql.com
Subject: import from
On Jan 14, 2008, at 10:51 AM, Hiep Nguyen wrote:
hi everyone,
i have a large ms excel data (text) file that i need to import to my
table in mysql. does any one have a suggestion how to do this? i'm
try to export to csv file, then import to my table, but i have so
much problems wit
hi everyone,
i have a large ms excel data (text) file that i need to import to my table
in mysql. does any one have a suggestion how to do this? i'm try to
export to csv file, then import to my table, but i have so much problems
with delimeters
thanks
--
MySQL General Mailing Lis
Doesn't that (the trailing comma) depend upon whether or not you want the
default value for the (missing) field, as opposed to "" or 0 used for empty
fields?
Either way, you are right - you should be able to import the data.
Regards,
Jerry Schwartz
The Infoshop by Glo
From: Jason Pruim [mailto:[EMAIL PROTECTED]
Sent: Thursday, August 09, 2007 10:54 AM
To: Edward Kay
Cc: mysql@lists.mysql.com
Subject: Re: Import file into MySQL Database..
On Aug 9, 2007, at 10:11 AM, Edward Kay wrote:
-Original Message-
From: Jason Pruim [mailto:[EMAIL PROTECT
s.com
www.etudes-marche.com
> -Original Message-
> From: Jason Pruim [mailto:[EMAIL PROTECTED]
> Sent: Thursday, August 09, 2007 10:54 AM
> To: Edward Kay
> Cc: mysql@lists.mysql.com
> Subject: Re: Import file into MySQL Database..
>
>
> On Aug 9, 200
On Aug 9, 2007, at 10:11 AM, Edward Kay wrote:
-Original Message-
From: Jason Pruim [mailto:[EMAIL PROTECTED]
Sent: 09 August 2007 14:16
To: Gary Josack
Cc: mysql@lists.mysql.com
Subject: Re: Import file into MySQL Database..
On Aug 8, 2007, at 5:19 PM, Gary Josack wrote:
Try
> -Original Message-
> From: Jason Pruim [mailto:[EMAIL PROTECTED]
> Sent: 09 August 2007 14:16
> To: Gary Josack
> Cc: mysql@lists.mysql.com
> Subject: Re: Import file into MySQL Database..
>
>
>
> On Aug 8, 2007, at 5:19 PM, Gary Josack wrote:
>
>
On Aug 8, 2007, at 5:19 PM, Gary Josack wrote:
Try:
|load data local infile '|/volumes/raider/aml.master.8.6.07.|csv'
into table test fields terminated by ',' enclosed by '"' lines
terminated by '\n' |ignore 1 lines
|(First, Last, Add1, Add2, City, State, Zip, XCode, Reason, Date);
If
Jason Pruim wrote:
First line of my .csv file is:
First,Last,Add1,Add2,City,State,Zip,Date,Xcode,Reason
DESCRIBE is:
mysql> describe test;
++-+--+-+-+---+
| Field | Type| Null | Key | Default | Extra |
++-+--+-+--
I believe that excel files generally terminate lines with '\r\n' and
if you use terminated by '\n' it will cause this behavior. so try it
with lines terminated by '\r\n' or get TextPad or a similar editor
that can save the file as unix platform that uses the same line
terminators that MySQL expects
First line of my .csv file is:
First,Last,Add1,Add2,City,State,Zip,Date,Xcode,Reason
DESCRIBE is:
mysql> describe test;
++-+--+-+-+---+
| Field | Type| Null | Key | Default | Extra |
++-+--+-+-+---+
| First
Jason Pruim wrote:
Okay, so I have been going crazy trying to figure this out...
All I want to do is load a excel file (Which I can convert to just
about anything) into a MySQL database... Should be easy right?
Here is the command that I have tried: LOAD DATA LOCAL INFILE
'/volumes/raider/A
Hi.
On Wednesday 08 August 2007 18:39, Jason Pruim wrote:
> Did some more testing, made a new table and matched the field names,
> now it will load it without any errors, it's just only importing the
> first row... Not the rest of the 934 records...
You are using ENCLOSED BY '"' in your SQL, whic
1 - 100 of 498 matches
Mail list logo