me] > alter table creditLine discard tablespace;
>> Query OK, 0 rows affected (0.30 sec)
>>
>> (copy operation of the .cfg and .ibd files from the origin server)
>>
>> (13:23:19) [borrame] > alter table creditLine import tablespace;
>> Query OK, 0 rows affec
ation of the .cfg and .ibd files from the origin server)
>
> (13:23:19) [borrame] > alter table creditLine import tablespace;
> Query OK, 0 rows affected (44.35 sec)
>
> 2014-10-10 13:26:42 1657 [Note] InnoDB: Importing tablespace for table
> 'letsbonus/creditLine' that
ard tablespace;
Query OK, 0 rows affected (0.30 sec)
(copy operation of the .cfg and .ibd files from the origin server)
(13:23:19) [borrame] > alter table creditLine import tablespace;
Query OK, 0 rows affected (44.35 sec)
2014-10-10 13:26:42 1657 [Note] InnoDB: Importing tablespace for
Am 05.10.2014 um 22:39 schrieb Jan Steinman:
I've had good experiences moving MyISAM files that way, but bad experience
moving INNODB files. I suspect the latter are more aggressively cached
simply no, no and no again
independent of "innodb_file_per_table = 1" there is *always* a global
tabl
* Reindl Harald [141005 13:12]:
>
> Am 05.10.2014 um 21:29 schrieb Tim Johnson:
> >I have a dual-boot OS X/Ubuntu 12.04 arrangement on a mac mini. The
> >ubuntu system has failed and I am unable to boot it.
> >
> >I have one database on the ubuntu partition that was not backed up.
> >
> >I am abl
* Jan Steinman [141005 13:12]:
> > So, this is a "Help me before I hurt myself" sort of question: Are
> > there any caveats and gotchas to consider?
> Do you know if the database was shut down properly? Or did Ubunto
> crash and die and your partition become unbootable while the
> database was i
Am 05.10.2014 um 21:29 schrieb Tim Johnson:
I have a dual-boot OS X/Ubuntu 12.04 arrangement on a mac mini. The
ubuntu system has failed and I am unable to boot it.
I have one database on the ubuntu partition that was not backed up.
I am able to mount the ubuntu partion with fuse-ext2 from Mac
> So, this is a "Help me before I hurt myself" sort of question: Are
> there any caveats and gotchas to consider?
Do you know if the database was shut down properly? Or did Ubunto crash and die
and your partition become unbootable while the database was in active use?
Either way, you need to mak
I have a dual-boot OS X/Ubuntu 12.04 arrangement on a mac mini. The
ubuntu system has failed and I am unable to boot it.
I have one database on the ubuntu partition that was not backed up.
I am able to mount the ubuntu partion with fuse-ext2 from Mac OS X,
thus I can read and copy the mysql dat
2013/3/31 Norah Jones
> Hi,
>
> To source sqldump i can use the source command, but if I need to do the
> same stuff using command line without going to the sqlpromt, can I achieve
> that.
>
>
Hello,
You mean cat sqldump.sql | mysql -uwhatever -pwhatever whatever_database ?
Manuel.
On 29 Oct 2011, at 22:59, luci spam wrote:
> I have 2 servers (1 for development, 1 for service)
>
> I keep ADD/DELETE columns and CREATE/DELETE Indexes on my development
> server, so these 2 server have similar but different mysql data structures.
>
> I know there's an option to expert structur
I have 2 servers (1 for development, 1 for service)
I keep ADD/DELETE columns and CREATE/DELETE Indexes on my development
server, so these 2 server have similar but different mysql data structures.
I know there's an option to expert structures only. (like –no-data)
Is there a way (except 3rd par
: Paul DuBois
To: Nunzio Daveri
Cc: mysql@lists.mysql.com
Sent: Tue, August 3, 2010 12:09:05 AM
Subject: Re: Importing User credentials from mysql.sql file???
On Aug 2, 2010, at 3:57 PM, Nunzio Daveri wrote:
> Hello Gurus, I just upgraded several MySQL 4.1 to 5.1 versions and also
>
On Aug 2, 2010, at 3:57 PM, Nunzio Daveri wrote:
> Hello Gurus, I just upgraded several MySQL 4.1 to 5.1 versions and also
> wanted
> to know how to extract the user name, password and credentials from the
> mysql.sql file (around 22 of them per server - have 8 servers total)? The
> contract
Hello Gurus, I just upgraded several MySQL 4.1 to 5.1 versions and also wanted
to know how to extract the user name, password and credentials from the
mysql.sql file (around 22 of them per server - have 8 servers total)? The
contract admin emailed me a sql file which is a dump of the default my
Hi all,
I would like to populate one column of my table with XML files -
meaning each element in that column will be an XML file. Is this
possible and how can it be done? Would it be more efficient to
store the files in the filesystem and instead keep a set of pointers
as elements in that colum
Gary Smith wrote:
Patrice Olivier-Wilson wrote:
Gave it a try got this:
MySQL said:
#1062 - Duplicate entry '1' for key 1
Yeah, that's what I was saying about in my previous mail. It looks
like you've got a primary key on one of your columns, and you're
attempting to insert data into it with
Patrice Olivier-Wilson wrote:
Gave it a try got this:
MySQL said:
#1062 - Duplicate entry '1' for key 1
Yeah, that's what I was saying about in my previous mail. It looks like
you've got a primary key on one of your columns, and you're attempting
to insert data into it with a duplicate primary
Gary Smith wrote:
Patrice Olivier-Wilson wrote:
I have data I need to keep in both db just trying to merge.
There's two ways around this:
First is to not export the structure (uncheck structure). The second
is to export with "if not exists". This should (IIRC) do a create
table if not ex
Patrice Olivier-Wilson wrote:
I have data I need to keep in both db just trying to merge.
There's two ways around this:
First is to not export the structure (uncheck structure). The second is
to export with "if not exists". This should (IIRC) do a create table if
not exists, so it'll do w
Gary Smith wrote:
Patrice Olivier-Wilson wrote:
I have 2 databases, different domains. Both have a table named
'tips'... both have different contents in the table.
Using phpMyAdmin for GUI.
I want to export databaseA tips as sql (done) then import content
into databaseB tips. But when I run t
Patrice Olivier-Wilson wrote:
I have 2 databases, different domains. Both have a table named
'tips'... both have different contents in the table.
Using phpMyAdmin for GUI.
I want to export databaseA tips as sql (done) then import content into
databaseB tips. But when I run that operation, the
I have 2 databases, different domains. Both have a table named 'tips'...
both have different contents in the table.
Using phpMyAdmin for GUI.
I want to export databaseA tips as sql (done) then import content into
databaseB tips. But when I run that operation, the databaseB says that
there is a
On 17/12/2009 17:46, mos wrote:
"Load Data ..." is still going to be much faster.
Mike
Hiya
If you using on Linux and using LVM, look at mylvmbackup.
HTH
Brent Clark
--
MySQL General Mailing List
For list archives: http://lists.mysql.com/mysql
To unsubscribe:http://lists.mysql.com/mys
At 03:59 AM 12/17/2009, you wrote:
Madison Kelly wrote:
Hi all,
I've got a fairly large set of databases I'm backing up each Friday. The
dump takes about 12.5h to finish, generating a ~172 GB file. When I try
to load it though, *after* manually dumping the old databases, it takes
1.5~2 days
Madison Kelly wrote:
Hi all,
I've got a fairly large set of databases I'm backing up each Friday.
The dump takes about 12.5h to finish, generating a ~172 GB file. When
I try to load it though, *after* manually dumping the old databases,
it takes 1.5~2 days to load the same databases. I am gue
Madison Kelly wrote:
Hi all,
I've got a fairly large set of databases I'm backing up each Friday. The
dump takes about 12.5h to finish, generating a ~172 GB file. When I try
to load it though, *after* manually dumping the old databases, it takes
1.5~2 days to load the same databases. I am gue
n Towey
Cc: mysql@lists.mysql.com
Subject: Re: Importing large databases faster
Gavin Towey wrote:
> There are scripts out there such at the Maatkit mk-parallel-dump/restore that
> can speed up this process by running in parallel.
>
> However if you're doing this every week on that l
Gavin Towey wrote:
There are scripts out there such at the Maatkit mk-parallel-dump/restore that
can speed up this process by running in parallel.
However if you're doing this every week on that large of a dataset, I'd just
use filesystem snapshots. You're backup/restore would then only take
ng
as it takes for you to scp the database from one machine to another.
Regards,
Gavin Towey
-Original Message-
From: Madison Kelly [mailto:li...@alteeve.com]
Sent: Wednesday, December 16, 2009 12:56 PM
To: mysql@lists.mysql.com
Subject: Importing large databases faster
Hi all,
I&
Hi all,
I've got a fairly large set of databases I'm backing up each Friday. The
dump takes about 12.5h to finish, generating a ~172 GB file. When I try
to load it though, *after* manually dumping the old databases, it takes
1.5~2 days to load the same databases. I am guessing this is, at leas
Thanks again for assistance. FYI, I did track this thread down
http://ask.metafilter.com/57007/Missing-commas-in-CSV-file
(exerpt:
Maybe there is a space or something in the 14th column of the first 15 rows.
posted by. on February 14, 2007
It's a bug in Excel (not something you did wrong.)
Jerry Schwartz wrote:
[JS] This is just a shot in the dark, but Excel can be rather surprising when
it puts out a CSV file. Depending upon the data, and exactly how you've
specified the export, it can put double-quotes in unexpected places.
If you leave out the 17th line of data what happ
>-Original Message-
>From: Patrice Olivier-Wilson [mailto:b...@biz-comm.com]
>Sent: Sunday, September 27, 2009 10:19 AM
>Cc: 'mysql'
>Subject: Re: Newbie question: importing cvs settings
>
>Back again... I have 192 records to import, and tried my extra lin
1. Try opening up the csv file in a text editor, viewing it in a spreadsheet
looks like it's hiding some extra formatting or lines that may be causing
problems.
2. Try importing through the mysql CLI. From the screenshot you posted, it
looks like PMA is parsing the file and creating an i
ewbie question: importing cvs settings
Back again... I have 192 records to import, and tried my extra line at
the end hoping for a work around, but nope, it failed at line 17 again.
Invalid field count in CSV input on line 17.
Anyone have an idea why this might be happening?
Patrice Olivier-Wilso
Back again... I have 192 records to import, and tried my extra line at
the end hoping for a work around, but nope, it failed at line 17 again.
Invalid field count in CSV input on line 17.
Anyone have an idea why this might be happening?
Patrice Olivier-Wilson wrote:
Yep, typo ...:-(
I did
Telephone +44 (0)7812 451238
Email j...@butterflysystems.co.uk
-Original Message-
From: Patrice Olivier-Wilson [mailto:b...@biz-comm.com]
Sent: 26 September 2009 17:08
To: mysql
Subject: Newbie question: importing cvs settings
Greetings:
I have a project for which need to import cvs file
fied MySQL 5 Developer (CMDEV)
IBM Cognos BI Developer
Telephone +44 (0)7812 451238
Email j...@butterflysystems.co.uk
-Original Message-
From: Patrice Olivier-Wilson [mailto:b...@biz-comm.com]
Sent: 26 September 2009 17:08
To: mysql
Subject: Newbie question: importing cvs settings
Greet
Greetings:
I have a project for which need to import cvs files into db.
I can do so up to a point. The import will only do 16 lines,
consistently. Error is failing at line 17.
Steps:
create table fields in Excel document, where they all match database fields
enter information in several of t
: Importing CSV into MySQL
Hi,
I'm sure I'm missing something quite obvious here, but the caffeine
hasn't quite kicked in yet. As the subject says, I'm importing a csv
file into MySQL 5.1.36 on WinXP using phpMyAdmin 3.3.2 (Apache 2.2.11
and PHP 5.3.0 should it matter). I
Hi,
I'm sure I'm missing something quite obvious here, but the caffeine
hasn't quite kicked in yet. As the subject says, I'm importing a csv
file into MySQL 5.1.36 on WinXP using phpMyAdmin 3.3.2 (Apache 2.2.11
and PHP 5.3.0 should it matter). I've done this man
ll lead to data loss.
Regards,
Vinodh.k
On Sat, Jun 20, 2009 at 12:19 AM, Isart Montane wrote:
> Hi Jason,
>
> if you run mysql with -f it will ignore any errors and continue importing
>
> cat aac.sql | mysql -f -u root AAC
>
> Isart
>
> On Wed, Jun 17, 2009 at 8:59 PM,
Hi Jason,
if you run mysql with -f it will ignore any errors and continue importing
cat aac.sql | mysql -f -u root AAC
Isart
On Wed, Jun 17, 2009 at 8:59 PM, Jason Novotny wrote:
> Hi,
>
> I'm trying to import a dumpfile like so:
>
> cat aac.sql | mysql -u root AAC
Hi,
I'm trying to import a dumpfile like so:
cat aac.sql | mysql -u root AAC
It all runs fine until I get something like:
ERROR 1061 (42000) at line 5671: Duplicate key name 'FK_mediaZip_to_zipSet'
Is there a way I can tell it to ignore or replace the key?
Thanks, Jason
--
MySQL Gene
Ali, Saqib wrote:
I exported a large data set from from Microsoft SQL server in CSV
format. However whenever I try to import that data to a a mySQL server
running on Linux, it adds a space between each character in each
field.
Essentially:
Saqib Ali
becomes
S a q i b A l i
I have tried to use
> try "OPTIONALLY ENCLOSED BY ' " '
already tried that. no help :(
saqib
http://doctrina.wordpress.com/
--
MySQL General Mailing List
For list archives: http://lists.mysql.com/mysql
To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]
I exported a large data set from from Microsoft SQL server in CSV
format. However whenever I try to import that data to a a mySQL server
running on Linux, it adds a space between each character in each
field.
Essentially:
Saqib Ali
becomes
S a q i b A l i
I have tried to use the dos2unix cmd on
[mailto:[EMAIL PROTECTED]
Sent: Monday, April 14, 2008 8:25 AM
To: mysql@lists.mysql.com
Subject: Importing - Adding Fields Into MySql From A List
Newbie question!
I have a list of field names from another database (not mysql) - like:
name
phone1
phone2
street
city
state
zip
info
etc (a bunch
Newbie question!
I have a list of field names from another database (not mysql) - like:
name
phone1
phone2
street
city
state
zip
info
etc (a bunch more fields)
Q: Is there a way I can add these to an existing empty/blank table?
Maybe I can use:
- phpMyAdmin ?
- sql commands with php - loop
cape slashes. It seems to me
that if it adds them in when exporting, it should take them out when
importing. Or vice versa, but in either case be consistent.
I just want my database to be exactly as it is before any export or
import options.
I'm a little muddled as to where I'm
.
Other than that, it's specific to phpMyAdmin, so maybe ask those guys
what they did...
I joined their list through Sourceforge, but I haven't seen any mail
from it, and any mail I send gets bounced back to me. I'm not sure what
the issue is.
Jed said:
If you're hav
, it
will add them because it's constructing sql statements, and they have
to be escaped.
I would take phpMyAdmin out of the equation and just use mysqldump. You
should have no problem doing something like creating a scratch table,
dumping it, and re-importing from the dump.
mysql>
have escape slashes in front of every double and
> single
> quote characters. I'm not sure if it's on the export or import where
> they get added in.
>
> I've looked through the phpMyAdmin online documentation, and I can't
> see
> any option to control the pre
m in when exporting, it should take them out when
importing. Or vice versa, but in either case be consistent.
I just want my database to be exactly as it is before any export or
import options.
I'm a little muddled as to where I'm making the mistake. Can anyone
advice on the b
A little bit easier of a way to do this could be the following command:
mysql -u[user] -h[host] -p [database] < [mysql dump file]
Make sure the database you're importing into is EMPTY (or at least the
TABLES you are importing to are empty...)
On 10/19/07, Werner Van Belle <[EMAI
Hello,
If it is a dump you can pipe it into mysql. If you have a csv like file you
can import it with LOAD DATA LOCAL INFILE like things. An example below:
DROP TABLE IF EXISTS EnsgDescriptions;
CREATE TABLE IF NOT EXISTS EnsgDescriptions
(stable_id VARCHAR(128) PRIMARY KEY,
description VARCHAR
Hi,
I have a 250mb dump and need to extract some data.
I know how to export a single table, but not import a single table using
mysqldump.
Any ideas?
Thanks
--
MySQL General Mailing List
For list archives: http://lists.mysql.com/mysql
To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]
Hi, sam
You can try to export table to file from Excel in CSV format.
And then import data from this file to mysql.
Something like this should help you:
LOAD DATA INFILE 'yourtabledata.txt' INTO TABLE yourtable FIELDS
TERMINATED BY ',' ENCLOSED BY '"' LINES TERMINATED BY '\r\n';
sam rumaizan
I have created table in mysql with 12 fields
Field1Field2 Field3 Field4
Field12
I have an excel sheet with 12 columns and 150 rows.
My question is how can I import all of the columns from the excel sheet to my
table without losing any information.
In the category of terrible, horrible, no good, very bad (but at least
documented) software behavior, I bumped into this today:
http://bugs.mysql.com/bug.php?id=14770
where the LOAD DATA INFILE command does not respect the default value of a
column if no value is supplied in the file. Instead, it
I'm in the process of moving a MySQL database with about 170 tables from my PC
to a new MacBook Pro. On my PC, I exported each database table as a SQL file,
then copied a folder containing all these files to my Mac.
I'd like to know if there's a way to import all these SQL files into a database
4:35 AM
Subject: Importing Text File Into mySQL
I have a text file with over 500K rows of data in it. The problem is
that the data is no seperated by commas but instead occupy a certain
amount of characters. So for instance:
ID 1 -11
NAME 12-50
COMPANY_NAME 51-100
...
How would you parse import this
In the last episode (Nov 17), John Kopanas said:
> On 11/17/06, Dan Nelson <[EMAIL PROTECTED]> wrote:
> >In the last episode (Nov 17), John Kopanas said:
> >> I have a text file with over 500K rows of data in it. The problem
> >> is that the data is no seperated by commas but instead occupy a
> >>
I am trying to figure out how this would work? How does LOAD DATA
figure out when one column begins and another ends when some of the
data are addresses with spaces in them?
On 11/17/06, Dan Nelson <[EMAIL PROTECTED]> wrote:
In the last episode (Nov 17), John Kopanas said:
> I have a text file
In the last episode (Nov 17), John Kopanas said:
> I have a text file with over 500K rows of data in it. The problem is
> that the data is no seperated by commas but instead occupy a certain
> amount of characters. So for instance:
>
> ID 1 -11
> NAME 12-50
> COMPANY_NAME 51-100
> ...
>
> How
I did a little shell script to do it. the key was the shell variable IFS:
Normally IFS=" "
to make it work right I set it as follows:
IFS="
"
Yes, thats a newline between the quotes
John Kopanas wrote:
I have a text file with over 500K rows of data in it. The problem is
that the data is
I have a text file with over 500K rows of data in it. The problem is
that the data is no seperated by commas but instead occupy a certain
amount of characters. So for instance:
ID 1 -11
NAME 12-50
COMPANY_NAME 51-100
...
How would you parse import this data into mysql?
Thanks for your help :
-Original Message-
> From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED]
> Sent: Monday, October 30, 2006 1:05 PM
> To: Jerry Schwartz
> Subject: RE: utf8 importing problem
>
> Jerry,
> I checked the imported data ( sql file) and the data are in
> utf8 coding.
> Is th
Ave.
Farmington, CT 06032
860.674.8796 / FAX: 860.674.8341
> -Original Message-
> From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED]
> Sent: Saturday, October 28, 2006 2:22 AM
> To: mysql@lists.mysql.com
> Subject: utf8 importing problem
>
> I use MySQL database with
I use MySQL database with utf8 character set and utf8_czech_ci
collation.
It works well on Linux server but when I try to
export the data and import into the same database but running on XP machine the
utf8 is
gone.Instead of a proper coding there are some strange characters.
I used
mysqldum
David Blomstrom wrote:
Hopeffully this will be the last question in this series. :)
I want to copy a database from my PC to my Apple laptop. I installed MySQL's
GUI Tools on both computers, created a file named Backup.mpb on my PC, then put
a copy of it on my Mac. Now I'm trying to figure out
Hopeffully this will be the last question in this series. :)
I want to copy a database from my PC to my Apple laptop. I installed MySQL's
GUI Tools on both computers, created a file named Backup.mpb on my PC, then put
a copy of it on my Mac. Now I'm trying to figure out how to get Backup.mbp int
I dont think that is the problem but, what do you mean by a slow
connection ?, you cant run the dos2unix command on the remote server ?
The error ocurred on line 2, did you see any special word in that line
? can you share with us that line? , remember that each version may
can reserve different
I dumped a database from a 4.0 mysql and am attempting to move it to a
server running 4.1 - using the command line:
$ mysql -u root -pmypassword empty4.1db < 4.0dump.sql
The result:
ERROR 1064 (42000) at line 2: You have an error in your SQL syntax; check
the manual that corresponds to your MySQL
On 6/22/06, Scott Haneda <[EMAIL PROTECTED]> wrote:
I have two chunks of data to import, one is in this format:
"01001 - AGAWAM, MA","01001",0,0,291,249,0,"42.070206","-72.622739"
Where it is comma sep and partially quoted
The other is in this format
"99502 ANCHORAGE,
AK","256","265","1424","196
Hi,
No unfortunately not...
Cheers
Ian
> -Original Message-
> From: John Meyer [mailto:[EMAIL PROTECTED]
> Sent: 25 June 2006 05:41 PM
> To: mysql@lists.mysql.com
> Subject: Re: Re-importing a mysqldump file
>
> Ian Barnes wrote:
> > Is this possible? Or woul
Ian Barnes wrote:
Is this possible? Or would the best way be to import the dumped file into a
temp table and then select out of the temp table into my correct table ?
Anyway to use a trigger?
--
Online library -- http://pueblonative.110mb.com
126 books and counting.
--
MySQL General Mailing
Hi,
I need to auto re-import a mysqldump file, but when importing it I need to
make a certain field a value for all information imported. For example my
db looks like this:
Id
Name
Value
Serverid
Now, on the remote server, name and value get exported, and when I re-import
it here
Scott Haneda wrote:
I have two chunks of data to import, one is in this format:
"01001 - AGAWAM, MA","01001",0,0,291,249,0,"42.070206","-72.622739"
Where it is comma sep and partially quoted
The other is in this format
"99502 ANCHORAGE,
AK","256","265","1424","1962","1131","528","643","6209","9
I have two chunks of data to import, one is in this format:
"01001 - AGAWAM, MA","01001",0,0,291,249,0,"42.070206","-72.622739"
Where it is comma sep and partially quoted
The other is in this format
"99502 ANCHORAGE,
AK","256","265","1424","1962","1131","528","643","6209","99502","61.096163",
"-1
09 June 2006 07:15 PM
> To: mysql@lists.mysql.com
> Subject: Re: Importing 3Gb File
>
> At 10:20 AM 6/8/2006, you wrote:
> >Hi,
> >
> >I am trying to import a 3.2Gb sql dump file back into my sql server
> (4.1.12)
> >and im coming across the following error:
>
Sent: 09 June 2006 05:01 PM
> To: Ian Barnes
> Cc: mysql@lists.mysql.com
> Subject: Re: Importing 3Gb File
>
> Hi Ian,
>
> > I am trying to import a 3.2Gb sql dump file back into my sql server
> (4.1.12)
> > and im coming across the following error:
> >
&
: mysql@lists.mysql.com
> Subject: Re: Importing 3Gb File
>
> At 10:20 AM 6/8/2006, you wrote:
> >Hi,
> >
> >I am trying to import a 3.2Gb sql dump file back into my sql server
> (4.1.12)
> >and im coming across the following error:
> >
> >mysql: Out of me
> At 10:20 AM 6/8/2006, you wrote:
> >Hi,
> >
> >I am trying to import a 3.2Gb sql dump file back into my sql server
> (4.1.12)
> >and im coming across the following error:
> >
> >mysql: Out of memory (Needed 178723240 bytes)
> >mysql: Out of memory (Needed 178719144 bytes)
> >
> >That error come
At 10:20 AM 6/8/2006, you wrote:
Hi,
I am trying to import a 3.2Gb sql dump file back into my sql server (4.1.12)
and im coming across the following error:
mysql: Out of memory (Needed 178723240 bytes)
mysql: Out of memory (Needed 178719144 bytes)
That error comes up after about 30 minutes wor
Hi Ian,
I am trying to import a 3.2Gb sql dump file back into my sql server (4.1.12)
and im coming across the following error:
mysql: Out of memory (Needed 178723240 bytes)
mysql: Out of memory (Needed 178719144 bytes)
That error message comes from some single place trying to allocate 178MB
m/doc/refman/5.0/en/packet-too-large.html
HTH,
Dan
Ian Barnes wrote:
Sorry, forgot to send to the list aswell. My reply is at the bottom.
-Original Message-
From: Ian Barnes [mailto:[EMAIL PROTECTED]
Sent: 08 June 2006 09:58 PM
To: 'Kishore Jalleda'
Subject: RE: Importing 3Gb
Sorry, forgot to send to the list aswell. My reply is at the bottom.
> -Original Message-
> From: Ian Barnes [mailto:[EMAIL PROTECTED]
> Sent: 08 June 2006 09:58 PM
> To: 'Kishore Jalleda'
> Subject: RE: Importing 3Gb File
>
>
>
> > -Origina
On 6/8/06, Ian Barnes <[EMAIL PROTECTED]> wrote:
Hi,
I am trying to import a 3.2Gb sql dump file back into my sql server (
4.1.12)
and im coming across the following error:
mysql: Out of memory (Needed 178723240 bytes)
mysql: Out of memory (Needed 178719144 bytes)
That error comes up after ab
Hi,
I am trying to import a 3.2Gb sql dump file back into my sql server (4.1.12)
and im coming across the following error:
mysql: Out of memory (Needed 178723240 bytes)
mysql: Out of memory (Needed 178719144 bytes)
That error comes up after about 30 minutes worth of import and I would guess
abou
one I have wondered about myself.
Is there a way in mysql to "attach" to session to issue a commit?
-Original Message-
From: sheeri kritzer [mailto:[EMAIL PROTECTED]
Sent: Friday, May 05, 2006 3:02 PM
To: Luke Vanderfluit
Cc: MySQL List
Subject: Re: importing a dumpfile f
to session to issue a commit?
-Original Message-
From: sheeri kritzer [mailto:[EMAIL PROTECTED]
Sent: Friday, May 05, 2006 3:02 PM
To: Luke Vanderfluit
Cc: MySQL List
Subject: Re: importing a dumpfile from with the mysql client
On 5/4/06, Luke Vanderfluit <[EMAIL PROTECTED]> wrote:
:) )
This is a good question, one I have wondered about myself.
Is there a way in mysql to "attach" to session to issue a commit?
-Original Message-
From: sheeri kritzer [mailto:[EMAIL PROTECTED]
Sent: Friday, May 05, 2006 3:02 PM
To: Luke Vanderfluit
Cc: MySQL List
Subject: Re
On 5/4/06, Luke Vanderfluit <[EMAIL PROTECTED]> wrote:
[snip]
I started this process remotely then went to the site to finish it.
But when the dump finished (several hours later) I was not able to
execute the following commands from my original location.
mysql> SET FOREIGN_KEY_CHECKS = 1;
mysq
Hi.
I recently imported a dumpfile into mysql4.1.18
I did this using the 'source' syntax from inside the mysql client.
this is syntax I used:
mysql>create database dbname;
mysql>use dbname;
-unset the creation of bin-logs in my.cnf
mysql> SET SQL_LOG_BIN = 0
then some tips to fast import dump f
off of the hard disk. Is there a way of importing
these to my MySQL 3.2 server? (I beleive that Arch Linux was running
5.0). I have tried simply placing them in the mysql database folder
but I get errors such as "Incorrect information in file:
'./my0007/ee_pm.frm..." when I try
nvent
---
-Original Message-
From: Christopher Beale [mailto:[EMAIL PROTECTED]
Sent: Monday, 17 April 2006 7:46 PM
To: mysql@lists.mysql.com
Subject: Importing raw MySQL file
off of
the hard disk. Is there a way of importing these to my MySQL 3.2 server?
(I beleive that Arch Linux was running 5.0). I have tried simply placing
them in the mysql database folder but I get errors such as "Incorrect
information in file: './my0007/ee_pm.frm..." when I try
How did you try to do it on the remote system?
I copied and pasted your query to a server running MySQL
4.1.12-standard-log, and another running MySQL 5.0.19-standard-log,
and they both created the table just fine.
Have you tried copying and pasting the table creation itself to see if
that works?
1 - 100 of 434 matches
Mail list logo