Re: Optimize fails due to duplicate rows error but no duplicates found

2018-02-23 Thread shawn l.green
. *From:* shawn l.green *Sent:* 13 February 2018 09:51:33 PM *To:* mysql@lists.mysql.com *Subject:* Re: Optimize fails due to duplicate rows error but no duplicates found Hello Machiel, On 2/13/2018 3:02 AM, Machiel Richards wrote: Good day guys, I

Re: Optimize fails due to duplicate rows error but no duplicates found

2018-02-13 Thread Machiel Richards
: Re: Optimize fails due to duplicate rows error but no duplicates found Hello Machiel, On 2/13/2018 3:02 AM, Machiel Richards wrote: > Good day guys, > > > I am hoping this mail finds you well. > > > I am at a bit of a loss here... > > > We are trying t

Re: Optimize fails due to duplicate rows error but no duplicates found

2018-02-13 Thread Machiel Richards
due to duplicate rows error but no duplicates found Hello Machiel, On 2/13/2018 3:02 AM, Machiel Richards wrote: > Good day guys, > > > I am hoping this mail finds you well. > > > I am at a bit of a loss here... > > > We are trying to run optimize again

Re: Optimize fails due to duplicate rows error but no duplicates found

2018-02-13 Thread shawn l.green
. However, after running for over an hour , the optimize fails stating there is a duplicate entry in the table. We have now spent 2 days using various methods but we are unable to find any duplicates in the primary key and also nothing on the unique key fields. Any idea on why optimize

Optimize fails due to duplicate rows error but no duplicates found

2018-02-13 Thread Machiel Richards
there is a duplicate entry in the table. We have now spent 2 days using various methods but we are unable to find any duplicates in the primary key and also nothing on the unique key fields. Any idea on why optimize would still be failing ? Regards

Re: duplicate rows in spite of multi-column unique constraint

2015-03-24 Thread shawn l.green
Hi Chris, On 3/24/2015 10:07 AM, Chris Hornung wrote: Thanks for the suggestions regarding non-printing characters, definitely makes sense as a likely culprit! However, the data really does seem to be identical in this case: mysql> select id, customer_id, concat('-', group_id, '-') from app_cu

Re: duplicate rows in spite of multi-column unique constraint

2015-03-24 Thread Chris Hornung
p_customergroupmembership where customer_id ='ajEiQA'; I suspect one of those group IDs has a trailing space or similar 'invible' character that makes it not identical. - Original Message - From: "Chris Hornung" To: "MySql" Sent: Monday, 23 M

Re: duplicate rows in spite of multi-column unique constraint

2015-03-24 Thread Johan De Meersman
-- Original Message - > From: "Chris Hornung" > To: "MySql" > Sent: Monday, 23 March, 2015 18:20:36 > Subject: duplicate rows in spite of multi-column unique constraint > Hello, > > I'm come across a situation where a table in our production D

duplicate rows in spite of multi-column unique constraint

2015-03-23 Thread Chris Hornung
, UNIQUE KEY `app_customergroupmembership_customer_id_31afe160_uniq` (`customer_id`,`group_id`), KEY `app_customergroupmembership_group_id_18aedd38e3f8a4a0` (`group_id`,`created`) ) ENGINE=InnoDB AUTO_INCREMENT=21951158253 DEFAULT CHARSET=utf8 COLLATE=utf8_bin Despite that, records with

Re: indexing on column having duplicate values

2014-05-28 Thread Reindl Harald
Am 28.05.2014 22:39, schrieb Rajeev Prasad: > (re-sending, i got err from yahoo) your previous message made it off-list to me *don't use reply-all on mailing lists* signature.asc Description: OpenPGP digital signature

Re: indexing on column having duplicate values

2014-05-28 Thread Rajeev Prasad
re will be > records which will have same value for the key field (other columns will be > different). > > so how can i do this? right now, i am getting error, about duplicate entries > and they are being discarded. All entries are important and I have to find a > way to locate

Re: indexing on column having duplicate values

2014-05-28 Thread Reindl Harald
for the key field (other columns will be >> different). >> >> so how can i do this? right now, i am getting error, about duplicate entries >> and they are being discarded. All entries are important and I have to find a >> way to locate records based on this key

Re: indexing on column having duplicate values

2014-05-28 Thread Reindl Harald
t; different). > > so how can i do this? right now, i am getting error, about duplicate entries > and they are being discarded. All entries are important and I have to find a > way to locate records based on this key field. who said that a key needs to be unique? just get phpMyAdmin to

indexing on column having duplicate values

2014-05-28 Thread Rajeev Prasad
error, about duplicate entries and they are being discarded. All entries are important and I have to find a way to locate records based on this key field. thx for help. Rajeev -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com

Re: Join query returning duplicate entries

2013-04-04 Thread shawn green
Hello Trimurthy, On 4/4/2013 3:21 AM, Trimurthy wrote: Hi list, i wrote the following query and it is returning duplicate entries as shown below, can any one suggest me how to avoid this duplicate entries, with out using distinct. Query: select p.date,p.coacode,p.type,p.crdr

Re: Join query returning duplicate entries

2013-04-04 Thread Lucky Wijaya
Hi, sorry i tried to help but i hardly understand the use of join in your query since the joined table is not used anywhere. From: Trimurthy To: mysql@lists.mysql.com Sent: Thursday, 4 April 2013, 14:21 Subject: Join query returning duplicate entries Hi

Re: Join query returning duplicate entries

2013-04-04 Thread Johan De Meersman
- Original Message - > From: "Lucky Wijaya" > To: mysql@lists.mysql.com > Sent: Thursday, 4 April, 2013 10:51:50 AM > Subject: Re: Join query returning duplicate entries > > Hi, sorry i tried to help but i hardly understand the use of join in > your quer

Re: slave replication with lots of 'duplicate entry' errors

2013-02-15 Thread Manuel Arostegui
2013/2/14 Robert Citek > > > According to the client, nothing is writing to the slave and > everything is being logged at the master. I have not had the > opportunity to independently verified any of this, yet. I do know > that the slave is not in read-only mode, but rather "we promise not to >

Re: slave replication with lots of 'duplicate entry' errors

2013-02-14 Thread Robert Citek
Agreed. Will do that along with several other possible changes. But for the moment, I'm still gathering information and coming up with plausible models. Will also be turning on general mysql logging on both Master and Slave, at least briefly, to see what statements are being run on both. Regard

Re: slave replication with lots of 'duplicate entry' errors

2013-02-14 Thread Manuel Arostegui
int-in-time recovery from -- CHANGE MASTER TO MASTER_LOG_FILE='mysql-bin.000974', MASTER_LOG_POS=240814775; And if you're using the right IP, there's no reason to have duplicate entries unless someone is writing directly into the slave. Manuel.

Re: slave replication with lots of 'duplicate entry' errors

2013-02-14 Thread Robert Citek
Yes. Except for a handful of static MyISAM tables. But the tables that are experiencing the issues are all InnoDB and large (a dozen or so fields, but lots of records.) Regards, - Robert On Thu, Feb 14, 2013 at 5:59 PM, Singer Wang wrote: > Are you using all InnoDB? > > S -- MySQL General Ma

Re: slave replication with lots of 'duplicate entry' errors

2013-02-14 Thread Robert Citek
> read only mode? > If you're starting replication using the values provided by --master-data=2 > (which should be something like): > > -- Position to start replication or point-in-time recovery from > > -- CHANGE MASTER TO MASTER_LOG_FILE='mysql-bin.000974', > MAS

Re: slave replication with lots of 'duplicate entry' errors

2013-02-14 Thread Robert Citek
On Thu, Feb 14, 2013 at 5:46 PM, Rick James wrote: >> Is it in read only mode? > Furthermore, are all users logging in as non-SUPER users? Note: root > bypasses the readonly flag! No. The user that is commonly used does have Super privileges. I am not sure why, but it does. Regards, - Rober

Re: slave replication with lots of 'duplicate entry' errors

2013-02-14 Thread Singer Wang
> --master-data=2 > > (which should be something like): > > > > -- Position to start replication or point-in-time recovery from > > > > -- CHANGE MASTER TO MASTER_LOG_FILE='mysql-bin.000974', > > MASTER_LOG_POS=240814775; > > > > And if yo

RE: slave replication with lots of 'duplicate entry' errors

2013-02-14 Thread Rick James
ysql > Subject: Re: slave replication with lots of 'duplicate entry' errors > > 2013/2/13 Robert Citek > > > On Wed, Feb 13, 2013 at 8:59 AM, Robert Citek > > > wrote: > > > Any other possibilities? Do other scenarios become likely if there > &

RE: slave replication with lots of 'duplicate entry' errors

2013-02-14 Thread Rick James
ursday, February 14, 2013 2:59 PM > To: Rick James > Cc: mysql > Subject: Re: slave replication with lots of 'duplicate entry' errors > > On Thu, Feb 14, 2013 at 5:46 PM, Rick James > wrote: > >> Is it in read only mode? > > Furthermore, are all users logg

Re: slave replication with lots of 'duplicate entry' errors

2013-02-13 Thread Robert Citek
On Wed, Feb 13, 2013 at 8:59 AM, Robert Citek wrote: > Any other possibilities? Do other scenarios become likely if there > are two or more tables? > > Of those, which are the most likely? [from off-list responder]: > Other possibility: The replication is reading from master not from the point

RE: console input - finding duplicate entries

2012-06-15 Thread Daevid Vincent
> -Original Message- > From: Gary Aitken [mailto:my...@dreamchaser.org] > Sent: Thursday, June 14, 2012 2:58 PM > > I can get the table loaded by specifying REPLACE INTO TABLE, but that still > leaves me with not knowing where the duplicate records are. To find duplica

Re: JOIN giving duplicate records

2012-04-04 Thread Hal�sz S�ndor
;>>> 2012/04/03 18:18 +0100, Tompkins Neil >>>> Before sending the table definition, and queries etc, can anyone advise why my query with four INNER JOIN might be give me back duplicate results e.g 100,UK,12121 100,UK,12121 Basically the qu

JOIN giving duplicate records

2012-04-03 Thread Tompkins Neil
Hi Before sending the table definition, and queries etc, can anyone advise why my query with four INNER JOIN might be give me back duplicate results e.g 100,UK,12121 100,UK,12121 Basically the query the statement AND (hotel_facilities.hotelfacilitytype_id = 47 OR

How to get the first data from a multiple or duplicate records

2011-10-18 Thread Gian Karlo C
Hello everyone, I would like to ask for idea and help on how to achieve my concern. Below is my SQL statement. Im joining 2 tables to get my results. Here's the sample results of what im getting. Name | Desc | Issue | ATime | Back | TotalTime | Ack | Res 123 | test | error | 2011-10-18 17:09:26

Re: myisamchk error (duplicate key records)

2011-09-19 Thread Johan De Meersman
- Original Message - > From: "Hank" > > I'm trying to rebuild an index after disabling all keys using > myisamchk and adding all 144 million records, so there is no current index on > the > table. Ahhh... I didn't realise that. > But in order to create the index, mysql has to do a fu

Re: myisamchk error (duplicate key records)

2011-09-19 Thread Hank
> > > Exactly - I can't create an index on the table until I remove the > > duplicate records. > > I was under the impression you were seeing this during a myisamchk run - > which indicates you should *already* have a key on that field. Or am I > interpreting that

Re: myisamchk error (duplicate key records)

2011-09-19 Thread Johan De Meersman
- Original Message - > From: "Hank" > > Exactly - I can't create an index on the table until I remove the > duplicate records. I was under the impression you were seeing this during a myisamchk run - which indicates you should *already* have a key on that

Re: myisamchk error (duplicate key records)

2011-09-19 Thread Hank
On Mon, Sep 19, 2011 at 7:19 AM, Johan De Meersman wrote: > - Original Message - > > From: "Hank" > > > > While running a -rq on a large table, I got the following error: > > > > myisamchk: warning: Duplicate key for record at 54381140 agains

Re: myisamchk error (duplicate key records)

2011-09-19 Thread Johan De Meersman
- Original Message - > From: "Hank" > > While running a -rq on a large table, I got the following error: > > myisamchk: warning: Duplicate key for record at 54381140 against > record at 54380810 > > How do I find which records are duplicated (wit

myisamchk error (duplicate key records)

2011-09-18 Thread Hank
While running a -rq on a large table, I got the following error: myisamchk: warning: Duplicate key for record at 54381140 against record at 54380810 How do I find which records are duplicated (without doing the typical self-join or "having cnt(*)>1" query)? This table has 144

Re: running a duplicate database

2011-09-09 Thread Rik Wasmus
> Am 09.09.2011 11:09, schrieb Dave Dyer: > > Is there a halfway house between a single database and a full > > master-slave setup? > > > > I have a database with one "piggish" table, and I'd like to direct > > queries that search the pig to a dupli

Re: running a duplicate database

2011-09-09 Thread Reindl Harald
Am 09.09.2011 11:09, schrieb Dave Dyer: > Is there a halfway house between a single database and a full master-slave > setup? > > I have a database with one "piggish" table, and I'd like to direct queries > that search the pig to a duplicate database, where it w

running a duplicate database

2011-09-09 Thread Dave Dyer
Is there a halfway house between a single database and a full master-slave setup? I have a database with one "piggish" table, and I'd like to direct queries that search the pig to a duplicate database, where it won't affect all the routine traffic. I could definitely d

Re: Deleting the duplicate values in a column

2011-05-09 Thread Aveek Misra
g count = 1; > > On May 9, 2011, at 5:45 PM, abhishek jain wrote: > >> hi, >> If we have a following mysql table: >> Name - ids >> A 1 >> B 1 >> C 2 >> D 3 >> >> I want to remove all duplicate occuran

Re: Deleting the duplicate values in a column

2011-05-09 Thread Aveek Misra
SELECT * from group by id having count = 1; On May 9, 2011, at 5:45 PM, abhishek jain wrote: > hi, > If we have a following mysql table: > Name - ids > A 1 > B 1 > C 2 > D 3 > > I want to remove all duplicate occurances and have

Deleting the duplicate values in a column

2011-05-09 Thread abhishek jain
hi, If we have a following mysql table: Name - ids A 1 B 1 C 2 D 3 I want to remove all duplicate occurances and have a result like Name - ids C 2 D 3 how can i do that with a query in mysql Pl. help asap -- Thanks and kind Regards

Re: ERROR 1062 (23000): Duplicate entry '31592' for key 'PRIMARY'

2011-04-28 Thread misiaq
Corrupted table and / or index. A number of reasons could cause this issue: http://dev.mysql.com/doc/refman/5.1/en/corrupted-myisam-tables.html Regards, m "Adarsh Sharma" pisze: > Thanks , but there is no trigger on tables. > > Even I solved the problem after googling a link but cannot unders

Re: ERROR 1062 (23000): Duplicate entry '31592' for key 'PRIMARY'

2011-04-28 Thread Adarsh Sharma
INE=MyISAM AUTO_INCREMENT=31592 DEFAULT CHARSET=latin1 Today don't know why below error occurs when i am going insert some data in it : mysql> insert into login(user_id,log_status) values(2,1); ERROR 1062 (23000): Duplicate entry '31592' for key 'PRIMARY' I check the lat

ERROR 1062 (23000): Duplicate entry '31592' for key 'PRIMARY'

2011-04-28 Thread Adarsh Sharma
`user_id` (`user_id`) ) ENGINE=MyISAM AUTO_INCREMENT=31592 DEFAULT CHARSET=latin1 Today don't know why below error occurs when i am going insert some data in it : mysql> insert into login(user_id,log_status) values(2,1); ERROR 1062 (23000): Duplicate entry '31592' for key 

Re: Duplicate entry '2' for key 1

2010-11-09 Thread Michael Dykman
. - michael dykman On Tue, Nov 9, 2010 at 3:36 PM, Ilham Firdaus wrote: > Dear friends. > > Anybody would be so nice to explain about meaning of this error message: > " > Duplicate entry '2' for key 1 > :. > It comes if we visit this: > http://www.otekno.biz/kn/cod

Duplicate entry '2' for key 1

2010-11-09 Thread Ilham Firdaus
Dear friends. Anybody would be so nice to explain about meaning of this error message: " Duplicate entry '2' for key 1 :. It comes if we visit this: http://www.otekno.biz/kn/code/functions.php?task=sync Thank you very much in advance. -- Enjoy our free facilities: http:/

RE: How To Duplicate Number of Hits from Prod Sever to NEW QA server?

2010-08-27 Thread Jerry Schwartz
>-Original Message- >From: Nunzio Daveri [mailto:nunziodav...@yahoo.com] >Sent: Friday, August 27, 2010 10:19 AM >To: mysql@lists.mysql.com >Subject: How To Duplicate Number of Hits from Prod Sever to NEW QA server? > >Hello, I have been asked to "replay" the

How To Duplicate Number of Hits from Prod Sever to NEW QA server?

2010-08-27 Thread Nunzio Daveri
quot; the log against the 5.5 server so as to "duplicate" real time traffic and not just replay the logs? Is there a tool or a shell script? I know there are built in "benchmarking" tools but I am trying to tell mgmt that 5.1.4x was lets say 60% percentage busy (cpu/mem/io)

Re: sql to duplicate records with modified value

2010-04-27 Thread Voytek Eymont
se! > > let me know how you go - hope you are keeping well! > > ray > > At 03:17 PM 27/03/2010, Voytek Eymont wrote: > >> I have Postfix virtual mailboxes in MySQL table like below: >> >> >> I'd like to duplicate all records whilst MODIFYING two fi

Re: sql to duplicate records with modified value

2010-03-27 Thread Ray Cauchi
you are keeping well! ray At 03:17 PM 27/03/2010, Voytek Eymont wrote: I have Postfix virtual mailboxes in MySQL table like below: I'd like to duplicate all records whilst MODIFYING two fields like so: current record has format like: user 'usern...@domain.tld' maildir 'domain.

Re: sql to duplicate records with modified value

2010-03-27 Thread Voytek Eymont
> Voytek Eymont wrote: > Are you hoping to do all that you want - copy rows, update rows and > create new rows - in a single SQL statement? Because if that's what you > want, I don't think it's possible. Unless someone has come up with some > new tricks, you can't insert a new record and update

Re: sql to duplicate records with modified value

2010-03-27 Thread Rhino
Voytek Eymont wrote: I have Postfix virtual mailboxes in MySQL table like below: I'd like to duplicate all records whilst MODIFYING two fields like so: current record has format like: user 'usern...@domain.tld' maildir 'domain.tld/usern...@domain.tld/' add

sql to duplicate records with modified value

2010-03-26 Thread Voytek Eymont
I have Postfix virtual mailboxes in MySQL table like below: I'd like to duplicate all records whilst MODIFYING two fields like so: current record has format like: user 'usern...@domain.tld' maildir 'domain.tld/usern...@domain.tld/' add new record that has: user

Re: Duplicate entries despite group by

2010-02-21 Thread Carsten Pedersen
However, after running for a few hours, the query fails with the following error: com.mysql.jdbc.exceptions.jdbc4.MySQLIntegrityConstraintViolationException: Duplicate entry 'new_order-248642-order_line-13126643' for key 'group_key' How is this possible? There were no concurre

Duplicate entries despite group by

2010-02-20 Thread Yang Zhang
ils with the following error: com.mysql.jdbc.exceptions.jdbc4.MySQLIntegrityConstraintViolationException: Duplicate entry 'new_order-248642-order_line-13126643' for key 'group_key' How is this possible? There were no concurrently running queries inserting into 'graph'. I'm u

Re: Possible to find this duplicate?

2010-02-13 Thread Mark Goodge
27;re both from the same author, and both in chapter 1 of the book. It should not return ID 4, because that's in a different chapter. Note that J. and John have to be considered the same. For my purposes, it's sufficient to look at the first word, Smith, and consi

Possible to find this duplicate?

2010-02-13 Thread Brian Dunning
chapter 1 of the book. It should not return ID 4, because that's in a different chapter. Note that J. and John have to be considered the same. For my purposes, it's sufficient to look at the first word, Smith, and consider that a duplicate. ++--+-+ | ID | Au

Re: Duplicate Entry, But Table Empty!

2009-12-14 Thread Mattia Merzi
2009/12/13 Victor Subervi : [...] > Please advise. review your sql: you are inserting into tem126072414516 and selecting from tem126072385457 ( Asterisk in Pinter Tibor's mail means "bold" ) Greetings, Mattia Merzi. -- MySQL General Mailing List For list archives: http://lists.mysql.com

Re: Duplicate Entry, But Table Empty!

2009-12-14 Thread Johan De Meersman
Gods. What is this, a creche ? *plonk* On Sun, Dec 13, 2009 at 6:44 PM, Victor Subervi wrote: > On Sun, Dec 13, 2009 at 12:21 PM, Pinter Tibor wrote: > > > Victor Subervi wrote: > > > >> Hi; > >> > >> mysql> insert into *tem126072414516* (ProdID, Quantity) values ("2", > "2"); > >> mysql> sele

Re: Duplicate Entry, But Table Empty!

2009-12-13 Thread Victor Subervi
On Sun, Dec 13, 2009 at 12:21 PM, Pinter Tibor wrote: > Victor Subervi wrote: > >> Hi; >> >> mysql> insert into *tem126072414516* (ProdID, Quantity) values ("2", "2"); >> mysql> select * from *tem126072385457*; >> > > mysql> insert into *tem126072414516* (ProdID, Quantity) values ("2", "2"); ERRO

Re: Duplicate Entry, But Table Empty!

2009-12-13 Thread Pinter Tibor
Victor Subervi wrote: Hi; mysql> insert into *tem126072414516* (ProdID, Quantity) values ("2", "2"); mysql> select * from *tem126072385457*; t -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/mysql?unsub=arch...@jab.org

Duplicate Entry, But Table Empty!

2009-12-13 Thread Victor Subervi
Hi; mysql> insert into tem126072414516 (ProdID, Quantity) values ("2", "2"); ERROR 1062 (23000): Duplicate entry '2' for key 2 mysql> select * from tem126072385457; Empty set (0.00 s

INSERT ... SELECT ON DUPLICATE

2009-09-24 Thread dbrb2002-sql
Does anyone know if I can add a hint SQL_BUFFER_RESULT to INSERT .. SELECT ON DUPLICATE ex.. INSERT INTO foo SELECT SQL_BUFFER_RESULT* FROM bar ON DUPLICATE KEY UPDATE foo.X=.. Both my tables foo and bar are InnoDB; but the idea is to release the lock on bar as soon as possible by moving the

Re: Data import problem with duplicate datasets

2009-09-23 Thread Claudio Nanni
A very first thought I got is disable the constraint before import and re-enable after that. One way could be to set the foreign key checks to false or alter the constraint and remove the 'cascade delete' part. It's just a quick brain storm, please verify the goodness of it, I still need to get my

Data import problem with duplicate datasets

2009-09-23 Thread spikerlion
Hello, we have two tables associated with a foreign key constraint. Table A with the primary key and table B with an "on delete cascade" constraint. We want to delete datasets in Table B if the related dataset in Table A is deleted - that works. Now the Problem: There is a weekly import defined

Re: Removing Duplicate Records

2009-07-14 Thread Matt Neimeyer
Ah... Yes. Good point. I like this because I was planning on keeping the output somewhere for a while. (In case we need an "accounting" at some point) So it will be easy enough to dump what's being deleted to the screen while we loop over our candidates. Thanks! On Tue, Jul 14, 2009 at 10:16 AM,

Re: Removing Duplicate Records

2009-07-14 Thread Brent Baisley
That's assuming that there is a unique identifier field, like an auto increment field. Although that could be added after the fact. Also, you need to run the query multiple times until it returns no affected records. So if there are 4 copies of a record, it would need to be run 3 times to get rid o

Re: Removing Duplicate Records

2009-07-14 Thread Marcus Bointon
You can combine the two queries you have in option 3 (you'll need to change field names, but you should get the idea), something like this: DELETE table1 FROM table1, (SELECT MAX(id) AS dupid, COUNT(id) AS dupcnt FROM table1 WHERE field1 IS NOT NULL GROUP BY link_id HAVING dupcnt>1) AS dups W

RE: Removing Duplicate Records

2009-07-14 Thread Nathan Sullivan
@lists.mysql.com Subject: Removing Duplicate Records In our database we have an Organizations table and a Contacts table, and a linking table that associates Contacts with Organizations. Occassionally we manually add to the linking table with information gleaned from outside data sources. This is common

Removing Duplicate Records

2009-07-14 Thread Matt Neimeyer
duplicate linkages, but it's FAR from an everyday occurance. I have three options for dealing with the resulting duplicates and I would appreciate some advice on which option might be best. 1. Attack the horrific spaghetti code that determines the Org and Contact ids and then does the manua

Re: Duplicate key name when importing mysql dump file

2009-06-19 Thread ars k
Jason Novotny >wrote: > > > Hi, > > > > I'm trying to import a dumpfile like so: > > > > cat aac.sql | mysql -u root AAC > > > > It all runs fine until I get something like: > > > > ERROR 1061 (42000) at line 5671: Duplicate key name

Re: Duplicate key name when importing mysql dump file

2009-06-19 Thread Isart Montane
> > It all runs fine until I get something like: > > ERROR 1061 (42000) at line 5671: Duplicate key name 'FK_mediaZip_to_zipSet' > > > Is there a way I can tell it to ignore or replace the key? > > Thanks, Jason > > -- > MySQL General Mailing List > For

Duplicate key name when importing mysql dump file

2009-06-17 Thread Jason Novotny
Hi, I'm trying to import a dumpfile like so: cat aac.sql | mysql -u root AAC It all runs fine until I get something like: ERROR 1061 (42000) at line 5671: Duplicate key name 'FK_mediaZip_to_zipSet' Is there a way I can tell it to ignore or replace the key? Thanks,

Re: Error: Duplicate entry '0' for key 'PRIMARY'

2009-03-02 Thread sam rumaizan
Thank you all I solved the problem --- On Mon, 3/2/09, Darryle Steplight wrote: From: Darryle Steplight Subject: Re: Error: Duplicate entry '0' for key 'PRIMARY' To: samc...@yahoo.com Cc: mysql@lists.mysql.com, g...@primeexalia.com Date: Monday, March 2, 2009, 2:32 PM

Re: Error: Duplicate entry '0' for key 'PRIMARY'

2009-03-02 Thread Darryle Steplight
Are you trying to do an Insert On Duplicate Key? Do ou want to insert a new row if it doesn't already exist or update one if it does? On Mon, Mar 2, 2009 at 4:09 PM, sam rumaizan wrote: > Are you talking about Length/Values1 > > > > --- On Mon, 3/2/09, Gary Smith wrote:

Re: Error: Duplicate entry '0' for key 'PRIMARY'

2009-03-02 Thread sam rumaizan
Are you talking about Length/Values1 --- On Mon, 3/2/09, Gary Smith wrote: From: Gary Smith Subject: Re: Error: Duplicate entry '0' for key 'PRIMARY' To: samc...@yahoo.com, mysql@lists.mysql.com Date: Monday, March 2, 2009, 1:58 PM Easy. Ensure that all in the primary ke

Re: Error: Duplicate entry '0' for key 'PRIMARY'

2009-03-02 Thread sam rumaizan
How do I modify the column to add value? Can I do it with phpmyadmin? --- On Mon, 3/2/09, Gary Smith wrote: From: Gary Smith Subject: Re: Error: Duplicate entry '0' for key 'PRIMARY' To: samc...@yahoo.com, mysql@lists.mysql.com Date: Monday, March 2, 2009, 1:58 PM Easy

Error: Duplicate entry '0' for key 'PRIMARY'

2009-03-02 Thread sam rumaizan
Error: Duplicate entry '0' for key 'PRIMARY'   how can i fix it ?

Re: [MySQL] Re: REPOST: ON DUPLICATE failure

2009-01-22 Thread Ian Simpson
The reporting of two rows thing is to do with how MySQL handles INSERT ... ON DUPLICATE KEY UPDATE ... statements; it will report 1 row if it inserts, and 2 rows if it finds a duplicate key and has to update as well. http://dev.mysql.com/doc/refman/5.0/en/insert-on-duplicate.html Just after the

Re: [MySQL] Re: REPOST: ON DUPLICATE failure

2009-01-21 Thread Ashley M. Kirchner
Michael Dykman wrote: It worked fine as you wrote it on my v5.0.45, although it reported 2 rows affected on each subsequent run of the insert statement. I thought this odd as I only ran the same statement repeatedly leaving me with one row ever, but the value updated just fine. I noticed

Re: REPOST: ON DUPLICATE failure

2009-01-21 Thread Michael Dykman
-+-+-+---+ > | name | varchar(40) | NO | PRI | | | > | value | varchar(255) | NO | | | | > +---+--+--+-+-+---+ > > Query run on both systems: > INSERT INTO $TABLE SET N

REPOST: ON DUPLICATE failure

2009-01-21 Thread Ashley M. Kirchner
lt | Extra | +---+--+--+-+-+---+ | name | varchar(40) | NO | PRI | | | | value | varchar(255) | NO | | | | +---+--+--+-+-+---+ Query run on both systems: INSERT INTO $TABLE SET NAME='atest', value=now() ON DUPLICATE

ON DUPLICATE failure

2009-01-20 Thread Ashley M. Kirchner
| name | varchar(40) | NO | PRI | | | | value | varchar(255) | NO | | | | +---+--+--+-+-+---+ Query run on both systems: INSERT INTO $TABLE SET NAME='atest', value=now() ON DUPLICATE K

RE: How to remove the duplicate values in my table!

2008-11-20 Thread roger.maynard
@lists.mysql.com Subject: Re: How to remove the duplicate values in my table! On Nov 19, 2008, at 3:24 AM, jean claude babin wrote: > Hi, > > I found the bug in my servlet ,when I run my application it enter > one record > to the database without duplicate values.Now I want to clean

Re: How to remove the duplicate values in my table!

2008-11-19 Thread Brent Baisley
On Nov 19, 2008, at 3:24 AM, jean claude babin wrote: Hi, I found the bug in my servlet ,when I run my application it enter one record to the database without duplicate values.Now I want to clean my table by removing all duplicate rows .Any thoughts? I assume you have a unique record

How to remove the duplicate values in my table!

2008-11-19 Thread jean claude babin
Hi, I found the bug in my servlet ,when I run my application it enter one record to the database without duplicate values.Now I want to clean my table by removing all duplicate rows .Any thoughts?

Re: Duplicate values inserted!

2008-11-17 Thread Carlos Proal
ql5.0 server.When I run my application, I got duplicate values everytime I enter a record. my java class is : import java.sql.Connection; import java.sql.DriverManager; import java.sql.Statement; public class HandlereplyModel { public String insert(String fname, String lname, String Email, String

Duplicate values inserted!

2008-11-17 Thread jean claude babin
Hello, I'm using Eclipse 3.3 and I use a model class HandlereplyModel.java to insert values to a mysql database using mysql5.0 server.When I run my application, I got duplicate values everytime I enter a record. my java class is : import java.sql.Connection; import java.sql.DriverMa

Duplicate column when creating table from table join

2008-11-05 Thread mos
x12 select * from tx1 left join tx2 on tx1.name1=tx2.name1; Error Code : 1060 Duplicate column name 'name1' I would have thought MySQL would have named the second tx12.name1 as "name1_1" so the column name is unique. But instead I get an error. Is there a way around this b

Re: removing duplicate entries

2008-08-12 Thread Brent Baisley
| PRI | | | > | AMOUNTINPENCE | bigint(20) | YES | | NULL| | > +---++--+-+-+---+ > > ACCOUNTPAYMENTACTION shares the primary key with ACCOUNTACTION > > I need to remove duplicate entries that occured at a specific time in > ACC

Re: removing duplicate entries

2008-08-11 Thread Moon's Father
> >on A1.ID = U1.ID >where A1.ACTIONDATE like '2008-08-01 02:00%' >and U1.ID is NULL >) as D > ); > > Thanks for the pointers ;-) > > > > > -Original Message- > From: Magnus Smith [mailto:[EMAIL PROTECTED] > Sen

RE: removing duplicate entries

2008-08-07 Thread Magnus Smith
10:35 To: Ananda Kumar Cc: mysql@lists.mysql.com Subject: RE: removing duplicate entries Yes I can see you are correct. I tried setting up a little test case myself. CREATE TABLE ACCOUNTACTION ( ID INT NOT NULL PRIMARY KEY, ACTIONDATE DATETIME, ACCOUNT_ID INT NOT NULL );

Re: removing duplicate entries

2008-08-07 Thread Rob Wultsch
7;), > ('010', '2008-08-01 03:00:00', '105'), > ('011', '2008-08-01 02:00:00', '106'); > > INSERT INTO ACCOUNTPAYMENTACTION (ID, AMOUNT) > VALUES('001', '1000'), > ('002', '1000'), > ('

RE: removing duplicate entries

2008-08-07 Thread Magnus Smith
7;2008-08-01 02:00:00', '105'), ('009', '2008-08-01 03:00:00', '104'), ('010', '2008-08-01 03:00:00', '105'), ('011', '2008-08-01 02:00:00', '106'); INSERT INTO ACCOUNTPAYMENTACTION (ID, AMOUNT) VAL

Re: removing duplicate entries

2008-08-06 Thread Ananda Kumar
ows returned, are the one you want to keep are indeed duplicates On 8/6/08, Magnus Smith <[EMAIL PROTECTED]> wrote: > > When I try the first suggestion (i) then I get all the 1682 duplicate > rows. The thing is that I need to keep the originals which are the ones > with the lo

RE: removing duplicate entries

2008-08-06 Thread Magnus Smith
When I try the first suggestion (i) then I get all the 1682 duplicate rows. The thing is that I need to keep the originals which are the ones with the lowest ACCOUNTACTION.ID value. The second suggestion (ii) gives me 563 rows that are the duplicates with the lowest ACCOUNTACTION.ID which are

Re: removing duplicate entries

2008-08-06 Thread Ananda Kumar
-+-+-+---+ > | ID| bigint(20) | NO | PRI | | | > | AMOUNTINPENCE | bigint(20) | YES | | NULL| | > +---++--+-+-+---+ > > ACCOUNTPAYMENTACTION shares the primary key with ACCOUNTACTION > > I need to rem

removing duplicate entries

2008-08-06 Thread Magnus Smith
| bigint(20) | NO | PRI | | | | AMOUNTINPENCE | bigint(20) | YES | | NULL| | +---++--+-+-+---+ ACCOUNTPAYMENTACTION shares the primary key with ACCOUNTACTION I need to remove duplicate entries that occured at a specific time in

  1   2   3   4   5   6   7   >