Re: Optimize fails due to duplicate rows error but no duplicates found

2018-02-23 Thread shawn l.green
. *From:* shawn l.green <shawn.l.gr...@oracle.com> *Sent:* 13 February 2018 09:51:33 PM *To:* mysql@lists.mysql.com *Subject:* Re: Optimize fails due to duplicate rows error but no duplicates found Hello Machiel, On 2/13/2018 3:02 AM, Machiel Richards

Re: Optimize fails due to duplicate rows error but no duplicates found

2018-02-13 Thread Machiel Richards
mysql@lists.mysql.com Subject: Re: Optimize fails due to duplicate rows error but no duplicates found Hello Machiel, On 2/13/2018 3:02 AM, Machiel Richards wrote: > Good day guys, > > > I am hoping this mail finds you well. > > > I am at a bit of a loss here... >

Re: Optimize fails due to duplicate rows error but no duplicates found

2018-02-13 Thread Machiel Richards
ql.com Subject: Re: Optimize fails due to duplicate rows error but no duplicates found Hello Machiel, On 2/13/2018 3:02 AM, Machiel Richards wrote: > Good day guys, > > > I am hoping this mail finds you well. > > > I am at a bit of a loss here... > > > W

Re: Optimize fails due to duplicate rows error but no duplicates found

2018-02-13 Thread shawn l.green
. However, after running for over an hour , the optimize fails stating there is a duplicate entry in the table. We have now spent 2 days using various methods but we are unable to find any duplicates in the primary key and also nothing on the unique key fields. Any idea on why optimize

Optimize fails due to duplicate rows error but no duplicates found

2018-02-13 Thread Machiel Richards
there is a duplicate entry in the table. We have now spent 2 days using various methods but we are unable to find any duplicates in the primary key and also nothing on the unique key fields. Any idea on why optimize would still be failing ? Regards

Re: duplicate rows in spite of multi-column unique constraint

2015-03-24 Thread Johan De Meersman
Hornung chris.horn...@klaviyo.com To: MySql mysql@lists.mysql.com Sent: Monday, 23 March, 2015 18:20:36 Subject: duplicate rows in spite of multi-column unique constraint Hello, I'm come across a situation where a table in our production DB has a relatively small number of duplicative rows

Re: duplicate rows in spite of multi-column unique constraint

2015-03-24 Thread Chris Hornung
a trailing space or similar 'invible' character that makes it not identical. - Original Message - From: Chris Hornungchris.horn...@klaviyo.com To: MySqlmysql@lists.mysql.com Sent: Monday, 23 March, 2015 18:20:36 Subject: duplicate rows in spite of multi-column unique constraint Hello, I'm

Re: duplicate rows in spite of multi-column unique constraint

2015-03-24 Thread shawn l.green
Hi Chris, On 3/24/2015 10:07 AM, Chris Hornung wrote: Thanks for the suggestions regarding non-printing characters, definitely makes sense as a likely culprit! However, the data really does seem to be identical in this case: mysql select id, customer_id, concat('-', group_id, '-') from

duplicate rows in spite of multi-column unique constraint

2015-03-23 Thread Chris Hornung
KEY `app_customergroupmembership_customer_id_31afe160_uniq` (`customer_id`,`group_id`), KEY `app_customergroupmembership_group_id_18aedd38e3f8a4a0` (`group_id`,`created`) ) ENGINE=InnoDB AUTO_INCREMENT=21951158253 DEFAULT CHARSET=utf8 COLLATE=utf8_bin Despite that, records with duplicate

indexing on column having duplicate values

2014-05-28 Thread Rajeev Prasad
error, about duplicate entries and they are being discarded. All entries are important and I have to find a way to locate records based on this key field. thx for help. Rajeev -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com

Re: indexing on column having duplicate values

2014-05-28 Thread Reindl Harald
how can i do this? right now, i am getting error, about duplicate entries and they are being discarded. All entries are important and I have to find a way to locate records based on this key field. who said that a key needs to be unique? just get phpMyAdmin to learn such things there you see

Re: indexing on column having duplicate values

2014-05-28 Thread Reindl Harald
a big table with lot of records, to expedite searching i wanted to index on a key field (which is numeric value). BUT, there will be records which will have same value for the key field (other columns will be different). so how can i do this? right now, i am getting error, about duplicate

Re: indexing on column having duplicate values

2014-05-28 Thread Rajeev Prasad
for the key field (other columns will be different). so how can i do this? right now, i am getting error, about duplicate entries and they are being discarded. All entries are important and I have to find a way to locate records based on this key field. who said that a key needs to be unique? just

Re: indexing on column having duplicate values

2014-05-28 Thread Reindl Harald
Am 28.05.2014 22:39, schrieb Rajeev Prasad: (re-sending, i got err from yahoo) your previous message made it off-list to me *don't use reply-all on mailing lists* signature.asc Description: OpenPGP digital signature

Re: Join query returning duplicate entries

2013-04-04 Thread Johan De Meersman
- Original Message - From: Lucky Wijaya luckyx_cool_...@yahoo.com To: mysql@lists.mysql.com Sent: Thursday, 4 April, 2013 10:51:50 AM Subject: Re: Join query returning duplicate entries Hi, sorry i tried to help but i hardly understand the use of join in your query since

Re: Join query returning duplicate entries

2013-04-04 Thread Lucky Wijaya
duplicate entries Hi list,         i wrote the following query and it is returning duplicate entries as shown below, can any one suggest me how to avoid this duplicate entries, with out using distinct. Query: select p.date,p.coacode,p.type,p.crdr,p.quantity,p.amount from ac_financialpostings p

Re: Join query returning duplicate entries

2013-04-04 Thread shawn green
Hello Trimurthy, On 4/4/2013 3:21 AM, Trimurthy wrote: Hi list, i wrote the following query and it is returning duplicate entries as shown below, can any one suggest me how to avoid this duplicate entries, with out using distinct. Query: select p.date,p.coacode,p.type,p.crdr

Re: slave replication with lots of 'duplicate entry' errors

2013-02-15 Thread Manuel Arostegui
2013/2/14 Robert Citek robert.ci...@gmail.com According to the client, nothing is writing to the slave and everything is being logged at the master. I have not had the opportunity to independently verified any of this, yet. I do know that the slave is not in read-only mode, but rather we

RE: slave replication with lots of 'duplicate entry' errors

2013-02-14 Thread Rick James
, 2013 2:59 PM To: Rick James Cc: mysql Subject: Re: slave replication with lots of 'duplicate entry' errors On Thu, Feb 14, 2013 at 5:46 PM, Rick James rja...@yahoo-inc.com wrote: Is it in read only mode? Furthermore, are all users logging in as non-SUPER users? Note: root bypasses

RE: slave replication with lots of 'duplicate entry' errors

2013-02-14 Thread Rick James
replication with lots of 'duplicate entry' errors 2013/2/13 Robert Citek robert.ci...@gmail.com On Wed, Feb 13, 2013 at 8:59 AM, Robert Citek robert.ci...@gmail.com wrote: Any other possibilities? Do other scenarios become likely if there are two or more tables? Of those

Re: slave replication with lots of 'duplicate entry' errors

2013-02-14 Thread Singer Wang
; And if you're using the right IP, there's no reason to have duplicate entries unless someone is writing directly into the slave. Manuel. According to the client, nothing is writing to the slave and everything is being logged at the master. I have not had the opportunity to independently

Re: slave replication with lots of 'duplicate entry' errors

2013-02-14 Thread Robert Citek
On Thu, Feb 14, 2013 at 5:46 PM, Rick James rja...@yahoo-inc.com wrote: Is it in read only mode? Furthermore, are all users logging in as non-SUPER users? Note: root bypasses the readonly flag! No. The user that is commonly used does have Super privileges. I am not sure why, but it does.

Re: slave replication with lots of 'duplicate entry' errors

2013-02-14 Thread Robert Citek
like): -- Position to start replication or point-in-time recovery from -- CHANGE MASTER TO MASTER_LOG_FILE='mysql-bin.000974', MASTER_LOG_POS=240814775; And if you're using the right IP, there's no reason to have duplicate entries unless someone is writing directly into the slave. Manuel

Re: slave replication with lots of 'duplicate entry' errors

2013-02-14 Thread Robert Citek
Yes. Except for a handful of static MyISAM tables. But the tables that are experiencing the issues are all InnoDB and large (a dozen or so fields, but lots of records.) Regards, - Robert On Thu, Feb 14, 2013 at 5:59 PM, Singer Wang w...@singerwang.com wrote: Are you using all InnoDB? S --

Re: slave replication with lots of 'duplicate entry' errors

2013-02-14 Thread Manuel Arostegui
MASTER TO MASTER_LOG_FILE='mysql-bin.000974', MASTER_LOG_POS=240814775; And if you're using the right IP, there's no reason to have duplicate entries unless someone is writing directly into the slave. Manuel.

Re: slave replication with lots of 'duplicate entry' errors

2013-02-14 Thread Robert Citek
Agreed. Will do that along with several other possible changes. But for the moment, I'm still gathering information and coming up with plausible models. Will also be turning on general mysql logging on both Master and Slave, at least briefly, to see what statements are being run on both.

Re: slave replication with lots of 'duplicate entry' errors

2013-02-13 Thread Robert Citek
On Wed, Feb 13, 2013 at 8:59 AM, Robert Citek robert.ci...@gmail.com wrote: Any other possibilities? Do other scenarios become likely if there are two or more tables? Of those, which are the most likely? [from off-list responder]: Other possibility: The replication is reading from master

RE: console input - finding duplicate entries

2012-06-15 Thread Daevid Vincent
-Original Message- From: Gary Aitken [mailto:my...@dreamchaser.org] Sent: Thursday, June 14, 2012 2:58 PM I can get the table loaded by specifying REPLACE INTO TABLE, but that still leaves me with not knowing where the duplicate records are. To find duplicate entries select

Re: JOIN giving duplicate records

2012-04-04 Thread Hal�sz S�ndor
; 2012/04/03 18:18 +0100, Tompkins Neil Before sending the table definition, and queries etc, can anyone advise why my query with four INNER JOIN might be give me back duplicate results e.g 100,UK,12121 100,UK,12121 Basically the query the statement AND (hotel_facilities.hotelfacilitytype_id

JOIN giving duplicate records

2012-04-03 Thread Tompkins Neil
Hi Before sending the table definition, and queries etc, can anyone advise why my query with four INNER JOIN might be give me back duplicate results e.g 100,UK,12121 100,UK,12121 Basically the query the statement AND (hotel_facilities.hotelfacilitytype_id = 47

How to get the first data from a multiple or duplicate records

2011-10-18 Thread Gian Karlo C
Hello everyone, I would like to ask for idea and help on how to achieve my concern. Below is my SQL statement. Im joining 2 tables to get my results. Here's the sample results of what im getting. Name | Desc | Issue | ATime | Back | TotalTime | Ack | Res 123 | test | error | 2011-10-18 17:09:26

Re: myisamchk error (duplicate key records)

2011-09-19 Thread Johan De Meersman
- Original Message - From: Hank hes...@gmail.com While running a -rq on a large table, I got the following error: myisamchk: warning: Duplicate key for record at 54381140 against record at 54380810 How do I find which records are duplicated (without doing the typical self

Re: myisamchk error (duplicate key records)

2011-09-19 Thread Hank
On Mon, Sep 19, 2011 at 7:19 AM, Johan De Meersman vegiv...@tuxera.bewrote: - Original Message - From: Hank hes...@gmail.com While running a -rq on a large table, I got the following error: myisamchk: warning: Duplicate key for record at 54381140 against record

Re: myisamchk error (duplicate key records)

2011-09-19 Thread Johan De Meersman
- Original Message - From: Hank hes...@gmail.com Exactly - I can't create an index on the table until I remove the duplicate records. I was under the impression you were seeing this during a myisamchk run - which indicates you should *already* have a key on that field. Or am I

Re: myisamchk error (duplicate key records)

2011-09-19 Thread Hank
Exactly - I can't create an index on the table until I remove the duplicate records. I was under the impression you were seeing this during a myisamchk run - which indicates you should *already* have a key on that field. Or am I interpreting that wrong? I'm trying to rebuild an index

Re: myisamchk error (duplicate key records)

2011-09-19 Thread Johan De Meersman
- Original Message - From: Hank hes...@gmail.com I'm trying to rebuild an index after disabling all keys using myisamchk and adding all 144 million records, so there is no current index on the table. Ahhh... I didn't realise that. But in order to create the index, mysql has to

myisamchk error (duplicate key records)

2011-09-18 Thread Hank
While running a -rq on a large table, I got the following error: myisamchk: warning: Duplicate key for record at 54381140 against record at 54380810 How do I find which records are duplicated (without doing the typical self-join or having cnt(*)1 query)? This table has 144 million rows, so

running a duplicate database

2011-09-09 Thread Dave Dyer
Is there a halfway house between a single database and a full master-slave setup? I have a database with one piggish table, and I'd like to direct queries that search the pig to a duplicate database, where it won't affect all the routine traffic. I could definitely do this by setting up

Re: running a duplicate database

2011-09-09 Thread Reindl Harald
Am 09.09.2011 11:09, schrieb Dave Dyer: Is there a halfway house between a single database and a full master-slave setup? I have a database with one piggish table, and I'd like to direct queries that search the pig to a duplicate database, where it won't affect all the routine traffic

Re: running a duplicate database

2011-09-09 Thread Rik Wasmus
Am 09.09.2011 11:09, schrieb Dave Dyer: Is there a halfway house between a single database and a full master-slave setup? I have a database with one piggish table, and I'd like to direct queries that search the pig to a duplicate database, where it won't affect all the routine

Deleting the duplicate values in a column

2011-05-09 Thread abhishek jain
hi, If we have a following mysql table: Name - ids A 1 B 1 C 2 D 3 I want to remove all duplicate occurances and have a result like Name - ids C 2 D 3 how can i do that with a query in mysql Pl. help asap -- Thanks and kind Regards

Re: Deleting the duplicate values in a column

2011-05-09 Thread Aveek Misra
SELECT * from table group by id having count = 1; On May 9, 2011, at 5:45 PM, abhishek jain wrote: hi, If we have a following mysql table: Name - ids A 1 B 1 C 2 D 3 I want to remove all duplicate occurances and have a result like Name - ids C

Re: Deleting the duplicate values in a column

2011-05-09 Thread Aveek Misra
* from table group by id having count = 1; On May 9, 2011, at 5:45 PM, abhishek jain wrote: hi, If we have a following mysql table: Name - ids A 1 B 1 C 2 D 3 I want to remove all duplicate occurances and have a result like Name - ids C

ERROR 1062 (23000): Duplicate entry '31592' for key 'PRIMARY'

2011-04-28 Thread Adarsh Sharma
`user_id` (`user_id`) ) ENGINE=MyISAM AUTO_INCREMENT=31592 DEFAULT CHARSET=latin1 Today don't know why below error occurs when i am going insert some data in it : mysql insert into login(user_id,log_status) values(2,1); ERROR 1062 (23000): Duplicate entry '31592' for key 'PRIMARY' I check

Re: ERROR 1062 (23000): Duplicate entry '31592' for key 'PRIMARY'

2011-04-28 Thread Adarsh Sharma
`) ) ENGINE=MyISAM AUTO_INCREMENT=31592 DEFAULT CHARSET=latin1 Today don't know why below error occurs when i am going insert some data in it : mysql insert into login(user_id,log_status) values(2,1); ERROR 1062 (23000): Duplicate entry '31592' for key 'PRIMARY' I check the latest entries

Re: ERROR 1062 (23000): Duplicate entry '31592' for key 'PRIMARY'

2011-04-28 Thread misiaq
Corrupted table and / or index. A number of reasons could cause this issue: http://dev.mysql.com/doc/refman/5.1/en/corrupted-myisam-tables.html Regards, m Adarsh Sharma adarsh.sha...@orkash.com pisze: Thanks , but there is no trigger on tables. Even I solved the problem after googling a

Duplicate entry '2' for key 1

2010-11-09 Thread Ilham Firdaus
Dear friends. Anybody would be so nice to explain about meaning of this error message: Duplicate entry '2' for key 1 :. It comes if we visit this: http://www.otekno.biz/kn/code/functions.php?task=sync Thank you very much in advance. -- Enjoy our free facilities: http://www.otekno.biz

Re: Duplicate entry '2' for key 1

2010-11-09 Thread Michael Dykman
. - michael dykman On Tue, Nov 9, 2010 at 3:36 PM, Ilham Firdaus il...@otekno.biz wrote: Dear friends. Anybody would be so nice to explain about meaning of this error message: Duplicate entry '2' for key 1 :. It comes if we visit this: http://www.otekno.biz/kn/code/functions.php?task=sync Thank

How To Duplicate Number of Hits from Prod Sever to NEW QA server?

2010-08-27 Thread Nunzio Daveri
server so as to duplicate real time traffic and not just replay the logs? Is there a tool or a shell script? I know there are built in benchmarking tools but I am trying to tell mgmt that 5.1.4x was lets say 60% percentage busy (cpu/mem/io) with traffic hitting it on Monday, the SAME amount

RE: How To Duplicate Number of Hits from Prod Sever to NEW QA server?

2010-08-27 Thread Jerry Schwartz
-Original Message- From: Nunzio Daveri [mailto:nunziodav...@yahoo.com] Sent: Friday, August 27, 2010 10:19 AM To: mysql@lists.mysql.com Subject: How To Duplicate Number of Hits from Prod Sever to NEW QA server? Hello, I have been asked to replay the traffic load we have on one of our

Re: sql to duplicate records with modified value

2010-04-27 Thread Voytek Eymont
/03/2010, Voytek Eymont wrote: I have Postfix virtual mailboxes in MySQL table like below: I'd like to duplicate all records whilst MODIFYING two fields like so: current record has format like: user 'usern...@domain.tld' maildir 'domain.tld/usern...@domain.tld/' add new record that has

Re: sql to duplicate records with modified value

2010-03-27 Thread Rhino
Voytek Eymont wrote: I have Postfix virtual mailboxes in MySQL table like below: I'd like to duplicate all records whilst MODIFYING two fields like so: current record has format like: user 'usern...@domain.tld' maildir 'domain.tld/usern...@domain.tld/' add new record that has: user

Re: sql to duplicate records with modified value

2010-03-27 Thread Voytek Eymont
quote who=Rhino Voytek Eymont wrote: Are you hoping to do all that you want - copy rows, update rows and create new rows - in a single SQL statement? Because if that's what you want, I don't think it's possible. Unless someone has come up with some new tricks, you can't insert a new record

Re: sql to duplicate records with modified value

2010-03-27 Thread Ray Cauchi
At 03:17 PM 27/03/2010, Voytek Eymont wrote: I have Postfix virtual mailboxes in MySQL table like below: I'd like to duplicate all records whilst MODIFYING two fields like so: current record has format like: user 'usern...@domain.tld' maildir 'domain.tld/usern...@domain.tld/' add new record that has

sql to duplicate records with modified value

2010-03-26 Thread Voytek Eymont
I have Postfix virtual mailboxes in MySQL table like below: I'd like to duplicate all records whilst MODIFYING two fields like so: current record has format like: user 'usern...@domain.tld' maildir 'domain.tld/usern...@domain.tld/' add new record that has: user 'username+s...@domain.tld

Re: Duplicate entries despite group by

2010-02-21 Thread Carsten Pedersen
for a few hours, the query fails with the following error: com.mysql.jdbc.exceptions.jdbc4.MySQLIntegrityConstraintViolationException: Duplicate entry 'new_order-248642-order_line-13126643' for key 'group_key' How is this possible? There were no concurrently running queries inserting into 'graph'. I'm

Duplicate entries despite group by

2010-02-20 Thread Yang Zhang
: com.mysql.jdbc.exceptions.jdbc4.MySQLIntegrityConstraintViolationException: Duplicate entry 'new_order-248642-order_line-13126643' for key 'group_key' How is this possible? There were no concurrently running queries inserting into 'graph'. I'm using mysql-5.4.3; is this a beta bug/anyone else happen to know

Possible to find this duplicate?

2010-02-13 Thread Brian Dunning
of the book. It should not return ID 4, because that's in a different chapter. Note that J. and John have to be considered the same. For my purposes, it's sufficient to look at the first word, Smith, and consider that a duplicate. ++--+-+ | ID | Author | Chapter

Re: Possible to find this duplicate?

2010-02-13 Thread Mark Goodge
from the same author, and both in chapter 1 of the book. It should not return ID 4, because that's in a different chapter. Note that J. and John have to be considered the same. For my purposes, it's sufficient to look at the first word, Smith, and consider that a duplicate

Re: Duplicate Entry, But Table Empty!

2009-12-14 Thread Johan De Meersman
Gods. What is this, a creche ? *plonk* On Sun, Dec 13, 2009 at 6:44 PM, Victor Subervi victorsube...@gmail.comwrote: On Sun, Dec 13, 2009 at 12:21 PM, Pinter Tibor tib...@tibyke.hu wrote: Victor Subervi wrote: Hi; mysql insert into *tem126072414516* (ProdID, Quantity) values (2,

Re: Duplicate Entry, But Table Empty!

2009-12-14 Thread Mattia Merzi
2009/12/13 Victor Subervi victorsube...@gmail.com: [...] Please advise. review your sql: you are inserting into tem126072414516 and selecting from tem126072385457 ( Asterisk in Pinter Tibor's mail means bold ) Greetings, Mattia Merzi. -- MySQL General Mailing List For list archives:

Duplicate Entry, But Table Empty!

2009-12-13 Thread Victor Subervi
Hi; mysql insert into tem126072414516 (ProdID, Quantity) values (2, 2); ERROR 1062 (23000): Duplicate entry '2' for key 2 mysql select * from tem126072385457; Empty set (0.00 sec) mysql describe tem126072385457; +--+-+--+-+-++ | Field

Re: Duplicate Entry, But Table Empty!

2009-12-13 Thread Pinter Tibor
Victor Subervi wrote: Hi; mysql insert into *tem126072414516* (ProdID, Quantity) values (2, 2); mysql select * from *tem126072385457*; t -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe:http://lists.mysql.com/mysql?unsub=arch...@jab.org

Re: Duplicate Entry, But Table Empty!

2009-12-13 Thread Victor Subervi
On Sun, Dec 13, 2009 at 12:21 PM, Pinter Tibor tib...@tibyke.hu wrote: Victor Subervi wrote: Hi; mysql insert into *tem126072414516* (ProdID, Quantity) values (2, 2); mysql select * from *tem126072385457*; mysql insert into *tem126072414516* (ProdID, Quantity) values (2, 2); ERROR 1064

INSERT ... SELECT ON DUPLICATE

2009-09-24 Thread dbrb2002-sql
Does anyone know if I can add a hint SQL_BUFFER_RESULT to INSERT .. SELECT ON DUPLICATE ex.. INSERT INTO foo SELECT SQL_BUFFER_RESULT* FROM bar ON DUPLICATE KEY UPDATE foo.X=.. Both my tables foo and bar are InnoDB; but the idea is to release the lock on bar as soon as possible by moving

Data import problem with duplicate datasets

2009-09-23 Thread spikerlion
Hello, we have two tables associated with a foreign key constraint. Table A with the primary key and table B with an on delete cascade constraint. We want to delete datasets in Table B if the related dataset in Table A is deleted - that works. Now the Problem: There is a weekly import defined

Re: Data import problem with duplicate datasets

2009-09-23 Thread Claudio Nanni
A very first thought I got is disable the constraint before import and re-enable after that. One way could be to set the foreign key checks to false or alter the constraint and remove the 'cascade delete' part. It's just a quick brain storm, please verify the goodness of it, I still need to get my

Removing Duplicate Records

2009-07-14 Thread Matt Neimeyer
with duplicate linkages, but it's FAR from an everyday occurance. I have three options for dealing with the resulting duplicates and I would appreciate some advice on which option might be best. 1. Attack the horrific spaghetti code that determines the Org and Contact ids and then does the manual add

RE: Removing Duplicate Records

2009-07-14 Thread Nathan Sullivan
@lists.mysql.com Subject: Removing Duplicate Records In our database we have an Organizations table and a Contacts table, and a linking table that associates Contacts with Organizations. Occassionally we manually add to the linking table with information gleaned from outside data sources. This is common

Re: Removing Duplicate Records

2009-07-14 Thread Marcus Bointon
You can combine the two queries you have in option 3 (you'll need to change field names, but you should get the idea), something like this: DELETE table1 FROM table1, (SELECT MAX(id) AS dupid, COUNT(id) AS dupcnt FROM table1 WHERE field1 IS NOT NULL GROUP BY link_id HAVING dupcnt1) AS dups

Re: Removing Duplicate Records

2009-07-14 Thread Brent Baisley
That's assuming that there is a unique identifier field, like an auto increment field. Although that could be added after the fact. Also, you need to run the query multiple times until it returns no affected records. So if there are 4 copies of a record, it would need to be run 3 times to get rid

Re: Removing Duplicate Records

2009-07-14 Thread Matt Neimeyer
Ah... Yes. Good point. I like this because I was planning on keeping the output somewhere for a while. (In case we need an accounting at some point) So it will be easy enough to dump what's being deleted to the screen while we loop over our candidates. Thanks! On Tue, Jul 14, 2009 at 10:16 AM,

Re: Duplicate key name when importing mysql dump file

2009-06-19 Thread Isart Montane
It all runs fine until I get something like: ERROR 1061 (42000) at line 5671: Duplicate key name 'FK_mediaZip_to_zipSet' Is there a way I can tell it to ignore or replace the key? Thanks, Jason -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe

Re: Duplicate key name when importing mysql dump file

2009-06-19 Thread ars k
wrote: Hi, I'm trying to import a dumpfile like so: cat aac.sql | mysql -u root AAC It all runs fine until I get something like: ERROR 1061 (42000) at line 5671: Duplicate key name 'FK_mediaZip_to_zipSet' Is there a way I can tell it to ignore or replace the key

Duplicate key name when importing mysql dump file

2009-06-17 Thread Jason Novotny
Hi, I'm trying to import a dumpfile like so: cat aac.sql | mysql -u root AAC It all runs fine until I get something like: ERROR 1061 (42000) at line 5671: Duplicate key name 'FK_mediaZip_to_zipSet' Is there a way I can tell it to ignore or replace the key? Thanks, Jason -- MySQL

Error: Duplicate entry '0' for key 'PRIMARY'

2009-03-02 Thread sam rumaizan
Error: Duplicate entry '0' for key 'PRIMARY'   how can i fix it ?

Re: Error: Duplicate entry '0' for key 'PRIMARY'

2009-03-02 Thread sam rumaizan
How do I modify the column to add value? Can I do it with phpmyadmin? --- On Mon, 3/2/09, Gary Smith g...@primeexalia.com wrote: From: Gary Smith g...@primeexalia.com Subject: Re: Error: Duplicate entry '0' for key 'PRIMARY' To: samc...@yahoo.com, mysql@lists.mysql.com Date: Monday, March 2

Re: Error: Duplicate entry '0' for key 'PRIMARY'

2009-03-02 Thread sam rumaizan
Are you talking about Length/Values1 --- On Mon, 3/2/09, Gary Smith g...@primeexalia.com wrote: From: Gary Smith g...@primeexalia.com Subject: Re: Error: Duplicate entry '0' for key 'PRIMARY' To: samc...@yahoo.com, mysql@lists.mysql.com Date: Monday, March 2, 2009, 1:58 PM Easy. Ensure

Re: Error: Duplicate entry '0' for key 'PRIMARY'

2009-03-02 Thread Darryle Steplight
Are you trying to do an Insert On Duplicate Key? Do ou want to insert a new row if it doesn't already exist or update one if it does? On Mon, Mar 2, 2009 at 4:09 PM, sam rumaizan samc...@yahoo.com wrote: Are you talking about Length/Values1 --- On Mon, 3/2/09, Gary Smith g...@primeexalia.com

Re: Error: Duplicate entry '0' for key 'PRIMARY'

2009-03-02 Thread sam rumaizan
Thank you all I solved the problem --- On Mon, 3/2/09, Darryle Steplight dstepli...@gmail.com wrote: From: Darryle Steplight dstepli...@gmail.com Subject: Re: Error: Duplicate entry '0' for key 'PRIMARY' To: samc...@yahoo.com Cc: mysql@lists.mysql.com, g...@primeexalia.com Date: Monday, March

Re: [MySQL] Re: REPOST: ON DUPLICATE failure

2009-01-22 Thread Ian Simpson
The reporting of two rows thing is to do with how MySQL handles INSERT ... ON DUPLICATE KEY UPDATE ... statements; it will report 1 row if it inserts, and 2 rows if it finds a duplicate key and has to update as well. http://dev.mysql.com/doc/refman/5.0/en/insert-on-duplicate.html Just after

REPOST: ON DUPLICATE failure

2009-01-21 Thread Ashley M. Kirchner
| +---+--+--+-+-+---+ | name | varchar(40) | NO | PRI | | | | value | varchar(255) | NO | | | | +---+--+--+-+-+---+ Query run on both systems: INSERT INTO $TABLE SET NAME='atest', value=now() ON DUPLICATE KEY UPDATE value=now

Re: REPOST: ON DUPLICATE failure

2009-01-21 Thread Michael Dykman
| | | | value | varchar(255) | NO | | | | +---+--+--+-+-+---+ Query run on both systems: INSERT INTO $TABLE SET NAME='atest', value=now() ON DUPLICATE KEY UPDATE value=now(); On SERVER 1 it fails

Re: [MySQL] Re: REPOST: ON DUPLICATE failure

2009-01-21 Thread Ashley M. Kirchner
Michael Dykman wrote: It worked fine as you wrote it on my v5.0.45, although it reported 2 rows affected on each subsequent run of the insert statement. I thought this odd as I only ran the same statement repeatedly leaving me with one row ever, but the value updated just fine. I noticed

ON DUPLICATE failure

2009-01-20 Thread Ashley M. Kirchner
| varchar(40) | NO | PRI | | | | value | varchar(255) | NO | | | | +---+--+--+-+-+---+ Query run on both systems: INSERT INTO $TABLE SET NAME='atest', value=now() ON DUPLICATE KEY UPDATE value=now

RE: How to remove the duplicate values in my table!

2008-11-20 Thread roger.maynard
@lists.mysql.com Subject: Re: How to remove the duplicate values in my table! On Nov 19, 2008, at 3:24 AM, jean claude babin wrote: Hi, I found the bug in my servlet ,when I run my application it enter one record to the database without duplicate values.Now I want to clean my table by removing

How to remove the duplicate values in my table!

2008-11-19 Thread jean claude babin
Hi, I found the bug in my servlet ,when I run my application it enter one record to the database without duplicate values.Now I want to clean my table by removing all duplicate rows .Any thoughts?

Re: How to remove the duplicate values in my table!

2008-11-19 Thread Brent Baisley
On Nov 19, 2008, at 3:24 AM, jean claude babin wrote: Hi, I found the bug in my servlet ,when I run my application it enter one record to the database without duplicate values.Now I want to clean my table by removing all duplicate rows .Any thoughts? I assume you have a unique record

Duplicate values inserted!

2008-11-17 Thread jean claude babin
Hello, I'm using Eclipse 3.3 and I use a model class HandlereplyModel.java to insert values to a mysql database using mysql5.0 server.When I run my application, I got duplicate values everytime I enter a record. my java class is : import java.sql.Connection; import java.sql.DriverManager

Re: Duplicate values inserted!

2008-11-17 Thread Carlos Proal
server.When I run my application, I got duplicate values everytime I enter a record. my java class is : import java.sql.Connection; import java.sql.DriverManager; import java.sql.Statement; public class HandlereplyModel { public String insert(String fname, String lname, String Email, String phone

Duplicate column when creating table from table join

2008-11-05 Thread mos
* from tx1 left join tx2 on tx1.name1=tx2.name1; Error Code : 1060 Duplicate column name 'name1' I would have thought MySQL would have named the second tx12.name1 as name1_1 so the column name is unique. But instead I get an error. Is there a way around this because the column list in both tables

Re: removing duplicate entries

2008-08-12 Thread Brent Baisley
| | +---++--+-+-+---+ ACCOUNTPAYMENTACTION shares the primary key with ACCOUNTACTION I need to remove duplicate entries that occured at a specific time in ACCOUNTACTION I then plan to remove the rows in ACCOUNTPAYMENTACTION that are no longer referenced in ACCOUNTACTION by using an outer join I

Re: removing duplicate entries

2008-08-11 Thread Moon's Father
); Thanks for the pointers ;-) -Original Message- From: Magnus Smith [mailto:[EMAIL PROTECTED] Sent: 07 August 2008 10:35 To: Ananda Kumar Cc: mysql@lists.mysql.com Subject: RE: removing duplicate entries Yes I can see you are correct. I tried setting up a little test case myself

RE: removing duplicate entries

2008-08-07 Thread Magnus Smith
', '1000'), ('003', '1000'), ('004', '1000'), ('005', '1000'), ('006', '1000'), ('007', '1000'), ('008', '1000'), ('009', '1000'), ('010', '1000'), ('011', '1000'); I got the following query that seems to work on my test case. I create a union of everything that is not a duplicate and then take

Re: removing duplicate entries

2008-08-07 Thread Rob Wultsch
on my test case. I create a union of everything that is not a duplicate and then take the ones that are not in this to be the duplicates select ACCOUNTACTION.ID, ACCOUNTACTION.ACCOUNT_ID from ACCOUNTACTION where ACCOUNTACTION.ACTIONDATE like '2008-08-01 02:00%' and (ACCOUNTACTION.ID

RE: removing duplicate entries

2008-08-07 Thread Magnus Smith
Subject: RE: removing duplicate entries Yes I can see you are correct. I tried setting up a little test case myself. CREATE TABLE ACCOUNTACTION ( ID INT NOT NULL PRIMARY KEY, ACTIONDATE DATETIME, ACCOUNT_ID INT NOT NULL ); CREATE TABLE ACCOUNTPAYMENTACTION

removing duplicate entries

2008-08-06 Thread Magnus Smith
| bigint(20) | NO | PRI | | | | AMOUNTINPENCE | bigint(20) | YES | | NULL| | +---++--+-+-+---+ ACCOUNTPAYMENTACTION shares the primary key with ACCOUNTACTION I need to remove duplicate entries that occured at a specific time

Re: removing duplicate entries

2008-08-06 Thread Ananda Kumar
| | +---++--+-+-+---+ ACCOUNTPAYMENTACTION shares the primary key with ACCOUNTACTION I need to remove duplicate entries that occured at a specific time in ACCOUNTACTION I then plan to remove the rows in ACCOUNTPAYMENTACTION that are no longer referenced in ACCOUNTACTION by using an outer join I can select

RE: removing duplicate entries

2008-08-06 Thread Magnus Smith
When I try the first suggestion (i) then I get all the 1682 duplicate rows. The thing is that I need to keep the originals which are the ones with the lowest ACCOUNTACTION.ID value. The second suggestion (ii) gives me 563 rows that are the duplicates with the lowest ACCOUNTACTION.ID which

Re: removing duplicate entries

2008-08-06 Thread Ananda Kumar
returned, are the one you want to keep are indeed duplicates On 8/6/08, Magnus Smith [EMAIL PROTECTED] wrote: When I try the first suggestion (i) then I get all the 1682 duplicate rows. The thing is that I need to keep the originals which are the ones with the lowest ACCOUNTACTION.ID http

  1   2   3   4   5   6   7   >