https://bugzilla.wikimedia.org/show_bug.cgi?id=48260

--- Comment #9 from Daniel Kinzler <daniel.kinz...@wikimedia.de> ---
(In reply to comment #8)
> I wrote all 30.536.568 links in 2013-05-27 database dump to a file, and then
> used sort(1) and uniq(1) to find all duplicate links.

To my knowledge, there can be no dupes in the wb_items_by_site table, because
there is a primary key covering the relevant fields.

Can you show exactly what you did? What exactly the query looks like? Can you
give some examples duplicates?

As far as I can see, the problem described in this report occurs when there are
things *missing* from wb_items_by_site, and thus conflicts fail to be detected.

-- 
You are receiving this mail because:
You are on the CC list for the bug.
_______________________________________________
Wikibugs-l mailing list
Wikibugs-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikibugs-l

Reply via email to