On Mon, Sep 19, 2011 at 10:39 PM, Daniel Friesen
<li...@nadir-seen-fire.com> wrote:
> On 11-09-19 06:39 PM, Anthony wrote:
>> On Mon, Sep 19, 2011 at 3:57 PM, Brion Vibber <br...@pobox.com> wrote:
>>> That's probably the simplest solution; adding a new empty table will be very
>>> quick. It may make it slower to use the field though, depending on what all
>>> uses/exposes it.
>> Isn't adding a new column with all NULL values quick too?
> Apparently in InnoDB a table ALTER requires an entire copy of the table
> to do. In other words to do a table alter every box doing it needs to be
> able to hold the entire Wikipedia revision table twice to add a new column.

Ah, okay.  I remember that's what happened in MyISAM but I figured
they had that fixed in InnoDB.

On Mon, Sep 19, 2011 at 3:57 PM, Brion Vibber <br...@pobox.com> wrote:
> During stub dump generation for instance this would need to add a left outer
> join on the other table, and add things to the dump output (and also needs
> an update to the XML schema for the dump format). This would then need to be
> preserved through subsequent dump passes as well.

Doesn't the stub dump generation computer have its own database?  I
still don't see the point of putting all this extra work on the master
database in order to maintain a function-based index which is only
being used for dumps.

The dump generation computer should have its own database.  For most
of the tables/dumps (probably all but the full-text ones), you could
even use sqlite and offer that as a download option for those who
aren't stuck in 1998.

On Tue, Sep 20, 2011 at 3:23 AM, Daniel Friesen
<li...@nadir-seen-fire.com> wrote:
> On 11-09-19 11:43 PM, Domas Mituzas wrote:
>> so, what are the use cases and how does one index for them? is it global 
>> hash check, per page? etc

> One use case I know if is this bug:
> https://bugzilla.wikimedia.org/show_bug.cgi?id=2939

Calculating and storing checksums on every revision in the database is
way overkill for that.

_______________________________________________
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Reply via email to