>>> Although in this case I would just run mysqldump with 
>>> --skip-extended-insert so that it doesn't create such long lines.
>> Yes, I tried that. But there are tables like l10ncache or objectcache that 
>> store serialized objects which produce long lines even in that case. Still, 
>> I think that should be a recommendation for creating the dumps.

>Do you really want to dump those caching tables?
Good question... I don't think it's generally necessary. But then again,  I 
think there are two reasons why we should allow for dumps that contain these 
tables:
* it's easy to use plain old mysql dumps. The easier we make it for developers 
to test their code, the better. An alternative would be to provide a dumper 
script that removes these tables or doesn't dump them in the first place. I 
could dive into that if people here think that's a better way to go.
* maybe someone wants to write a regression test that reproduces some specific 
caching issue. I agree this is unlikely, most probably unit tests would be a 
better way here. But then again, the mechanism I am working on might be a 
possible basis for unit test resources as well.
The point is (and I did not see that in my previous post) that even if I skip 
caching tables in the import, there is most certainly one other table which 
might exceed the 1024 byte limit and that is "text".

Cheers,
Markus

_______________________________________________
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Reply via email to