Dan Nessett wrote:
> After reflection, here are some other problems.
> 
> + Some tests assume the existence of data in the db. For example, the 
> PagedTiffHandler tests assume the image Multipage.tiff is already loaded. 
> However, this requires an entry in the image table. You could modify the 
> test to clone the existing image table, but that means you have problems 
> with:
> 
> + Some tests assume certain data is *not* in the db. PagedTiffHandler has 
> tests that upload images. These cannot already be in the images table. 
> So, you can't simply clone the images table.

Interesting. I haven't seen PagedTiffHandler tests.
What normal parsertest do is to "upload" existing images and add the
needed articles to the empty tables.
Previously, the image tables entries were added directly by SQL. I
changed it in r70917 to use recordUpload2() instead.


> All of this suggests to me that a better strategy is:
> 
> + When the test run begins, clone a db associated with the test suite.

Having another database would be the optimal solution, but it's not
always possible.

OTOH MaxSem replied to my concern in r58669: Oracle table names are
limited to 32 characters. Mysql limit is of 64 characters* which gives
more margin. Our longest table is msg_resource_links with 18 characters.
If we choose a prefix like mwtest_ we could use another underscore plus
6 random digits for identifying the instance and still be Oracle compliant.

* http://dev.mysql.com/doc/refman/5.1/en/identifiers.html

> + Switch the wiki to use this db and return a cookie or some other state 
> information that identifies this test run configuration.

I think you mean for remote petitions, not just for internal queries,
where do you expect to store that data?


_______________________________________________
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Reply via email to