2010/5/25 Aryeh Gregor <simetrical+wikil...@gmail.com>: > Having Wikimedia servers send HTTP requests to each other instead of > just doing database queries does not sound like a great idea to me. > You're hitting several extra servers for no reason, including extra > requests to an application server. On top of that, you're caching > stuff in the database which is already *in* the database! FileRepo > does this the Right Way, and you should definitely look at how that > works. It uses polymorphism to use the database if possible, else the > API. > > However, someone like Tim Starling should be consulted for a > definitive performance assessment; don't rely on my word alone. > This is true if, indeed, all parsing is done on the distant wiki. However, if parsing is done on the home wiki, you're not simply requesting data that's ready-baked in the DB and API calls make sense. I'm also not convinced this would be a huge performance problem because it'd only be done on parse (thanks to parser cache), but like you I trust Tim's verdict more than mine. Unlike Platonides suggested, you cannot use FauxRequest to do cross-wiki API requests.
To the point of whether parsing on the on the distant wiki makes more sense: I guess there are points to be made both ways. I originally subscribed to the idea of parsing on the home wiki so expanding the same template with the same arguments would always result in the same (preprocessed) wikitext, but I do see how parsing on the local wiki would help for stuff like {{SITENAME}} and {{CONTENTLANG}}. Roan Kattouw (Catrope) _______________________________________________ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l