On Thu, Feb 27, 2014 at 02:49:28PM -0800, Gergo Tisza wrote: > As I understand, Mark has some scripts running from a cronjob to aggregate > data from the event logs (which cannot be public for privacy reasons) into > CSV files, and those CSV files are fed to Limn. Building those CSVs for > every wiki and making them downloadable should be simple, I imagine?
My favorite answer: "Yes". Once I have a first-pass script that does this for one wiki, it should be "trivial" to replace the places that call a specific wiki with variables that get replaced and saved into multiple scripts, then run all the scripts. But, this is a wishlist thing that I'm not super interested in worrying about right now - I want to do the metrics we need, not the ones we might later. But, I'll leave it open to do later. :) You can see the scripts I'm running at Gitorious [0]. There are some leftovers from my first shitty implementation and the cronjobs aren't there, but you can get a good picture. [0] https://gitorious.org/analytics/multimedia -- Mark Holmquist Software Engineer, Multimedia Wikimedia Foundation [email protected] https://wikimediafoundation.org/wiki/User:MHolmquist
signature.asc
Description: Digital signature
_______________________________________________ Multimedia mailing list [email protected] https://lists.wikimedia.org/mailman/listinfo/multimedia
