I thiiink (don't quote me on this) the NT schema contains wiki data. I'll
see if I can stitch together an aggregate dataset for you, if you think
it'd help. Per-day granular enough?


On 11 June 2014 06:54, Federico Leva (Nemo) <[email protected]> wrote:

> Federico Leva (Nemo), 01/06/2014 14:11:
>
>  Currently I'm in need of per-wiki and/or
>> diachronic/over-time plots for it.wiki:
>> <https://meta.wikimedia.org/wiki/Research:The_sudden_
>> decline_of_Italian_Wikipedia>
>>
>>
>> If nobody has ideas I guess I'll just go for the 800 lb gorilla approach
>> and submit a patch to have performance graphs for the top 10 wikis as we
>> currently do for edits (cf. https://bugzilla.wikimedia.org/56039 )? In
>> the meanwhile: https://gerrit.wikimedia.org/r/#/c/136631/
>>
>
> Those two are closed but I can't just steal the editswiki approach because
> it's based on the series' values; absent better ideas I'll need to hardcode
> the wikis' names.
>
> Not for that goal, but mostDeviant can be of use too: <
> https://graphite.readthedocs.org/en/latest/functions.html#
> graphite.render.functions.mostDeviant>
> For instance, a graph of sites/kinds of assets whose payload varied the
> most across last week. http://ur1.ca/hi4oa Can something like this be of
> use anywhere?
>
>
> Nemo
>
> _______________________________________________
> Analytics mailing list
> [email protected]
> https://lists.wikimedia.org/mailman/listinfo/analytics
>



-- 
Oliver Keyes
Research Analyst
Wikimedia Foundation
_______________________________________________
Analytics mailing list
[email protected]
https://lists.wikimedia.org/mailman/listinfo/analytics

Reply via email to