Yep, thanks to your email, I realized a "mini dump" extracted randomly
is enough for my purposes.
Thanks Ilmari!

P.


On Tue, Jul 20, 2010 at 1:25 PM, Ilmari Karonen <nos...@vyznev.net> wrote:
> On 07/20/2010 09:51 AM, paolo massa wrote:
>> Thanks Gregor and yes, you are right.
>> I didn't think about your suggestion before, sorry.
>> The fact is that I wrote a script running on the
>> pages-meta-current.xml because it is much smaller and manageable but,
>> you are right: I can use the revision of the page I'm interested that
>> is in pages-meta-history.xml
>
> If you're only interested in a small number of pages, you can get an
> up-to-date "mini dump" through Special:Export.  See
> <http://meta.wikimedia.org/wiki/Help:Export> and
> <http://www.mediawiki.org/wiki/Export> for details.
>
> Alternatively, you can also fetch page histories through the API:
> <http://www.mediawiki.org/wiki/API:Query_-_Properties#revisions_.2F_rv>
>
> --
> Ilmari Karonen
>
> _______________________________________________
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>



-- 
--
Paolo Massa
Email: paolo AT gnuband DOT org
Blog: http://gnuband.org

_______________________________________________
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Reply via email to