I can run a database report Monday. But keep in mind that the wiki isn't
static and what you want changes on a very rapid rate


On Friday, June 19, 2015, D. Hansen <sammelacco...@tageskurier.de> wrote:

> Hi!
> I have tried to get a list of all .svg-files on commons.wikipedia.
>
> Of course I could just parse through commons, but
> * if there would be any way to provide a dump with the names of the really
> existing .svg-files, that would be a tremendeous help for me, *
> and in my estimation it would reduce the download  size and cpu-burden and
> most importantly the HD-burden about 70% to 80% compared to browse and
> parse through commons. (Wheres the cpu-usage and heat problems because of
> the HD-burden on my notebook would be much more adverse than the burden on
> wikipedias server, I assume. I lost already two HDs over the years when
> downloading larger amounts of files in one go.)
>
> Though I have asked at various places so far I haven't found a good
> solution.
> One suggestion was to downloadcommonswiki-20150417-all-titles, which I did.
>
> But this file does contain deleted names and renamed names, and names the
> partly have "File:" and some that don't have "File:" or a similiar
> indicator at the start.
> Doing just a small sample resulted in 5 correct names, and around 7
> deleted and 7 renamed names. I have asked at various places, and especially
> one person tried to help me, but this even he couldn't solve. Beside this I
> didn't much feedback.
>
> Is it possible to get such a dump? Or to get another dump that I could use
> to update and crosscheck the all-titles file?
>
> Greetings
> D. Hansen
>
> _______________________________________________
> Xmldatadumps-l mailing list
> Xmldatadumps-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/xmldatadumps-l
>
_______________________________________________
Xmldatadumps-l mailing list
Xmldatadumps-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/xmldatadumps-l

Reply via email to