Hi Platonides,
I succeeded in downloading the file. (After some crashes.)
As I'm sure you will need the hd-space, I want to use the checksum you
provided, so I could tell you that the data actually arrived.
As I'm fed up with using Windows for large files, I'll use an Ubuntu
version from an US
Hi Platonides
First let me thank you very much!
99 GByte, how could that be?
On https://dumps.wikimedia.org/commonswiki/20160305/
there are dumps like
commonswiki-20160305-pages-articles-multistream.xml.bz2
with 6.7 GByte and that file should contain
"Articles, templates, media/file descriptions
As I am planing to download other filestypes from commons and other
wikis, and as I can't ask you to do it for me like 100 times,
is it possible to access the database myself?
On my last request for the svg-filenames Platonides suggested:
> You should be able to extract such list with a query suc
0, schrieb Platonides:
On 02/03/16 22:05, D. Hansen wrote:
Hi!
Last year I was provided very kindly with a list of all svg-files on
commons, that is their then *real* http(s)-paths. (Either by John
phoenixoverr...@gmail.com or by Ariel T. Glenn agl...@wikimedia.org
agl...@wikimedia.org.)
Could I get
Hi!
Last year I was provided very kindly with a list of all svg-files on
commons, that is their then *real* http(s)-paths. (Either by John
phoenixoverr...@gmail.com or by Ariel T. Glenn agl...@wikimedia.org
agl...@wikimedia.org.)
Could I get a current version of this dump, please? (With the r
ried to help me, but this even he couldn't solve.
Beside this I didn't much feedback.
Is it possible to get such a dump? Or to get another dump that I could
use to update and crosscheck the all-titles file?
Greetings
D. Hansen
___
Xm