[Wikitech-l] Wikipedia database

2010-11-25 Thread Petromir Dzhunev
Hi everyone, Would you like to put in page table coordinates for each page(of course for the pages, which have coordinates)?Is it possible? The reason I'm asking you is that we want to know, which Wikipedia pages are marked in Google maps. Best Regards, Petromir

Re: [Wikitech-l] Wikipedia database

2010-11-25 Thread Q
On 11/25/2010 2:14 AM, Petromir Dzhunev wrote: Hi everyone, Would you like to put in page table coordinates for each page(of course for the pages, which have coordinates)?Is it possible? The reason I'm asking you is that we want to know, which Wikipedia pages are marked in Google

Re: [Wikitech-l] Commons ZIP file upload for admins

2010-11-25 Thread David Gerard
On 25 November 2010 07:58, Bryan Tong Minh bryan.tongm...@gmail.com wrote: I think you are taking the wrong approach here, altough I agree with MZMcBride's reply to your mail From a social and technical perspective, this proposal is horribly hackish. [...] Given the current parameters, this

[Wikitech-l] mwdumper results and performance

2010-11-25 Thread Billy Chan
Hi Everybody, I use mwdumper to import the latest current xml dump enwiki-20101011-pages-meta-current.xml.bz2 to my mediawiki. Everything seems fine, however, i found that only 6,669,091 pages in the database, while the mwdumper stops working and exit at the number 21,894,705. I am not sure if i

Re: [Wikitech-l] Commons ZIP file upload for admins

2010-11-25 Thread Platonides
Erik Moeller wrote: [Kicking this thread back to life, full-quoting below only for quick reference.] I've collected some additional notes on this here: http://commons.wikimedia.org/wiki/Commons:Restricted_uploads Would appreciate feedback will circulate further in the Commons community.

[Wikitech-l] alternative way to get wikipedia dump while server is down

2010-11-25 Thread Oliver Schmidt
Hello alltogether, is there any alternative way to get hands on a wikipedia dump? Preferably the last complete one. Which was supposed to be found at this address: http://download.wikimedia.org/enwiki/20100130/ I would need that dump asap for my research. Thank you for any help! Best regards

Re: [Wikitech-l] Commons ZIP file upload for admins

2010-11-25 Thread bawolff
Message: 5 Date: Wed, 24 Nov 2010 15:46:24 -0800 From: Erik Moeller e...@wikimedia.org Subject: Re: [Wikitech-l] Commons ZIP file upload for admins To: Wikimedia developers wikitech-l@lists.wikimedia.org Message-ID:       aanlktimd7kxngs4azgpanr_84ok_th9t1dsanc7st...@mail.gmail.com

Re: [Wikitech-l] [Mediawiki-api] Issue with loading specific page

2010-11-25 Thread Bryan Tong Minh
I am forwarding your request to wikitech-l, in the hope that there are more people on there who can comment on this issue. For those who did not follow the entire thread: the user does not send an Accept-Encoding: gzip header, but nevertheless gets a gzipped response. On Thu, Nov 25, 2010 at