e to keep up to date.
>
> Are their private data feeds? I noticed a wg_enwiki dump listed.
>
> Christian
>
> On Jan 28, 2009, at 10:47 AM, Christian Storm wrote:
>
> > That would be great. I second this notion whole heartedly.
> >
> >
> > On Jan 28,
Ariel,
Thank you for giving some insight into what has been going on behind
the scenes.
I have a few questions that will hopefully get some answers to those
of us eager to help out
in any way we can.
What are the planned code changes to speed the process up? Can we help
this volunteer
with
Thanks for the update Russell!
On Feb 23, 2009, at 10:04 AM, Russell Blau wrote:
> "Russell Blau" wrote in message
> news:gnuacf$hf...@ger.gmane.org...
>>
>> I have to second this. I tried to report this outage several times
>> last
>> week - on IRC, on this mailing list, and on Bugzilla. Al
seems a lot of parties still do
so after looking at robots.txt
I have to assume that is how Google et al. is able to keep up to date.
Are their private data feeds? I noticed a wg_enwiki dump listed.
Christian
On Jan 28, 2009, at 10:47 AM, Christian Storm wrote:
> That would be great.
That would be great. I second this notion whole heartedly.
On Jan 28, 2009, at 7:34 AM, Russell Blau wrote:
> "Brion Vibber" wrote in message
> news:497f9c35.9050...@wikimedia.org...
>> On 1/27/09 2:55 PM, Robert Rohde wrote:
>>> On Tue, Jan 27, 2009 at 2:42 PM, Brion Vibber
>>> wrote:
On
>> On 1/4/09 6:20 AM, yegg at alum.mit.edu wrote:
>> The current enwiki database dump
>> (http://download.wikimedia.org/enwiki/20081008/
>> ) has been crawling along since 10/15/2008.
> The current dump system is not sustainable on very large wikis and
> is being replaced. You'll hear about it