Re: [Wikitech-l] Dump throughput

2009-05-14 Thread Alex
Brion Vibber wrote: I believe that refers to yesterday's replication lag on the machine running watchlist queries; the abstract dump process that was hitting that particular server was aborted yesterday. Is Yahoo still using those? Looking at the last successful one for enwiki, it looks

Re: [Wikitech-l] Dump throughput

2009-05-13 Thread Anthony
On Wed, May 13, 2009 at 10:13 AM, Daniel Kinzler dan...@brightbyte.dewrote: Now, to be even more useful, database dumps should be produced on *regular* intervals. That way, we can compare various measures such as article growth, link counts or usage of certain words, without having to

Re: [Wikitech-l] Dump throughput

2009-05-13 Thread Daniel Kinzler
Brion Vibber schrieb: On a related note: I noticed that the meta-info dumps like stub-meta-history.xml.gz etc appear to be generated from the full history dump - and thus fail if the full history dump fails, and get delayed if the full history dump gets delayed. Quite the opposite;

Re: [Wikitech-l] Dump throughput

2009-05-12 Thread Aryeh Gregor
On Mon, May 11, 2009 at 3:27 PM, Brian brian.min...@colorado.edu wrote: In my opinion fragmentation of conversations onto evermore mailing lists discourages contribution. I have to agree that I don't think the dump discussion traffic seemed large enough to warrant a whole new mailing list.

Re: [Wikitech-l] Dump throughput

2009-05-12 Thread Tomasz Finc
Aryeh Gregor wrote: On Mon, May 11, 2009 at 3:27 PM, Brian brian.min...@colorado.edu wrote: In my opinion fragmentation of conversations onto evermore mailing lists discourages contribution. I have to agree that I don't think the dump discussion traffic seemed large enough to warrant a

Re: [Wikitech-l] Dump throughput

2009-05-11 Thread Brian
In my opinion fragmentation of conversations onto evermore mailing lists discourages contribution. On Mon, May 11, 2009 at 1:04 PM, Tomasz Finc tf...@wikimedia.org wrote: Andreas Meier wrote: Tomasz Finc schrieb: Tomasz Finc wrote: Russell Blau wrote: Erik Zachte

Re: [Wikitech-l] Dump throughput

2009-05-11 Thread Tomasz Finc
Andreas Meier wrote: Tomasz Finc schrieb: Tomasz Finc wrote: Russell Blau wrote: Erik Zachte erikzac...@infodisiac.com wrote in message news:002d01c9cd8d$3355beb0$9a013c...@com... Tomasz, the amount of dump power that you managed to activate is impressive. 136 dumps yesterday, today

Re: [Wikitech-l] Dump throughput

2009-05-09 Thread Andreas Meier
Tomasz Finc schrieb: Tomasz Finc wrote: Russell Blau wrote: Erik Zachte erikzac...@infodisiac.com wrote in message news:002d01c9cd8d$3355beb0$9a013c...@com... Tomasz, the amount of dump power that you managed to activate is impressive. 136 dumps yesterday, today already 110 :-) Out of 760

Re: [Wikitech-l] Dump throughput

2009-05-08 Thread Lars Aronsson
Tomasz Finc wrote: Commons finished just fine along with every single one of the other small mid size wiki's waiting to be picked up. Now were just left with the big sized wiki's to finish. The new dump processes started on May 1 and sped up to twelve processes on May 4. As of yesterday

[Wikitech-l] Dump throughput

2009-05-08 Thread Erik Zachte
Lars wrote Now, to be even more useful, database dumps should be produced on *regular* intervals. That way, we can compare various measures such as article growth, link counts or usage of certain words, without having to introduce the exact dump time in the count. That would complicate

Re: [Wikitech-l] Dump throughput

2009-05-07 Thread Tomasz Finc
Tomasz Finc wrote: Russell Blau wrote: Erik Zachte erikzac...@infodisiac.com wrote in message news:002d01c9cd8d$3355beb0$9a013c...@com... Tomasz, the amount of dump power that you managed to activate is impressive. 136 dumps yesterday, today already 110 :-) Out of 760 total. Of course

Re: [Wikitech-l] Dump throughput

2009-05-07 Thread Russell Blau
Tomasz Finc tf...@wikimedia.org wrote in message news:4a032be3.60...@wikimedia.org... Commons finished just fine along with every single one of the other small mid size wiki's waiting to be picked up. Now were just left with the big sized wiki's to finish. This is probably a stupid

[Wikitech-l] Dump throughput

2009-05-05 Thread Erik Zachte
Tomasz, the amount of dump power that you managed to activate is impressive. 136 dumps yesterday, today already 110 :-) Out of 760 total. Of course there are small en large dumps, but this is very encouraging. Erik Zachte ___ Wikitech-l mailing list