On Tue, Jul 21, 2009 at 7:17 PM, Chengbin Zheng<chengbinzh...@gmail.com> wrote:
...
>
> No, I know what parsing means. Even if it takes 2 days to parse them,
> wouldn't it be faster than to actually create a static HTML dump the
> traditional way?
>
> If it is not, then what is the difficulty of making static HTML dumps? It
> can't be bandwidth, storage, or speed.
>

WikiMedia work with limited resources on manpower, hardware, etc..etc...

Things are done. When? when theres available resources, humans and of
the other types.
Is not only you, there are lots of people that want to download the
wikipedia (sometimes in a periodic fashion)

There are a log somewhere with the daily work of some wikipedia admin. ( - :
http://wikitech.wikimedia.org/view/Server_admin_log

Some of these are even very fun, like in:
02:11 b****: CPAN sux
01:47 d******: I FOUND HOW TO REVIVE APACHES
( names obscured to protect the inocents ).

-- 
--
ℱin del ℳensaje.

_______________________________________________
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Reply via email to