On 10/13/05, Ken D'Ambrosio <[EMAIL PROTECTED]> wrote:
> I love Wikipedia.  A lot.  Except for the fact that it's slow as dirt.  So
> I decided to run in locally, and downloaded their dump (a mere 16.7 GB),
> untarred it... and don't know where to go from here.

Running it locally might not improve the speed that much, since it may
be the application that crawls.  I'm running a local mediawiki on my
site[1] and it is not too fast.  I used ezPublish once, and it was so
gawd-awful slow, I could not deliver it as a 'product' even when put
onto a reasonable hosting environment.   My experience with wikipedia
is that it is slower than I would like.  But then again the content is
soooo worth the wait.

So, installing it locally might get you a performance boost.  And at a
minimum you've got a fun project to tinker with.  (Do they offer a way
to mirror the dump so that you can keep up to date?)

You do have the mediawiki[2] code right?  That's the application that
runs Wikipedia.  If all you have is a database dump, then you'll need
to install mediawiki, and load the data to get started.

hth,  I'm offline tonight, but if you need more help at some point: IM freephile

[1] http://nbptgnus.org
[2] http://www.mediawiki.org


I was kind of hoping
> for, at least, a README or HOWTO or wiki entry on how to, or SOMETHING...
> but I can't find diddly.
>
> They talk about a MySQL back-end, but don't I need the database they're
> using, as well as the dump of all their data?  Or...?
>
> Any pointers on this would be helpful.
>
> Thanks,
>
> -Ken
> _______________________________________________
> gnhlug-discuss mailing list
> gnhlug-discuss@mail.gnhlug.org
> http://mail.gnhlug.org/mailman/listinfo/gnhlug-discuss
>
_______________________________________________
gnhlug-discuss mailing list
gnhlug-discuss@mail.gnhlug.org
http://mail.gnhlug.org/mailman/listinfo/gnhlug-discuss

Reply via email to