Dear Mohamad,
thanks for compiling this comprehensive list.
You might want to add JWPL:
http://code.google.com/p/jwpl/
and WikipediaMiner:
http://wikipedia-miner.sourceforge.net/
-Torsten
From: wiki-research-l-boun...@lists.wikimedia.org
[mailto:wiki-research-l-boun...@lists.wikimedia.org]
And maybe Wiktionary parser and visual interface :)
http://code.google.com/p/wikokit/
Best regards,
Andrew Krizhanovsky
On Wed, Apr 20, 2011 at 12:15 PM, Torsten Zesch
ze...@tk.informatik.tu-darmstadt.de wrote:
Dear Mohamad,
thanks for compiling this comprehensive list.
You might want
Dear Jodi and all!
I hope that you are fine.
Here there is a wiki page listing suggestions on how to develop a research in a
way that respects Wikimedia community principles:
http://meta.wikimedia.org/wiki/Notes_on_good_practices_on_Wikipedia_research
Hopeing is useful! Have a nice day, Mayo
We wrote a bunch a Python scripts for parsing Wikipedia dumps with
different goals.
You can get them at https://github.com/phauly/wiki-network/
We also released some datasets of network extracted from User Talk pages.
See http://www.gnuband.org/2011/04/19/wikipedia_datasets_released/
Enjoy! ;)
Not directly related with Wikipedia, but about wikis: WikiTeam[1] and their
dumps[2] about wikis. Thanks to these dumps, you can compare your research
results about Wikipedia community with other wiki communities in the world.
[1] http://code.google.com/p/wikiteam/
[2]
Hi everyone,
Thank you all for your replies, we really appreciate your cooperation. Below is
a summary of the tools and data sets recommended by Torsten, Andrew, paolo, and
emijrp. We would also like to know if there is any existing Wikipedia page that
includes such a list so we can add to