Hi,

I'm Dario Garcia-Gasulla, an AI researcher at Barcelona Tech (UPC).

I'm currently doing research on very large directed graphs and I am
using one of your datasets for testing. Concretly, I am using the
"Wikipedia Pagelinks" dataset as available in the DBpedia web site.

Unfortunately the description of the dataset is not very detailed:


      Wikipedia Pagelinks

/Dataset containing internal links between DBpedia instances.
The dataset was created from the internal links between Wikipedia
articles. The dataset might be useful for structural analysis, data
mining or for ranking DBpedia instances using Page Rank or similar
algorithms./

I wonder if you could give me more information on how the dataset was
built and what composes it.
I understand Wikipedia has 4M articles and 31M pages, while this dataset
has 17M instances and 130M links (couldn't find the number of links of
Wikipedia).

What's the relation between both? Could someone briefly explain the
nature of the Pagelinks dataset and the differences with the Wikipedia?

Thank you for your time,
Dario.
------------------------------------------------------------------------------
Rapidly troubleshoot problems before they affect your business. Most IT 
organizations don't have a clear picture of how application performance 
affects their revenue. With AppDynamics, you get 100% visibility into your 
Java,.NET, & PHP application. Start your 15-day FREE TRIAL of AppDynamics Pro!
http://pubads.g.doubleclick.net/gampad/clk?id=84349351&iu=/4140/ostg.clktrk
_______________________________________________
Dbpedia-discussion mailing list
Dbpedia-discussion@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/dbpedia-discussion

Reply via email to