Hi Hugh,
After checking "Semantic Web Crawling" the issue was solved.
Best,
Parsa
Hi Hugh,
Well after following the steps in the tutorial, I got some couple of
thousands of pages downloaded in my DAV user's rdf_sink folder but still
nothing in the RDF Quad Store. Fyi I logged in as 'dba' and chose 'dav' as
the crawling user and tried both /dav/rdf_sink and /dba/rdf_sink. I just
Hi Parsa,
What is the URL being crawled, have you tried using the sample URL in the
documentation at:
http://docs.openlinksw.com/virtuoso/htmlconductorbar.html#contentcrawlerrdf
As this works for me so I would expect it to work for you also ?
The "dav_delete" function can be used to r
I'm using the Crawler with "Ask for RDF", "Store content locally" and "Store
metadata" all enabled. But after crawling is done, I get the data only in my
user's DAV path (e.g. /DAV/home/dba/rdf_sink/) and not in the Quad Store.
"select count(*) where {?s ?p ?o}" returns exactly the same number bef