On the consistency issues -- we do have a mechanism for logging 
inconsistencies according to some rules, defined in the custom ruleset. 
However it only logs inconsistencies on the standard output (or some 
predefined java PrintStream). It currently doesn't prevent them from entering 
the repository. If such a mechanism would work for you -- let me know (you 
will need a 3.2.x BigOWLIM to have this working).

About the Lucene and logging customization -- we're on it and we haven't 
forgotten you :) These are still in progress.

As of OWLIM-as-a-server setup -- we are usually using Sesame server as well. 
One possible way of speeding up the communication to this server is to use 
the Sesame's own tuple result binary format (TupleQueryResultFormat.BINARY) 
as preferred tuple query result format of HTTPRepository. This significantly 
reduces the overhead of serializing/parsing the SPARQL query results.
Hope this helps!


Cheers,
Ivan


On Wednesday 03 February 2010 12:49:28 Peter Kostelnik, PhD. wrote:
> hi, Ivan,
>
> allright, thanks a lot..
>
> anyway, got some other questions :)
>
> .. the inferences defined in .pie rule sets are able to infer the new
> assertions without any need of meta-data/schema .. is there some
> validation mechanism, which can be used to hold the data consistent
> against the schema?? .. now we can assert anything (having the schema or
> not, OWLIM does not care)
>
> and .. some additional stuff (just pinging, as usual :) ):
> how does it look with integration of lucene into BigOWLIM?
> and, is there some possibility to redirect/customize the logiing of
> BigOWLIM to some file or .. somewhere else than to standard E/O?
>
> and .. just being curious .. do you have some idea, how to make
> communication with remote BigOWLIM faster (we are running it on sesame
> server)? ..
>
> thanks in advance, cheers,
>                               Peter K.
>
> > Hey Peter,
> >
> > On your questions:
> >
> > 1. Your rules should reside within a single .pie file or a single
> > built-in ruleset. If you need to extend the owl-max or whichever built-in
> > ruleset, just copy the respective .pie file and extend/modify it as you
> > wish
> >
> > 2. Once loaded with some ruleset, your data will get enriched with the
> > inferred closure of the loaded dataset (according to the ruleset).
> > Changing
> > the ruleset afterwards will only affect future inferences (this is a side
> > effect of the forward-chaining strategy used by Owlim). We are thinking
> > about
> > building some mechanism able to reinfer the RDF closure at later stage,
> > but
> > this is still being planned. Until we have it implemented, changing the
> > ruleset will require reloading of the repository data.
> >
> >
> > Cheers,
> > Ivan
> >
> > On Wednesday 03 February 2010 12:25:25 Peter Kostelnik, PhD. wrote:
> >> hi there,
> >>
> >> I'm experimenting with custom .pie files in BigOWLIM and, amazingly, the
> >> few questions have arised :)
> >>
> >> so ..
> >>
> >> 1. is it possible to use more rules files? .. I mean, for example, to
> >> use
> >> owl-max rules and then, additionally, some custom rules in separate file
> >> (not to hold everything in one .pie file)
> >>
> >> 2. as there is no documentation available (or is it?), i've tried this
> >> setup: (a) loaded data into repository initialized with custom rule set
> >> and shutted down; (b) initialized repository with extended rule set
> >> without loading data again .. in this case, the rules were compiled, but
> >> inferences defined in extended rules were not asserted .. so:
> >> is there possibility, how to extend the rule set without reloading whole
> >> repository?
> >>
> >> thanks in advance,
> >>                           Peter K.
> >>
> >> _______________________________________________
> >> OWLIM-discussion mailing list
> >> [email protected]
> >> http://ontotext.com/mailman/listinfo/owlim-discussion
>
> _______________________________________________
> OWLIM-discussion mailing list
> [email protected]
> http://ontotext.com/mailman/listinfo/owlim-discussion


_______________________________________________
OWLIM-discussion mailing list
[email protected]
http://ontotext.com/mailman/listinfo/owlim-discussion

Reply via email to