I know, but I'm pretty sure any Fuseki call will also have to load the
whole data into Pellet - that is mandatory for Pellet as it will apply
tableau algorithm. So if your Fuseki is backed by TDB, this step is
needed anyways. But maybe I'm wrong - I just can't imagine that the
whole TDB database is kept in-memory all the time, and even once you
edit data, Pellet would have to recompute the inferences.

But yes, Dave and Andy will know better than me and should give you a
better answer for sure.

On 11.06.20 11:05, Mikael Pesonen wrote:
>
> Pellet and Openllet are familiar tools. They just require to dump the
> data from Jena for reasoning, so one extra step...
>
> Thanks for the suggestion!
>
> On 11.6.2020 9:41, Lorenz Buehmann wrote:
>> I'm not aware of such a Jena built in tool.
>>
>> You could dump your DB and use Pellet reasoner via its CLI - it also
>> would support explanations - something for which you need Pellet resp.
>> OWL API anyways, no? You should also be aware of the fact that the OWL
>> DL reasoners need the whole ontology in-memory, so it might be slow
>> and/or time consuming depending on the complexity of your schema and the
>> size of your data.
>>
>> On 10.06.20 16:40, Mikael Pesonen wrote:
>>> Hi,
>>>
>>> we are managing RDF data using Fuseki web API. Is it possible to do
>>> OWL DL + SWRL reasoning with Jena without programming java? So is
>>> there a command line tool available (probably JAR)? Of reasoning we
>>> would do data/model consistency checking and entailment checking with
>>> explanations (optional).
>>>
>>> Thanks!
>>>

Reply via email to