Andy,

Thanks for the clear answer — I’ll give the DatasetGraphFactory.GraphMaker a 
try. Assuming that works (I’m sure it will), having a configuration option 
becomes lower priority for me, but for ease of use it would be good to raise an 
issue.

Thanks,

Ian


From: Andy Seaborne <[email protected]>
Date: Monday, August 11, 2025 at 10:33 AM
To: [email protected] <[email protected]>
Subject: Re: Configuring Fuseki for inference
Hi Ian, Interesting usage and I think I've heard about a similar one before. 
There is also a code-way to customize the creation of graph in a datasets 
(DatasetGraphFactory. GraphMaker) but there isn't a way to configure that from 
a Fuseki configuration


Hi Ian,



Interesting usage and I think I've heard about a similar one before.

There is also a code-way to customize the creation of graph in a

datasets (DatasetGraphFactory.GraphMaker) but there isn't a way to

configure that from a Fuseki configuration file unfortunately.



Sorry about that - do you want to raise an issue?



     Andy



On 07/08/2025 19:07, Emmons, Ian D wrote:

> Andy,

>

> Thanks for your reply.

>

> We use named graphs to store behavior models (pattern-of-life together with 
> other related information) that lie at the heart of a large system of 
> systems. Each model is stored in its own named graph, with the default graph 
> used as a catalogue of the models with pointers to the named graphs.

>

> Each graph contains the ontology separately, though it is the same ontology 
> across all of them.

>

> An inference graph containing all the named graphs is not an option, because 
> the models have a lifecycle like this:

>

>

>    *   Initial development

>    *   A series of approvals

>    *   Insertion into operations

>    *   Branching of the operational model to produce a new development-phase 
> model

>    *   Editing the branched model to make necessary changes

>    *   A new approval cycle

>    *   Retirement of the original operational model and promotion of the 
> branch into operations

>    *   Rinse, lather, repeat as necessary

>

> Because two branches of the same model differ, they may contradict each 
> other, so a combined inference graph would produce nonsense. In essence, the 
> named graphs are serving as inference isolation boundaries.

>

> The inference level we use is Micro-OWL.

>

> Thanks,

>

> Ian

>

>

> From: Andy Seaborne <[email protected]>

> Date: Thursday, August 7, 2025 at 11:39 AM

> To: [email protected] <[email protected]>

> Subject: Re: Configuring Fuseki for inference

> Hi Ian, There may be ways of doing some cases such as one inference graph 
> combining all the named graphs? How do you use named graphs? Are they for 
> data management reasons? Does one of them have the 
> schema/vocabulary/ontology? What inference

>

>

> Hi Ian,

>

>

>

> There may be ways of doing some cases such as one inference graph

>

> combining all the named graphs?

>

>

>

> How do you use named graphs?

>

>     Are they for data management reasons?

>

>     Does one of them have the schema/vocabulary/ontology?

>

>

>

> What inference level are you using?

>

>

>

>       Andy

>

>

>

> On 05/08/2025 18:49, Emmons, Ian D wrote:

>

>> Fuseki Users,

>

>>

>

>> I have been working to convert a project to use Fuseki as its semantic graph 
>> store, and for the most part it works well. However, I have hit a roadblock.

>

>>

>

>> I have followed the instructions to configure my default graph for 
>> inference, and this does work. However, it appears that I must configure 
>> each graph for inference separately, and, in particular, that there is no 
>> way to configure Fuseki so that all graphs do inference, including graphs 
>> created in the future.

>

>>

>

>> Is there a way to do this? I regard this as an important capability because 
>> that’s the key feature of a semantic graph that makes it unique — if it 
>> weren’t for inference, there are so many other databases I could be using.

>

>>

>

>> Thanks,

>

>>

>

>> Ian

>

>>

>

>> ==================

>

>> Ian Emmons

>

>> RTX BBN Technologies

>

>>

>

>

>


Reply via email to