Hi Ian,

On 03/09/2025 18:21, Emmons, Ian D wrote:
I’ve returned to this task, and I have two questions:

First, DatasetGraphFactory.GraphMaker builds a graph, but inference is 
configured at the model level. Can I use a DatasetGraphFactory.GraphMaker to 
create an inference-enabled graph?

Yes - create the model, and use Model.getGraph()

The inference engines work on graphs - Model/InfModel doesn't add any inference machinery.

Model.getGraph() returns the graph (an InfGraph, a subclass of Graph).

Or InfModel.getInfGraph() which is the same object as getGraph()

Second, assuming I have a DatasetGraphFactory.GraphMaker working, what do I do 
with it? I don’t see any methods on Dataset, DatasetGraph, DataService, 
FusekiService, or any of the associated builders or factories that allows me to 
register a GraphMaker for use by the system.

The constructor to DatasetGraphMapLink isn't visible (that can be done in the next release).

It is possible to (mis)use DatasetGraphMapLink.cloneStructure

---
DatasetGraphFactory.GraphMaker graphMaker = (name) -> {
    System.out.println("Name:"+name);
    return GraphFactory.createTxnGraph();
};

// Create a dataset with the default graph.
DatasetGraph dsg0 = DatasetGraphFactory.wrap(graphMaker.create(null));

// Cause a DatasetGraphMapLink to setup with the GraphMaker.
DatasetGraph dsg = DatasetGraphMapLink.cloneStructure(dsg0, graphMaker);

Node x = NodeFactory.createURI("http://example/";);
dsg.add(x,x,x,x);
dsg.add(x,x,x,NodeConst.emptyString);
---
and the first quad added cause a graphMake call.

    Andy


Thanks, in advance,

Ian


From: Emmons, Ian D <[email protected]>
Date: Monday, August 11, 2025 at 6:15 PM
To: [email protected] <[email protected]>
Subject: Re: Configuring Fuseki for inference
Andy, Thanks for the clear answer — I’ll give the DatasetGraphFactory. 
GraphMaker a try. Assuming that works (I’m sure it will), having a 
configuration option becomes lower priority for me, but for ease of use it 
would be good to raise an issue. 


Andy,



Thanks for the clear answer — I’ll give the DatasetGraphFactory.GraphMaker a 
try. Assuming that works (I’m sure it will), having a configuration option 
becomes lower priority for me, but for ease of use it would be good to raise an 
issue.



Thanks,



Ian





From: Andy Seaborne <[email protected]>

Date: Monday, August 11, 2025 at 10:33 AM

To: [email protected] <[email protected]>

Subject: Re: Configuring Fuseki for inference

Hi Ian, Interesting usage and I think I've heard about a similar one before. 
There is also a code-way to customize the creation of graph in a datasets 
(DatasetGraphFactory. GraphMaker) but there isn't a way to configure that from 
a Fuseki configuration





Hi Ian,







Interesting usage and I think I've heard about a similar one before.



There is also a code-way to customize the creation of graph in a



datasets (DatasetGraphFactory.GraphMaker) but there isn't a way to



configure that from a Fuseki configuration file unfortunately.







Sorry about that - do you want to raise an issue?







      Andy







On 07/08/2025 19:07, Emmons, Ian D wrote:



Andy,







Thanks for your reply.







We use named graphs to store behavior models (pattern-of-life together with 
other related information) that lie at the heart of a large system of systems. 
Each model is stored in its own named graph, with the default graph used as a 
catalogue of the models with pointers to the named graphs.







Each graph contains the ontology separately, though it is the same ontology 
across all of them.







An inference graph containing all the named graphs is not an option, because 
the models have a lifecycle like this:











    *   Initial development



    *   A series of approvals



    *   Insertion into operations



    *   Branching of the operational model to produce a new development-phase 
model



    *   Editing the branched model to make necessary changes



    *   A new approval cycle



    *   Retirement of the original operational model and promotion of the 
branch into operations



    *   Rinse, lather, repeat as necessary







Because two branches of the same model differ, they may contradict each other, 
so a combined inference graph would produce nonsense. In essence, the named 
graphs are serving as inference isolation boundaries.







The inference level we use is Micro-OWL.







Thanks,







Ian











From: Andy Seaborne <[email protected]>



Date: Thursday, August 7, 2025 at 11:39 AM



To: [email protected] <[email protected]>



Subject: Re: Configuring Fuseki for inference



Hi Ian, There may be ways of doing some cases such as one inference graph 
combining all the named graphs? How do you use named graphs? Are they for data 
management reasons? Does one of them have the schema/vocabulary/ontology? What 
inference











Hi Ian,















There may be ways of doing some cases such as one inference graph







combining all the named graphs?















How do you use named graphs?







     Are they for data management reasons?







     Does one of them have the schema/vocabulary/ontology?















What inference level are you using?















       Andy















On 05/08/2025 18:49, Emmons, Ian D wrote:







Fuseki Users,















I have been working to convert a project to use Fuseki as its semantic graph 
store, and for the most part it works well. However, I have hit a roadblock.















I have followed the instructions to configure my default graph for inference, 
and this does work. However, it appears that I must configure each graph for 
inference separately, and, in particular, that there is no way to configure 
Fuseki so that all graphs do inference, including graphs created in the future.















Is there a way to do this? I regard this as an important capability because 
that’s the key feature of a semantic graph that makes it unique — if it weren’t 
for inference, there are so many other databases I could be using.















Thanks,















Ian















==================







Ian Emmons







RTX BBN Technologies


























Reply via email to