Hello Andy and Jena,
Thanks for your kindly reply.
Ok, I will try it first.
Another question, is there any small examples that show me how to
implements StorageRDF.
or any interface for external-storage.
I also want to try to store and query the data in a RDMS-backend or a
KV-based strorage.
If there any examples I can follow, I can take baby steps and try to make
it.
TDB/TDB2 may be a good example, but it looks a little hard for me .
> On Jan 13, 2022, at 2:58 AM, Andy Seaborne <[email protected]> wrote:
>
> It's hard too understand cold but this looks odd to me:
>
>
> In App.java@
>
> DgraphDB db = new DgraphDB(dsg, new DatasetPrefixesDgraphDB());
> Dataset dataset = DatasetFactory.wrap(db);
> Model myMod = dataset.getDefaultModel();
> ...
> Dataset ds = DatasetFactory.create(myMod);
> ds.asDatasetGraph().setDefaultGraph(infgraph);
> ---------
>
> If using StorageRDF, then I'd the dataset to be built with DatasetGraphStorage
>
> (see TDB2)
>
>
> setDefaultGraph, slightly contrary to name, is a copy. It is a bit of legacy
> hangover.
>
> I'd expect:
> StorageDgraphDB Dg_dsg =
> new StorageDgraphDB(txnSystem, tripleTable, quadTable);
> DatasetGragh dsg1 =
> new DatasetGraphStorage(Dg_dsg,
> new DatasetPrefixesDgraphDB(),
> txnSystem);
>
> And inference:
> Graph g = dsg1.getDefaultGraph();
> InfGraph infgraph = reasoner.bind(g);
> infgraph.setDerivationLogging(true);
>
> create a layer:
>
> DatasetGraph dsg2 = DatasetGraphFactory.wrap(g);
>
> FusekiServer server = FusekiServer.create()
> .add("/ds", dsg2)
>
> (untested)
>
> Andy
>
>
> On 12/01/2022 14:39, brain wrote:
>> Hi Andy and Jena,
>> So glad to see you here.
>> I just upload my code to GitHub
>> https://github.com/analyticservicedev/dgraph-jena
>> <https://github.com/analyticservicedev/dgraph-jena>
>> It’s short and a little dirty .
>> I have `class DgraphTripleTable implements TripleStore`.
>> And do CRUD with add() delete() find() or findXxxMethods()
>> Then I have DgraphTripleTable in `class StorageDgraphDB implements
>> StorageRDF`
>> And then wrap the StorageDgraphDB with ds= DatasetFactory.wrap(db);
>> Finally, I have FusekiServer.create().add("/ds",
>> ds).port(6384).build().start()
>> I have a test file in src/test/java/com/jena/app/Inf.java
>> There are three methods named main tdbmain memoryMain
>> for different storage backend: Dgraph 、 TDB、 Memory
>> Could you check my code and give me some advice to help me make it run?
>> Thank you very much.
>>> On Jan 12, 2022, at 9:45 PM, Andy Seaborne <[email protected]> wrote:
>>>
>>>
>>> On 12/01/2022 09:10, brain wrote:
>>>> Hello,
>>>> I need some help.
>>>> With this guide,
>>>> https://jena.apache.org/documentation/fuseki2/fuseki-embedded.html,
>>>> <https://jena.apache.org/documentation/fuseki2/fuseki-embedded.html,>
>>>> I created an embeded-fuseki server to provide a SPARQL service .
>>>> I make an implemention of `org.apache.jena.dboe.storage.StorageRDF`
>>>> interface in Java, so I can store rdf triples with my own storage
>>>> engine(a distributed database). And it works. I can query the rdfs with
>>>> SPARQL.
>>>> However, I have some problems.
>>>> When I try to change my model to a `InfGraph` , the reasoner can't
>>>> works. It must be some bugs in my code. but I can't find it .
>>>> Is there any guide or something else to help me fix the bug .
>>>> Our data stored in a distributed database. We want to do SPARQL Query
>>>> and Inference.
>>>> Thanks
>>>
>>> Hi there,
>>>
>>> Could you give some details of your setup?
>>>
>>> + How do you query with RDFS?
>>>
>>> + What level of inferencing are you setting for the InfGraph?
>>>
>>> + Are you using an assembler or setting up the InGraph with code?
>>>
>>> If it is RDFS you are wanting, there's a different approach that might work
>>> better for you:
>>>
>>> https://jena.apache.org/documentation/rdfs/
>>>
>>> This is fixed schema, data-centric (so it is not full RDFS reasoning -
>>> there are no axiomatic triples, and iyt assumes that vocabulary like
>>> subproperty or subclass isn't being subproperty'ed.)
>>>
>>> But keeps no in-memory state from the data itself so it scales and you can
>>> directly update the data and see new inferred triples.
>>>
>>> Andy
>>>
>