Thanks Dave!!
Just tell me what i can understand from your reply is correct or not.

By the solution provided by you, you mean to:
Firstly create a model reading it from file with inferencing enabled.
Second store the inferred model in the dataset while during the creation of
the dataset.

At last then read the model back from the dataset in OWL_MEM mode, which
results in OntModel with inferences, but without inferencing mode enabled.
Right?

If this is so! then Yes this can be useful to me. But I guess you missed a
point in my question that, I directly want to query the dataset (Eg.:
QueryExecution qexec = QueryExecutionFactory.create(**query, dataset); )
and not the OntModel read from the dataset.

Thanks once again!!


On Thu, Oct 17, 2013 at 1:18 PM, Dave Reynolds <[email protected]>wrote:

> For largely static data then the best way to mix inference and persistent
> storage in Jena is to load your data into memory, do the inference and
> store the results to the TDB store. Then when you want to use it open the
> TDB store as a plain OntModel (or Model) with no inference.
>
> A brute force version of this would be something like:
>
>         OntModel om = ModelFactory.**createOntologyModel(**
> OntModelSpec.OWL_MEM_MICRO_**RULE_INF);
>         FileManager.get().readModel( om, "myfile");
>
>         Dataset dataset = TDBFactory.createDataset("**mydir") ;
>         dataset.begin( ReadWrite.WRITE );
>         try {
>             dataset.getDefaultModel().add( om );
>         } finally {
>             dataset.end();
>         }
>
> Then when a future program wants to access the data:
>
>     OntModel om = ModelFactory.**createOntologyModel(
>          OntModelSpec.OWL_MEM, tdbdataset.getDefaultModel() );
>
> In some applications you only need to support a limited range of query
> patterns, in which case you can replace the add(om) stage by a more
> selective add of just the results you are going to be interested in.
>
> Dave
>
>
>
> On 17/10/13 08:24, Dibyanshu Jaiswal wrote:
>
>> Hi!
>> Yes you are right! Its true that the inferencing is done by the OntModel
>> and not the database. I tried to set the default model of the dataset by
>> using dataset.setDefaultModel(om), where om is a OntModel with
>> OntModelSpec.OWL_MEM_RULE_INF. But in this case the programme undergoes an
>> error.
>>
>> Can you please elaborate a bit on the solution which you say about writing
>> the whole OntModel to a Graph? The TDB I use is initialized (create) using
>> an owl file read locally, which stands to be the conceptual model for our
>> application and i dont want to modify that file, but at the same time want
>> to use inferencing.
>>
>>
>>
>>
>> On Fri, Oct 11, 2013 at 6:22 PM, Andy Seaborne <[email protected]> wrote:
>>
>>  Hi there,
>>>
>>> If you query the dataset directly, do you see any triples?  You will see
>>> different results if you query via the inference model.  The inference is
>>> not done in the database but in the inference engine associate with the
>>> OntModel.  The dtabase does not contain the deductions.
>>>
>>> You can store the inferred results to a database by writing the whole
>>> OntModel to a graph in the database.
>>>
>>>          Andy
>>>
>>>
>>> On 10/10/13 13:21, Dibyanshu Jaiswal wrote:
>>>
>>>  Here is a sample code of the above stated problem.
>>>>
>>>> public class ReadTDB {
>>>>
>>>>       public static void main(String[] args){
>>>>
>>>>       // open TDB dataset
>>>>       String directory = "./MyDatabases/OntologyTDB"; // My TDB created
>>>> beforehand
>>>>       Dataset dataset = TDBFactory.createDataset(****directory);
>>>>
>>>>       //Read Model from the tdb
>>>>       Model tdb = dataset.getDefaultModel();
>>>>
>>>>       // read the OntModel from the Model
>>>>       OntModel m = ModelFactory.****createOntologyModel(
>>>>
>>>> OntModelSpec.OWL_MEM,
>>>> tdb );
>>>>
>>>>       String sparqlQueryString=null;
>>>>
>>>>        sparqlQueryString = "SELECT ?s WHERE { ?s <
>>>> http://www.w3.org/2000/01/rdf-****schema#subClassOf<http://www.w3.org/2000/01/rdf-**schema#subClassOf>
>>>> <http://**www.w3.org/2000/01/rdf-schema#**subClassOf<http://www.w3.org/2000/01/rdf-schema#subClassOf>
>>>> >>
>>>> <
>>>> http://www.owl-ontologies.com/****unnamed.owl#ABCD<http://www.owl-ontologies.com/**unnamed.owl#ABCD>
>>>> <http://www.**owl-ontologies.com/unnamed.**owl#ABCD<http://www.owl-ontologies.com/unnamed.owl#ABCD>
>>>> >>
>>>> }";
>>>>
>>>>
>>>>       Query query = QueryFactory.create(****sparqlQueryString) ;
>>>>       QueryExecution qexec = QueryExecutionFactory.create(****query,
>>>> m) ;
>>>>
>>>>   //LINE
>>>> OF CONSIDERATION
>>>>       ResultSet results = qexec.execSelect() ;
>>>>
>>>>       ResultSetFormatter.out(****results) ;
>>>>
>>>>
>>>>       tdb.close();
>>>>       dataset.close();
>>>>
>>>>       }
>>>>
>>>> }
>>>>
>>>>
>>>>
>>>>
>>>> As per the above code snippet given, and the line marked as "LINE OF
>>>> CONSIDERATION", when i pass the OntModel m as the parameter the results
>>>> are
>>>> in accordence to the inference mechanisms (such as Transitive relations)
>>>> but if I change the parameter to dataset i.e.
>>>> QueryExecution qexec = QueryExecutionFactory.create(****query,
>>>> dataset) ;
>>>>
>>>> and hence execute the query, the results are not the same.
>>>> As per my observation the a Query made to the Dataset/TDB directly is
>>>> unable to provide inference mechanisms as provided by the OntModel, even
>>>> when the TDB creation is done as follows:
>>>>
>>>> public static OntModel createTDBFromOWL(){
>>>>
>>>>           Dataset dataset =
>>>> TDBFactory.createDataset("./****MyDatabases/OntologyTDB") ;
>>>>           Model m = dataset.getDefaultModel();
>>>>           OntModel om =ModelFactory.****createOntologyModel(
>>>>
>>>> OntModelSpec.OWL_MEM_RULE_INF, m );
>>>>           FileManager.get().readModel( om,"./OWLs/MyOWLfile.owl");
>>>>           return om;
>>>>
>>>>       }
>>>>
>>>>
>>>
>>>
>>>
>>>
>>>>
>>>>
>>>> Is there some way to create a Dataset object, which is Inference
>>>> enabled,
>>>> similar to creation of an OntModel like:
>>>>     OntModel om =ModelFactory.****createOntologyModel(
>>>>
>>>> OntModelSpec.OWL_MEM_RULE_INF, m );
>>>> so that the dataset supports inferenceing mechanisms?
>>>>
>>>>
>>>> On Thu, Oct 10, 2013 at 3:51 PM, Andy Seaborne <[email protected]> wrote:
>>>>
>>>>   On 10/10/13 10:12, Dibyanshu Jaiswal wrote:
>>>>
>>>>>
>>>>>   Hi!
>>>>>
>>>>>>
>>>>>> I am new to semantic web technologies, and have started with RDF/OWL
>>>>>> for
>>>>>> making web applications.
>>>>>> Currently i have a requirement for accessing a Ontology (OntModel with
>>>>>> OWLModelSpec.OWL_MEM_RULE_INF) from an OWL file.I am also able to
>>>>>> Store
>>>>>> in
>>>>>> local TDB, all done by JENA 2.11.0. Thanks to the Nice API and
>>>>>> Tutorial
>>>>>> provided for the same.
>>>>>> I need to fire SPARQL queries on the model to get some fruitful
>>>>>> results.
>>>>>> Once the TDB is created, in order to fire search query on the same,
>>>>>> the
>>>>>> results are not as expected.
>>>>>>
>>>>>> As per my SPARQL qurey, when directly made to the TDB Dataset, does
>>>>>> not
>>>>>> returns results (i.e. results with inference rules) accordingly. Where
>>>>>> as
>>>>>> If the same query is fired on the OntModel (loaded from the TDB, with
>>>>>> OWLModelSpec.OWL_MEM_RULE_INF ) itself the results are found to be as
>>>>>> expected.
>>>>>>
>>>>>>
>>>>>>  Generally, showing the details of what you are doing makes it easier
>>>>> for
>>>>> people to provide answers.  The details matter :-)
>>>>>
>>>>>
>>>>>
>>>>>   How do I solve the problem of making queries directly to the Dataset
>>>>> and
>>>>>
>>>>>> not to OntModel with inference rule enabled?
>>>>>>
>>>>>>
>>>>>>  Inference is a characteristic of the model (in RDF inference is
>>>>> within
>>>>> models/graphs, not between graphs).
>>>>>
>>>>> You need to create an ont model backed by a graph from the TDB store.
>>>>>
>>>>>           Andy
>>>>>
>>>>>
>>>>>    Please help!!
>>>>>
>>>>>  Thanks in Advance!!
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>
>>>>
>>>>
>>>
>>
>>
>


-- 
*Dibyanshu Jaiswal*
Mb: +91 9038304989
Mb: +91 9674272265

Reply via email to