On Wed, 2011-02-23 at 01:32 -0800, Amir Hossein Jadidinejad wrote:
>   
> Hi,
> I'm going to create an owl from 50,000 entities. This is a function
> which responsible for create or return the concept:
>     private OntClass getOrCreateClass(String cid) throws Exception {
>         String rui = ns + getCUI(cid);
>         OntClass concept = model.getOntClass(rui);
>         if (concept != null)
>             return concept;
>         else {
>             OntClass new_concept = model.createClass(rui);
>             // add comments
> 
>             String[] en_descs = getEnglishDescriptions(cid);
>             for (String en : en_descs)
>                 new_concept.addComment(en, "en");
> 
>             String[] fa_descs = getPersianDescriptions(cid);
>             for (String fa : fa_descs)
>                 new_concept.addComment(fa, "fa");
> 
>             // add synonyms
>             String[] syns = getSynonyms(cid).split("\\s*;\\s*");
>             for (String syn : syns) {
>                 new_concept.addProperty(hasSynonym, syn);
>             }
> 
>             // add English title
>             String en_title = getTitleEN(cid);
>             new_concept.addProperty(hasEnglishTitle, en_title);
> 
>             // add Persian title
>             String fa_title = getTitleFA(cid);
>             new_concept.addProperty(hasPersianTitle, fa_title);
> 
>             return new_concept;
>         }
>     }
> 
> Would you please give me some information about the performance of
> "model.getOntClass(rui)"?

It is quite a trivial function, the only work involved (other that a
create a few small java objects) is a test to see if there is a
statement in the model with the given URI as its subject. 

> It seems that when the processed concepts become more, the performance
> of the program effectively decline.

You haven't shown or explained enough of your code for us to know what
is going on (in particular, how was the model created?) but here's a
guess ...

... one possibility is that your OntModel is inference based, either
deliberately or because you specified no OntModelSpec in which case the
default is RDFS reasoning.  The trouble is that if you change a
inference model then the next query after that change has to perform
some inference. If you have any (explicit or implicit) deletes going on
(there are none in that code but you haven't shown us the calling code)
then the inference effectively starts over from scratch.

So if you are calling your getOrCreateClass in a tight loop then each
call to getOntClass will ask the inference machinery to catch up on the
batch of additions from the last time round the loop.

If that's the case then the answer is easy, use a no-inference OntModel
at least for this part of the code.

If that's not the problem then tell us more about how the OntModel is
being created and what else might be going on with the model between
calls to the above code.

Dave



Reply via email to