I got that announcement as well. last time I used bigowlim lib, it can not live with swiftowlim lib. I would rather wait for new bigowlim comes out or damyan addressed my problems encountered with his last patch for bigowlim.
Damyan Ognyanoff wrote: > > Ontotext is happy to announce the next release of SwiftOWLIM - v2.8.4 > > http://www.ontotext.com/owlim/ > > Major changes in version 2.8.4, as compared to version 2.8.3: > > * *Custom inference:* the TRREE rule compiler became part of the > distribution, which allows the usage of custom rule-sets for > inference (see section 6.3 of the System Documentation for more > on it). This way one can specify semantics which best fits the > concrete application in terms of expressivity and performance; > * *Command line parameters:* some of the parameters of SwiftOWLIM > can now be passed through the command line. In the previous > versions, those could have only been specified as SAIL > parameters in the system.conf file of Sesame or programmatically. > * *Minor fixes in the owl-max rule-set:* those allow for covering > some extra cases of A-Box reasoning and eliminate most of the > cases when B-Nodes have been generated. > * *Linux shell scripts:* Linux scripts have been added to the > distribution, which allow for controlling (start/stop) of a > standalone version of SwiftOWLIM and running tests. Such scripts > were available only for Windows in previous versions. > > > The next BETA version of BigOWLIM that incorporates the same new > features and improvements along with other fixes will be available soon. > > Damyan Ognyanoff, > Ontotext Lab. > Sirma Group Corp. > ------------------------------------------------------------------------ > > _______________________________________________ > OWLIM-discussion mailing list > [email protected] > http://ontotext.com/mailman/listinfo/owlim-discussion_ontotext.com > _______________________________________________ OWLIM-discussion mailing list [email protected] http://ontotext.com/mailman/listinfo/owlim-discussion_ontotext.com
