I'm considering Jess for a software project that reasons about large- ish data sets of small objects. Perhaps people could comment on whether they think JESS is a suitable engine.

From what I know of CLIPS, the rule based system would be a very good way to transform the data sets in terms of the scheme of the data and the processes I'm undertaking. Until I run the software I don't know the exact amount of data that will be produced (that's half the point of the exercise!) but I estimate that it'll have of the order of a hundred thousand (upwards) small facts and a small number of rules. I'm not carrying out any massively complicated inference and the combinatorics of the rules themselves aren't explosive. But the size of the data may be prohibitive?

So my question is, what's the ball-park figure for the amount of data that JESS can store? Does it cache to disc? My data-set will possibly fit in memory all at once, but I wouldn't want to rely on that.

Thanks!

Joe

--------------------------------------------------------------------
To unsubscribe, send the words 'unsubscribe jess-users [EMAIL PROTECTED]'
in the BODY of a message to [EMAIL PROTECTED], NOT to the list
(use your own address!) List problems? Notify [EMAIL PROTECTED]
--------------------------------------------------------------------

Reply via email to