oh, what stupidity, I'm sorry. I confused some mails! I hope you've done some
grinning on this, at least.
Chantal
Am Montag, 21. Januar 2002 08:36 schrieben Sie:
> hello Harun,
>
> if your often doing searching, maybe you'd like to index all the files. Try
> out Lucene (Jakarta Project). It's a
search on the article body.
> For example : All the articles, whose body has the word 'Hello', or the
> sentence 'Hello Mr. President!'
>
>
> Note-1:
>
> XML file may reside either Operating System level, or in a XML-supporting
> DATABASE, as well.
>
hello all,
I am still trying to find the best way to index a really big amount of data.
at the moment I am trying to index each of the 29 textfiles in a single
thread using for each an own IndexWriter and an own directory where to place
the index. there are always six threads working the same
hi Ian, hi Winton, hi all,
sorry I meant heap size of 100Mb. I'm starting java with -Xmx100m. I'm not
setting -Xms.
For what I know now, I had a bug in my own code. still I don't understand
where these OutOfMemoryErrors came from. I will try to index again in one
thread without RAMDirectory
hi to all,
please help! I think I mixed my brain up already with this stuff...
I'm trying to index about 29 textfiles where the biggest one is ~700Mb and
the smallest ~300Mb. I achieved once to run the whole index, with a merge
factor = 10 and maxMergeDocs=1. This took more than 35 hours I
dear all,
we have a linguistics project running here and we
want to use lucene for the
information retrieval. rather then just searching
for specific terms we want
to build frequency lists and detect coocurrences
of terms.
what we need is some kind of the following
functionality (I will give
dear all,
we have a linguistics project running here and we
want to use lucene for the
information retrieval. rather then just searching
for specific terms we want
to build frequency lists and detect coocurrences
of terms.
what we need is some kind of the following
functionality (I will give