HI testn,

it gives performance improvement while optimizing the Index. 

Now i seprate the IndexStore on a daily basis.(ie) 
For Every Day it create a new Index store ,sep- 08-2007,sep-09-2007 like
wise it will minimize the size of the IndexStore.could you give me an idea
on how to open every day folders for every search.

Query I use here is,

9840836588 AND dateSc:[070901 TO 070910] 

07---->year (2007)
09---->month(september)
01----->day

i restrict for 15 days that it is possible to search 15 days record in my
application.at a time 10 users aare going to search every store.is there any
other better way to improve the search performance to avoid memory problem
as well as speed of the search.






testn wrote:
> 
> So did you see any improvement in performance?
> 
> Sebastin wrote:
>> 
>> It works finally .i use Lucene 2.2  in my application.thanks testn and
>> Mike
>> 
>> Michael McCandless-2 wrote:
>>> 
>>> 
>>> It sounds like there may be a Lucene version mismatch?  When Luke was
>>> used
>>> it was likely based on Lucene 2.2, but it sounds like an older version
>>> of
>>> Lucene is now being used to open the index?
>>> 
>>> Mike
>>> 
>>> "testn" <[EMAIL PROTECTED]> wrote:
>>>> 
>>>> Should the file be "segments_8" and "segments.gen"? Why is it
>>>> "Segment"?
>>>> The
>>>> case is different.
>>>> 
>>>> 
>>>> Sebastin wrote:
>>>> > 
>>>> > java.io.IoException:File Not Found- Segments  is the error message
>>>> > 
>>>> > testn wrote:
>>>> >> 
>>>> >> What is the error message? Probably Mike, Erick or Yonik can help
>>>> you
>>>> >> better on this since I'm no one in index area.
>>>> >> 
>>>> >> Sebastin wrote:
>>>> >>> 
>>>> >>> HI testn,
>>>> >>>              1.I optimize the Large Indexes of size 10 GB using
>>>> Luke.it
>>>> >>> optimize all the content into a single CFS file and it generates
>>>> >>> segments.gen and segments_8 file when i search the item it shows an
>>>> >>> error that segments file is not there.could you help me in this 
>>>> >>> 
>>>> >>> testn wrote:
>>>> >>>> 
>>>> >>>> 1. You can close the searcher once you're done. If you want to
>>>> reopen
>>>> >>>> the index, you can close and reopen only the updated 3 readers and
>>>> keep
>>>> >>>> the 2 old indexreaders and reuse it. It should reduce the time to
>>>> >>>> reopen it.
>>>> >>>> 2. Make sure that you optimize it every once in a while
>>>> >>>> 3. You might consider separating indices in separated storage and
>>>> use
>>>> >>>> ParallelReader
>>>> >>>> 
>>>> >>>> 
>>>> >>>> 
>>>> >>>> Sebastin wrote:
>>>> >>>>> 
>>>> >>>>> The problem in my pplication are as follows:
>>>> >>>>>                  1.I am not able to see the updated records in my
>>>> >>>>> index store because i instantiate 
>>>> >>>>> IndexReader and IndexSearcher class once that is in the first
>>>> >>>>> search.further searches use the same IndexReaders(5 Directories)
>>>> and
>>>> >>>>> IndexSearcher with different queries.
>>>> >>>>> 
>>>> >>>>>                 2.My search is very very slow First 2 Directories
>>>> of
>>>> >>>>> size 10 GB each which are having old index records and no update
>>>> in
>>>> >>>>> that remaining 3 Diretories are updated every second.
>>>> >>>>> 
>>>> >>>>>                 3.i am Indexing 20 million records per day so the
>>>> >>>>> Index store gets growing and it makes search very very slower.
>>>> >>>>>  
>>>> >>>>>                4.I am using searcherOne class as the global
>>>> >>>>> application helper class ,with the scope as APPLICATION it
>>>> consists of
>>>> >>>>> one IndexReader and IndexSearcher get set method which will hold
>>>> the
>>>> >>>>> IndexReader and IndexSearcher object after the First Search.it is
>>>> used
>>>> >>>>> for all other searches.
>>>> >>>>> 
>>>> >>>>>               5.I am using Lucene 2.2.0 version, in a WEB
>>>> Application
>>>> >>>>> which index 15 fields per document and Index 5 Fieds,store 10
>>>> Fields.i
>>>> >>>>> am not using any sort in my query.for a single query upto the
>>>> maximum
>>>> >>>>> it fetches 600 records from the index store(5 direcories)    
>>>> >>>>>                 
>>>> >>>>> 
>>>> >>>>> hossman wrote:
>>>> >>>>>> 
>>>> >>>>>> 
>>>> >>>>>> : I set IndexSearcher as the application Object after the first
>>>> >>>>>> search.
>>>> >>>>>>     ...
>>>> >>>>>> : how can i reconstruct the new IndexSearcher for every hour to
>>>> see
>>>> >>>>>> the
>>>> >>>>>> : updated records .
>>>> >>>>>> 
>>>> >>>>>> i'm confused ... my understanding based on the comments you made
>>>> >>>>>> below 
>>>> >>>>>> (in an earlier message) was that you already *were* constructing
>>>> a
>>>> >>>>>> new  
>>>> >>>>>> IndexSearcher once an hour -- but every time you do that, your
>>>> memory 
>>>> >>>>>> usage grows, and and that sometimes you got OOM Errors.
>>>> >>>>>> 
>>>> >>>>>> if that's not what you said, then i think you need to explain,
>>>> in
>>>> >>>>>> detail, 
>>>> >>>>>> in one message, exactly what your problem is.  And don't assume
>>>> we 
>>>> >>>>>> understand anything -- tell us *EVERYTHING* (like, for example,
>>>> what
>>>> >>>>>> the 
>>>> >>>>>> word "crore" means, how "searcherOne" is implemented, and the
>>>> answer
>>>> >>>>>> to 
>>>> >>>>>> the specfic question i asked in my last message: does your
>>>> >>>>>> application, 
>>>> >>>>>> contain anywhere in it, any code that will close anything
>>>> >>>>>> (IndexSearchers 
>>>> >>>>>> or IndexReaders) ?
>>>> >>>>>> 
>>>> >>>>>> 
>>>> >>>>>> : > : I use StandardAnalyzer.the records daily ranges from 5
>>>> crore to
>>>> >>>>>> 6 crore.
>>>> >>>>>> : > for
>>>> >>>>>> : > : every second i am updating my Index. i instantiate
>>>> >>>>>> IndexSearcher object
>>>> >>>>>> : > one
>>>> >>>>>> : > : time for all the searches. for an hour can i see the
>>>> updated
>>>> >>>>>> records in
>>>> >>>>>> : > the
>>>> >>>>>> : > : indexstore by reinstantiating IndexSearcher object.but the
>>>> >>>>>> problem when
>>>> >>>>>> : > i
>>>> >>>>>> : > : reinstantiate IndexSearcher ,my RAM memory gets
>>>> appended.is
>>>> >>>>>> there any
>>>> >>>>>> 
>>>> >>>>>> 
>>>> >>>>>> : > IndexSearcher are you explicitly closing both the old
>>>> >>>>>> IndexSearcher as 
>>>> >>>>>> : > well as all of 4 of those old IndexReaders and the
>>>> MultiReader?
>>>> >>>>>> 
>>>> >>>>>> 
>>>> >>>>>> 
>>>> >>>>>> 
>>>> >>>>>> -Hoss
>>>> >>>>>> 
>>>> >>>>>> 
>>>> >>>>>>
>>>> ---------------------------------------------------------------------
>>>> >>>>>> To unsubscribe, e-mail: [EMAIL PROTECTED]
>>>> >>>>>> For additional commands, e-mail:
>>>> [EMAIL PROTECTED]
>>>> >>>>>> 
>>>> >>>>>> 
>>>> >>>>>> 
>>>> >>>>> 
>>>> >>>>> 
>>>> >>>> 
>>>> >>>> 
>>>> >>> 
>>>> >>> 
>>>> >> 
>>>> >> 
>>>> > 
>>>> > 
>>>> 
>>>> -- 
>>>> View this message in context:
>>>> http://www.nabble.com/Java-Heap-Space--Out-Of-Memory-Error-tf4376803.html#a12655880
>>>> Sent from the Lucene - Java Users mailing list archive at Nabble.com.
>>>> 
>>>> 
>>>> ---------------------------------------------------------------------
>>>> To unsubscribe, e-mail: [EMAIL PROTECTED]
>>>> For additional commands, e-mail: [EMAIL PROTECTED]
>>>> 
>>> 
>>> ---------------------------------------------------------------------
>>> To unsubscribe, e-mail: [EMAIL PROTECTED]
>>> For additional commands, e-mail: [EMAIL PROTECTED]
>>> 
>>> 
>>> 
>> 
>> 
> 
> 

-- 
View this message in context: 
http://www.nabble.com/Java-Heap-Space--Out-Of-Memory-Error-tf4376803.html#a12687410
Sent from the Lucene - Java Users mailing list archive at Nabble.com.


---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

Reply via email to