You may consider incrementally adding documents to
your index; I'm not sure why there would be problems
adding to an existing index, but you can always add
additional documents.  You can optimize later to get
everything back into a single segment.

Querying is a different story; if you are using the
Sort API, you will need enough memory to store a full
sorting of your documents in memory.  If you're trying
to sort on a string or anything other than an int or
float, this could require a lot of memory.

I've used indices much bigger than 5 mil. docs/3.5 gb
with less than 4GB of RAM and had no problems.

Greg


--- Leon Chaddock <[EMAIL PROTECTED]> wrote:

> Hi,
> we are having tremendous problems building a large
> lucene index and querying 
> it.
> 
> The programmers are telling me that when the index
> file reaches 3.5 gb or 5 
> million docs the index file can no longer grow any
> larger.
> 
> To rectify this they have built index files in
> multiple directories. Now 
> apparently my 4gb memory is not enough to query.
> 
> Does this seem right to people or does anyone have
> any experience on largish 
> scale projects.
> 
> I am completely tearing my hair out here and dont
> know what to do.
> 
> Thanks
> 


__________________________________________________
Do You Yahoo!?
Tired of spam?  Yahoo! Mail has the best spam protection around 
http://mail.yahoo.com 

---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

Reply via email to