On Fri, Oct 22, 2010 at 11:18 AM, Lance Norskog <goks...@gmail.com> wrote:
> There is an API now for dynamically loading, unloading, creating and
> deleting cores.
> Restarting a Solr with thousands of cores will take, I don't know, hours.
>
Is this in the trunk? Any docs available?
> On Thu, Oct 21, 2010 at 10:44 PM, Tharindu Mathew <mcclou...@gmail.com> wrote:
>> Hi Mike,
>>
>> I've also considered using a separate cores in a multi tenant
>> application, ie a separate core for each tenant/domain. But the cores
>> do not suit that purpose.
>>
>> If you check out documentation no real API support exists for this so
>> it can be done dynamically through SolrJ. And all use cases I found,
>> only had users configuring it statically and then using it. That was
>> maybe 2 or 3 cores. Please correct me if I'm wrong Solr folks.
>>
>> So your better off using a single index and with a user id and use a
>> query filter with the user id when fetching data.
>>
>> On Fri, Oct 22, 2010 at 1:12 AM, Jonathan Rochkind <rochk...@jhu.edu> wrote:
>>> No, it does not seem reasonable.  Why do you think you need a seperate core
>>> for every user?
>>> mike anderson wrote:
>>>>
>>>> I'm exploring the possibility of using cores as a solution to "bookmark
>>>> folders" in my solr application. This would mean I'll need tens of
>>>> thousands
>>>> of cores... does this seem reasonable? I have plenty of CPUs available for
>>>> scaling, but I wonder about the memory overhead of adding cores (aside
>>>> from
>>>> needing to fit the new index in memory).
>>>>
>>>> Thoughts?
>>>>
>>>> -mike
>>>>
>>>>
>>>
>>
>>
>>
>> --
>> Regards,
>>
>> Tharindu
>>
>
>
>
> --
> Lance Norskog
> goks...@gmail.com
>



-- 
Regards,

Tharindu

Reply via email to