Re: multicore vs multi collection

2013-03-28 Thread Jack Krupansky

Unable? In what way?

Did you look at the Solr "example"?

Did you look at solr.xml?

Did you see the  element? (Needs to be one per core/collection.)

Did you see the "multicore" directory in the example?

Did you look at the solr.xml file in multicore?

Did you see how there are separate directories for each collection/core in 
multicore?


Did you see how there is a  element in solr.xml in multicore, one for 
each collection directory (instance)?


Did you try setting up your own test directory parallel to multicore in 
example?


Did you read the README.txt files in the Solr example directories?

Did you see the command to start Solr with a specific Solr "home" 
directory? -


   java -Dsolr.solr.home=multicore -jar start.jar

Did you try that for your own test solr home directory created above?

So... what exactly was the problem you were encountering? Be specific.

My guess is that you simply need to re-read the README.txt files more 
carefully in the Solr "example" directories.


If you have questions about what the README.txt files say, please ask them, 
but please be specific.


-- Jack Krupansky

-Original Message- 
From: hupadhyay

Sent: Thursday, March 28, 2013 5:35 AM
To: solr-user@lucene.apache.org
Subject: Re: multicore vs multi collection

Does that means i can create multiple collections with different
configurations ?
can you please outline basic steps to create multiple collections,cause i am
not able to
create them on solr 4.0



--
View this message in context: 
http://lucene.472066.n3.nabble.com/multicore-vs-multi-collection-tp4051352p4052002.html
Sent from the Solr - User mailing list archive at Nabble.com. 



Re: multicore vs multi collection

2013-03-28 Thread hupadhyay
Does that means i can create multiple collections with different
configurations ?
can you please outline basic steps to create multiple collections,cause i am
not able to 
create them on solr 4.0



--
View this message in context: 
http://lucene.472066.n3.nabble.com/multicore-vs-multi-collection-tp4051352p4052002.html
Sent from the Solr - User mailing list archive at Nabble.com.


Re: multicore vs multi collection

2013-03-26 Thread Furkan KAMACI
Also from there http://wiki.apache.org/solr/SolrCloud:

*Q:* What is the difference between a Collection and a
SolrCore?

*A:* In classic single node Solr, a
SolrCoreis basically equivalent
to a Collection. It presents one logical index. In
SolrCloud, the SolrCore 's on
multiple nodes form a Collection. This is still just one logical index, but
multiple SolrCores  host different
'shards' of the full collection. So a
SolrCoreencapsulates a single
physical index on an instance. A Collection is a
combination of all of the SolrCores
that together provide a logical
index that is distributed across many
nodes.

2013/3/26 J Mohamed Zahoor 

> Thanks.
>
> This make it clear than the wiki.
>
> How do you create multiple collection which can have different schema?
>
> ./zahoor
>
> On 26-Mar-2013, at 3:52 PM, Furkan KAMACI  wrote:
>
> > Did you check that document:
> >
> http://wiki.apache.org/solr/SolrCloud#A_little_about_SolrCores_and_CollectionsIt
> > says:
> > On a single instance, Solr has something called a
> > SolrCorethat is essentially a
> > single index. If you want multiple indexes, you
> > create multiple SolrCores . With
> > SolrCloud, a single index can span multiple Solr instances. This means
> that
> > a single index can be made up of multiple
> > SolrCore's
> > on different machines. We call all of these
> > SolrCoresthat make up one
> > logical index a collection. A collection is a essentially
> > a single index that spans many
> > SolrCore's,
> > both for index scaling as well as redundancy. If you wanted to move your
> 2
> > SolrCore  Solr setup to SolrCloud,
> > you would have 2 collections, each made up of multiple individual
> > SolrCores.
> >
> >
> > 2013/3/26 J Mohamed Zahoor 
> >
> >> Hi
> >>
> >> I am kind of confuzed between multi core and multi collection.
> >> Docs dont seem to clarify this.. can someone enlighten me what is ther
> >> difference between a core and a collection?
> >> Are they same?
> >>
> >> ./zahoor
>
>


Re: multicore vs multi collection

2013-03-26 Thread J Mohamed Zahoor
Thanks.

This make it clear than the wiki.

How do you create multiple collection which can have different schema?

./zahoor

On 26-Mar-2013, at 3:52 PM, Furkan KAMACI  wrote:

> Did you check that document:
> http://wiki.apache.org/solr/SolrCloud#A_little_about_SolrCores_and_CollectionsIt
> says:
> On a single instance, Solr has something called a
> SolrCorethat is essentially a
> single index. If you want multiple indexes, you
> create multiple SolrCores . With
> SolrCloud, a single index can span multiple Solr instances. This means that
> a single index can be made up of multiple
> SolrCore's
> on different machines. We call all of these
> SolrCoresthat make up one
> logical index a collection. A collection is a essentially
> a single index that spans many
> SolrCore's,
> both for index scaling as well as redundancy. If you wanted to move your 2
> SolrCore  Solr setup to SolrCloud,
> you would have 2 collections, each made up of multiple individual
> SolrCores.
> 
> 
> 2013/3/26 J Mohamed Zahoor 
> 
>> Hi
>> 
>> I am kind of confuzed between multi core and multi collection.
>> Docs dont seem to clarify this.. can someone enlighten me what is ther
>> difference between a core and a collection?
>> Are they same?
>> 
>> ./zahoor



Re: multicore vs multi collection

2013-03-26 Thread Furkan KAMACI
Did you check that document:
http://wiki.apache.org/solr/SolrCloud#A_little_about_SolrCores_and_CollectionsIt
says:
On a single instance, Solr has something called a
SolrCorethat is essentially a
single index. If you want multiple indexes, you
create multiple SolrCores . With
SolrCloud, a single index can span multiple Solr instances. This means that
a single index can be made up of multiple
SolrCore's
on different machines. We call all of these
SolrCoresthat make up one
logical index a collection. A collection is a essentially
a single index that spans many
SolrCore's,
both for index scaling as well as redundancy. If you wanted to move your 2
SolrCore  Solr setup to SolrCloud,
you would have 2 collections, each made up of multiple individual
SolrCores.


2013/3/26 J Mohamed Zahoor 

> Hi
>
> I am kind of confuzed between multi core and multi collection.
> Docs dont seem to clarify this.. can someone enlighten me what is ther
> difference between a core and a collection?
> Are they same?
>
> ./zahoor


Re: Multicore Master - Slave - solr 3.6.1

2013-02-27 Thread Michael Della Bitta
On Wed, Feb 27, 2013 at 7:01 AM, Sujatha Arun  wrote:
> 1) Added the properties as name value pairs in the solr.xml  - *But these
> values are lost on Server Restart*

This is how you do it in my experience. Just make sure
persistent="true" is set, and don't edit the file while the server is
running...


Michael Della Bitta


Appinions
18 East 41st Street, 2nd Floor
New York, NY 10017-6271

www.appinions.com

Where Influence Isn’t a Game


Re: Multicore search with ManifoldCF security not working

2013-02-05 Thread Ahmet Arslan

Hello,

Aha so you are using nabble. Please follow the instructions described here : 
http://manifoldcf.apache.org/en_US/mail.html

And subscribe 'ManifoldCF User Mailing List' and send your question there.

Ahmet
--- On Mon, 1/28/13, eShard  wrote:

> From: eShard 
> Subject: Re: Multicore search with ManifoldCF security not working
> To: solr-user@lucene.apache.org
> Date: Monday, January 28, 2013, 8:26 PM
> I'm sorry, I don't know what you
> mean.
> I clicked on the hidden email link, filled out the form and
> when I hit
> submit; 
> I got this error:
> Domain starts with dot
> Please fix the error and try again.
> 
> Who exactly am I sending this to and how do I get the form
> to work?
> 
> 
> 
> --
> View this message in context: 
> http://lucene.472066.n3.nabble.com/Multicore-search-with-ManifoldCF-security-not-working-tp4036776p4036829.html
> Sent from the Solr - User mailing list archive at
> Nabble.com.
> 


Re: Multicore search with ManifoldCF security not working

2013-01-28 Thread eShard
I'm sorry, I don't know what you mean.
I clicked on the hidden email link, filled out the form and when I hit
submit; 
I got this error:
Domain starts with dot
Please fix the error and try again.

Who exactly am I sending this to and how do I get the form to work?



--
View this message in context: 
http://lucene.472066.n3.nabble.com/Multicore-search-with-ManifoldCF-security-not-working-tp4036776p4036829.html
Sent from the Solr - User mailing list archive at Nabble.com.


Re: Multicore search with ManifoldCF security not working

2013-01-28 Thread Ahmet Arslan
Hello,

Can you post this question to u...@manifoldcf.apache.org too?



--- On Mon, 1/28/13, eShard  wrote:

> From: eShard 
> Subject: Multicore search with ManifoldCF security not working
> To: solr-user@lucene.apache.org
> Date: Monday, January 28, 2013, 6:16 PM
> Good morning,
> I used this post here to join to search 2 different cores
> and return one
> data set.
> http://stackoverflow.com/questions/2139030/search-multiple-solr-cores-and-return-one-result-set
> The good news is that it worked!
> The bad news is that one of the cores is Opentext and the
> ManifoldCF
> security check isn't firing!
> So users could see documents that they aren't supposed to.
> The opentext security works if I call the core handler
> individually. it
> fails for the merged result.
> I need to find a way to get the AuthenticatedUserName
> parameter to the
> opentext core.
> Here's my /query handler for the merged result
>    class="solr.SearchHandler">
>   
>   
>     *:*
> 
>     id, attr_general_name,
> attr_general_owner,
> attr_general_creator, attr_general_modifier,
> attr_general_description,
> attr_general_creationdate, attr_general_modifydate,
> solr.title, 
>     content, category, link, pubdateiso
>     
>      name="shards">localhost:8080/solr/opentext/,localhost:8080/solr/Profiles/
>    
>          name="last-components">
>        
> manifoldCFSecurity
>       
>   
> 
> As you can see, I tried calling manifoldCFSecurity first and
> it didn't work. 
> I was thinking perhaps I can call the shards directly in the
> URL and put the
> AuthenticatedUserName on the opentext shard but I'm getting
> pulled in
> different directions currently.
> 
> Can anyone point me in the right direction?
> Thanks,
> 
> 
> 
> 
> 
> 
> --
> View this message in context: 
> http://lucene.472066.n3.nabble.com/Multicore-search-with-ManifoldCF-security-not-working-tp4036776.html
> Sent from the Solr - User mailing list archive at
> Nabble.com.
>


Re: Multicore configuration

2013-01-15 Thread Bruno Dusausoy

Dariusz Borowski a écrit :

Hi Bruno,

Maybe this helps. I wrote something about it:
http://www.coderthing.com/solr-with-multicore-and-database-hook-part-1/


Hi Darius,

Thanks for the link.
I've found my - terrible - mistake : solr.xml was not in solr.home dir 
but in solr.home/conf dir, so it didn't take it :-/

It works perfectly now.

Sorry for the noise.

Regards.
--
Bruno Dusausoy
Software Engineer
YP5 Software
--
Pensez environnement : limitez l'impression de ce mail.
Please don't print this e-mail unless you really need to.


Re: Multicore configuration

2013-01-15 Thread Upayavira
You should put your solr.xml into your 'cores' directory, and set
-Dsolr.solr.home=cores

That should get you going. 'cores' *is* your Solr Home. Otherwise, your
instanceDir entries in your current solr.xml will need correct paths to
../cores/procedure/ etc.

Upayavira

On Tue, Jan 15, 2013, at 08:52 AM, Bruno Dusausoy wrote:
> Hi,
> 
> I'd like to use two separate indexes (Solr 3.6.1).
> I've read several wiki pages and looked at the multicore example bundled 
> with the distribution but it seems I missing something.
> 
> 
> I have this hierarchy :
> solr-home/
> |
> -- conf
>|
> -- solr.xml
> -- solrconfig.xml (if I don't put it, solr complains)
> -- schema.xml (idem)
> -- ...
> |
> -- cores
>|
>-- dossier
>  |
>   -- conf
> |
>  -- dataconfig.xml
>  -- schema.xml
>  -- solrconfig.xml
>  |
>   -- data
>|
>-- procedure
>  |
>   -- conf
> |
>  -- dataconfig.xml
>  -- schema.xml
>  -- solrconfig.xml
>  |
>   -- data
> 
> Here's the content of my solr.xml file :
> http://paste.debian.net/224818/
> 
> And I launch my servlet container with 
> -Dsolr.solr.home=my-directory/solr-home.
> 
> I've put nearly nothing in my solr-home/conf/schema.xml so Solr 
> complains, but that's not the point.
> 
> When I go to the admin of core "dossier",
> http://localhost:8080/solr/dossier/admin, the container says it doesn't 
> exist.
> But when I go to http://localhost:8080/solr/admin it finds it, which 
> makes me guess that Solr is stil in "single core" mode.
> 
> What am I missing ?
> 
> Regards.
> -- 
> Bruno Dusausoy
> Software Engineer
> YP5 Software
> --
> Pensez environnement : limitez l'impression de ce mail.
> Please don't print this e-mail unless you really need to.


Re: Multicore configuration

2013-01-15 Thread Dariusz Borowski
Hi Bruno,

Maybe this helps. I wrote something about it:
http://www.coderthing.com/solr-with-multicore-and-database-hook-part-1/

Dariusz



On Tue, Jan 15, 2013 at 9:52 AM, Bruno Dusausoy  wrote:

> Hi,
>
> I'd like to use two separate indexes (Solr 3.6.1).
> I've read several wiki pages and looked at the multicore example bundled
> with the distribution but it seems I missing something.
>
>
> I have this hierarchy :
> solr-home/
> |
> -- conf
>   |
>-- solr.xml
>-- solrconfig.xml (if I don't put it, solr complains)
>-- schema.xml (idem)
>-- ...
> |
> -- cores
>   |
>   -- dossier
> |
>  -- conf
>|
> -- dataconfig.xml
> -- schema.xml
> -- solrconfig.xml
> |
>  -- data
>   |
>   -- procedure
> |
>  -- conf
>|
> -- dataconfig.xml
> -- schema.xml
> -- solrconfig.xml
> |
>  -- data
>
> Here's the content of my solr.xml file :
> http://paste.debian.net/**224818/ 
>
> And I launch my servlet container with -Dsolr.solr.home=my-directory/**
> solr-home.
>
> I've put nearly nothing in my solr-home/conf/schema.xml so Solr complains,
> but that's not the point.
>
> When I go to the admin of core "dossier",
> http://localhost:8080/solr/**dossier/admin,
> the container says it doesn't exist.
> But when I go to 
> http://localhost:8080/solr/**adminit finds 
> it, which makes me guess that Solr is stil in "single core" mode.
>
> What am I missing ?
>
> Regards.
> --
> Bruno Dusausoy
> Software Engineer
> YP5 Software
> --
> Pensez environnement : limitez l'impression de ce mail.
> Please don't print this e-mail unless you really need to.
>


Re: Multicore setup is ignored when deploying solr.war on Tomcat 5/6/7

2012-10-20 Thread Rogerio Pereira
Here`s the catalina.out contents:

Out 20, 2012 12:55:58 PM org.apache.solr.core.SolrResourceLoader
locateSolrHome
INFO: using system property solr.solr.home: /home/rogerio/Dados/salutisvitae
Out 20, 2012 12:55:58 PM org.apache.solr.core.SolrResourceLoader 
INFO: new SolrResourceLoader for deduced Solr Home:
'/home/rogerio/Dados/salutisvitae/'
Out 20, 2012 12:55:58 PM org.apache.solr.servlet.SolrDispatchFilter init
INFO: SolrDispatchFilter.init()
Out 20, 2012 12:55:58 PM org.apache.solr.core.SolrResourceLoader
locateSolrHome
INFO: No /solr/home in JNDI
Out 20, 2012 12:55:58 PM org.apache.solr.core.SolrResourceLoader
locateSolrHome
INFO: using system property solr.solr.home: /home/rogerio/Dados/salutisvitae
Out 20, 2012 12:55:58 PM org.apache.solr.core.CoreContainer$Initializer
initialize
INFO: looking for solr.xml: /home/rogerio/Dados/salutisvitae/solr.xml
Out 20, 2012 12:55:58 PM org.apache.solr.core.CoreContainer 
INFO: New CoreContainer 1806276996

/home/rogerio/Dados/salutisvitae really exists and has two core dirs,
collection1 and collection2, but only collection1 is initialized as we can
see below:

INFO: unique key field: id
Out 20, 2012 12:56:29 PM org.apache.solr.core.SolrCore 
INFO: [collection1] Opening new SolrCore at
/home/rogerio/Dados/salutisvitae/collection1/,
dataDir=/home/rogerio/Dados/salutisvitae/collection1/data/
Out 20, 2012 12:56:29 PM org.apache.solr.core.SolrCore 
INFO: JMX monitoring not detected for core: collection1
Out 20, 2012 12:56:29 PM org.apache.solr.core.SolrCore getNewIndexDir
WARNING: New index directory detected: old=null
new=/home/rogerio/Dados/salutisvitae/collection1/data/index/
Out 20, 2012 12:56:29 PM org.apache.solr.core.CachingDirectoryFactory get
INFO: return new directory for
/home/rogerio/Dados/salutisvitae/collection1/data/index forceNew:false

No more cores are initialized after collection1.

Note, I`m just making a simple copy of multicore example
to /home/rogerio/Dados/salutisvitae and renaming core1 to collection1,
copying collection1 to collection2 and doing the configuration changes on
solrconfig.xml, and to set the path above I`m using the solr.solr.home
system property with solr admin deployed on tomcat from solr.war

I`m getting the same strange behavior on both Xubuntu 10.04 and Ubuntu 12.10

2012/10/16 Chris Hostetter 

> : To answer your question, I tried both -Dsolr.solr.home and solr/home JNDI
> : variable, in both cases I got the same result.
> :
> : I checked the logs several times, solr always only loads up the
> collection1,
>
> That doesn't really answer any of the questions i was asking you.
>
> *Before* solr logs anything about loading collection1, it will log
> information about how/where it is locating the solr home dir and
> solr.xml
>
> : if you look at the logging when solr first starts up, you should ese
> : several messages about how/where it's trying to locate the Solr Home Dir
> : ... please double check that it's finding the one you intended.
> :
> : Please give us more details about those log messages related to the solr
> : home dir, as well as how you are trying to set it, and what your
> directory
> : structure looks like in tomcat.
>
> For example, this is what Solr logs if it can't detect either the system
> property, or JNDI, and is assuming it should use "./solr" ...
>
> Oct 16, 2012 8:48:52 AM org.apache.solr.core.SolrResourceLoader
> locateSolrHome
> INFO: JNDI not configured for solr (NoInitialContextEx)
> Oct 16, 2012 8:48:52 AM org.apache.solr.core.SolrResourceLoader
> locateSolrHome
> INFO: solr home defaulted to 'solr/' (could not find system property or
> JNDI)
> Oct 16, 2012 8:48:52 AM org.apache.solr.core.SolrResourceLoader 
> INFO: new SolrResourceLoader for deduced Solr Home: 'solr/'
> Oct 16, 2012 8:48:53 AM org.apache.solr.servlet.SolrDispatchFilter init
> INFO: SolrDispatchFilter.init()
> Oct 16, 2012 8:48:53 AM org.apache.solr.core.SolrResourceLoader
> locateSolrHome
> INFO: JNDI not configured for solr (NoInitialContextEx)
> Oct 16, 2012 8:48:53 AM org.apache.solr.core.SolrResourceLoader
> locateSolrHome
> INFO: solr home defaulted to 'solr/' (could not find system property or
> JNDI)
> Oct 16, 2012 8:48:53 AM org.apache.solr.core.CoreContainer$Initializer
> initialize
> INFO: looking for solr.xml:
> /home/hossman/lucene/dev/solr/example/solr/solr.xml
>
> What do your startup logs look like as far as finding the solr home dir?
>
> because my suspicion is that the reason it's not loading your
> multicore setup, or complaining about malformed xml in your solr.xml
> file, is because it's not fiding the directory you want at all.
>
>
>
> -Hoss
>



-- 
Regards,

Rogério Pereira Araújo

Blogs: http://faces.eti.br, http://ararog.blogspot.com
Twitter: http://twitter.com/ararog
Skype: rogerio.araujo
MSN: ara...@hotmail.com
Gtalk/FaceTime: rogerio.ara...@gmail.com

(0xx62) 8240 7212
(0xx62) 3920 2666


Re: Multicore setup is ignored when deploying solr.war on Tomcat 5/6/7

2012-10-16 Thread Chris Hostetter
: To answer your question, I tried both -Dsolr.solr.home and solr/home JNDI
: variable, in both cases I got the same result.
: 
: I checked the logs several times, solr always only loads up the collection1,

That doesn't really answer any of the questions i was asking you.

*Before* solr logs anything about loading collection1, it will log 
information about how/where it is locating the solr home dir and 
solr.xml

: if you look at the logging when solr first starts up, you should ese
: several messages about how/where it's trying to locate the Solr Home Dir
: ... please double check that it's finding the one you intended.
: 
: Please give us more details about those log messages related to the solr
: home dir, as well as how you are trying to set it, and what your directory
: structure looks like in tomcat.

For example, this is what Solr logs if it can't detect either the system 
property, or JNDI, and is assuming it should use "./solr" ...

Oct 16, 2012 8:48:52 AM org.apache.solr.core.SolrResourceLoader locateSolrHome
INFO: JNDI not configured for solr (NoInitialContextEx)
Oct 16, 2012 8:48:52 AM org.apache.solr.core.SolrResourceLoader locateSolrHome
INFO: solr home defaulted to 'solr/' (could not find system property or JNDI)
Oct 16, 2012 8:48:52 AM org.apache.solr.core.SolrResourceLoader 
INFO: new SolrResourceLoader for deduced Solr Home: 'solr/'
Oct 16, 2012 8:48:53 AM org.apache.solr.servlet.SolrDispatchFilter init
INFO: SolrDispatchFilter.init()
Oct 16, 2012 8:48:53 AM org.apache.solr.core.SolrResourceLoader locateSolrHome
INFO: JNDI not configured for solr (NoInitialContextEx)
Oct 16, 2012 8:48:53 AM org.apache.solr.core.SolrResourceLoader locateSolrHome
INFO: solr home defaulted to 'solr/' (could not find system property or JNDI)
Oct 16, 2012 8:48:53 AM org.apache.solr.core.CoreContainer$Initializer 
initialize
INFO: looking for solr.xml: /home/hossman/lucene/dev/solr/example/solr/solr.xml

What do your startup logs look like as far as finding the solr home dir?

because my suspicion is that the reason it's not loading your 
multicore setup, or complaining about malformed xml in your solr.xml 
file, is because it's not fiding the directory you want at all.



-Hoss


Re: Multicore setup is ignored when deploying solr.war on Tomcat 5/6/7

2012-10-16 Thread Rogério Pereira Araújo

Hi Chris,

To answer your question, I tried both -Dsolr.solr.home and solr/home JNDI 
variable, in both cases I got the same result.


I checked the logs several times, solr always only loads up the collection1, 
if I rename the cores on solr.xml to anything else or add more cores, 
nothing happens.


Even if I put some garbage on solr.xml, by removing closing tags, no 
exception is generated.


I'm running Tomcat 7 and Solr 4 on Xubuntu 10.04, but I don't think the OS 
is the problem, I'll do the same test on other OSes.


-Mensagem Original- 
From: Chris Hostetter

Sent: Monday, October 15, 2012 5:38 PM
To: solr-user@lucene.apache.org ; rogerio.ara...@gmail.com
Subject: Re: Multicore setup is ignored when deploying solr.war on Tomcat 
5/6/7



: on Tomcat I setup the system property pointing to solr/home path,
: unfortunatelly when I start tomcat the solr.xml is ignored and only the

Please elaborate on how exactly you pointed tomcat at your solr/home.

you mentioned "system property" but when using system properties to set
the Solr Home" you wnat to set "solr.solr.home" .. "solr/home" is the JNDI
variable name used as an alternative.

if you look at the logging when solr first starts up, you should ese
several messages about how/where it's trying to locate the Solr Home Dir
... please double check that it's finding the one you intended.

Please give us more details about those log messages related to the solr
home dir, as well as how you are trying to set it, and what your directory
structure looks like in tomcat.

If you haven't seen it yet...

https://wiki.apache.org/solr/SolrTomcat



-Hoss 



Re: Multicore setup is ignored when deploying solr.war on Tomcat 5/6/7

2012-10-15 Thread Chris Hostetter

: on Tomcat I setup the system property pointing to solr/home path,
: unfortunatelly when I start tomcat the solr.xml is ignored and only the

Please elaborate on how exactly you pointed tomcat at your solr/home.

you mentioned "system property" but when using system properties to set 
the Solr Home" you wnat to set "solr.solr.home" .. "solr/home" is the JNDI 
variable name used as an alternative.

if you look at the logging when solr first starts up, you should ese 
several messages about how/where it's trying to locate the Solr Home Dir 
... please double check that it's finding the one you intended.

Please give us more details about those log messages related to the solr 
home dir, as well as how you are trying to set it, and what your directory 
structure looks like in tomcat.

If you haven't seen it yet...

https://wiki.apache.org/solr/SolrTomcat



-Hoss


Re: Multicore setup is ignored when deploying solr.war on Tomcat 5/6/7

2012-10-15 Thread Rogério Pereira Araújo

Hi Vadim,

In fact tomcat is running in another non standard path, there's no old 
version deployed on tomcat, I double checked it.


Let me try in another environment.

-Mensagem Original- 
From: Vadim Kisselmann

Sent: Monday, October 15, 2012 6:01 AM
To: solr-user@lucene.apache.org ; rogerio.ara...@gmail.com
Subject: Re: Multicore setup is ignored when deploying solr.war on Tomcat 
5/6/7


Hi Rogerio,
i can imagine what it is. Tomcat extract the war-files in
/var/lib/tomcatXX/webapps.
If you already run an older Solr-Version on your server, the old
extracted Solr-war could still be there (keyword: tomcat cache).
Delete the /var/lib/tomcatXX/webapps/solr - folder and restart tomcat,
when Tomcat should put your new war-file.
Best regards
Vadim



2012/10/14 Rogerio Pereira :

I'll try to be more specific Jack.

I just download the apache-solr-4.0.0.zip, from this archive I took the
core1 and core2 folders from multicore example and rename them to
collection1 and collection2, I also did all necessary changes on solr.xml
and solrconfig.xml and schema.xml on these two correct to reflect the new
names.

After this step I just tried to deploy and war file on tomcat pointing to
the the directory (solr/home) where these two cores are located, solr.xml
is there, with collection1 and collection2 properly configured.

The question is, now matter what is contained on solr.xml, this file isn't
read at Tomcat startup, I tried to cause a parser error on solr.xml by
removing closing tags, but even with this change I can't get at least a
parser error.

I hope to be clear now.


2012/10/14 Jack Krupansky 


I can't quite parse "the same multicore deployment as we have on apache
solr 4.0 distribution archive". Could you rephrase and be more specific.
What "archive"?

Were you already using 4.0-ALPHA or BETA (or some snapshot of 4.0) or are
you moving from pre-4.0 to 4.0? The directory structure did change in 
4.0.

Look at the example/solr directory.

-- Jack Krupansky

-Original Message- From: Rogerio Pereira
Sent: Sunday, October 14, 2012 10:01 AM
To: solr-user@lucene.apache.org
Subject: Multicore setup is ignored when deploying solr.war on Tomcat 
5/6/7



Hi,

I tried to perform the same multicore deployment as we have on apache 
solr
4.0 distribution archive, I created a directory for solr/home with 
solr.xml
inside and two subdirectories collection1 and collection2, these two 
cores
are properly configured with conf folder and solrconfi.xml and 
schema.xml,

on Tomcat I setup the system property pointing to solr/home path,
unfortunatelly when I start tomcat the solr.xml is ignored and only the
default collection1 is loaded.

As a test, I made changes on solr.xml to cause parser errors, and guess
what? These errors aren't reported on tomcat startup.

The same thing doesn't happens on multicore example that comes on
distribution archive, now I'm trying to figure out what's the black magic
happening.

Let me do the same kind of deployment on Windows and Mac OSX, if persist,
I'll update this thread.

Regards,

Rogério





--
Regards,

Rogério Pereira Araújo

Blogs: http://faces.eti.br, http://ararog.blogspot.com
Twitter: http://twitter.com/ararog
Skype: rogerio.araujo
MSN: ara...@hotmail.com
Gtalk/FaceTime: rogerio.ara...@gmail.com

(0xx62) 8240 7212
(0xx62) 3920 2666 




Re: Multicore setup is ignored when deploying solr.war on Tomcat 5/6/7

2012-10-15 Thread Vadim Kisselmann
Hi Rogerio,
i can imagine what it is. Tomcat extract the war-files in
/var/lib/tomcatXX/webapps.
If you already run an older Solr-Version on your server, the old
extracted Solr-war could still be there (keyword: tomcat cache).
Delete the /var/lib/tomcatXX/webapps/solr - folder and restart tomcat,
when Tomcat should put your new war-file.
Best regards
Vadim



2012/10/14 Rogerio Pereira :
> I'll try to be more specific Jack.
>
> I just download the apache-solr-4.0.0.zip, from this archive I took the
> core1 and core2 folders from multicore example and rename them to
> collection1 and collection2, I also did all necessary changes on solr.xml
> and solrconfig.xml and schema.xml on these two correct to reflect the new
> names.
>
> After this step I just tried to deploy and war file on tomcat pointing to
> the the directory (solr/home) where these two cores are located, solr.xml
> is there, with collection1 and collection2 properly configured.
>
> The question is, now matter what is contained on solr.xml, this file isn't
> read at Tomcat startup, I tried to cause a parser error on solr.xml by
> removing closing tags, but even with this change I can't get at least a
> parser error.
>
> I hope to be clear now.
>
>
> 2012/10/14 Jack Krupansky 
>
>> I can't quite parse "the same multicore deployment as we have on apache
>> solr 4.0 distribution archive". Could you rephrase and be more specific.
>> What "archive"?
>>
>> Were you already using 4.0-ALPHA or BETA (or some snapshot of 4.0) or are
>> you moving from pre-4.0 to 4.0? The directory structure did change in 4.0.
>> Look at the example/solr directory.
>>
>> -- Jack Krupansky
>>
>> -Original Message- From: Rogerio Pereira
>> Sent: Sunday, October 14, 2012 10:01 AM
>> To: solr-user@lucene.apache.org
>> Subject: Multicore setup is ignored when deploying solr.war on Tomcat 5/6/7
>>
>>
>> Hi,
>>
>> I tried to perform the same multicore deployment as we have on apache solr
>> 4.0 distribution archive, I created a directory for solr/home with solr.xml
>> inside and two subdirectories collection1 and collection2, these two cores
>> are properly configured with conf folder and solrconfi.xml and schema.xml,
>> on Tomcat I setup the system property pointing to solr/home path,
>> unfortunatelly when I start tomcat the solr.xml is ignored and only the
>> default collection1 is loaded.
>>
>> As a test, I made changes on solr.xml to cause parser errors, and guess
>> what? These errors aren't reported on tomcat startup.
>>
>> The same thing doesn't happens on multicore example that comes on
>> distribution archive, now I'm trying to figure out what's the black magic
>> happening.
>>
>> Let me do the same kind of deployment on Windows and Mac OSX, if persist,
>> I'll update this thread.
>>
>> Regards,
>>
>> Rogério
>>
>
>
>
> --
> Regards,
>
> Rogério Pereira Araújo
>
> Blogs: http://faces.eti.br, http://ararog.blogspot.com
> Twitter: http://twitter.com/ararog
> Skype: rogerio.araujo
> MSN: ara...@hotmail.com
> Gtalk/FaceTime: rogerio.ara...@gmail.com
>
> (0xx62) 8240 7212
> (0xx62) 3920 2666


Re: Multicore setup is ignored when deploying solr.war on Tomcat 5/6/7

2012-10-14 Thread Rogerio Pereira
I'll try to be more specific Jack.

I just download the apache-solr-4.0.0.zip, from this archive I took the
core1 and core2 folders from multicore example and rename them to
collection1 and collection2, I also did all necessary changes on solr.xml
and solrconfig.xml and schema.xml on these two correct to reflect the new
names.

After this step I just tried to deploy and war file on tomcat pointing to
the the directory (solr/home) where these two cores are located, solr.xml
is there, with collection1 and collection2 properly configured.

The question is, now matter what is contained on solr.xml, this file isn't
read at Tomcat startup, I tried to cause a parser error on solr.xml by
removing closing tags, but even with this change I can't get at least a
parser error.

I hope to be clear now.


2012/10/14 Jack Krupansky 

> I can't quite parse "the same multicore deployment as we have on apache
> solr 4.0 distribution archive". Could you rephrase and be more specific.
> What "archive"?
>
> Were you already using 4.0-ALPHA or BETA (or some snapshot of 4.0) or are
> you moving from pre-4.0 to 4.0? The directory structure did change in 4.0.
> Look at the example/solr directory.
>
> -- Jack Krupansky
>
> -Original Message- From: Rogerio Pereira
> Sent: Sunday, October 14, 2012 10:01 AM
> To: solr-user@lucene.apache.org
> Subject: Multicore setup is ignored when deploying solr.war on Tomcat 5/6/7
>
>
> Hi,
>
> I tried to perform the same multicore deployment as we have on apache solr
> 4.0 distribution archive, I created a directory for solr/home with solr.xml
> inside and two subdirectories collection1 and collection2, these two cores
> are properly configured with conf folder and solrconfi.xml and schema.xml,
> on Tomcat I setup the system property pointing to solr/home path,
> unfortunatelly when I start tomcat the solr.xml is ignored and only the
> default collection1 is loaded.
>
> As a test, I made changes on solr.xml to cause parser errors, and guess
> what? These errors aren't reported on tomcat startup.
>
> The same thing doesn't happens on multicore example that comes on
> distribution archive, now I'm trying to figure out what's the black magic
> happening.
>
> Let me do the same kind of deployment on Windows and Mac OSX, if persist,
> I'll update this thread.
>
> Regards,
>
> Rogério
>



-- 
Regards,

Rogério Pereira Araújo

Blogs: http://faces.eti.br, http://ararog.blogspot.com
Twitter: http://twitter.com/ararog
Skype: rogerio.araujo
MSN: ara...@hotmail.com
Gtalk/FaceTime: rogerio.ara...@gmail.com

(0xx62) 8240 7212
(0xx62) 3920 2666


Re: Multicore setup is ignored when deploying solr.war on Tomcat 5/6/7

2012-10-14 Thread Jack Krupansky
I can't quite parse "the same multicore deployment as we have on apache solr 
4.0 distribution archive". Could you rephrase and be more specific. What 
"archive"?


Were you already using 4.0-ALPHA or BETA (or some snapshot of 4.0) or are 
you moving from pre-4.0 to 4.0? The directory structure did change in 4.0. 
Look at the example/solr directory.


-- Jack Krupansky

-Original Message- 
From: Rogerio Pereira

Sent: Sunday, October 14, 2012 10:01 AM
To: solr-user@lucene.apache.org
Subject: Multicore setup is ignored when deploying solr.war on Tomcat 5/6/7

Hi,

I tried to perform the same multicore deployment as we have on apache solr
4.0 distribution archive, I created a directory for solr/home with solr.xml
inside and two subdirectories collection1 and collection2, these two cores
are properly configured with conf folder and solrconfi.xml and schema.xml,
on Tomcat I setup the system property pointing to solr/home path,
unfortunatelly when I start tomcat the solr.xml is ignored and only the
default collection1 is loaded.

As a test, I made changes on solr.xml to cause parser errors, and guess
what? These errors aren't reported on tomcat startup.

The same thing doesn't happens on multicore example that comes on
distribution archive, now I'm trying to figure out what's the black magic
happening.

Let me do the same kind of deployment on Windows and Mac OSX, if persist,
I'll update this thread.

Regards,

Rogério 



Re: Multicore admin problem in Websphere

2012-07-23 Thread kmsenthil
Hi,

I am currently looking for some information on how to host multiple SOLR
indexes on Websphere. I have this already working on tomcat. 

Do you have any documentation on how to set it up on websphere?

Thanks
Senthil



--
View this message in context: 
http://lucene.472066.n3.nabble.com/Multicore-admin-problem-in-Websphere-tp764471p3996691.html
Sent from the Solr - User mailing list archive at Nabble.com.


Re: Multicore master-slaver replication in Solr Cloud

2012-06-19 Thread Mark Miller

On Jun 19, 2012, at 9:59 AM, fabio curti wrote:

> Hi,
> i tried to set a Multicore master-slaver replication in Solr Cloud found in
> this post
> http://pulkitsinghal.blogspot.it/2011/09/multicore-master-slave-replication-in.html
> but
> i get the following problem
> 
> SEVERE: Error while trying to recover.
> org.apache.solr.client.solrj.SolrServerException: Server at
> http://myserver:8983/solr was not found (404).
> at
> org.apache.solr.client.solrj.impl.HttpSolrServer.request(HttpSolrServer.java:372)
> at
> org.apache.solr.client.solrj.impl.HttpSolrServer.request(HttpSolrServer.java:182)
> at
> org.apache.solr.cloud.RecoveryStrategy.sendPrepRecoveryCmd(RecoveryStrategy.java:192)
> at
> org.apache.solr.cloud.RecoveryStrategy.doRecovery(RecoveryStrategy.java:303)
> at org.apache.solr.cloud.RecoveryStrategy.run(RecoveryStrategy.java:213)
> Jun 19, 2012 3:17:49 PM org.apache.solr.cloud.RecoveryStrategy doRecovery
> SEVERE: Recovery failed - trying again...
> 
> The infrastructure will look like:
> 
>   - Solr-Instance-A
>  - master1 (indexes changes for shard1)
>  - slave1-master2 (replicates changes from shard2)
>  - slave2-master2 (replicates changes from shard2)
>   - Solr-Instance-B
>  - master2 (indexes changes for shard2)
>  - slave1-master1 (replicates changes from shard1)
>  - slave2-master1 (replicates changes from shard1)
> 
> 
> Any idea?


You don't want to explicitly setup master - slave replication when using 
solrcloud. Just define an empty replication handler (and make sure you have 
some other required config) and the rest is automatic.

http://wiki.apache.org/solr/SolrCloud#Required_Config

- Mark Miller
lucidimagination.com













Re: Multicore Issue - Server Restart

2012-05-30 Thread Sujatha Arun
solr 1.3

Regards
Sujatha

On Wed, May 30, 2012 at 8:26 PM, Siva Kommuri  wrote:

> Hi Sujatha,
>
> Which version of Solr are you using?
>
> Best Wishes,
> Siva
>
> On Wed, May 30, 2012 at 12:22 AM, Sujatha Arun 
> wrote:
>
> > Yes ,that is correct.
> >
> > Regards
> > Sujatha
> >
> > On Tue, May 29, 2012 at 7:23 PM, lboutros  wrote:
> >
> > > Hi Suajtha,
> > >
> > > each webapps has its own solr home ?
> > >
> > > Ludovic.
> > >
> > > -
> > > Jouve
> > > France.
> > > --
> > > View this message in context:
> > >
> >
> http://lucene.472066.n3.nabble.com/Multicore-Issue-Server-Restart-tp3986516p3986602.html
> > > Sent from the Solr - User mailing list archive at Nabble.com.
> > >
> >
>


Re: Multicore Issue - Server Restart

2012-05-30 Thread Siva Kommuri
Hi Sujatha,

Which version of Solr are you using?

Best Wishes,
Siva

On Wed, May 30, 2012 at 12:22 AM, Sujatha Arun  wrote:

> Yes ,that is correct.
>
> Regards
> Sujatha
>
> On Tue, May 29, 2012 at 7:23 PM, lboutros  wrote:
>
> > Hi Suajtha,
> >
> > each webapps has its own solr home ?
> >
> > Ludovic.
> >
> > -
> > Jouve
> > France.
> > --
> > View this message in context:
> >
> http://lucene.472066.n3.nabble.com/Multicore-Issue-Server-Restart-tp3986516p3986602.html
> > Sent from the Solr - User mailing list archive at Nabble.com.
> >
>


Re: Multicore Issue - Server Restart

2012-05-30 Thread Sujatha Arun
Yes ,that is correct.

Regards
Sujatha

On Tue, May 29, 2012 at 7:23 PM, lboutros  wrote:

> Hi Suajtha,
>
> each webapps has its own solr home ?
>
> Ludovic.
>
> -
> Jouve
> France.
> --
> View this message in context:
> http://lucene.472066.n3.nabble.com/Multicore-Issue-Server-Restart-tp3986516p3986602.html
> Sent from the Solr - User mailing list archive at Nabble.com.
>


Re: Multicore Issue - Server Restart

2012-05-29 Thread lboutros
Hi Suajtha,

each webapps has its own solr home ?

Ludovic.

-
Jouve
France.
--
View this message in context: 
http://lucene.472066.n3.nabble.com/Multicore-Issue-Server-Restart-tp3986516p3986602.html
Sent from the Solr - User mailing list archive at Nabble.com.


Re: Multicore solr

2012-05-23 Thread Amit Jha
Please any one can help me on this 

Rgds
AJ

On 23-May-2012, at 14:37, Jens Grivolla  wrote:

> So are you even doing text search in Solr at all, or just using it as a
> key-value store?
> 
> If the latter, do you have your schema configured so
> that only the search_id field is indexed (with a keyword tokenizer) and 
> everything else only stored? Also, are you sure that Solr is the best option 
> as a key-value store?
> 
> Jens
> 
> On 05/23/2012 04:34 AM, Amit Jha wrote:
>> Hi,
>> 
>> Thanks for your advice. It is basically a meta search application.
>> Users can perform a search on N number of data sources at a time. We
>> broadcast Parallel search to each  selected data sources and write
>> data to solr using custom build API(API and solr are deployed on
>> separate machine API job is to perform parallel search, write data to
>> solr ). API respond to application that some results are available
>> then application fires  a search query to display the results(query
>> would be q=unique_search_id). And other side API keep writing data to
>> solr and user can fire a search to solr to view all results.
>> 
>> In current scenario we are using single solr server&  we performing
>> real time index and search. Performing these operations on single
>> solr making process slow as index size increases.
>> 
>> So we are planning to use multi core solr and each user will have its
>> core. All core will have the same schema.
>> 
>> Please suggest if this approach has any issues.
>> 
>> Rgds AJ
>> 
>> On 22-May-2012, at 20:14, Sohail Aboobaker
>> wrote:
>> 
>>> It would help if you provide your use case. What are you indexing
>>> for each user and why would you need a separate core for indexing
>>> each user? How do you decide schema for each user? It might be
>>> better to describe your use case and desired results. People on the
>>> list will be able to advice on the best approach.
>>> 
>>> Sohail
>> 
> 
> 


Re: Multicore solr

2012-05-23 Thread Shanu Jha
Jens,

Yes we are doing text search.

My question to all is, the approach of creating cores for each user is a
good idea?

AJ

On Wed, May 23, 2012 at 2:37 PM, Jens Grivolla  wrote:

> So are you even doing text search in Solr at all, or just using it as a
> key-value store?
>
> If the latter, do you have your schema configured so
> that only the search_id field is indexed (with a keyword tokenizer) and
> everything else only stored? Also, are you sure that Solr is the best
> option as a key-value store?
>
> Jens
>
>
> On 05/23/2012 04:34 AM, Amit Jha wrote:
>
>> Hi,
>>
>> Thanks for your advice. It is basically a meta search application.
>> Users can perform a search on N number of data sources at a time. We
>> broadcast Parallel search to each  selected data sources and write
>> data to solr using custom build API(API and solr are deployed on
>> separate machine API job is to perform parallel search, write data to
>> solr ). API respond to application that some results are available
>> then application fires  a search query to display the results(query
>> would be q=unique_search_id). And other side API keep writing data to
>> solr and user can fire a search to solr to view all results.
>>
>> In current scenario we are using single solr server&  we performing
>>
>> real time index and search. Performing these operations on single
>> solr making process slow as index size increases.
>>
>> So we are planning to use multi core solr and each user will have its
>> core. All core will have the same schema.
>>
>> Please suggest if this approach has any issues.
>>
>> Rgds AJ
>>
>> On 22-May-2012, at 20:14, Sohail Aboobaker
>> wrote:
>>
>>  It would help if you provide your use case. What are you indexing
>>> for each user and why would you need a separate core for indexing
>>> each user? How do you decide schema for each user? It might be
>>> better to describe your use case and desired results. People on the
>>> list will be able to advice on the best approach.
>>>
>>> Sohail
>>>
>>
>>
>
>


Re: Multicore solr

2012-05-23 Thread Jens Grivolla

So are you even doing text search in Solr at all, or just using it as a
key-value store?

If the latter, do you have your schema configured so
that only the search_id field is indexed (with a keyword tokenizer) and 
everything else only stored? Also, are you sure that Solr is the best 
option as a key-value store?


Jens

On 05/23/2012 04:34 AM, Amit Jha wrote:

Hi,

Thanks for your advice. It is basically a meta search application.
Users can perform a search on N number of data sources at a time. We
broadcast Parallel search to each  selected data sources and write
data to solr using custom build API(API and solr are deployed on
separate machine API job is to perform parallel search, write data to
solr ). API respond to application that some results are available
then application fires  a search query to display the results(query
would be q=unique_search_id). And other side API keep writing data to
solr and user can fire a search to solr to view all results.

In current scenario we are using single solr server&  we performing
real time index and search. Performing these operations on single
solr making process slow as index size increases.

So we are planning to use multi core solr and each user will have its
core. All core will have the same schema.

Please suggest if this approach has any issues.

Rgds AJ

On 22-May-2012, at 20:14, Sohail Aboobaker
wrote:


It would help if you provide your use case. What are you indexing
for each user and why would you need a separate core for indexing
each user? How do you decide schema for each user? It might be
better to describe your use case and desired results. People on the
list will be able to advice on the best approach.

Sohail







Re: Multicore solr

2012-05-23 Thread Shanu Jha
Awaiting for suggestions.

On Wed, May 23, 2012 at 8:04 AM, Amit Jha  wrote:

> Hi,
>
> Thanks for your advice.
> It is basically a meta search application. Users can perform a search on N
> number of data sources at a time. We broadcast Parallel search to each
>  selected data sources and write data to solr using custom build API(API
> and solr are deployed on separate machine API job is to perform parallel
> search, write data to solr ). API respond to application that some results
> are available then application fires  a search query to display the
> results(query would be q=unique_search_id). And other side API keep writing
> data to solr and user can fire a search to solr to view all results.
>
> In current scenario we are using single solr server & we performing real
> time index and search. Performing these operations on single solr making
> process slow as index size increases.
>
> So we are planning to use multi core solr and each user will have its
> core. All core will have the same schema.
>
> Please suggest if this approach has any issues.
>
> Rgds
> AJ
>
> On 22-May-2012, at 20:14, Sohail Aboobaker  wrote:
>
> > It would help if you provide your use case. What are you indexing for
> each
> > user and why would you need a separate core for indexing each user? How
> do
> > you decide schema for each user? It might be better to describe your use
> > case and desired results. People on the list will be able to advice on
> the
> > best approach.
> >
> > Sohail
>


Re: Multicore solr

2012-05-22 Thread Amit Jha
Hi,

Thanks for your advice.
It is basically a meta search application. Users can perform a search on N 
number of data sources at a time. We broadcast Parallel search to each  
selected data sources and write data to solr using custom build API(API and 
solr are deployed on separate machine API job is to perform parallel search, 
write data to solr ). API respond to application that some results are 
available then application fires  a search query to display the results(query 
would be q=unique_search_id). And other side API keep writing data to solr and 
user can fire a search to solr to view all results. 

In current scenario we are using single solr server & we performing real time 
index and search. Performing these operations on single solr making process 
slow as index size increases. 

So we are planning to use multi core solr and each user will have its core. All 
core will have the same schema.

Please suggest if this approach has any issues.

Rgds
AJ

On 22-May-2012, at 20:14, Sohail Aboobaker  wrote:

> It would help if you provide your use case. What are you indexing for each
> user and why would you need a separate core for indexing each user? How do
> you decide schema for each user? It might be better to describe your use
> case and desired results. People on the list will be able to advice on the
> best approach.
> 
> Sohail


Re: Multicore solr

2012-05-22 Thread Sohail Aboobaker
It would help if you provide your use case. What are you indexing for each
user and why would you need a separate core for indexing each user? How do
you decide schema for each user? It might be better to describe your use
case and desired results. People on the list will be able to advice on the
best approach.

Sohail


Re: Multicore Solr

2012-05-22 Thread Shanu Jha
Hi,

Could please tell me what do you mean by filter data by users? I would like
to know is there real problem creating a core for a user. ie. resource
utilization, cpu usage etc.

AJ

On Tue, May 22, 2012 at 4:39 PM, findbestopensource <
findbestopensou...@gmail.com> wrote:

> Having cores per user is not good idea. The count is too high. Keep
> everything in single core. You could filter the data based on user name or
> user id.
>
> Regards
> Aditya
> www.findbestopensource.com
>
>
>
> On Tue, May 22, 2012 at 2:29 PM, Shanu Jha  wrote:
>
> > Hi all,
> >
> > greetings from my end. This is my first post on this mailing list. I have
> > few questions on multicore solr. For background we want to create a core
> > for each user logged in to our application. In that case it may be 50,
> 100,
> > 1000, N-numbers. Each core will be used to write and search index in real
> > time.
> >
> > 1. Is this a good idea to go with?
> > 2. What are the pros and cons of this approch?
> >
> > Awaiting for your response.
> >
> > Regards
> > AJ
> >
>


Re: Multicore Solr

2012-05-22 Thread findbestopensource
Having cores per user is not good idea. The count is too high. Keep
everything in single core. You could filter the data based on user name or
user id.

Regards
Aditya
www.findbestopensource.com



On Tue, May 22, 2012 at 2:29 PM, Shanu Jha  wrote:

> Hi all,
>
> greetings from my end. This is my first post on this mailing list. I have
> few questions on multicore solr. For background we want to create a core
> for each user logged in to our application. In that case it may be 50, 100,
> 1000, N-numbers. Each core will be used to write and search index in real
> time.
>
> 1. Is this a good idea to go with?
> 2. What are the pros and cons of this approch?
>
> Awaiting for your response.
>
> Regards
> AJ
>


Re: Multicore clustering setup problem

2011-07-01 Thread Stanislaw Osinski
Hi Walter,

That makes sense, but this has always been a multi-core setup, so the paths
> have not changed, and the clustering component worked fine for core0. The
> only thing new is I have fine tuned core1 (to begin implementing it).
> Previously the solrconfig.xml file was very basic. I replaced it with
> core0's solrconfig.xml and made very minor changes to it (unrelated to
> clustering) - it's a nearly identical solrconfig.xml file so I'm surprised
> it doesn't work for core1.
>

I'd probably need to take a look at the whole Solr dir you're working with,
clearly there's something wrong with the classpath of core1.

Again, I'm wondering if perhaps since both cores have the clustering
> component, if it should have a shared configuration in a different file
> used
> by both cores(?). Perhaps the duplicate clusteringComponent configuration
> for both cores is the problem?
>

I'm not an expert on Solr's internals related to core management, but I once
did configure two cores with search results clustering, where clustering
configuration and s were specified for each core separately, so this is
unlikely to be a problem. Another approach would be to put all the JARs
required for clustering in a common directory and point Solr to that lib
using the sharedLib attribute in the  tag:
http://wiki.apache.org/solr/CoreAdmin#solr. But it really should work both
ways.

If you can somehow e-mail (off-list) the listing of your Solr directory and
contents of your configuration XMLs, I may be able to trace the problem for
you.

Cheers,

Staszek


Re: Multicore clustering setup problem

2011-06-30 Thread Walter Closenfleight
Staszek,

That makes sense, but this has always been a multi-core setup, so the paths
have not changed, and the clustering component worked fine for core0. The
only thing new is I have fine tuned core1 (to begin implementing it).
Previously the solrconfig.xml file was very basic. I replaced it with
core0's solrconfig.xml and made very minor changes to it (unrelated to
clustering) - it's a nearly identical solrconfig.xml file so I'm surprised
it doesn't work for core1.

In other words, the paths here are the same for core0 and core1:
  
  
  
  
Again, I'm wondering if perhaps since both cores have the clustering
component, if it should have a shared configuration in a different file used
by both cores(?). Perhaps the duplicate clusteringComponent configuration
for both cores is the problem?

Thanks for looking at this!

On Thu, Jun 30, 2011 at 1:29 PM, Stanislaw Osinski <
stanislaw.osin...@carrotsearch.com> wrote:

> It looks like the whole clustering component JAR is not in the classpath. I
> remember that I once dealt with a similar issue in Solr 1.4 and the cause
> was the relative path of the  tag being resolved against the core's
> instanceDir, which made the path incorrect when directly copying and
> pasting
> from the single core configuration. Try correcting the relative  paths
> or replacing them with absolute ones, it should solve the problem.
>
> Cheers,
>
> Staszek
>


Re: Multicore clustering setup problem

2011-06-30 Thread Stanislaw Osinski
It looks like the whole clustering component JAR is not in the classpath. I
remember that I once dealt with a similar issue in Solr 1.4 and the cause
was the relative path of the  tag being resolved against the core's
instanceDir, which made the path incorrect when directly copying and pasting
from the single core configuration. Try correcting the relative  paths
or replacing them with absolute ones, it should solve the problem.

Cheers,

Staszek


Re: Multicore clustering setup problem

2011-06-30 Thread Walter Closenfleight
Sure, thanks for having a look!

By the way, if I attempt to hit a solr URL, I get this error, followed by
the stacktrace. If I set abortOnConfigurationError to false (I've found you
must put the setting in both solr.xml and solrconfig.xml for both cores
otherwise you keep getting the error), then the main URL to solr (
http://localhost/solr) lists just the first core.

HTTP Status 500 - Severe errors in solr configuration. Check your log files
for more detailed information on what may be wrong. If you want solr to
continue after configuration errors, change:
false in solr.xml
-
org.apache.solr.common.SolrException: Error loading class
'org.apache.solr.handler.clustering.ClusteringComponent' at
org.apache.solr.core.SolrResourceLoader.findClass(SolrResourceLoader.java:375)
at

*Tomcat Log:*

INFO: [core1] Added SolrEventListener:
org.apache.solr.core.QuerySenderListener{queries=[{q=solr
rocks,start=0,rows=10}, {q=static firstSearcher warming query from
solrconfig.xml}]}
Jun 30, 2011 8:51:23 AM org.apache.solr.request.XSLTResponseWriter init
INFO: xsltCacheLifetimeSeconds=5
Jun 30, 2011 8:51:23 AM org.apache.solr.common.SolrException log
SEVERE: org.apache.solr.common.SolrException: Error loading class
'org.apache.solr.handler.clustering.ClusteringComponent'
 at
org.apache.solr.core.SolrResourceLoader.findClass(SolrResourceLoader.java:375)
 at org.apache.solr.core.SolrCore.createInstance(SolrCore.java:413)
 at org.apache.solr.core.SolrCore.createInitInstance(SolrCore.java:435)
 at org.apache.solr.core.SolrCore.initPlugins(SolrCore.java:1498)
 at org.apache.solr.core.SolrCore.initPlugins(SolrCore.java:1492)
 at org.apache.solr.core.SolrCore.initPlugins(SolrCore.java:1525)
 at org.apache.solr.core.SolrCore.loadSearchComponents(SolrCore.java:833)
 at org.apache.solr.core.SolrCore.(SolrCore.java:551)
 at org.apache.solr.core.CoreContainer.create(CoreContainer.java:428)
 at org.apache.solr.core.CoreContainer.load(CoreContainer.java:278)
 at
org.apache.solr.core.CoreContainer$Initializer.initialize(CoreContainer.java:117)
 at
org.apache.solr.servlet.SolrDispatchFilter.init(SolrDispatchFilter.java:83)
 at
org.apache.catalina.core.ApplicationFilterConfig.getFilter(ApplicationFilterConfig.java:275)
 at
org.apache.catalina.core.ApplicationFilterConfig.setFilterDef(ApplicationFilterConfig.java:397)
 at
org.apache.catalina.core.ApplicationFilterConfig.(ApplicationFilterConfig.java:108)
 at
org.apache.catalina.core.StandardContext.filterStart(StandardContext.java:3800)
 at
org.apache.catalina.core.StandardContext.start(StandardContext.java:4450)
 at
org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:791)
 at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:771)
 at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:526)
 at
org.apache.catalina.startup.HostConfig.deployDescriptor(HostConfig.java:630)
 at
org.apache.catalina.startup.HostConfig.deployDescriptors(HostConfig.java:556)
 at org.apache.catalina.startup.HostConfig.deployApps(HostConfig.java:491)
 at org.apache.catalina.startup.HostConfig.start(HostConfig.java:1206)
 at
org.apache.catalina.startup.HostConfig.lifecycleEvent(HostConfig.java:314)
 at
org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:119)
 at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1053)
 at org.apache.catalina.core.StandardHost.start(StandardHost.java:722)
 at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1045)
 at org.apache.catalina.core.StandardEngine.start(StandardEngine.java:443)
 at org.apache.catalina.core.StandardService.start(StandardService.java:516)
 at org.apache.catalina.core.StandardServer.start(StandardServer.java:710)
 at org.apache.catalina.startup.Catalina.start(Catalina.java:583)
 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
 at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
 at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
 at java.lang.reflect.Method.invoke(Method.java:597)
 at org.apache.catalina.startup.Bootstrap.start(Bootstrap.java:288)
 at org.apache.catalina.startup.Bootstrap.main(Bootstrap.java:413)
Caused by: java.lang.ClassNotFoundException:
org.apache.solr.handler.clustering.ClusteringComponent
 at java.net.URLClassLoader$1.run(URLClassLoader.java:200)
 at java.security.AccessController.doPrivileged(Native Method)
 at java.net.URLClassLoader.findClass(URLClassLoader.java:188)
 at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
 at java.net.FactoryURLClassLoader.loadClass(URLClassLoader.java:592)
 at java.lang.ClassLoader.loadClass(ClassLoader.java:252)
 at java.lang.ClassLoader.loadClassInternal(ClassLoader.java:320)
 at java.lang.Class.forName0(Native Method)
 at java.lang.Class.forName(Class.java:247)
 at
org.apache.solr.core.SolrResourceLoader.findClass(SolrResourceLoader.java:359)
 ...

Re: Multicore clustering setup problem

2011-06-29 Thread Stanislaw Osinski
Hi,

Can you post the full strack trace? I'd need to know if it's
really org.apache.solr.handler.clustering.ClusteringComponent that's missing
or some other class ClusteringComponent depends on.

Cheers,

Staszek

On Thu, Jun 30, 2011 at 04:19, Walter Closenfleight <
walter.p.closenflei...@gmail.com> wrote:

> I had set up the clusteringComponent in solrconfig.xml for my first core.
> It
> has been working fine and now I want to get my next core working. I set up
> the second core with the clustering component so that I could use it, use
> solritas properly, etc. but Solr did not like the solrconfig.xml changes
> for
> the second core. I'm getting this error when Solr is started or when I hit
> a
> Solr related URL:
>
> SEVERE: org.apache.solr.common.SolrException: Error loading class
> 'org.apache.solr.handler.clustering.ClusteringComponent'
>
> Should the clusteringComponent be set up in a shared configuration file
> somehow or is there something else I am doing wrong?
>
> Thanks in advance!
>


Re: multicore file creation order

2011-06-27 Thread Stefan Matheis

Jérôme,

the complete directory structure, including required files, has to be 
created first - manually. the admin/cores will only "activate" the core 
for solr, that's it :)


Regards
Stefan

Am 27.06.2011 12:20, schrieb Jérôme Étévé:

Hi,

When one issues a command
admin/core&action=CREATE&core=blabla&instanceDir=...&dataDir=../../foobar
, what gets created first on disk?

Is it the new solr.xml or the new data directory?

Cheers,

Jerome.



Re: multicore and replication cause OOM

2011-06-27 Thread Shalin Shekhar Mangar
On Sun, Jun 26, 2011 at 5:37 AM, Esteban Donato
 wrote:
> thanks Shalin.  One more question:  is there any way to avoid multiple
> cores replicating at the same time?  Like synchronizing the
> ReplicationHandler somehow?
>

Yes, just specify different poll intervals for each core. The
ReplicationHandler always tries to schedule pulls by rounding to the
nearest interval e.g. specifying an interval as "00:05:00" will cause
pulls to happen at 5 minutes past the hour, then 10 minutes past the
hour and so on.

-- 
Regards,
Shalin Shekhar Mangar.


Re: multicore and replication cause OOM

2011-06-25 Thread Esteban Donato
thanks Shalin.  One more question:  is there any way to avoid multiple
cores replicating at the same time?  Like synchronizing the
ReplicationHandler somehow?

On Fri, Jun 24, 2011 at 6:55 AM, Shalin Shekhar Mangar
 wrote:
> On Fri, Jun 24, 2011 at 1:41 PM, Esteban Donato
>  wrote:
>> I have a Solr with 7 cores (~150MB each).  All cores replicate at the
>> same time from a Solr master instance.  Every time the replication
>> happens I get an OOM after experiencing long response times.  This
>> Solr used to have 4 cores before and I've never got an OOM with that
>> configuration (replication occurs on daily basis).
>>
>> My question is: could the new 3 cores be the cause of OOM?  Does Solr
>> require considerable extra heap for performing the replication?.
>
> Yes and no. Replication itself does not consume a lot of heap (I guess
> about a couple of MBs per ongoing replication). However, when the
> searchers are re-opened on the newly installed index, auto warming can
> cause memory usage to double for a core.
>
>> Should I avoid replicating all the cores at the same time?
>
> You should try that especially if you are so constrained for heap space.
>
>> I'm using Solr 1.4 with the following mem configuration: -Xms512m
>> -Xmx512m -XX:NewSize=128M -XX:MaxNewSize=128M
>
> That seems to be a small amount of RAM for indexing/querying seven
> 150MB indexes in parallel.
>
> --
> Regards,
> Shalin Shekhar Mangar.
>


Re: multicore and replication cause OOM

2011-06-24 Thread Shalin Shekhar Mangar
On Fri, Jun 24, 2011 at 1:41 PM, Esteban Donato
 wrote:
> I have a Solr with 7 cores (~150MB each).  All cores replicate at the
> same time from a Solr master instance.  Every time the replication
> happens I get an OOM after experiencing long response times.  This
> Solr used to have 4 cores before and I've never got an OOM with that
> configuration (replication occurs on daily basis).
>
> My question is: could the new 3 cores be the cause of OOM?  Does Solr
> require considerable extra heap for performing the replication?.

Yes and no. Replication itself does not consume a lot of heap (I guess
about a couple of MBs per ongoing replication). However, when the
searchers are re-opened on the newly installed index, auto warming can
cause memory usage to double for a core.

> Should I avoid replicating all the cores at the same time?

You should try that especially if you are so constrained for heap space.

> I'm using Solr 1.4 with the following mem configuration: -Xms512m
> -Xmx512m -XX:NewSize=128M -XX:MaxNewSize=128M

That seems to be a small amount of RAM for indexing/querying seven
150MB indexes in parallel.

--
Regards,
Shalin Shekhar Mangar.


Re: Multicore

2011-03-16 Thread Markus Jelsma
What Solr are you using? That filter is not pre 3.1 releases.

On Wednesday 16 March 2011 13:55:21 Brian Lamb wrote:
> Hi all,
> 
> I am setting up multicore and the schema.xml file in the core0 folder says
> not to sure that one because its very stripped down. So I copied the schema
> from example/solr/conf but now I am getting a bunch of class not found
> exceptions:
> 
> SEVERE: org.apache.solr.common.SolrException: Error loading class
> 'solr.KeywordMarkerFilterFactory'
> 
> For example.
> 
> I also copied over the solrconfig.xml from example/solr/conf and changed
> all the lib dir="xxx" paths to go up one directory higher ( dir="../xxx" /> instead). I've found that when I use my solrconfig file
> with the stripped down schema.xml file, it runs correctly. But when I use
> the full schema xml file, I get those errors.
> 
> Now this says to me I am not loading a library or two somewhere but I've
> looked through the configuration files and cannot see any other place other
> than solrconfig.xml where that would be set so what am I doing incorrectly?
> 
> Thanks,
> 
> Brian Lamb

-- 
Markus Jelsma - CTO - Openindex
http://www.linkedin.com/in/markus17
050-8536620 / 06-50258350


Re: Multicore boosting to only 1 core

2011-02-15 Thread mike anderson
Could you make an additional date field, call it date_boost, that gets
populated in all of the cores EXCEPT the one with the newest articles, and
then boost on this field? Then when you move articles from the 'newest' core
to the rest of the cores you copy over the date to the date_boost field. (I
haven't used boosting before so I don't know what happens if you try to
boost a field that's empty)

This would boost documents in each index (locally, as desired). Keep in mind
when you get your results back from a distributed shard query that the IDF
is not distributed so your scores aren't reliable for sorting.

-mike


On Tue, Feb 15, 2011 at 1:19 PM, Jonathan Rochkind  wrote:

> No. In fact, there's no way to search over multi-cores at once in Solr at
> all, even before you get to your boosting question. Your different cores are
> entirely different Solr indexes, Solr has no built-in way to combine
> searches accross multiple Solr instances.
>
> [Well, sort of it can, with sharding. But sharding is unlikely to be a
> solution to your problem either, UNLESS you problem is that your solr index
> is so big you want to split it accross multiple machines for performance.
>  That is the problem sharding is meant to solve. People trying to use it to
> solve other problems run into trouble.]
>
>
> On 2/14/2011 1:59 PM, Tanner Postert wrote:
>
>> I have a multicore system and I am looking to boost results by date, but
>> only for 1 core. Is this at all possible?
>>
>> Basically one of the core's content is very new, and changes all the time,
>> and if I boost everything by date, that core's content will almost always
>> be
>> at the top of the results, so I only want to do the date boosting to the
>> cores that have older content so that their more recent results get
>> boosted
>> over the older content.
>>
>


Re: Multicore boosting to only 1 core

2011-02-15 Thread Jonathan Rochkind
No. In fact, there's no way to search over multi-cores at once in Solr 
at all, even before you get to your boosting question. Your different 
cores are entirely different Solr indexes, Solr has no built-in way to 
combine searches accross multiple Solr instances.


[Well, sort of it can, with sharding. But sharding is unlikely to be a 
solution to your problem either, UNLESS you problem is that your solr 
index is so big you want to split it accross multiple machines for 
performance.  That is the problem sharding is meant to solve. People 
trying to use it to solve other problems run into trouble.]



On 2/14/2011 1:59 PM, Tanner Postert wrote:

I have a multicore system and I am looking to boost results by date, but
only for 1 core. Is this at all possible?

Basically one of the core's content is very new, and changes all the time,
and if I boost everything by date, that core's content will almost always be
at the top of the results, so I only want to do the date boosting to the
cores that have older content so that their more recent results get boosted
over the older content.


Re: Multicore Relaod Theoretical Question

2011-01-24 Thread Em

Thanks Alexander, what a valuable ressource :).

- Em
-- 
View this message in context: 
http://lucene.472066.n3.nabble.com/Multicore-Relaod-Theoretical-Question-tp2293999p2321335.html
Sent from the Solr - User mailing list archive at Nabble.com.


Re: Multicore Relaod Theoretical Question

2011-01-24 Thread Alexander Kanarsky
Em,

that's correct. You can use 'lsof' to see file handles still in use.
See 
http://0xfe.blogspot.com/2006/03/troubleshooting-unix-systems-with-lsof.html,
"Recipe #11".

-Alexander

On Sun, Jan 23, 2011 at 1:52 AM, Em  wrote:
>
> Hi Alexander,
>
> thank you for your response.
>
> You said that the old index files were still in use. That means Linux does
> not *really* delete them until Solr frees its locks from it, which happens
> while reloading?
>
>
>
> Thank you for sharing your experiences!
>
> Kind regards,
> Em
>
>
> Alexander Kanarsky wrote:
>>
>> Em,
>>
>> yes, you can replace the index (get the new one into a separate folder
>> like index.new and then rename it to the index folder) outside the
>> Solr, then just do the http call to reload the core.
>>
>> Note that the old index files may still be in use (continue to serve
>> the queries while reloading), even if the old index folder is deleted
>> - that is on Linux filesystems, not sure about NTFS.
>> That means the space on disk will be freed only when the old files are
>> not referenced by Solr searcher any longer.
>>
>> -Alexander
>>
>> On Sat, Jan 22, 2011 at 1:51 PM, Em  wrote:
>>>
>>> Hi Erick,
>>>
>>> thanks for your response.
>>>
>>> Yes, it's really not that easy.
>>>
>>> However, the target is to avoid any kind of master-slave-setup.
>>>
>>> The most recent idea i got is to create a new core with a data-dir
>>> pointing
>>> to an already existing directory with a fully optimized index.
>>>
>>> Regards,
>>> Em
>>> --
>>> View this message in context:
>>> http://lucene.472066.n3.nabble.com/Multicore-Relaod-Theoretical-Question-tp2293999p2310709.html
>>> Sent from the Solr - User mailing list archive at Nabble.com.
>>>
>>
>>
>
> --
> View this message in context: 
> http://lucene.472066.n3.nabble.com/Multicore-Relaod-Theoretical-Question-tp2293999p2312778.html
> Sent from the Solr - User mailing list archive at Nabble.com.
>


Re: Multicore Relaod Theoretical Question

2011-01-23 Thread Em

Hi Alexander,

thank you for your response.

You said that the old index files were still in use. That means Linux does
not *really* delete them until Solr frees its locks from it, which happens
while reloading? 


 
Thank you for sharing your experiences!

Kind regards,
Em


Alexander Kanarsky wrote:
> 
> Em,
> 
> yes, you can replace the index (get the new one into a separate folder
> like index.new and then rename it to the index folder) outside the
> Solr, then just do the http call to reload the core.
> 
> Note that the old index files may still be in use (continue to serve
> the queries while reloading), even if the old index folder is deleted
> - that is on Linux filesystems, not sure about NTFS.
> That means the space on disk will be freed only when the old files are
> not referenced by Solr searcher any longer.
> 
> -Alexander
> 
> On Sat, Jan 22, 2011 at 1:51 PM, Em  wrote:
>>
>> Hi Erick,
>>
>> thanks for your response.
>>
>> Yes, it's really not that easy.
>>
>> However, the target is to avoid any kind of master-slave-setup.
>>
>> The most recent idea i got is to create a new core with a data-dir
>> pointing
>> to an already existing directory with a fully optimized index.
>>
>> Regards,
>> Em
>> --
>> View this message in context:
>> http://lucene.472066.n3.nabble.com/Multicore-Relaod-Theoretical-Question-tp2293999p2310709.html
>> Sent from the Solr - User mailing list archive at Nabble.com.
>>
> 
> 

-- 
View this message in context: 
http://lucene.472066.n3.nabble.com/Multicore-Relaod-Theoretical-Question-tp2293999p2312778.html
Sent from the Solr - User mailing list archive at Nabble.com.


Re: Multicore Relaod Theoretical Question

2011-01-22 Thread Alexander Kanarsky
Em,

yes, you can replace the index (get the new one into a separate folder
like index.new and then rename it to the index folder) outside the
Solr, then just do the http call to reload the core.

Note that the old index files may still be in use (continue to serve
the queries while reloading), even if the old index folder is deleted
- that is on Linux filesystems, not sure about NTFS.
That means the space on disk will be freed only when the old files are
not referenced by Solr searcher any longer.

-Alexander

On Sat, Jan 22, 2011 at 1:51 PM, Em  wrote:
>
> Hi Erick,
>
> thanks for your response.
>
> Yes, it's really not that easy.
>
> However, the target is to avoid any kind of master-slave-setup.
>
> The most recent idea i got is to create a new core with a data-dir pointing
> to an already existing directory with a fully optimized index.
>
> Regards,
> Em
> --
> View this message in context: 
> http://lucene.472066.n3.nabble.com/Multicore-Relaod-Theoretical-Question-tp2293999p2310709.html
> Sent from the Solr - User mailing list archive at Nabble.com.
>


Re: Multicore Relaod Theoretical Question

2011-01-22 Thread Em

Hi Erick,

thanks for your response.

Yes, it's really not that easy.

However, the target is to avoid any kind of master-slave-setup.

The most recent idea i got is to create a new core with a data-dir pointing
to an already existing directory with a fully optimized index.

Regards,
Em
-- 
View this message in context: 
http://lucene.472066.n3.nabble.com/Multicore-Relaod-Theoretical-Question-tp2293999p2310709.html
Sent from the Solr - User mailing list archive at Nabble.com.


Re: Multicore Relaod Theoretical Question

2011-01-22 Thread Erick Erickson
This seems far too complex to me. Why not just optimize on the master
and let replication do all the rest for you?

Best
Erick

On Fri, Jan 21, 2011 at 1:07 PM, Em  wrote:

>
> Hi,
>
> are there no experiences or thoughts?
> How would you solve this at Lucene-Level?
>
> Regards
>
>
> Em wrote:
> >
> > Hello list,
> >
> > I got a theoretical question about a Multicore-Situation:
> >
> > I got two cores: active, inactive
> >
> > The active core serves all the queries.
> >
> > The inactive core is the tricky thing:
> > I create an optimized index outside the environment and want to insert
> > that optimized index 1 to 1 into the inactive core, which means replacing
> > everything inside the index-directory.
> > After this is done, I would like to reload the inactive core, so that it
> > is ready for a core-swap and ready for serving queries on top of the new
> > inserted optimized index.
> >
> > Is it possible to handle such a situation?
> >
> > Thank you.
> >
> >
>
> --
> View this message in context:
> http://lucene.472066.n3.nabble.com/Multicore-Relaod-Theoretical-Question-tp2293999p2303585.html
> Sent from the Solr - User mailing list archive at Nabble.com.
>


Re: Multicore Relaod Theoretical Question

2011-01-21 Thread Em

Hi,

are there no experiences or thoughts?
How would you solve this at Lucene-Level?

Regards


Em wrote:
> 
> Hello list,
> 
> I got a theoretical question about a Multicore-Situation:
> 
> I got two cores: active, inactive
> 
> The active core serves all the queries.
> 
> The inactive core is the tricky thing:
> I create an optimized index outside the environment and want to insert
> that optimized index 1 to 1 into the inactive core, which means replacing
> everything inside the index-directory.
> After this is done, I would like to reload the inactive core, so that it
> is ready for a core-swap and ready for serving queries on top of the new
> inserted optimized index.
> 
> Is it possible to handle such a situation?
> 
> Thank you.
> 
> 

-- 
View this message in context: 
http://lucene.472066.n3.nabble.com/Multicore-Relaod-Theoretical-Question-tp2293999p2303585.html
Sent from the Solr - User mailing list archive at Nabble.com.


Re: Multicore Search "Map size must not be negative"

2011-01-20 Thread Markus Jelsma
That looks like this issue:
https://issues.apache.org/jira/browse/SOLR-2278

On Thursday 20 January 2011 13:02:41 Jörg Agatz wrote:
> Hallo..
> 
> I have create multicore search and will search in more then one Core!
> 
> Now i have done:
> 
> http://192.168.105.59:8080/solr/mail/select?wt=phps&q=*:*&shards=192.168.10
> 5.59:8080/solr/mail,192.168.105.59:8080/solr/mail11
> 
> But Error...
> 
> HTTP Status 500 - Map size must not be negative
> java.lang.IllegalArgumentException: Map size must not be negative at
> org.apache.solr.request.PHPSerializedWriter.writeMapOpener(PHPSerializedRes
> ponseWriter.java:224) at
> org.apache.solr.request.JSONWriter.writeSolrDocument(JSONResponseWriter.jav
> a:398) at
> org.apache.solr.request.JSONWriter.writeSolrDocumentList(JSONResponseWriter
> .java:553) at
> org.apache.solr.request.TextResponseWriter.writeVal(TextResponseWriter.java
> :148) at
> org.apache.solr.request.JSONWriter.writeNamedListAsMapMangled(JSONResponseW
> riter.java:154) at
> org.apache.solr.request.PHPSerializedWriter.writeNamedList(PHPSerializedRes
> ponseWriter.java:100) at
> org.apache.solr.request.PHPSerializedWriter.writeResponse(PHPSerializedResp
> onseWriter.java:95) at
> org.apache.solr.request.PHPSerializedResponseWriter.write(PHPSerializedResp
> onseWriter.java:69) at
> org.apache.solr.servlet.SolrDispatchFilter.writeResponse(SolrDispatchFilter
> .java:325) at
> org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java
> :254) at
> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(Applicatio
> nFilterChain.java:235) at
> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterC
> hain.java:206) at
> org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.j
> ava:233) at
> org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.j
> ava:191) at
> org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:12
> 7) at
> org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:10
> 2) at
> org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.jav
> a:109) at
> org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:298)
> at
> org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:859)
> at
> org.apache.coyote.http11.Http11Protocol$Http11ConnectionHandler.process(Htt
> p11Protocol.java:588) at
> org.apache.tomcat.util.net.JIoEndpoint$Worker.run(JIoEndpoint.java:489) at
> java.lang.Thread.run(Thread.java:636)
> 
> When i search
> http://192.168.105.59:8080/solr/mail/select?wt=php&q=*:*&shards=192.168.105
> .59:8080/solr/mail,192.168.105.59:8080/solr/mail11
> 
> it works but i need wt=phps it is important!
> 
> but i dont understand the Problem!!!
> 
> 
> Jörg

-- 
Markus Jelsma - CTO - Openindex
http://www.linkedin.com/in/markus17
050-8536620 / 06-50258350


RE: multicore controlled by properties

2011-01-09 Thread Ephraim Ofir
I use a script to generate the appropriate solr.xml for each host according to 
a config file.  You could also prepare separate files and create a soft link 
from solr.xml to the appropriate one on each host.

Ephraim Ofir

-Original Message-
From: Lance Norskog [mailto:goks...@gmail.com] 
Sent: Sunday, January 09, 2011 6:03 AM
To: solr-user@lucene.apache.org; Zach Friedland
Subject: Re: multicore controlled by properties

The config files support XInclude. Some sites use this to include a
local configuration that affects your single global file.

On Sat, Jan 8, 2011 at 10:53 AM, Zach Friedland  wrote:
> We have a large number of solr cores that are used by different groups for
> different purposes.  To make the source control simple, we keep a single
> 'multicore' directory and solr.xml references all cores.  We deploy the same
> configuration to all servers (shared NFS mount), and then only populate the
> indexes of the cores that we want running on that server.  However, it still
> seems wasteful to have the cores running where we know they won't be used.  
> What
> I'd like to be able to do is define properties that will allow me to enable 
> and
> disable cores via JVM params on startup.  I was hoping to use the 'enable'
> parameter that is supported elsewhere in solr, but it didn't seem to be
> respected in solr.xml.  Here's the syntax I tried in my solr.xml file:
>
>  
>     enable="${solr.enable.core.businessUnit1:true}"/>
>     enable="${solr.enable.core.businessUnit2:true}"/>
>     enable="${solr.enable.core.businessUnit3:true}"/>
>     enable="${solr.enable.core.businessUnit4:true}"/>
>     enable="${solr.enable.core.businessUnit5:true}"/>
>  
>
> Another idea is that I have solr1.xml, solr2.xml, solr3.xml, solr4.xml (etc);
> and then have some property that tells the JVM which solr.xml version to load
> (and each xml file would have only the cores that that instance needs).  But I
> couldn't find any property that controls which xml file is loaded for
> multicore.  Is the code hard-coded to look for solr.xml?
>
> Thanks
>
>
>
>
>



-- 
Lance Norskog
goks...@gmail.com


Re: multicore controlled by properties

2011-01-08 Thread Lance Norskog
The config files support XInclude. Some sites use this to include a
local configuration that affects your single global file.

On Sat, Jan 8, 2011 at 10:53 AM, Zach Friedland  wrote:
> We have a large number of solr cores that are used by different groups for
> different purposes.  To make the source control simple, we keep a single
> 'multicore' directory and solr.xml references all cores.  We deploy the same
> configuration to all servers (shared NFS mount), and then only populate the
> indexes of the cores that we want running on that server.  However, it still
> seems wasteful to have the cores running where we know they won't be used.  
> What
> I'd like to be able to do is define properties that will allow me to enable 
> and
> disable cores via JVM params on startup.  I was hoping to use the 'enable'
> parameter that is supported elsewhere in solr, but it didn't seem to be
> respected in solr.xml.  Here's the syntax I tried in my solr.xml file:
>
>  
>     enable="${solr.enable.core.businessUnit1:true}"/>
>     enable="${solr.enable.core.businessUnit2:true}"/>
>     enable="${solr.enable.core.businessUnit3:true}"/>
>     enable="${solr.enable.core.businessUnit4:true}"/>
>     enable="${solr.enable.core.businessUnit5:true}"/>
>  
>
> Another idea is that I have solr1.xml, solr2.xml, solr3.xml, solr4.xml (etc);
> and then have some property that tells the JVM which solr.xml version to load
> (and each xml file would have only the cores that that instance needs).  But I
> couldn't find any property that controls which xml file is loaded for
> multicore.  Is the code hard-coded to look for solr.xml?
>
> Thanks
>
>
>
>
>



-- 
Lance Norskog
goks...@gmail.com


Re: Multicore Search broken

2010-12-17 Thread Lance Norskog
All of the cores have to have the same schema. And, they should not
have any documents in common.


On Thu, Dec 16, 2010 at 8:36 AM, Jörg Agatz  wrote:
> I have tryed some Thinks, now i have new news,
>
> when i search in :
> http://localhost:8080/solr/mail/select?q=*:*&shards=localhost:8080/solr/mail,localhost:8080/solr/
> mail
> it works, so it looks that it is not a Problem with the JAVA or something
> like this,
>
> i have a Idea, it is Possible, that the diferences configs?
>
> pleas, when you have an idea, than told me this...
>



-- 
Lance Norskog
goks...@gmail.com


Re: Multicore Search broken

2010-12-16 Thread Jörg Agatz
I have tryed some Thinks, now i have new news,

when i search in :
http://localhost:8080/solr/mail/select?q=*:*&shards=localhost:8080/solr/mail,localhost:8080/solr/
mail
it works, so it looks that it is not a Problem with the JAVA or something
like this,

i have a Idea, it is Possible, that the diferences configs?

pleas, when you have an idea, than told me this...


Re: Multicore and Replication (scripts vs. java, spellchecker)

2010-12-11 Thread Martin Grotzke
On Sat, Dec 11, 2010 at 12:38 AM, Chris Hostetter
 wrote:
>
> : #SOLR-433 "MultiCore and SpellChecker replication" [1]. Based on the
> : status of this feature request I'd asume that the normal procedure of
> : keeping the spellchecker index up2date would be running a cron job on
> : each node/slave that updates the spellchecker.
> : Is that right?
>
> i'm not 100% certain, but i suspect a lot of people just build the
> spellcheck dictionaries on the slave machines (redundently) using
> buildOnCommit
>
> http://wiki.apache.org/solr/SpellCheckComponent#Building_on_Commits

Ok, also a good option. Though, for us this is not that perfect
because we have 4 different spellcheckers configured so that this
would eat some cpu that we'd prefer to have left for searching.
I think what would be desirable (in our case) is s.th. like rebuilding
the spellchecker based on a cron expression, so that we could recreate
it e.g. every night at 1 am.

When thinking about creating s.th. like this, do you have some advice
where I could have a look at in solr? Is there already some
"framework" for running regular tasks, or should I pull up my own
Timer/TimerTask etc. and create it from scratch?

Cheers,
Martin


>
>
>
>
>
>
> -Hoss
>



-- 
Martin Grotzke
http://www.javakaffee.de/blog/


Re: Multicore and Replication (scripts vs. java, spellchecker)

2010-12-10 Thread Chris Hostetter

: #SOLR-433 "MultiCore and SpellChecker replication" [1]. Based on the
: status of this feature request I'd asume that the normal procedure of
: keeping the spellchecker index up2date would be running a cron job on
: each node/slave that updates the spellchecker.
: Is that right?

i'm not 100% certain, but i suspect a lot of people just build the 
spellcheck dictionaries on the slave machines (redundently) using 
buildOnCommit

http://wiki.apache.org/solr/SpellCheckComponent#Building_on_Commits






-Hoss


Re: Multicore and Replication (scripts vs. java, spellchecker)

2010-12-10 Thread Martin Grotzke
Hi,

that there's no feedback indicates that our plans/preferences are
fine. Otherwise it's now a good opportunity to feed back :-)

Cheers,
Martin


On Wed, Dec 8, 2010 at 2:48 PM, Martin Grotzke
 wrote:
> Hi,
>
> we're just planning to move from our replicated single index setup to
> a replicated setup with multiple cores.
> We're going to start with 2 cores, but the number of cores may
> change/increase over time.
>
> Our replication is still based on scripts/rsync, and I'm wondering if
> it's worth moving to java based replication.
> AFAICS the main advantage is simplicity, as with scripts based
> replication our operations team would have to maintain rsync daemons /
> cron jobs for each core.
> Therefore my own preference would be to drop scripts and chose the
> java based replication.
>
> I'd just wanted to ask for experiences with the one or another in a
> multicore setup. What do you say?
>
> Another question is regarding spellchecker replication. I know there's
> #SOLR-433 "MultiCore and SpellChecker replication" [1]. Based on the
> status of this feature request I'd asume that the normal procedure of
> keeping the spellchecker index up2date would be running a cron job on
> each node/slave that updates the spellchecker.
> Is that right?
>
> And a final one: are there other things we should be aware of / keep
> in mind when planning the migration to multiple cores?
> (Ok, I'm risking to get "ask specific questions!" as an answer, but
> perhaps s.o. has interesting, related stories to tell  :-))
>
> Thanx in advance,
> cheers,
> Martin
>
>
> [1] https://issues.apache.org/jira/browse/SOLR-433
>



-- 
Martin Grotzke
http://www.javakaffee.de/blog/


Re: MultiCore config less stable than SingleCore?

2010-12-07 Thread Erick Erickson
Could you tell us what version of Solr you're running?
And what OS you're concerned about?
And what file system you're operating on?
And anything else you can think of that'd help us help you?

Best
Erick

On Tue, Dec 7, 2010 at 4:56 AM, Jan Simon Winkelmann <
jansimon.winkelm...@newsfactory.de> wrote:

> Hi,
>
> i have recently moved Solr at one of our customers to a MultiCore
> environment running 2 indexes. Since then, we seem to be having problems
> with locks not being removed properly, .lock files keep sticking around in
> the index directory.
> Hence, any updates to the index keep returning 500 errors with the
> following stack trace:
>
> Error 500 Lock obtain timed out: NativeFSLock@
> /data/jetty/solr/index1/data/index/lucene-96165c19c16f26b93de3954f6891-write.lock
>
> org.apache.lucene.store.LockObtainFailedException: Lock obtain timed out:
> NativeFSLock@
> /data/jetty/solr/index1/data/index/lucene-96165c19c16f26b93de3954f6891-write.lock
>at org.apache.lucene.store.Lock.obtain(Lock.java:85)
>at org.apache.lucene.index.IndexWriter.init(IndexWriter.java:1545)
>at
> org.apache.lucene.index.IndexWriter.(IndexWriter.java:1402)
>at
> org.apache.solr.update.SolrIndexWriter.(SolrIndexWriter.java:190)
>at
> org.apache.solr.update.UpdateHandler.createMainIndexWriter(UpdateHandler.java:98)
>at
> org.apache.solr.update.DirectUpdateHandler2.openWriter(DirectUpdateHandler2.java:173)
>at
> org.apache.solr.update.DirectUpdateHandler2.addDoc(DirectUpdateHandler2.java:220)
>at
> org.apache.solr.update.processor.RunUpdateProcessor.processAdd(RunUpdateProcessorFactory.java:61)
>at
> org.apache.solr.handler.XMLLoader.processUpdate(XMLLoader.java:139)
>at org.apache.solr.handler.XMLLoader.load(XMLLoader.java:69)
>at
> org.apache.solr.handler.ContentStreamHandlerBase.handleRequestBody(ContentStreamHandlerBase.java:54)
>at
> org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:131)
>at org.apache.solr.core.SolrCore.execute(SolrCore.java:1316)
>at
> org.apache.solr.servlet.SolrDispatchFilter.execute(SolrDispatchFilter.java:338)
>at
> org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:241)
>at
> org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1187)
>at
> org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:425)
>at
> org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:119)
>at
> org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:457)
>at
> org.eclipse.jetty.server.session.SessionHandler.handle(SessionHandler.java:182)
>at
> org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:933)
>at
> org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:362)
>at
> org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:867)
>at
> org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:117)
>at
> org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:245)
>at
> org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:126)
>at
> org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:113)
>at org.eclipse.jetty.server.Server.handle(Server.java:334)
>at
> org.eclipse.jetty.server.HttpConnection.handleRequest(HttpConnection.java:559)
>at
> org.eclipse.jetty.server.HttpConnection$RequestHandler.content(HttpConnection.java:1007)
>at org.eclipse.jetty.http.HttpParser.parseNext(HttpParser.java:747)
>at
> org.eclipse.jetty.http.HttpParser.parseAvailable(HttpParser.java:209)
>at
> org.eclipse.jetty.server.HttpConnection.handle(HttpConnection.java:406)
>at
> org.eclipse.jetty.io.nio.SelectChannelEndPoint.run(SelectChannelEndPoint.java:462)
>at
> org.eclipse.jetty.util.thread.QueuedThreadPool$2.run(QueuedThreadPool.java:436)
>at java.lang.Thread.run(Thread.java:662)
>
> All our other installations with a similar SingleCore config are running
> very smoothly.
> Does anyone have an idea what the problem is? Could I have missed something
> when configuring the MultiCore environment?
>
> Regards,
> Jan
>


Re: multicore defaultCoreName not working

2010-10-13 Thread Ron Chan
that explains it then, using 1.4.1 

thanks for that 
Ron 


- Original Message - 
From: "Ephraim Ofir"  
To: solr-user@lucene.apache.org 
Sent: Wednesday, 13 October, 2010 2:11:49 PM 
Subject: RE: multicore defaultCoreName not working 

Which version of solr are you using? 
I believe this is only available on trunk, not even on 1.4.1 (SOLR-1722). Also, 
watch out for SOLR-2127 bug, haven't gotten around to creating a patch yet... 

Ephraim Ofir 


-Original Message- 
From: Ron Chan [mailto:rc...@i-tao.com] 
Sent: Wednesday, October 13, 2010 9:20 AM 
To: solr-user@lucene.apache.org 
Subject: multicore defaultCoreName not working 

Hello 

I have this in my solr.xml 

 
 
 
 
 
 


admin is working and the individual cores are working through 

http://localhost:8080/solr/live/select/?q=abc 
and 
http://localhost:8080/solr/staging/select/?q=abc 

returning the correct results from the right core 

however, I wanted to keep the existing single core URLs and thought that the 
defaultCoreName attribute does this 

i.e. 
http://localhost:8080/solr/select/?q=abc 

should give me the "live" core 

but it gives me "Missing core name in path" 

Is there anything else I need to do? 

Thanks 
Ron 


RE: multicore defaultCoreName not working

2010-10-13 Thread Ephraim Ofir
Which version of solr are you using?
I believe this is only available on trunk, not even on 1.4.1 (SOLR-1722).  
Also, watch out for SOLR-2127 bug, haven't gotten around to creating a patch 
yet...

Ephraim Ofir


-Original Message-
From: Ron Chan [mailto:rc...@i-tao.com] 
Sent: Wednesday, October 13, 2010 9:20 AM
To: solr-user@lucene.apache.org
Subject: multicore defaultCoreName not working

Hello 

I have this in my solr.xml


  


  



admin is working and the individual cores are working through

http://localhost:8080/solr/live/select/?q=abc
and
http://localhost:8080/solr/staging/select/?q=abc

returning the correct results from the right core

however, I wanted to keep the existing single core URLs and thought that the 
defaultCoreName attribute does this

i.e.
http://localhost:8080/solr/select/?q=abc

should give me the "live" core

but it gives me "Missing core name in path"

Is there anything else I need to do?

Thanks
Ron


Re: multicore replication slave

2010-10-12 Thread Christopher Bottaro
Answered my own question.  Instead of naming each core in the
replication handler, you use a variable instead:


  
http://solr.mydomain.com:8983/solr/${solr.core.name}/replication
00:00:60
  


That will get all of your cores replicating.

-- C

On Mon, Oct 11, 2010 at 6:25 PM, Christopher Bottaro
 wrote:
> Hello,
>
> I can't get my multicore slave to replicate from the master.
>
> The master is setup properly and the following urls return "00OKNo
> command" as expected:
> http://solr.mydomain.com:8983/solr/core1/replication
> http://solr.mydomain.com:8983/solr/core2/replication
> http://solr.mydomain.com:8983/solr/core3/replication
>
> The following pastie shows how my slave is setup:
> http://pastie.org/1214209
>
> But it's not working (i.e. I see no replication attempts in the slave's log).
>
> Any ideas?
>
> Thanks for the help.
>


Re: multicore Vs multiple solr webapps

2010-05-27 Thread Ryan McKinley
The two approaches solve different needs.  In 'multicore' you have a
single webapp with multiple indexes.  This means they are all running
in the same JVM.  This may be an advantage or a disadvantage depending
on what you are doing.

ryan



On Thu, May 27, 2010 at 10:44 AM, Antonello Mangone
 wrote:
> Hi to all, I have a question for you ...
> Can someone exaplain me the differences between a unique solr application
> multicore and multiple solr webapps ???
> Thank you all in advance
>


Re: multicore Vs multiple solr webapps

2010-05-27 Thread David Stuart
So correction as per a different thread the next verison of solr will  
be 3.1 as per the merge with the luence tpl


David Stuart

On 27 May 2010, at 15:44, Antonello Mangone  
 wrote:



Hi to all, I have a question for you ...
Can someone exaplain me the differences between a unique solr  
application

multicore and multiple solr webapps ???
Thank you all in advance


Re: multicore Vs multiple solr webapps

2010-05-27 Thread David Stuart

Hi Antonello,

In multicore you get richer fuctionality including  core discovery,  
core config reload, alias, core
swap and (soon to be) core create. Under a single webapp you get  
control over memory allocation threads etc. Personally I would chose  
multicore and I believe in solr 1.5 they are going with a default  
multicore setup with a single core.


David Stuart

On 27 May 2010, at 15:44, Antonellor Mangone > wrote:



Hi to all, I have a question for you ...
Can someone exaplain me the differences between a unique solr  
application

multicore and multiple solr webapps ???
Thank you all in advance


Re: Multicore and TermVectors

2010-04-05 Thread Lance Norskog
There is no query parameter. The query parser throws an NPE if there
is no query parameter:

http://issues.apache.org/jira/browse/SOLR-435

It does not look like term vectors are processed in distributed search anyway.

On Mon, Apr 5, 2010 at 4:45 PM, Chris Hostetter
 wrote:
>
> : Subject: Multicore and TermVectors
>
> It doesn't sound like Multicore is your issue ... it seems like what you
> mean is that you are using distributed search with TermVectors, and that
> is causing a problem.  Can you please clarify exactly what you mean ...
> describe your exact setup (ie: how manay machines, how many solr ports
> running on each of those machines, what the solr.xml looks like on each of
> those ports, how many SolrCores running in each of those ports, what
> the slrconfig.xml looks like for each of those instances, which instances
> coordinate distributed searches of which shards, what urls your client
> hits, what URLs get hit on each of your shards (according to the logs) as
> a result, etc...
>
> details, details, details.
>
>
> -Hoss
>
>



-- 
Lance Norskog
goks...@gmail.com


Re: Multicore and TermVectors

2010-04-05 Thread Chris Hostetter

: Subject: Multicore and TermVectors

It doesn't sound like Multicore is your issue ... it seems like what you 
mean is that you are using distributed search with TermVectors, and that 
is causing a problem.  Can you please clarify exactly what you mean ... 
describe your exact setup (ie: how manay machines, how many solr ports 
running on each of those machines, what the solr.xml looks like on each of 
those ports, how many SolrCores running in each of those ports, what 
the slrconfig.xml looks like for each of those instances, which instances 
coordinate distributed searches of which shards, what urls your client 
hits, what URLs get hit on each of your shards (according to the logs) as 
a result, etc... 

details, details, details.


-Hoss



Re: multicore embedded swap / reload etc.

2010-03-29 Thread Lance Norskog
The code snippet you give tells how to access existing cores that are
registered in the top-level solr.xml file. The Wiki pages tells how
these cores are configured

The Wiki pages also discusses dynamic operations on multiple cores.
SolrJ should be able to do these as well (but I am not a SolrJ
expert).

On Fri, Mar 26, 2010 at 12:39 PM, Nagelberg, Kallin
 wrote:
> Thanks everyone,
> I was following the solrj wiki which says:
>
>
> """
> If you want to use MultiCore features, then you should use this:
>
>
>    File home = new File( "/path/to/solr/home" );
>    File f = new File( home, "solr.xml" );
>    CoreContainer container = new CoreContainer();
>    container.load( "/path/to/solr/home", f );
>
>    EmbeddedSolrServer server = new EmbeddedSolrServer( container, "core name 
> as defined in solr.xml" );
>    ...
> """
>
> I'm just a little confused with the disconnect between that and what I see 
> about managing multiple cores here: http://wiki.apache.org/solr/CoreAdmin . 
> If someone could provide some high-level directions it would be greatly 
> appreciated.
>
> Thanks,
> -Kallin Nagelberg
>
>
> -----Original Message-
> From: Mark Miller [mailto:markrmil...@gmail.com]
> Sent: Friday, March 26, 2010 7:54 AM
> To: solr-user@lucene.apache.org
> Subject: Re: multicore embedded swap / reload etc.
>
> Embedded supports MultiCore  - it's the direct core connection thing
> that supports one.
>
> - Mark
>
> http://www.lucidimagination.com (mobile)
>
> On Mar 26, 2010, at 7:38 AM, Erik Hatcher 
> wrote:
>
>> But wait... embedded Solr doesn't support multicore, does it?  Just
>> off memory, I think it's fixed to a single core.
>>
>>    Erik
>>
>> On Mar 25, 2010, at 10:31 PM, Lance Norskog wrote:
>>
>>> All operations through the SolrJ work exactly the same against the
>>> Solr web app and embedded Solr. You code the calls to update cores
>>> with the same SolrJ APIs either way.
>>>
>>> On Wed, Mar 24, 2010 at 2:19 PM, Nagelberg, Kallin
>>>  wrote:
>>>> Hi,
>>>>
>>>> I've got a situation where I need to reindex a core once a day. To
>>>> do this I was thinking of having two cores, one 'live' and one
>>>> 'staging'. The app is always serving 'live', but when the daily
>>>> index happens it goes into 'staging', then staging is swapped into
>>>> 'live'. I can see how to do this sort of thing over http, but I'm
>>>> using an embedded solr setup via solrJ. Any suggestions on how to
>>>> proceed? I could just have two solrServer's built from different
>>>> coreContainers, and then swap the references when I'm ready, but I
>>>> wonder if there is a better approach. Maybe grab a hold of the
>>>> CoreAdminHandler?
>>>>
>>>> Thanks,
>>>> Kallin Nagelberg
>>>>
>>>
>>>
>>>
>>> --
>>> Lance Norskog
>>> goks...@gmail.com
>>
>



-- 
Lance Norskog
goks...@gmail.com


Re: Multicore process

2010-03-28 Thread Blargy


Mark Miller-3 wrote:
> 
> Hmmm...but isn't your slave on a different machine? Every install is
> going to need a solr.xml, no way around that..
> 

Of course its on another machine. I was just hoping to only have 1 version
of solr.xml checked into our source control and that I can change which
configuration to use by passing some sort of java property on the command
line. Like I said its no real probelm.. im just getting picky now ;) Ill
just have to make sure that during the deploy that the correct configuration
gets copied to home/solr.xml

Thanks again!


-- 
View this message in context: 
http://n3.nabble.com/Multicore-process-tp681929p682225.html
Sent from the Solr - User mailing list archive at Nabble.com.


Re: Multicore process

2010-03-28 Thread Mark Miller

On 03/28/2010 05:43 PM, Blargy wrote:

Thanks that makes perfect sense for solrconfig.xml however I dont see that
sort of functionality for solr.xml.

Im guessing Ill need to manage 2 different versions of solr.xml

Version 1 master

   
 
 
   


Version 2 slave

   
 
   


And my app will always be pointing to http://slave-host:8983/solr/items

This isnt the biggest deal but if there is a better/alternative way I would
love to know.
   


Hmmm...but isn't your slave on a different machine? Every install is 
going to need a solr.xml, no way around that (other than removing the 
solr.xml and doing all multicore stuff programmaticly :) ).



Mark, I see you work for LucidImagination. Does the Lucid solr distribution
happen to come with Solr-236 patch (Field Collapsing). I know it has some
extras thrown in there but not quite sure of the exact nature of it. Im
already using the LucidKStemmer ;)
   


No, no Field Collapsing in the Lucid Dist - it will make it into Solr 
eventually tough.


--
- Mark

http://www.lucidimagination.com





Re: Multicore process

2010-03-28 Thread Blargy

Thanks that makes perfect sense for solrconfig.xml however I dont see that
sort of functionality for solr.xml.

Im guessing Ill need to manage 2 different versions of solr.xml

Version 1 master

  


  


Version 2 slave

  

  


And my app will always be pointing to http://slave-host:8983/solr/items

This isnt the biggest deal but if there is a better/alternative way I would
love to know.

Mark, I see you work for LucidImagination. Does the Lucid solr distribution
happen to come with Solr-236 patch (Field Collapsing). I know it has some
extras thrown in there but not quite sure of the exact nature of it. Im
already using the LucidKStemmer ;)
-- 
View this message in context: 
http://n3.nabble.com/Multicore-process-tp681929p682205.html
Sent from the Solr - User mailing list archive at Nabble.com.


Re: Multicore process

2010-03-28 Thread Mark Miller

On 03/28/2010 05:14 PM, Blargy wrote:

Nice. Almost there...

So it appears then that I will need two different solr.xml configurations.
One for the master defining core0 and core1 and one for the slave with the
default configuration. Is there anyway to specify master/slave specific
settings in solr.xml or will I have to have 2 different versions?

Not as big of a deal but in the future when I have more than 1 type of
document (currently "items") how would I configure solrconfig.xml for
replication? For example I have this as of now:


  http://localhost:8983/solr/items-live/replication


Which is fine... but what happens when I have another object say "users"


  http://localhost:8983/solr/users-live/replication


I guess when it comes down to that I will have to have 2 different versions
of solrconfig.xml too?

ps. I can't thank you enough for your time
   
Right -  two different solrconfig.xml's, or use XInclude to factor out 
the common parts into a third single file, and the two can just have the 
unique pieces in them.


http://wiki.apache.org/solr/SolrConfigXml?highlight=%28xinclude%29#XInclude

--
- Mark

http://www.lucidimagination.com





Re: Multicore process

2010-03-28 Thread Blargy

Nice. Almost there...

So it appears then that I will need two different solr.xml configurations.
One for the master defining core0 and core1 and one for the slave with the
default configuration. Is there anyway to specify master/slave specific
settings in solr.xml or will I have to have 2 different versions?

Not as big of a deal but in the future when I have more than 1 type of
document (currently "items") how would I configure solrconfig.xml for
replication? For example I have this as of now:


 http://localhost:8983/solr/items-live/replication


Which is fine... but what happens when I have another object say "users"


 http://localhost:8983/solr/users-live/replication


I guess when it comes down to that I will have to have 2 different versions
of solrconfig.xml too?

ps. I can't thank you enough for your time
-- 
View this message in context: 
http://n3.nabble.com/Multicore-process-tp681929p682176.html
Sent from the Solr - User mailing list archive at Nabble.com.


Re: Multicore process

2010-03-28 Thread Mark Miller

On 03/28/2010 04:49 PM, Blargy wrote:

I just thought about this...

Im guessing my slaves should always be replicating from the "live" master
core: (http://localhost:8983/solr/items-live/replication).

So my master solr will have a directory structure like this:

home/items/data/core0/index
home/items/data/core1/index

and at any point the "live" core could be physically located at core0 or
core1

Whereas my slave solr will have a directory structure like this:
home/items/data/index

Is this close?



   


Yes, exactly.

--
- Mark

http://www.lucidimagination.com





Re: Multicore process

2010-03-28 Thread Blargy

I just thought about this...

Im guessing my slaves should always be replicating from the "live" master
core: (http://localhost:8983/solr/items-live/replication). 

So my master solr will have a directory structure like this:

home/items/data/core0/index
home/items/data/core1/index

and at any point the "live" core could be physically located at core0 or
core1

Whereas my slave solr will have a directory structure like this:
home/items/data/index

Is this close?



-- 
View this message in context: 
http://n3.nabble.com/Multicore-process-tp681929p682149.html
Sent from the Solr - User mailing list archive at Nabble.com.


Re: Multicore process

2010-03-28 Thread Blargy

Ok great... its starting to make sense. Now Im just a little confused on
replication.

So I had previously had my slave configuration as follows

 

  commit
  startup
  schema.xml,stopwords.txt


  
   
http://${replication.host}:8983/solr/${solr.core.instanceDir}replication
  
  ${replication.interval}

  

But Im assuming Ill need to change this now? I really only want my "live"
data to be replicated so how can I configure this? There is no real need for
the slaves to replicate the "offline" data.

FYI my dir structure looks like this:

home/items/data/core0/index
home/items/data/core1/index

-- 
View this message in context: 
http://n3.nabble.com/Multicore-process-tp681929p682141.html
Sent from the Solr - User mailing list archive at Nabble.com.


Re: Multicore process

2010-03-28 Thread Mark Miller
Right - I'd just have the data dir be generic (like core0, core1 as you 
have i example 2) and then the names will be live and offline and flip 
back and forth between the core0, core1 dirs.



On 03/28/2010 04:06 PM, Blargy wrote:

Mark, first off thanks for the response. Im glad someone is around today ;)

So this is what I have so far:


   
 
 
   


So my directory structure is:

home/items/data/live/index
home/items/data/offline/index

So after playing around I see that swap literally just swaps the dataDir in
solr.xml. I have peristent = true so it saves which core is pointing to
which dataDir. So where I think I am a little confused is the naming
convention I used above. In this type of setup there is no such thing as a
live or offline dataDir as at any point they can be one or the other... the
core name is what really matters. So Im guessing this naming convention
makes a little more sense


   
 
 
   


Sine the actually dataDir name really doesnt mean anything. Is this the
correct reasoning?
   



--
- Mark

http://www.lucidimagination.com





Re: Multicore process

2010-03-28 Thread Blargy

Mark, first off thanks for the response. Im glad someone is around today ;)

So this is what I have so far:


  


  


So my directory structure is:

home/items/data/live/index
home/items/data/offline/index

So after playing around I see that swap literally just swaps the dataDir in
solr.xml. I have peristent = true so it saves which core is pointing to
which dataDir. So where I think I am a little confused is the naming
convention I used above. In this type of setup there is no such thing as a
live or offline dataDir as at any point they can be one or the other... the
core name is what really matters. So Im guessing this naming convention
makes a little more sense


  


  


Sine the actually dataDir name really doesnt mean anything. Is this the
correct reasoning? 
-- 
View this message in context: 
http://n3.nabble.com/Multicore-process-tp681929p682088.html
Sent from the Solr - User mailing list archive at Nabble.com.


Re: Multicore process

2010-03-28 Thread Mark Miller

On 03/28/2010 02:58 PM, Blargy wrote:


 Also, how do I share the same schema and config files?


In solr.xml you can specify schema.xml and config.xml - just specify the 
same one for each core. If you are creating cores dynamically, you can 
still do this. You prob want to use the shareSchema option.


http://wiki.apache.org/solr/CoreAdmin

--
- Mark

http://www.lucidimagination.com





Re: Multicore process

2010-03-28 Thread Mark Miller

On 03/28/2010 02:54 PM, Blargy wrote:


 I was hoping someone could explain to me how your Solr multicore
 process currently operates.

 This is what I am thinking about and I was hoping I could get some
 ideas/suggestions.

 I have a master/slave setup where the master will be doing all the
 indexing via DIH. Ill be doing a full-import every day or two with
 delta-imports being run throughout the day. I want to be able to have
 have an offline core that will be responsible for the the
 full-importing and when finished it will be swapped with the live
 core. While the full-import may take a few hours on the offline core
 Ill have delta-imports running on the live core. All slaves will be
 replicating from the master live core. Any comments on this logic?


Whats the purpose of the full import if you will also be doing delta 
imports? Won't the live core end up the same as the offline core that 
got the full import? I'm sure you have a reason, just not following...




 Ok, now to the implementation. I've been playing around with the core
 admin all day today but Im still unsure on the best way to accomplish
 the above process. Im guessing first I need to create a new core.
 Then Ill have to issue a DIH full-import against this new core. Then
 Ill run a swap command against offline and live cores which should
 switch the cores. This sounds about right but then Ill have a core
 named live which will not actually be live anymore right? Is there
 anyway around this?


Hmm...this is not really true. The core that is accessed by hitting 
/live will always be the live core (though the underlying SolrCore 
object will change) if that is the access path you use for live traffic 
- see below.




 When setting up the new core what should I use for my instanceDir
 and dataDir? At first I had something like this

 home/items/data/live/index home/items/data/offline/index

 but I dont think this is right. Should I have something like this?

 home/items/data/index home/items-offline/data/index


Yes - like this - the index dir under the data dir. But you only should 
make the data dir - the core will make the index dir when it does not 
see it - you will have issues if you make an empty index dir - seeing 
the dir, the core won't create it, and so the index will never get 
created inside the dir.




 When creating a new core from an existing core do the index files
 get copied?


I'm not sure what you mean here? I'm guessing the swap command as you 
reference above?


Swap will simply change what path references which core. So to start, 
localhost:8983/solr/live will hit one core, and 
localhost:8983/solr/offline will hit another core. You will direct all 
traffic to /live. Once you do the swap(live,offline), the live URL will 
actually hit the other core, and the offline URL will hit the previously 
live core. So there is no move or copy of files - it simply swaps which 
name accesses which core. Same thing if you are using solrj - it just 
changes which access name brings back a given underlying core.




 Can someone please explain to me this whole process. Thanks!






--
- Mark

http://www.lucidimagination.com





Re: Multicore process

2010-03-28 Thread Blargy

Also, how do I share the same schema and config files?
-- 
View this message in context: 
http://n3.nabble.com/Multicore-process-tp681929p681936.html
Sent from the Solr - User mailing list archive at Nabble.com.


RE: multicore embedded swap / reload etc.

2010-03-26 Thread Nagelberg, Kallin
Thanks everyone,
I was following the solrj wiki which says:


"""
If you want to use MultiCore features, then you should use this:


File home = new File( "/path/to/solr/home" );
File f = new File( home, "solr.xml" );
CoreContainer container = new CoreContainer();
container.load( "/path/to/solr/home", f );

EmbeddedSolrServer server = new EmbeddedSolrServer( container, "core name 
as defined in solr.xml" );
...
"""

I'm just a little confused with the disconnect between that and what I see 
about managing multiple cores here: http://wiki.apache.org/solr/CoreAdmin . If 
someone could provide some high-level directions it would be greatly 
appreciated.

Thanks,
-Kallin Nagelberg


-Original Message-
From: Mark Miller [mailto:markrmil...@gmail.com] 
Sent: Friday, March 26, 2010 7:54 AM
To: solr-user@lucene.apache.org
Subject: Re: multicore embedded swap / reload etc.

Embedded supports MultiCore  - it's the direct core connection thing  
that supports one.

- Mark

http://www.lucidimagination.com (mobile)

On Mar 26, 2010, at 7:38 AM, Erik Hatcher   
wrote:

> But wait... embedded Solr doesn't support multicore, does it?  Just  
> off memory, I think it's fixed to a single core.
>
>Erik
>
> On Mar 25, 2010, at 10:31 PM, Lance Norskog wrote:
>
>> All operations through the SolrJ work exactly the same against the
>> Solr web app and embedded Solr. You code the calls to update cores
>> with the same SolrJ APIs either way.
>>
>> On Wed, Mar 24, 2010 at 2:19 PM, Nagelberg, Kallin
>>  wrote:
>>> Hi,
>>>
>>> I've got a situation where I need to reindex a core once a day. To  
>>> do this I was thinking of having two cores, one 'live' and one  
>>> 'staging'. The app is always serving 'live', but when the daily  
>>> index happens it goes into 'staging', then staging is swapped into  
>>> 'live'. I can see how to do this sort of thing over http, but I'm  
>>> using an embedded solr setup via solrJ. Any suggestions on how to  
>>> proceed? I could just have two solrServer's built from different  
>>> coreContainers, and then swap the references when I'm ready, but I  
>>> wonder if there is a better approach. Maybe grab a hold of the  
>>> CoreAdminHandler?
>>>
>>> Thanks,
>>> Kallin Nagelberg
>>>
>>
>>
>>
>> -- 
>> Lance Norskog
>> goks...@gmail.com
>


Re: multicore embedded swap / reload etc.

2010-03-26 Thread Mark Miller
Embedded supports MultiCore  - it's the direct core connection thing  
that supports one.


- Mark

http://www.lucidimagination.com (mobile)

On Mar 26, 2010, at 7:38 AM, Erik Hatcher   
wrote:


But wait... embedded Solr doesn't support multicore, does it?  Just  
off memory, I think it's fixed to a single core.


   Erik

On Mar 25, 2010, at 10:31 PM, Lance Norskog wrote:


All operations through the SolrJ work exactly the same against the
Solr web app and embedded Solr. You code the calls to update cores
with the same SolrJ APIs either way.

On Wed, Mar 24, 2010 at 2:19 PM, Nagelberg, Kallin
 wrote:

Hi,

I've got a situation where I need to reindex a core once a day. To  
do this I was thinking of having two cores, one 'live' and one  
'staging'. The app is always serving 'live', but when the daily  
index happens it goes into 'staging', then staging is swapped into  
'live'. I can see how to do this sort of thing over http, but I'm  
using an embedded solr setup via solrJ. Any suggestions on how to  
proceed? I could just have two solrServer's built from different  
coreContainers, and then swap the references when I'm ready, but I  
wonder if there is a better approach. Maybe grab a hold of the  
CoreAdminHandler?


Thanks,
Kallin Nagelberg





--
Lance Norskog
goks...@gmail.com




Re: multicore embedded swap / reload etc.

2010-03-26 Thread Erik Hatcher
But wait... embedded Solr doesn't support multicore, does it?  Just  
off memory, I think it's fixed to a single core.


Erik

On Mar 25, 2010, at 10:31 PM, Lance Norskog wrote:


All operations through the SolrJ work exactly the same against the
Solr web app and embedded Solr. You code the calls to update cores
with the same SolrJ APIs either way.

On Wed, Mar 24, 2010 at 2:19 PM, Nagelberg, Kallin
 wrote:

Hi,

I've got a situation where I need to reindex a core once a day. To  
do this I was thinking of having two cores, one 'live' and one  
'staging'. The app is always serving 'live', but when the daily  
index happens it goes into 'staging', then staging is swapped into  
'live'. I can see how to do this sort of thing over http, but I'm  
using an embedded solr setup via solrJ. Any suggestions on how to  
proceed? I could just have two solrServer's built from different  
coreContainers, and then swap the references when I'm ready, but I  
wonder if there is a better approach. Maybe grab a hold of the  
CoreAdminHandler?


Thanks,
Kallin Nagelberg





--
Lance Norskog
goks...@gmail.com




Re: multicore embedded swap / reload etc.

2010-03-25 Thread Lance Norskog
All operations through the SolrJ work exactly the same against the
Solr web app and embedded Solr. You code the calls to update cores
with the same SolrJ APIs either way.

On Wed, Mar 24, 2010 at 2:19 PM, Nagelberg, Kallin
 wrote:
> Hi,
>
> I've got a situation where I need to reindex a core once a day. To do this I 
> was thinking of having two cores, one 'live' and one 'staging'. The app is 
> always serving 'live', but when the daily index happens it goes into 
> 'staging', then staging is swapped into 'live'. I can see how to do this sort 
> of thing over http, but I'm using an embedded solr setup via solrJ. Any 
> suggestions on how to proceed? I could just have two solrServer's built from 
> different coreContainers, and then swap the references when I'm ready, but I 
> wonder if there is a better approach. Maybe grab a hold of the 
> CoreAdminHandler?
>
> Thanks,
> Kallin Nagelberg
>



-- 
Lance Norskog
goks...@gmail.com


Re: multiCore

2010-03-06 Thread Erick Erickson
I've seen similar errors happen if you delete the *contents* of our index
directory but not the directory itself.

Just to be sure, stop/restart your SOLR instance if you manually
delete your index.

But the error I've seen when doing the above usually doesn't
mention a specific character, so I'd guess your XML isn't correctly
formed.

HTH
Erick

On Sat, Mar 6, 2010 at 1:55 AM, Suram  wrote:

>
>
>
> Siddhant Goel wrote:
> >
> > Can you provide the error message that you got?
> >
> > On Sat, Mar 6, 2010 at 11:13 AM, Suram  wrote:
> >
> >>
> >> Hi,
> >>
> >>
> >>  how can i send the xml file to solr after created the multicore.i tried
> >> it
> >> refuse accept
> >> --
> >> View this message in context:
> >> http://old.nabble.com/multiCore-tp27802043p27802043.html
> >> Sent from the Solr - User mailing list archive at Nabble.com.
> >>
> >>
> >
> >
> > --
> > - Siddhant
> >
> >
>
> i excute command like this :
>
> D:\solr\example\exampledocs>java -Ddata=args -Dcommit=yes -Durl=http://l
> ocalhost:8080/solr/core0/update -jar post.jar Example.xml
>
> Mar 6, 2010 12:20:36 PM org.apache.solr.common.SolrException log
> SEVERE: org.apache.solr.common.SolrException: Unexpected character 'E'
> (code
> 69) in prolog; expected '<'
>  at [row,col {unknown-source}]: [1,1]
>at org.apache.solr.handler.XMLLoader.load(XMLLoader.java:72)
>at
>
> org.apache.solr.handler.ContentStreamHandlerBase.handleRequestBody(ContentStreamHandlerBase.java:54)
>at
>
> org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:131)
>at org.apache.solr.core.SolrCore.execute(SolrCore.java:1316)
>at
>
> org.apache.solr.servlet.SolrDispatchFilter.execute(SolrDispatchFilter.java:338)
>at
>
> org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:241)
>at
>
> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:215)
>at
>
> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:188)
>at
>
> org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:213)
>at
>
> org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:172)
>at
>
> org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:127)
>at
>
> org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:117)
>at
>
> org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:108)
>at
> org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:174)
>at
> org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:873)
>at
>
> org.apache.coyote.http11.Http11BaseProtocol$Http11ConnectionHandler.processConnection(Http11BaseProtocol.java:665)
>at
>
> org.apache.tomcat.util.net.PoolTcpEndpoint.processSocket(PoolTcpEndpoint.java:528)
>at
>
> org.apache.tomcat.util.net.LeaderFollowerWorkerThread.runIt(LeaderFollowerWorkerThread.java:81)
>at
>
> org.apache.tomcat.util.threads.ThreadPool$ControlRunnable.run(ThreadPool.java:689)
>at java.lang.Thread.run(Unknown Source)
> Caused by: com.ctc.wstx.exc.WstxUnexpectedCharException: Unexpected
> character 'E' (code 69) in prolog; expected '<'
>  at [row,col {unknown-source}]: [1,1]
>at
> com.ctc.wstx.sr.StreamScanner.throwUnexpectedChar(StreamScanner.java:648)
>at
>
> com.ctc.wstx.sr.BasicStreamReader.nextFromProlog(BasicStreamReader.java:2047)
>at
> com.ctc.wstx.sr.BasicStreamReader.next(BasicStreamReader.java:1069)
>at
> org.apache.solr.handler.XMLLoader.processUpdate(XMLLoader.java:90)
>at org.apache.solr.handler.XMLLoader.load(XMLLoader.java:69)
>... 19 more
>
> if i exceute command like this :
>
> java -jar post.jar Example.xml
>
> SimplePostTool: version 1.2
> SimplePostTool: WARNING: Make sure your XML documents are encoded in UTF-8,
> othe
> r encodings are not currently supported
> SimplePostTool: POSTing files to http://localhost:8080/solr/update..
> SimplePostTool: POSTing file Example.xml
> SimplePostTool: FATAL: Solr returned an error: Bad Request
>
>
> --
> View this message in context:
> http://old.nabble.com/multiCore-tp27802043p27802330.html
> Sent from the Solr - User mailing list archive at Nabble.com.
>
>


Re: multiCore

2010-03-05 Thread Suram



Siddhant Goel wrote:
> 
> Can you provide the error message that you got?
> 
> On Sat, Mar 6, 2010 at 11:13 AM, Suram  wrote:
> 
>>
>> Hi,
>>
>>
>>  how can i send the xml file to solr after created the multicore.i tried
>> it
>> refuse accept
>> --
>> View this message in context:
>> http://old.nabble.com/multiCore-tp27802043p27802043.html
>> Sent from the Solr - User mailing list archive at Nabble.com.
>>
>>
> 
> 
> -- 
> - Siddhant
> 
> 

i excute command like this :

D:\solr\example\exampledocs>java -Ddata=args -Dcommit=yes -Durl=http://l
ocalhost:8080/solr/core0/update -jar post.jar Example.xml

Mar 6, 2010 12:20:36 PM org.apache.solr.common.SolrException log
SEVERE: org.apache.solr.common.SolrException: Unexpected character 'E' (code
69) in prolog; expected '<'
 at [row,col {unknown-source}]: [1,1]
at org.apache.solr.handler.XMLLoader.load(XMLLoader.java:72)
at
org.apache.solr.handler.ContentStreamHandlerBase.handleRequestBody(ContentStreamHandlerBase.java:54)
at
org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:131)
at org.apache.solr.core.SolrCore.execute(SolrCore.java:1316)
at
org.apache.solr.servlet.SolrDispatchFilter.execute(SolrDispatchFilter.java:338)
at
org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:241)
at
org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:215)
at
org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:188)
at
org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:213)
at
org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:172)
at
org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:127)
at
org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:117)
at
org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:108)
at
org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:174)
at
org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:873)
at
org.apache.coyote.http11.Http11BaseProtocol$Http11ConnectionHandler.processConnection(Http11BaseProtocol.java:665)
at
org.apache.tomcat.util.net.PoolTcpEndpoint.processSocket(PoolTcpEndpoint.java:528)
at
org.apache.tomcat.util.net.LeaderFollowerWorkerThread.runIt(LeaderFollowerWorkerThread.java:81)
at
org.apache.tomcat.util.threads.ThreadPool$ControlRunnable.run(ThreadPool.java:689)
at java.lang.Thread.run(Unknown Source)
Caused by: com.ctc.wstx.exc.WstxUnexpectedCharException: Unexpected
character 'E' (code 69) in prolog; expected '<'
 at [row,col {unknown-source}]: [1,1]
at
com.ctc.wstx.sr.StreamScanner.throwUnexpectedChar(StreamScanner.java:648)
at
com.ctc.wstx.sr.BasicStreamReader.nextFromProlog(BasicStreamReader.java:2047)
at com.ctc.wstx.sr.BasicStreamReader.next(BasicStreamReader.java:1069)
at org.apache.solr.handler.XMLLoader.processUpdate(XMLLoader.java:90)
at org.apache.solr.handler.XMLLoader.load(XMLLoader.java:69)
... 19 more

if i exceute command like this :

java -jar post.jar Example.xml

SimplePostTool: version 1.2
SimplePostTool: WARNING: Make sure your XML documents are encoded in UTF-8,
othe
r encodings are not currently supported
SimplePostTool: POSTing files to http://localhost:8080/solr/update..
SimplePostTool: POSTing file Example.xml
SimplePostTool: FATAL: Solr returned an error: Bad Request


-- 
View this message in context: 
http://old.nabble.com/multiCore-tp27802043p27802330.html
Sent from the Solr - User mailing list archive at Nabble.com.



Re: multiCore

2010-03-05 Thread Siddhant Goel
Can you provide the error message that you got?

On Sat, Mar 6, 2010 at 11:13 AM, Suram  wrote:

>
> Hi,
>
>
>  how can i send the xml file to solr after created the multicore.i tried it
> refuse accept
> --
> View this message in context:
> http://old.nabble.com/multiCore-tp27802043p27802043.html
> Sent from the Solr - User mailing list archive at Nabble.com.
>
>


-- 
- Siddhant


Re: multicore setup and security

2010-02-23 Thread Jorg Heymans
On Tue, Feb 23, 2010 at 10:09 AM, Shalin Shekhar Mangar <
shalinman...@gmail.com> wrote:

> On Mon, Feb 22, 2010 at 5:43 PM, Jorg Heymans  >wrote:
>
> >
> > What is the recommended pattern for securing a multicore solr instance,
> > accessed by different applications ? In our case, we need to prevent
> > application A from accessing the core of application B. Also, we need to
> > avoid the use of username/password authentication wherever possible. I
> have
> > read the wiki page on solr security and it talks about path based
> > authentication, but both DIGEST and BASIC auth are username/password
> based
> > so i'm looking for alternatives.
> >
> > One idea i had was to use https and create a x509 cert per application,
> > with
> > a different subject per application. Then on the solr server i would
> > somehow
> > need to extend the component that is responsible for delegating
> > /sorl/appA/*
> > to the appA request handlers (is there such thing even ?) and verify that
> > requests for /appA are done over https with a valid certificate that has
> > /appA as subject. Is this feasible ? Or maybe there is an easier way of
> > doing this ?
> >
> >
> I wouldn't go for a HTTPS based solution because HTTPS adds a huge
> overhead.
> Besides, you only need access control and not secure communication, right?
> Could a shared-secret approach work for your use-case? You can define a
> secret key per core and share it with the application supposed to use that
> core. Then you can write a Java Filter placed before SolrDispatchFilter
> which can look at the request path and verify access.
>

that would work equally well i guess and it's simpler to setup. Then again
the security level of a shared-secret approach depends mostly on how secret
the shared-secret really is, i'll consider that as the trade-off then.

Thanks
Jorg


Re: multicore setup and security

2010-02-23 Thread Shalin Shekhar Mangar
On Mon, Feb 22, 2010 at 5:43 PM, Jorg Heymans wrote:

>
> What is the recommended pattern for securing a multicore solr instance,
> accessed by different applications ? In our case, we need to prevent
> application A from accessing the core of application B. Also, we need to
> avoid the use of username/password authentication wherever possible. I have
> read the wiki page on solr security and it talks about path based
> authentication, but both DIGEST and BASIC auth are username/password based
> so i'm looking for alternatives.
>
> One idea i had was to use https and create a x509 cert per application,
> with
> a different subject per application. Then on the solr server i would
> somehow
> need to extend the component that is responsible for delegating
> /sorl/appA/*
> to the appA request handlers (is there such thing even ?) and verify that
> requests for /appA are done over https with a valid certificate that has
> /appA as subject. Is this feasible ? Or maybe there is an easier way of
> doing this ?
>
>
I wouldn't go for a HTTPS based solution because HTTPS adds a huge overhead.
Besides, you only need access control and not secure communication, right?
Could a shared-secret approach work for your use-case? You can define a
secret key per core and share it with the application supposed to use that
core. Then you can write a Java Filter placed before SolrDispatchFilter
which can look at the request path and verify access.

-- 
Regards,
Shalin Shekhar Mangar.


Re: Multicore Example

2010-02-19 Thread K Wong
The point that these guys are trying to make is that if another
program is using the port that Solr is trying to bind to then they
will both fight over the exclusive use of the port.

Both the netstat and lsof command work fine on my Mac (Leopard 10.5.8).

Trinity:~ kelvin$ which netstat
/usr/sbin/netstat
Trinity:~ kelvin$ which lsof
/usr/sbin/lsof
Trinity:~ kelvin$

If you use MacPorts, you can also find out port information using 'nmap'.

If something is already using the port Solr is trying to use then you
need to configure Solr to use a different port.

K



On Fri, Feb 19, 2010 at 12:51 PM, Lee Smith  wrote:
> Thanks Shawn
>
> I am actually running it on mac
>
> It does not like those unix commands ??
>
> Any further advice ?
>
> Lee
>
> On 19 Feb 2010, at 20:32, Shawn Heisey wrote:
>
>> Assuming you are on a unix variant with a working lsof, use this.  This 
>> probably won't work correctly on Solaris 10:
>>
>> lsof -nPi | grep 8983
>> lsof -nPi | grep 8080
>>
>> On Windows, you can do this in a command prompt.  It requires elevation on 
>> Vista or later.  The -b option was added in WinXP SP2 and Win2003 SP1, 
>> without it you can't see the program name that's got the port open:
>>
>> netstat -b > ports.txt
>> ports.txt
>>
>> Shawn
>>
>>
>> On 2/19/2010 1:01 PM, Lee Smith wrote:
>>> How can I find out ??
>>>
>>>
>>> On 19 Feb 2010, at 19:26, Dave Searle wrote:
>>>
>>>
 Do you have something else using port 8983 or 8080?

>>
>
>


Re: Multicore Example

2010-02-19 Thread Lee Smith
Thanks Shawn

I am actually running it on mac

It does not like those unix commands ??

Any further advice ?

Lee

On 19 Feb 2010, at 20:32, Shawn Heisey wrote:

> Assuming you are on a unix variant with a working lsof, use this.  This 
> probably won't work correctly on Solaris 10:
> 
> lsof -nPi | grep 8983
> lsof -nPi | grep 8080
> 
> On Windows, you can do this in a command prompt.  It requires elevation on 
> Vista or later.  The -b option was added in WinXP SP2 and Win2003 SP1, 
> without it you can't see the program name that's got the port open:
> 
> netstat -b > ports.txt
> ports.txt
> 
> Shawn
> 
> 
> On 2/19/2010 1:01 PM, Lee Smith wrote:
>> How can I find out ??
>> 
>> 
>> On 19 Feb 2010, at 19:26, Dave Searle wrote:
>> 
>>   
>>> Do you have something else using port 8983 or 8080?
>>> 
> 



Re: Multicore Example

2010-02-19 Thread Shawn Heisey
Assuming you are on a unix variant with a working lsof, use this.  This 
probably won't work correctly on Solaris 10:


lsof -nPi | grep 8983
lsof -nPi | grep 8080

On Windows, you can do this in a command prompt.  It requires elevation 
on Vista or later.  The -b option was added in WinXP SP2 and Win2003 
SP1, without it you can't see the program name that's got the port open:


netstat -b > ports.txt
ports.txt

Shawn


On 2/19/2010 1:01 PM, Lee Smith wrote:

How can I find out ??


On 19 Feb 2010, at 19:26, Dave Searle wrote:

   

Do you have something else using port 8983 or 8080?
 




Re: Multicore Example

2010-02-19 Thread Dave Searle
Are you on windows? Try netstat -a

Sent from my iPhone

On 19 Feb 2010, at 20:02, "Lee Smith"  wrote:

> How can I find out ??
>
>
> On 19 Feb 2010, at 19:26, Dave Searle wrote:
>
>> Do you have something else using port 8983 or 8080?
>>
>> Sent from my iPhone
>>
>> On 19 Feb 2010, at 19:22, "Lee Smith"  wrote:
>>
>>> Hey All
>>>
>>> Trying to dip my feet into multicore and hoping someone can advise
>>> why the example is not working.
>>>
>>> Basically I have been working with the example single core fine so I
>>> have stopped the server and restarted with the new command line for
>>> multicore
>>>
>>> ie, java -Dsolr.solr.home=multicore -jar start.jar
>>>
>>> When it launches I get this error:
>>>
>>> 2010-02-19 11:13:39.740::WARN:  EXCEPTION
>>> java.net.BindException: Address already in use
>>>  at java.net.PlainSocketImpl.socketBind(Native Method)
>>>  at etc
>>>
>>> Any ideas what this can be because I have stopped the first one.
>>>
>>> Thank you if you can advise.
>>>
>>>
>


Re: Multicore Example

2010-02-19 Thread Lee Smith
How can I find out ??


On 19 Feb 2010, at 19:26, Dave Searle wrote:

> Do you have something else using port 8983 or 8080?
> 
> Sent from my iPhone
> 
> On 19 Feb 2010, at 19:22, "Lee Smith"  wrote:
> 
>> Hey All
>> 
>> Trying to dip my feet into multicore and hoping someone can advise  
>> why the example is not working.
>> 
>> Basically I have been working with the example single core fine so I  
>> have stopped the server and restarted with the new command line for  
>> multicore
>> 
>> ie, java -Dsolr.solr.home=multicore -jar start.jar
>> 
>> When it launches I get this error:
>> 
>> 2010-02-19 11:13:39.740::WARN:  EXCEPTION
>> java.net.BindException: Address already in use
>>   at java.net.PlainSocketImpl.socketBind(Native Method)
>>   at etc
>> 
>> Any ideas what this can be because I have stopped the first one.
>> 
>> Thank you if you can advise.
>> 
>> 



Re: Multicore Example

2010-02-19 Thread Dave Searle
Do you have something else using port 8983 or 8080?

Sent from my iPhone

On 19 Feb 2010, at 19:22, "Lee Smith"  wrote:

> Hey All
>
> Trying to dip my feet into multicore and hoping someone can advise  
> why the example is not working.
>
> Basically I have been working with the example single core fine so I  
> have stopped the server and restarted with the new command line for  
> multicore
>
> ie, java -Dsolr.solr.home=multicore -jar start.jar
>
> When it launches I get this error:
>
> 2010-02-19 11:13:39.740::WARN:  EXCEPTION
> java.net.BindException: Address already in use
>at java.net.PlainSocketImpl.socketBind(Native Method)
>at etc
>
> Any ideas what this can be because I have stopped the first one.
>
> Thank you if you can advise.
>
>


Re: Multicore Example

2010-02-19 Thread Pascal Dimassimo

Are you sure that you don't have any java processes that are still running?

Did you change the port or are you still using 8983?


Lee Smith-6 wrote:
> 
> Hey All
> 
> Trying to dip my feet into multicore and hoping someone can advise why the
> example is not working.
> 
> Basically I have been working with the example single core fine so I have
> stopped the server and restarted with the new command line for multicore
> 
> ie, java -Dsolr.solr.home=multicore -jar start.jar
> 
> When it launches I get this error:
> 
> 2010-02-19 11:13:39.740::WARN:  EXCEPTION
> java.net.BindException: Address already in use
>   at java.net.PlainSocketImpl.socketBind(Native Method)
>   at etc
> 
> Any ideas what this can be because I have stopped the first one.
> 
> Thank you if you can advise.
> 
> 
> 
> 

-- 
View this message in context: 
http://old.nabble.com/Multicore-Example-tp27659052p27659102.html
Sent from the Solr - User mailing list archive at Nabble.com.



  1   2   >