Synchronising two masters

2014-07-08 Thread Prasi S
Hi ,
Our solr setup consists of 2 Masters and 2Slaves. The slaves would point to
any one of the Masters through a load balancer and replicate the data.

Master1(M1) is the primary indexer. I send data to M1. In case M1 fails, i
have a failover master, M2 and that would be indexing the data. The problem
is, once the Master1 comes up, how to synchornize M1 and M2? SolrCloud
would the option rather that going with this setup. But, currently we want
it to be implemented in Master-Slave mode.

Any suggestions?
Thanks,
Prasi


Re: Two solr instances access common index

2014-06-27 Thread Prasi S
My scenario was to have a single master and two slaves replicating from it.
In case the Master fails, we had a failover master. But synchornizing both
Masters was a problem. So, we planned to have a common index shared with 2
Masters. Indexing will happen via Master1 and if it fails then it would be
via Master2.

This was not as simple as we thought. there were lot of locking issues. so
i changed the lock type to none and it worked. But after some updates,
the index becomes unstable.

Thanks,
Prasi


On Fri, Jun 27, 2014 at 6:10 AM, Erick Erickson erickerick...@gmail.com
wrote:

 bq: Avoid all the ridiculous complexity of cloud

 And then re-introduce a single point of failure. Bad disk ==
 unfortunate consequences

 But frankly I don't see why you would ever _need_ to write from two
 Solr instances. Wouldn't simply having one writer (which you could
 change when you restarted) and multiple readers (which is OOB
 functionality now) work?

 And as far as offloading analysis, the preanalyzed field types seem
 interesting (although I haven't played with them)

 Erick

 On Thu, Jun 26, 2014 at 12:49 PM, Walter Underwood
 wun...@wunderwood.org wrote:
  Cool? More like generally useless.  --wunder
 
  On Jun 26, 2014, at 12:44 PM, Jack Krupansky j...@basetechnology.com
 wrote:
 
  Erick, I agree, but... wouldn't it be SO COOL if it did work! Avoid all
 the ridiculous complexity of cloud.
 
  Have a temporary lock to permit and exclude updates.
 
  -- Jack Krupansky
 
  -Original Message- From: Erick Erickson
  Sent: Thursday, June 26, 2014 12:37 PM
  To: solr-user@lucene.apache.org
  Subject: Re: Two solr instances access common index
 
  bq: But my scenario is that both solr instances would write to the
 common
  directory
 
  Do NOT do this. Don't even try. I guarantee Bad Things Will Happen.
 
  Why do you want to do this? To save disk space? Accomplish NRT
  searching on multiple machines?
 
  Please define the problem you're trying to solve and why existing
 supported
  ways of using Solr wouldn't work for you, e.g. SolrCloud or master/slave
  setups before asking for a specific solution, as this sounds very much
 like an
  XY problem.
 
  Best,
  Erick
 
  On Thu, Jun 26, 2014 at 4:25 AM, Prasi S prasi1...@gmail.com wrote:
  Can you please tell me whihc solr version you have tried with? I tried
  giving
 
  lockType${solr.lock.type:none}/lockType in 2 solr instances and
 now it
  is working. I am not getting the write lock exception when starting the
  second instance.
 
  But my scenario is that both solr instances would write to the common
  directory ( but not both simultaneously for sure). Is there any
 drawback of
  using  noLock
 
  Please advice.
 
  Thanks,
  Prasi
 
 
  On Thu, Jun 26, 2014 at 3:20 PM, Uwe Reh r...@hebis.uni-frankfurt.de
 wrote:
 
  Hi,
 
  with the lock type 'simple' I have tree instances (different JREs,
  GC-Problem) running on the same files.
  You should use this option only for a readonly system. Otherwise it's
 easy
  to corrupt the index.
 
  Maybe you should have a look on replication or SolrCloud.
 
  Uwe
 
 
  Am 26.06.2014 11:25, schrieb Prasi S:
 
  Hi,
  Is it possible to point two solr instances to point to a common index
  directory. Will this work wit changing the lock type?
 
 
 
  Thanks,
  Prasi
 
 
 
  --
  Walter Underwood
  wun...@wunderwood.org
 
 
 



Two solr instances access common index

2014-06-26 Thread Prasi S
Hi,
Is it possible to point two solr instances to point to a common index
directory. Will this work wit changing the lock type?



Thanks,
Prasi


Re: Two solr instances access common index

2014-06-26 Thread Prasi S
Can you please tell me whihc solr version you have tried with? I tried
giving

lockType${solr.lock.type:none}/lockType in 2 solr instances and now it
is working. I am not getting the write lock exception when starting the
second instance.

But my scenario is that both solr instances would write to the common
directory ( but not both simultaneously for sure). Is there any drawback of
using  noLock

Please advice.

Thanks,
Prasi


On Thu, Jun 26, 2014 at 3:20 PM, Uwe Reh r...@hebis.uni-frankfurt.de wrote:

 Hi,

 with the lock type 'simple' I have tree instances (different JREs,
 GC-Problem) running on the same files.
 You should use this option only for a readonly system. Otherwise it's easy
 to corrupt the index.

 Maybe you should have a look on replication or SolrCloud.

 Uwe


 Am 26.06.2014 11:25, schrieb Prasi S:

  Hi,
 Is it possible to point two solr instances to point to a common index
 directory. Will this work wit changing the lock type?



 Thanks,
 Prasi




Solr index pdf/word document with attachements

2014-06-19 Thread Prasi S
Hi ,
How can I index word / pdf documents with attachments to solr?

I have tried indexing a simple file with an attachment using tika, but it
does not index the attachment separately. Only the origiinal document is
getting indexed.



Thanks,
Prasi


Inconsistent behavior with Solr replication

2014-06-13 Thread Prasi S
Hi,
I am using solr 4.4 , replication set with one Master and 1 slave. Master
is set to replicate after startup and commit. It has an internal autocommit
of maxTime:15000.

Slave polls the master every 45sec to check for updates.

I indexed Master with DIH,

First, indexed half million docs in Master and it was replicated to slave.

Second, indexed next half million docs and same was in slave.

Third, while starting the index, by mistake i enalbed clean=true but
immediately gave another import with clean=false. now the Master has 1.5 M
docs but the slave has zero docs. How is this possible.

If i give, replication?command=indexversion in both, the version and
generation numbers are same. but slave has zero.
*Master:*
long name=indexversion1402644323820/long
long name=generation695/long
*Slave:*
long name=indexversion1402644323820/long
long name=generation695/long

*Admin UI Master REplication page:*

  Index Version Gen Size Master (Searching)
1402644188557
694
63.6 MB
Master (Replicable)
1402647815765
711
-
Settings (Master):

   - replication enable:
   - replicateAfter:commit, startup

*Admin UI Slave replication page:*

 Index Version Gen Size Master (Searching)
1402644188557
694
96.4 MB
Master (Replicable)
1402647831725
712
-
Slave (Searching)
1402647831725
712
67.69 MB
Settings:

   - master url:http:/ip:port/solr/col1
   - polling enable: (interval: 00:00:45)

Settings (Master):

   - replication enable:
   - replicateAfter:commit, startup


Thanks,
Prasi


Solr indexing javabean

2014-03-31 Thread Prasi S
Hi,
My solr document has a field is an xml. I am indexing the xml as such to
solr and at runtime, i get the xml, parse it and display. Instead of xml, can
we index that XML as a Java Bean.


Thanks,
Prasi


Re: Solr dih to read Clob contents

2014-03-28 Thread Prasi S
The column in my database is of xml datatype. But if I do not use
XMLSERIALIZE(SMRY as CLOB(1M)) as SMRY , and instead take SMRY field
directly as

select ID,SMRY from BOOK_REC, i get the below error,

Exception while processing: x document : SolrInputDocument(fields:
[id=45768734]):org.apache.solr.handler.dataimport.DataImportHandlerException:

Parsing failed for xml, url:null rows processed:0 Processing Document # 1



Caused by: com.ctc.wstx.exc.WstxUnexpectedCharException: Unexpected
character 'c' (code 99) in prolog; expected ''
 at javax.xml.stream.SerializableLocation@5780578


Thanks,
Prasi


On Mon, Mar 24, 2014 at 3:51 PM, Prasi S prasi1...@gmail.com wrote:

 Below is my full configuration,

 dataConfig
 dataSource driver=com.ibm.db2.jcc.DB2Driver
 url=jdbc:db2://IP:port/dbname user= password= /

 dataSource name=xmldata type=FieldReaderDataSource/

  document

 entity name=x query=SELECT ID, XMLSERIALIZE(SMRY as CLOB(1M)) as SMRY
 FROM BOOK_REC fetch first 40 rows only
 transformer=ClobTransformer 
 field column=MBR name=mbr /
 entity name=y dataSource=xmldata dataField=x.SMRY
 processor=XPathEntityProcessor
 forEach=/*:summary rootEntity=true 
 field column=card_no xpath=/cardNo /

 /entity
 /entity
   /document
 /dataConfig

 And this is my xml data

 ns:summary xmlns:ns=***
 cardNoZAYQ5181/tripId
 firstNameSam/firstName
 lastNameMathews/lastName
 date2013-01-18T23:29:04.492/date
 /ns:summary

 Thanks,
 Prasi


 On Mon, Mar 24, 2014 at 3:23 PM, Shalin Shekhar Mangar 
 shalinman...@gmail.com wrote:

 1. I don't see the definition of a datasource named 'xmldata' in your
 data-config.
 2. You have forEach=/*:summary but I don't think that is a syntax
 supported by XPathRecordReader.

 If you can give a sample of the xml stored as Clob in your database,
 then we can help you write the right xpaths.

 On Mon, Mar 24, 2014 at 12:55 PM, Prasi S prasi1...@gmail.com wrote:
  My database configuration is  as below
 
entity name=x query=SELECT ID, XMLSERIALIZE(SMRY as CLOB(1M)) as
 SMRY
  FROM BOOK_REC  fetch first 40 rows only
 transformer=ClobTransformer 
  field column=MBR name=mbr /
 entity name=y dataSource=xmldata dataField=x.SMRY
  processor=XPathEntityProcessor
  forEach=/*:summary rootEntity=true 
   field column=card_no xpath=/cardNo /
 
 /entity
   /entity
 
  and i get my response from solr as below
 
  doc
  str name=card_noorg...@1c8e807/str
 
  Am i mising anything?
 
 
 
  Thanks,
  Prasi
 
 
  On Thu, Mar 20, 2014 at 4:25 PM, Gora Mohanty g...@mimirtech.com
 wrote:
 
  On 20 March 2014 14:53, Prasi S prasi1...@gmail.com wrote:
  
   Hi,
   I have a requirement to index a database table with clob content.
 Each
  row
   in my table a column which is an xml stored as clob. I want to read
 the
   contents of xmlthrough dih and map each of the xml tag to a separate
 solr
   field,
  
   Below is my clob content.
   root
  authorA/author
  date02-Dec-2013/date
  .
  .
  .
   /root
  
   i want to read the contents of the clob and map author to
 author_solr and
   date to date_solr . Is this possible with a clob tranformer or a
 script
   tranformer.
 
  You will need to use a FieldReaderDataSource, and a
 XPathEntityProcessor
  along with the ClobTransformer. You do not provide details of your DIH
 data
  configuration file, but this should look something like:
 
  dataSource name=xmldata type=FieldReaderDataSource/
  ...
  document
entity name=x query=... transformer=ClobTransformer
   entity name=y dataSource=xmldata dataField=x.clob_column
  processor=XPathEntityProcessor forEach=/root
 field column=author_solr xpath=/author /
 field column=date_solr xpath=/date /
   /entity
/entity
  /document
 
  Regards,
  Gora
 



 --
 Regards,
 Shalin Shekhar Mangar.





Re: Solr dih to read Clob contents

2014-03-24 Thread Prasi S
My database configuration is  as below

  entity name=x query=SELECT ID, XMLSERIALIZE(SMRY as CLOB(1M)) as SMRY
FROM BOOK_REC  fetch first 40 rows only
   transformer=ClobTransformer 
field column=MBR name=mbr /
   entity name=y dataSource=xmldata dataField=x.SMRY
processor=XPathEntityProcessor
forEach=/*:summary rootEntity=true 
 field column=card_no xpath=/cardNo /

   /entity
 /entity

and i get my response from solr as below

doc
str name=card_noorg...@1c8e807/str

Am i mising anything?



Thanks,
Prasi


On Thu, Mar 20, 2014 at 4:25 PM, Gora Mohanty g...@mimirtech.com wrote:

 On 20 March 2014 14:53, Prasi S prasi1...@gmail.com wrote:
 
  Hi,
  I have a requirement to index a database table with clob content. Each
 row
  in my table a column which is an xml stored as clob. I want to read the
  contents of xmlthrough dih and map each of the xml tag to a separate solr
  field,
 
  Below is my clob content.
  root
 authorA/author
 date02-Dec-2013/date
 .
 .
 .
  /root
 
  i want to read the contents of the clob and map author to author_solr and
  date to date_solr . Is this possible with a clob tranformer or a script
  tranformer.

 You will need to use a FieldReaderDataSource, and a XPathEntityProcessor
 along with the ClobTransformer. You do not provide details of your DIH data
 configuration file, but this should look something like:

 dataSource name=xmldata type=FieldReaderDataSource/
 ...
 document
   entity name=x query=... transformer=ClobTransformer
  entity name=y dataSource=xmldata dataField=x.clob_column
 processor=XPathEntityProcessor forEach=/root
field column=author_solr xpath=/author /
field column=date_solr xpath=/date /
  /entity
   /entity
 /document

 Regards,
 Gora



Re: Solr dih to read Clob contents

2014-03-24 Thread Prasi S
Below is my full configuration,

dataConfig
dataSource driver=com.ibm.db2.jcc.DB2Driver
url=jdbc:db2://IP:port/dbname user= password= /
dataSource name=xmldata type=FieldReaderDataSource/

 document

entity name=x query=SELECT ID, XMLSERIALIZE(SMRY as CLOB(1M)) as SMRY
FROM BOOK_REC fetch first 40 rows only
transformer=ClobTransformer 
field column=MBR name=mbr /
entity name=y dataSource=xmldata dataField=x.SMRY
processor=XPathEntityProcessor
forEach=/*:summary rootEntity=true 
field column=card_no xpath=/cardNo /

/entity
/entity
  /document
/dataConfig

And this is my xml data

ns:summary xmlns:ns=***
cardNoZAYQ5181/tripId
firstNameSam/firstName
lastNameMathews/lastName
date2013-01-18T23:29:04.492/date
/ns:summary

Thanks,
Prasi


On Mon, Mar 24, 2014 at 3:23 PM, Shalin Shekhar Mangar 
shalinman...@gmail.com wrote:

 1. I don't see the definition of a datasource named 'xmldata' in your
 data-config.
 2. You have forEach=/*:summary but I don't think that is a syntax
 supported by XPathRecordReader.

 If you can give a sample of the xml stored as Clob in your database,
 then we can help you write the right xpaths.

 On Mon, Mar 24, 2014 at 12:55 PM, Prasi S prasi1...@gmail.com wrote:
  My database configuration is  as below
 
entity name=x query=SELECT ID, XMLSERIALIZE(SMRY as CLOB(1M)) as
 SMRY
  FROM BOOK_REC  fetch first 40 rows only
 transformer=ClobTransformer 
  field column=MBR name=mbr /
 entity name=y dataSource=xmldata dataField=x.SMRY
  processor=XPathEntityProcessor
  forEach=/*:summary rootEntity=true 
   field column=card_no xpath=/cardNo /
 
 /entity
   /entity
 
  and i get my response from solr as below
 
  doc
  str name=card_noorg...@1c8e807/str
 
  Am i mising anything?
 
 
 
  Thanks,
  Prasi
 
 
  On Thu, Mar 20, 2014 at 4:25 PM, Gora Mohanty g...@mimirtech.com
 wrote:
 
  On 20 March 2014 14:53, Prasi S prasi1...@gmail.com wrote:
  
   Hi,
   I have a requirement to index a database table with clob content. Each
  row
   in my table a column which is an xml stored as clob. I want to read
 the
   contents of xmlthrough dih and map each of the xml tag to a separate
 solr
   field,
  
   Below is my clob content.
   root
  authorA/author
  date02-Dec-2013/date
  .
  .
  .
   /root
  
   i want to read the contents of the clob and map author to author_solr
 and
   date to date_solr . Is this possible with a clob tranformer or a
 script
   tranformer.
 
  You will need to use a FieldReaderDataSource, and a XPathEntityProcessor
  along with the ClobTransformer. You do not provide details of your DIH
 data
  configuration file, but this should look something like:
 
  dataSource name=xmldata type=FieldReaderDataSource/
  ...
  document
entity name=x query=... transformer=ClobTransformer
   entity name=y dataSource=xmldata dataField=x.clob_column
  processor=XPathEntityProcessor forEach=/root
 field column=author_solr xpath=/author /
 field column=date_solr xpath=/date /
   /entity
/entity
  /document
 
  Regards,
  Gora
 



 --
 Regards,
 Shalin Shekhar Mangar.



Solr dih to read Clob contents

2014-03-20 Thread Prasi S
Hi,
I have a requirement to index a database table with clob content. Each row
in my table a column which is an xml stored as clob. I want to read the
contents of xmlthrough dih and map each of the xml tag to a separate solr
field,

Below is my clob content.
root
   authorA/author
   date02-Dec-2013/date
   .
   .
   .
/root

i want to read the contents of the clob and map author to author_solr and
date to date_solr . Is this possible with a clob tranformer or a script
tranformer.


Thanks,
Prasi


Network path for data directory

2014-03-13 Thread Prasi S
Hi,
I have solr index directory in a machine. I want a second solr instance on
a different server to use this index. Is it possible to specify the path of
a remote machine for data directory.

Thanks,
Prasi


solr securing index files

2014-03-13 Thread Prasi S
Hi,
Is there any way to secure the solr index directory . I have many users on
a server and i want to restrict file access to only the administrator.

does securing the index directory affect solr accessing the folder


Thanks,
Prasi


Setting up solr with IBM Websphere 7

2014-02-20 Thread Prasi S
Hi,
I have a requirement to setup solr in IBM websphere server 7.x. Has anybody
done the same in your project? Is there any blog/ link with the set of
instructions for doing the same?
Please advice.

Thanks,
Prasi


Solr and SDL Tridion Integration

2014-02-03 Thread Prasi S
Hi,
I want to index sdl tridion content to solr. Can you suggest how this can
be achieved. Is there any document/tutorial for this? Thanks

Thanks,
Prasi


Re: Solr and SDL Tridion Integration

2014-02-03 Thread Prasi S
Thanks a lot for the options. Our site has dynamic content as well. I would
look into what best suits.

Thanks,
Prasi


On Mon, Feb 3, 2014 at 10:34 PM, Chris Warner chris_war...@yahoo.comwrote:

 There are many ways to do this, Prasi. You have a lot of thinking to do on
 the subject.

 You could decide to publish your content to database, and then index that
 database in Solr.

 You could publish XML or CSV files of your content for Solr to read and
 index.

 You could use nutch or some other tool to crawl your web server.

 There are many more methods, probably. These being some of the more common.

 Does your site have dynamic content presentation? If so, you may want to
 consider having Solr examine your broker database.

 Static pages on your site? You may want to go with either a crawler or
 publishing a special file for Solr.

 Please check out https://tridion.stackexchange.com/ for more on this
 topic.

 --
 chris_war...@yahoo.com



 On Monday, February 3, 2014 3:54 AM, Jack Krupansky 
 j...@basetechnology.com wrote:
 If SDL Tridion can export to CSV format, Solr can then import from CSV
 format.

 Otherwise, you may have to write a custom script or even maybe Java code to
 read from SDL Tridion and output a supported Solr format, such as Solr XML,
 Solr JSON, or CSV.

 -- Jack Krupansky


 -Original Message-
 From: Prasi S
 Sent: Monday, February 3, 2014 4:16 AM
 To: solr-user@lucene.apache.org
 Subject: Solr and SDL Tridion Integration

 Hi,
 I want to index sdl tridion content to solr. Can you suggest how this can
 be achieved. Is there any document/tutorial for this? Thanks

 Thanks,
 Prasi



Solr standard score

2013-12-09 Thread Prasi S
Hi,
I have a requirement to standardize solr scores. For example,


docs with score  7   Most relevant
docs with score 7 and 4  Moderate
docs with score 4  Less relevant.

But in the real scenario this does not happen, as in few scenarios the top
document may have a score of 3.5.

Can i have the scores standardized in someway ( by index/query boosting) so
that i can achieve this.



Thanks,
Prasi


Solr non-suuported languages

2013-12-02 Thread Prasi S
hi ,
I have a requirement to index and search few languages that are not
supported by solr. ( E.g countries like Slovenia, Moldova, Belarus etc.)

If i need to do only exact match against these langauges, what sort of
analyser, tokenizers would suit

thanks.


Thanks,
Prasi


Solrj Query Performance

2013-11-28 Thread Prasi S
Hi,
We recently saw a behavior which I wanted to confirm, WE are using solrj to
query solr. From the code, we use HttpSolrServer to hit the query and
return the response

1. When a sample query is hit using Solrj, we get the QTime as 4seconds.
The same query when we hit against solr in the browser, we get it in
50milliseconds.

Initially we thought it was because of caching.

But then, we tried the reverse way. We hit a new query to solr in the
browser first. We got in milliseconds. Then we used Solrj, it came to 4.5
seconds. ( We take the QTime from the response object Header.

Is this anything to do with Solrj's internal implementation?

Thanks,
Prasi


Persist solr cache

2013-11-27 Thread Prasi S
Hi all,
Is there any way the solr caches ( document / field/ query) cache can be
persisted on disk. In case of system crash, can i make the new cache loaded
from the persisted cache.

Thanks,
Prasi


Re: Persist solr cache

2013-11-27 Thread Prasi S
Currently , once solr is started, we run a batch that would fire queries to
solr ( just something like the firstsearcher does). Once this is done, then
the users would start using search.

In case the server is restarted or anything crashes, then again i have to
run this batch which i cannot control. Thats why if there is any way we can
persist .

This was only for our business scenario.



Thanks,
Prasi


On Wed, Nov 27, 2013 at 2:05 PM, michael.boom my_sky...@yahoo.com wrote:

 Caches are only valid as long as the Index Searcher is valid. So, if you
 make
 a commit with opening a new searcher then caches will be invalidated.
 However, in this scenario you can configure your caches so that the new
 searcher will keep a certain number of cache entries from the previous one
 (autowarmCount).
 That's the only cache persistence Solr can offer. On restarting/crash you
 can't reuse caches.

 Why do you need to persist caches in case of a crash? What's your usage
 scenario?
 Do you have problems with performance after startup?

 You can read more at http://wiki.apache.org/solr/SolrCaching#Overview



 -
 Thanks,
 Michael
 --
 View this message in context:
 http://lucene.472066.n3.nabble.com/Persist-solr-cache-tp4103463p4103469.html
 Sent from the Solr - User mailing list archive at Nabble.com.



Solr Autowarmed queries on jvm crash

2013-11-26 Thread Prasi S
Hi,
What happens to the autowarmed queries if the servers is shutdown / jvm
crashes.

Is there any possibility to recover that from the physical storage (
trasaction log?)


Thanks,
Sinduja


Re: Solr Autowarmed queries on jvm crash

2013-11-26 Thread Prasi S
Thanks Shawn for the reply.

In that case, when the system is restarted, a new searcher would be opened?
It cannot populate from its previous searchers?

I may be wrong here, but i wanted to confirm.


Thanks,
Prasi


On Wed, Nov 27, 2013 at 12:04 PM, Shawn Heisey s...@elyograg.org wrote:

 On 11/26/2013 11:15 PM, Prasi S wrote:
  What happens to the autowarmed queries if the servers is shutdown / jvm
  crashes.
 
  Is there any possibility to recover that from the physical storage (
  trasaction log?)

 The transaction log only contains data that is sent to Solr for
 indexing.  Cached query data is lost when the program exits, so it
 cannot be used for autowarming.  If the logs are set to at least INFO
 severity, they will contain a query history, but Solr doesn't have any
 way to pull those back out of the logfile and re-use them.

 If firstSearcher and/or newSearcher warming queries are defined in
 solrconfig.xml, then those will be re-done when Solr starts back up.

 Thanks,
 Shawn




Re: Solr Autowarmed queries on jvm crash

2013-11-26 Thread Prasi S
Ok. i have started solr for the first time and have autowarmed few queries.
Now my jvm crashes due to some other reason . Then i restart solr. What
would happen to the autowarmed queries , cache , old searcher now.

Thanks,
Prasi


On Wed, Nov 27, 2013 at 12:32 PM, Shawn Heisey s...@elyograg.org wrote:

 On 11/26/2013 11:49 PM, Prasi S wrote:
  Thanks Shawn for the reply.
 
  In that case, when the system is restarted, a new searcher would be
 opened?
  It cannot populate from its previous searchers?
 
  I may be wrong here, but i wanted to confirm.

 There are no previous searchers when Solr first starts up.  At startup,
 any queries defined as part of the firstSearcher event are executed.
 Each time a new searcher is created, any queries defined as part of the
 newSearcher event are executed.

 Thanks,
 Shawn




SolrCloud frequently hanging

2013-10-22 Thread Prasi S
Hi all,
We are using solrcloud 4.4 (solrcloud with external zookeeper, 2 tomcats ,
2 solr- 1 in each tomcat) for indexing delimited files. Our index records
count to 220 Million. We have three different files each with a partial set
of data.

We index the first file completely. Then the second and thrid files are
partial updates.

1. While we are testing the indexing performance, we notice that the solr
hangs frequently after 2 days. It just hangs for about an hour or 2 hours
 and then if we hit the admin url , it comes back and starts indexing. Why
does this happen?

We have noticed that in the last 12 hours , the hangin was so frequent .
almost 6 hours it was just in hanged state.

2. also, commit time also increases for the partial upload.


Do we need to tweek any parameter or is it the behavior with Cloud for huge
volume of data?


Thanks,
Prasi


Re: SolrCloud frequently hanging

2013-10-22 Thread Prasi S
bq: ...three different files each with a partial set
of data.

WE have to index around 170 metadata. around 120 fields are int he first
file, 50 metadata in the second fiel and 6 on the third file. All the three
files have the same unique key. We use solrj to push these files to solr.
First, we index the first file for the 220 Million records. Then we take
the second file, do a partial update on the existing 220M. then the same is
repeated for the third file.

WE commit in batches. Our batch consist of 20,000 records. Once 5 such
batches are sent to solr, we send a commit to solr from the code. We have
disabled Softcommit. The hardcommit is as below.

 autoCommit
   maxTime${solr.autoCommit.maxTime:60}/maxTime
   openSearcherfalse/openSearcher
 /autoCommit


Thanks,
Prasi


On Tue, Oct 22, 2013 at 2:34 PM, Erick Erickson erickerick...@gmail.comwrote:

 This is not a lot of data really.

 bq: ...three different files each with a partial set
 of data.

 OK, what does this mean? Are you importing as CSV files or
 something? Are you trying to commit 10s of M documents at once?

 This shouldn't be merging since you're in 4.4 unless you're committing
 far too frequently.

 What are your commit settings? Both soft and hard? How are you
 committing?

 In short, there's not a lot of information to go on here, you need to
 provide
 a number of details.

 Best,
 Erick


 On Tue, Oct 22, 2013 at 9:25 AM, Prasi S prasi1...@gmail.com wrote:

  Hi all,
  We are using solrcloud 4.4 (solrcloud with external zookeeper, 2 tomcats
 ,
  2 solr- 1 in each tomcat) for indexing delimited files. Our index records
  count to 220 Million. We have three different files each with a partial
 set
  of data.
 
  We index the first file completely. Then the second and thrid files are
  partial updates.
 
  1. While we are testing the indexing performance, we notice that the solr
  hangs frequently after 2 days. It just hangs for about an hour or 2 hours
   and then if we hit the admin url , it comes back and starts indexing.
 Why
  does this happen?
 
  We have noticed that in the last 12 hours , the hangin was so frequent .
  almost 6 hours it was just in hanged state.
 
  2. also, commit time also increases for the partial upload.
 
 
  Do we need to tweek any parameter or is it the behavior with Cloud for
 huge
  volume of data?
 
 
  Thanks,
  Prasi
 



DIH with SolrCloud

2013-10-08 Thread Prasi S
Hi ,
I have setup solrcloud with solr4.4. The cloud has 2 tomcat instances with
separate zookeeper.

 i execute the below command in the url,

http://localhost:8180/solr/colindexer/dataimportmssql?command=full-importcommit=trueclean=false


response
lst name=responseHeader
int name=status0/int
int name=QTime0/int
/lst
lst name=initArgs
lst name=defaults
str name=configdata-config-mssql.xml/str
/lst
/lst
str name=commandstatus/str
str name=statusidle/str
str name=importResponse/
lst name=statusMessages
str name=Total Requests made to DataSource1/str
str name=Total Rows Fetched0/str
str name=Total Documents Skipped0/str
str name=Full Dump Started2013-10-08 10:55:27/str
str name=Total Documents Processed0/str
str name=Time taken0:0:1.585/str
/lst
str name=WARNING
This response format is experimental. It is likely to change in the future.
/str
/response

I dont get Indexing completed. added  documents ...  status message at
all. Also, when i check the dataimport in Solr admin page,get the below
status. and no documents are indexed.


[image: Inline image 1]

Not sure of the problem.


Re: DIH with SolrCloud

2013-10-08 Thread Prasi S
My select statement retusn documents. i have checked the query in the sql
server.

The problem is the same configuration i have given with default handler
/dataimport. It was working. If i give it with /dataimportmssql handler , i
get this type of behaviour


On Tue, Oct 8, 2013 at 1:28 PM, Raymond Wiker rwi...@gmail.com wrote:

 It looks like your select statement does not return any rows... have you
 verified it with some sort of SQL client?


 On Tue, Oct 8, 2013 at 8:57 AM, Prasi S prasi1...@gmail.com wrote:

  Hi ,
  I have setup solrcloud with solr4.4. The cloud has 2 tomcat instances
 with
  separate zookeeper.
 
   i execute the below command in the url,
 
 
 
 http://localhost:8180/solr/colindexer/dataimportmssql?command=full-importcommit=trueclean=false
 
 
  response
  lst name=responseHeader
  int name=status0/int
  int name=QTime0/int
  /lst
  lst name=initArgs
  lst name=defaults
  str name=configdata-config-mssql.xml/str
  /lst
  /lst
  str name=commandstatus/str
  str name=statusidle/str
  str name=importResponse/
  lst name=statusMessages
  str name=Total Requests made to DataSource1/str
  str name=Total Rows Fetched0/str
  str name=Total Documents Skipped0/str
  str name=Full Dump Started2013-10-08 10:55:27/str
  str name=Total Documents Processed0/str
  str name=Time taken0:0:1.585/str
  /lst
  str name=WARNING
  This response format is experimental. It is likely to change in the
 future.
  /str
  /response
 
  I dont get Indexing completed. added  documents ...  status message at
  all. Also, when i check the dataimport in Solr admin page,get the below
  status. and no documents are indexed.
 
 
  [image: Inline image 1]
 
  Not sure of the problem.
 



Solr Commit Time

2013-09-27 Thread Prasi S
Hi,
What would be the maximum commit time for indexing 1 lakh documents in solr
on a 32 gb machine.



Thanks,
Prasi


Solr DIH call a java class

2013-09-24 Thread Prasi S
Hi,
Can we call a java class inside a solr ddata-config.xml file similar to
calling a script function.

I have few manipulations to do before sending data via dataimporthandler.
For each row, can I pass that row to a java class in the same way we pass
it to a script function?


Thanks,
Prasi


Zookeeper could not read dataimport.properties

2013-09-21 Thread Prasi S
Hi,
Im using solr 4.4 cloud setup with external zookeeper and solr running in
tomcat.

For the initial indexing, i use csv files to load data to solr. Then we do
delta indexing from database table.

My zookeeper always thorws exception saying could not read
dataimport.properties.

Where should we configure zookeeper to create / read dataimport.properties?

Thanks,
Prasi


Re: Stop zookeeper from batch

2013-09-18 Thread Prasi S
Yeah, but its not yet into the zookeeper's latest releases. Is it fine with
using it.


On Wed, Sep 18, 2013 at 2:39 AM, Furkan KAMACI furkankam...@gmail.comwrote:

 Are you looking for that:

 https://issues.apache.org/jira/browse/ZOOKEEPER-1122

 16 Eylül 2013 Pazartesi tarihinde Prasi S prasi1...@gmail.com adlı
 kullanıcı şöyle yazdı:
  Hi,
  We have setup solrcloud with zookeeper and 2 tomcats . we are using a
 batch
  file to start the zookeeper, uplink config files and start tomcats.
 
  Now, i need to stop zookeeper from the batch file. How is this possible.
 
  Im using Windows server. Zookeeper 3.4.5 version.
 
  Pls help.
 
  Thanks,
  Prasi
 



Re: FAcet with values are displayes in output

2013-09-18 Thread Prasi S
How to filter them in the query itself?

Thanks,
Prasi


On Wed, Sep 18, 2013 at 1:06 PM, Upayavira u...@odoko.co.uk wrote:

 Filter them out in your query, or in your display code.

 Upayavira

 On Wed, Sep 18, 2013, at 06:36 AM, Prasi S wrote:
  Hi ,
  Im using solr 4.4 for our search. When i query for a keyword, it returns
  empty valued facets in the response
 
  lst name=facet_counts
  lst name=facet_queries/
  lst name=facet_fields
  lst name=Country
  *int name=1/int*
  int name=USA1/int
  /lst
  /lst
  lst name=facet_dates/
  lst name=facet_ranges/
  /lst
 
  I have also tried using facet.missing parameter., but no change. How can
  we
  handle this.
 
 
  Thanks,
  Prasi



Re: FAcet with values are displayes in output

2013-09-18 Thread Prasi S
No analysis is done on the facets. The facets are string fields.


On Wed, Sep 18, 2013 at 11:59 PM, tamanjit.bin...@yahoo.co.in 
tamanjit.bin...@yahoo.co.in wrote:

 Any analysis happening on the country field during indexing? If so then
 facets are on tokens.



 --
 View this message in context:
 http://lucene.472066.n3.nabble.com/FAcet-with-values-are-displayes-in-output-tp4090777p4090904.html
 Sent from the Solr - User mailing list archive at Nabble.com.



FAcet with values are displayes in output

2013-09-17 Thread Prasi S
Hi ,
Im using solr 4.4 for our search. When i query for a keyword, it returns
empty valued facets in the response

lst name=facet_counts
lst name=facet_queries/
lst name=facet_fields
lst name=Country
*int name=1/int*
int name=USA1/int
/lst
/lst
lst name=facet_dates/
lst name=facet_ranges/
/lst

I have also tried using facet.missing parameter., but no change. How can we
handle this.


Thanks,
Prasi


Stop zookeeper from batch

2013-09-16 Thread Prasi S
Hi,
We have setup solrcloud with zookeeper and 2 tomcats . we are using a batch
file to start the zookeeper, uplink config files and start tomcats.

Now, i need to stop zookeeper from the batch file. How is this possible.

Im using Windows server. Zookeeper 3.4.5 version.

Pls help.

Thanks,
Prasi


Solr PingQuery

2013-09-14 Thread Prasi S
Hi,
I use SolrPingResponse.getStatus method to start indexing to solr. I use
SolrCloud with external zookeeper

If i send it to the Zookeeper, if zookeeper is down, it returns NOTOK.

But if one of my solr is up and second solr is down, the Ping returns OK
status.

Is this the behavior?


Thanks,
Prasi


Escaping *, ? in Solr

2013-09-13 Thread Prasi S
Hi,
I want to do regex search in solr.

E.g: Googl* . In my query api, i have used the ClientUtils.escapeQueryChars
funtion to escape characters special to solr.

In the above case, a search for
1. Google - gives 677 records.
2. Googl* - Escaped as Googl\* in code- gives 12 results
3. When given q=Google* directly in the Browser - gives 677 records.

Which is correct if I want to achieve regex search ( Googl*). Should i
restrict from escaping *, ? in the code for handling regex?

Pls suggest.

Thanks,
Prasi.


Solr wildcard search

2013-09-13 Thread Prasi S
Hi all,
I am working with wildcard queries and few things are confusing.

1. Does a wildcard search omit the analysers on a particular field?

2. I have searched for
q=google\ technology - gives result
q=google technology - Gives results
q=google tech*   - gives results
q=google\ tech* - 0 results. The debug Query for the last query is str
name=parsedquery_toStringtext:google tech*/str

Why does this happen.


Thanks,
Prasi


Re: number of replicas in Cloud

2013-09-12 Thread Prasi S
Hi Anshum,
Im using solr 4.4. Is there a problem with using replicationFactor of 2




On Thu, Sep 12, 2013 at 11:20 AM, Anshum Gupta ans...@anshumgupta.netwrote:

 Prasi, a replicationFactor of 2 is what you want. However, as of the
 current releases, this is not persisted.



 On Thu, Sep 12, 2013 at 11:17 AM, Prasi S prasi1...@gmail.com wrote:

  Hi,
  I want to setup solrcloud with 2 shards and 1 replica for each shard.
 
  MyCollection
 
  shard1 , shard2
  shard1-replica , shard2-replica
 
  In this case, i would numShards=2. For replicationFactor , should give
  replicationFactor=1 or replicationFActor=2 ?
 
 
  Pls suggest me.
 
  thanks,
  Prasi
 



 --

 Anshum Gupta
 http://www.anshumgupta.net



number of replicas in Cloud

2013-09-11 Thread Prasi S
Hi,
I want to setup solrcloud with 2 shards and 1 replica for each shard.

MyCollection

shard1 , shard2
shard1-replica , shard2-replica

In this case, i would numShards=2. For replicationFactor , should give
replicationFactor=1 or replicationFActor=2 ?


Pls suggest me.

thanks,
Prasi


Does configuration change requires Zookeeper restart?

2013-09-09 Thread Prasi S
Hi,
I have solrcloud with two collections. I have indexed 100Million docs to
the first collection.

I need some changes to the solr configuration files. Im going to index the
new data tot he second collection. What are the steps that i should follow?
Should i restart the zookeeper?

Pls suggest


Thanks,
Prasi


Re: SolrCloud - Path must not end with / character

2013-09-03 Thread Prasi S
The issue is resolved. I have given all the path inside tomcat as relative
paths( solr home, solr war). That was the creating the problem.


On Mon, Sep 2, 2013 at 2:19 PM, Prasi S prasi1...@gmail.com wrote:

 Does this have anyting to do with tomcat? I cannot go back as we already
 fixed with tomcat.

 Any suggestions pls. The same setup , if i copy and run it on a different
 machine, it works fine. Am not sure what is missing. Is it because of some
 system parameter getting set?


 On Fri, Aug 30, 2013 at 9:11 PM, Jared Griffith 
 jgriff...@picsauditing.com wrote:

 I was getting the same errors when trying to implement SolrCloud with
 Tomcat.  I eventually gave up until something came out of this thread.
 This all works if you just ditch Tomcat and go with the native Jetty
 server.


 On Fri, Aug 30, 2013 at 6:28 AM, Prasi S prasi1...@gmail.com wrote:

  Also, this fails with the default solr 4.4 downlaoded configuration too
 
 
  On Fri, Aug 30, 2013 at 4:19 PM, Prasi S prasi1...@gmail.com wrote:
 
   Below is the script i run
  
   START /MAX
   F:\SolrCloud\zookeeper\zk-server-1\zookeeper-3.4.5\bin\zkServer.cmd
  
  
   START /MAX F:\solrcloud\zookeeper java -classpath .;solr-lib/*
   org.apache.solr.cloud.ZkCLI -cmd upconfig -zkhost localhost:2182
 -confdir
   solr-conf -confname solrconf1
  
  
  
   START /MAX F:\solrcloud\zookeeper java -classpath .;solr-lib/*
   org.apache.solr.cloud.ZkCLI -cmd linkconfig -zkhost 127.0.0.1:2182
 -collection
  firstcollection -confname solrconf1 -solrhome ../tomcat1/solr1
  
  
  
   START /MAX F:\solrcloud\zookeeper java -classpath .;solr-lib/*
   org.apache.solr.cloud.ZkCLI -cmd upconfig -zkhost localhost:2182
 -confdir
   solr-conf -confname solrconf2
  
  
  
  
   START /MAX F:\solrcloud\zookeeper java -classpath .;solr-lib/*
   org.apache.solr.cloud.ZkCLI -cmd linkconfig -zkhost 127.0.0.1:2182
 -collection
  seccollection -confname solrconf2 -solrhome ../tomcat1/solr1
  
  
  
   START /MAX F:\solrcloud\tomcat1\bin\startup.bat
  
  
  
   START /MAX F:\solrcloud\tomcat2\bin\startup.bat
  
  
   On Fri, Aug 30, 2013 at 4:07 PM, Prasi S prasi1...@gmail.com wrote:
  
   Im still clueless on where the issue could be. There is no much
   information in the solr logs.
  
   i had a running version of cloud in another server. I have copied the
   same to this server, and started zookeeper, then ran teh below
 commands,
  
   java -classpath .;solr-lib/* org.apache.solr.cloud.ZkCLI -cmd
 upconfig
   -zkhost localhost:2181 -confdir solr-conf -confname solrconfindex
  
   java -classpath .;solr-lib/* org.apache.solr.cloud.ZkCLI -cmd
 linkconfig
   -zkhost 127.0.0.1:2181 -collection colindexer -confname
 solrconfindex
   -solrhome ../tomcat1/solr1
  
   After this, when i started tomcat, the first tomcat starts fine. When
  the
   second tomcat is started, i get the above exception and it stops.
 Tehn
  the
   first tomcat also shows teh same exception.
  
  
  
  
   On Thu, Aug 29, 2013 at 7:18 PM, Mark Miller markrmil...@gmail.com
  wrote:
  
   Yeah, you see this when the core could not be created. Check the
 logs
  to
   see if you can find something more useful.
  
   I ran into this again the other day - it's something we should fix.
 You
   see the same thing in the UI when a core cannot be created and it
  gives you
   no hint about the problem and is confusing.
  
   - Mark
  
   On Aug 29, 2013, at 5:23 AM, sathish_ix skandhasw...@inautix.co.in
 
   wrote:
  
Hi ,
   
Check your configuration files uploaded into zookeeper is valid
 and
  no
   error
in config files uploaded.
I think due to this error, solr core will not be created.
   
Thanks,
Sathish
   
   
   
--
View this message in context:
  
 
 http://lucene.472066.n3.nabble.com/SolrCloud-Path-must-not-end-with-character-tp4087159p4087182.html
Sent from the Solr - User mailing list archive at Nabble.com.
  
  
  
  
 



 --

 Jared Griffith
 Linux Administrator, PICS Auditing, LLC
 P: (949) 936-4574
 C: (909) 653-7814

 http://www.picsauditing.com

 17701 Cowan #140 | Irvine, CA | 92614

 Join PICS on LinkedIn and Twitter!

 https://twitter.com/PICSAuditingLLC





Re: SolrCloud - Path must not end with / character

2013-09-02 Thread Prasi S
Does this have anyting to do with tomcat? I cannot go back as we already
fixed with tomcat.

Any suggestions pls. The same setup , if i copy and run it on a different
machine, it works fine. Am not sure what is missing. Is it because of some
system parameter getting set?


On Fri, Aug 30, 2013 at 9:11 PM, Jared Griffith
jgriff...@picsauditing.comwrote:

 I was getting the same errors when trying to implement SolrCloud with
 Tomcat.  I eventually gave up until something came out of this thread.
 This all works if you just ditch Tomcat and go with the native Jetty
 server.


 On Fri, Aug 30, 2013 at 6:28 AM, Prasi S prasi1...@gmail.com wrote:

  Also, this fails with the default solr 4.4 downlaoded configuration too
 
 
  On Fri, Aug 30, 2013 at 4:19 PM, Prasi S prasi1...@gmail.com wrote:
 
   Below is the script i run
  
   START /MAX
   F:\SolrCloud\zookeeper\zk-server-1\zookeeper-3.4.5\bin\zkServer.cmd
  
  
   START /MAX F:\solrcloud\zookeeper java -classpath .;solr-lib/*
   org.apache.solr.cloud.ZkCLI -cmd upconfig -zkhost localhost:2182
 -confdir
   solr-conf -confname solrconf1
  
  
  
   START /MAX F:\solrcloud\zookeeper java -classpath .;solr-lib/*
   org.apache.solr.cloud.ZkCLI -cmd linkconfig -zkhost 127.0.0.1:2182
 -collection
  firstcollection -confname solrconf1 -solrhome ../tomcat1/solr1
  
  
  
   START /MAX F:\solrcloud\zookeeper java -classpath .;solr-lib/*
   org.apache.solr.cloud.ZkCLI -cmd upconfig -zkhost localhost:2182
 -confdir
   solr-conf -confname solrconf2
  
  
  
  
   START /MAX F:\solrcloud\zookeeper java -classpath .;solr-lib/*
   org.apache.solr.cloud.ZkCLI -cmd linkconfig -zkhost 127.0.0.1:2182
 -collection
  seccollection -confname solrconf2 -solrhome ../tomcat1/solr1
  
  
  
   START /MAX F:\solrcloud\tomcat1\bin\startup.bat
  
  
  
   START /MAX F:\solrcloud\tomcat2\bin\startup.bat
  
  
   On Fri, Aug 30, 2013 at 4:07 PM, Prasi S prasi1...@gmail.com wrote:
  
   Im still clueless on where the issue could be. There is no much
   information in the solr logs.
  
   i had a running version of cloud in another server. I have copied the
   same to this server, and started zookeeper, then ran teh below
 commands,
  
   java -classpath .;solr-lib/* org.apache.solr.cloud.ZkCLI -cmd upconfig
   -zkhost localhost:2181 -confdir solr-conf -confname solrconfindex
  
   java -classpath .;solr-lib/* org.apache.solr.cloud.ZkCLI -cmd
 linkconfig
   -zkhost 127.0.0.1:2181 -collection colindexer -confname solrconfindex
   -solrhome ../tomcat1/solr1
  
   After this, when i started tomcat, the first tomcat starts fine. When
  the
   second tomcat is started, i get the above exception and it stops. Tehn
  the
   first tomcat also shows teh same exception.
  
  
  
  
   On Thu, Aug 29, 2013 at 7:18 PM, Mark Miller markrmil...@gmail.com
  wrote:
  
   Yeah, you see this when the core could not be created. Check the logs
  to
   see if you can find something more useful.
  
   I ran into this again the other day - it's something we should fix.
 You
   see the same thing in the UI when a core cannot be created and it
  gives you
   no hint about the problem and is confusing.
  
   - Mark
  
   On Aug 29, 2013, at 5:23 AM, sathish_ix skandhasw...@inautix.co.in
   wrote:
  
Hi ,
   
Check your configuration files uploaded into zookeeper is valid and
  no
   error
in config files uploaded.
I think due to this error, solr core will not be created.
   
Thanks,
Sathish
   
   
   
--
View this message in context:
  
 
 http://lucene.472066.n3.nabble.com/SolrCloud-Path-must-not-end-with-character-tp4087159p4087182.html
Sent from the Solr - User mailing list archive at Nabble.com.
  
  
  
  
 



 --

 Jared Griffith
 Linux Administrator, PICS Auditing, LLC
 P: (949) 936-4574
 C: (909) 653-7814

 http://www.picsauditing.com

 17701 Cowan #140 | Irvine, CA | 92614

 Join PICS on LinkedIn and Twitter!

 https://twitter.com/PICSAuditingLLC



Re: SolrCloud - Path must not end with / character

2013-08-30 Thread Prasi S
Im still clueless on where the issue could be. There is no much information
in the solr logs.

i had a running version of cloud in another server. I have copied the same
to this server, and started zookeeper, then ran teh below commands,

java -classpath .;solr-lib/* org.apache.solr.cloud.ZkCLI -cmd upconfig
-zkhost localhost:2181 -confdir solr-conf -confname solrconfindex

java -classpath .;solr-lib/* org.apache.solr.cloud.ZkCLI -cmd linkconfig
-zkhost 127.0.0.1:2181 -collection colindexer -confname solrconfindex
-solrhome ../tomcat1/solr1

After this, when i started tomcat, the first tomcat starts fine. When the
second tomcat is started, i get the above exception and it stops. Tehn the
first tomcat also shows teh same exception.




On Thu, Aug 29, 2013 at 7:18 PM, Mark Miller markrmil...@gmail.com wrote:

 Yeah, you see this when the core could not be created. Check the logs to
 see if you can find something more useful.

 I ran into this again the other day - it's something we should fix. You
 see the same thing in the UI when a core cannot be created and it gives you
 no hint about the problem and is confusing.

 - Mark

 On Aug 29, 2013, at 5:23 AM, sathish_ix skandhasw...@inautix.co.in
 wrote:

  Hi ,
 
  Check your configuration files uploaded into zookeeper is valid and no
 error
  in config files uploaded.
  I think due to this error, solr core will not be created.
 
  Thanks,
  Sathish
 
 
 
  --
  View this message in context:
 http://lucene.472066.n3.nabble.com/SolrCloud-Path-must-not-end-with-character-tp4087159p4087182.html
  Sent from the Solr - User mailing list archive at Nabble.com.




Re: SolrCloud - Path must not end with / character

2013-08-30 Thread Prasi S
Below is the script i run

START /MAX
F:\SolrCloud\zookeeper\zk-server-1\zookeeper-3.4.5\bin\zkServer.cmd


START /MAX F:\solrcloud\zookeeper java -classpath .;solr-lib/*
org.apache.solr.cloud.ZkCLI -cmd upconfig -zkhost localhost:2182 -confdir
solr-conf -confname solrconf1



START /MAX F:\solrcloud\zookeeper java -classpath .;solr-lib/*
org.apache.solr.cloud.ZkCLI -cmd linkconfig -zkhost
127.0.0.1:2182-collection firstcollection -confname solrconf1
-solrhome ../tomcat1/solr1



START /MAX F:\solrcloud\zookeeper java -classpath .;solr-lib/*
org.apache.solr.cloud.ZkCLI -cmd upconfig -zkhost localhost:2182 -confdir
solr-conf -confname solrconf2




START /MAX F:\solrcloud\zookeeper java -classpath .;solr-lib/*
org.apache.solr.cloud.ZkCLI -cmd linkconfig -zkhost
127.0.0.1:2182-collection seccollection -confname solrconf2 -solrhome
../tomcat1/solr1



START /MAX F:\solrcloud\tomcat1\bin\startup.bat



START /MAX F:\solrcloud\tomcat2\bin\startup.bat


On Fri, Aug 30, 2013 at 4:07 PM, Prasi S prasi1...@gmail.com wrote:

 Im still clueless on where the issue could be. There is no much
 information in the solr logs.

 i had a running version of cloud in another server. I have copied the same
 to this server, and started zookeeper, then ran teh below commands,

 java -classpath .;solr-lib/* org.apache.solr.cloud.ZkCLI -cmd upconfig
 -zkhost localhost:2181 -confdir solr-conf -confname solrconfindex

 java -classpath .;solr-lib/* org.apache.solr.cloud.ZkCLI -cmd linkconfig
 -zkhost 127.0.0.1:2181 -collection colindexer -confname solrconfindex
 -solrhome ../tomcat1/solr1

 After this, when i started tomcat, the first tomcat starts fine. When the
 second tomcat is started, i get the above exception and it stops. Tehn the
 first tomcat also shows teh same exception.




 On Thu, Aug 29, 2013 at 7:18 PM, Mark Miller markrmil...@gmail.comwrote:

 Yeah, you see this when the core could not be created. Check the logs to
 see if you can find something more useful.

 I ran into this again the other day - it's something we should fix. You
 see the same thing in the UI when a core cannot be created and it gives you
 no hint about the problem and is confusing.

 - Mark

 On Aug 29, 2013, at 5:23 AM, sathish_ix skandhasw...@inautix.co.in
 wrote:

  Hi ,
 
  Check your configuration files uploaded into zookeeper is valid and no
 error
  in config files uploaded.
  I think due to this error, solr core will not be created.
 
  Thanks,
  Sathish
 
 
 
  --
  View this message in context:
 http://lucene.472066.n3.nabble.com/SolrCloud-Path-must-not-end-with-character-tp4087159p4087182.html
  Sent from the Solr - User mailing list archive at Nabble.com.





Re: SolrCloud - Path must not end with / character

2013-08-30 Thread Prasi S
Also, this fails with the default solr 4.4 downlaoded configuration too


On Fri, Aug 30, 2013 at 4:19 PM, Prasi S prasi1...@gmail.com wrote:

 Below is the script i run

 START /MAX
 F:\SolrCloud\zookeeper\zk-server-1\zookeeper-3.4.5\bin\zkServer.cmd


 START /MAX F:\solrcloud\zookeeper java -classpath .;solr-lib/*
 org.apache.solr.cloud.ZkCLI -cmd upconfig -zkhost localhost:2182 -confdir
 solr-conf -confname solrconf1



 START /MAX F:\solrcloud\zookeeper java -classpath .;solr-lib/*
 org.apache.solr.cloud.ZkCLI -cmd linkconfig -zkhost 127.0.0.1:2182-collection 
 firstcollection -confname solrconf1 -solrhome ../tomcat1/solr1



 START /MAX F:\solrcloud\zookeeper java -classpath .;solr-lib/*
 org.apache.solr.cloud.ZkCLI -cmd upconfig -zkhost localhost:2182 -confdir
 solr-conf -confname solrconf2




 START /MAX F:\solrcloud\zookeeper java -classpath .;solr-lib/*
 org.apache.solr.cloud.ZkCLI -cmd linkconfig -zkhost 127.0.0.1:2182-collection 
 seccollection -confname solrconf2 -solrhome ../tomcat1/solr1



 START /MAX F:\solrcloud\tomcat1\bin\startup.bat



 START /MAX F:\solrcloud\tomcat2\bin\startup.bat


 On Fri, Aug 30, 2013 at 4:07 PM, Prasi S prasi1...@gmail.com wrote:

 Im still clueless on where the issue could be. There is no much
 information in the solr logs.

 i had a running version of cloud in another server. I have copied the
 same to this server, and started zookeeper, then ran teh below commands,

 java -classpath .;solr-lib/* org.apache.solr.cloud.ZkCLI -cmd upconfig
 -zkhost localhost:2181 -confdir solr-conf -confname solrconfindex

 java -classpath .;solr-lib/* org.apache.solr.cloud.ZkCLI -cmd linkconfig
 -zkhost 127.0.0.1:2181 -collection colindexer -confname solrconfindex
 -solrhome ../tomcat1/solr1

 After this, when i started tomcat, the first tomcat starts fine. When the
 second tomcat is started, i get the above exception and it stops. Tehn the
 first tomcat also shows teh same exception.




 On Thu, Aug 29, 2013 at 7:18 PM, Mark Miller markrmil...@gmail.comwrote:

 Yeah, you see this when the core could not be created. Check the logs to
 see if you can find something more useful.

 I ran into this again the other day - it's something we should fix. You
 see the same thing in the UI when a core cannot be created and it gives you
 no hint about the problem and is confusing.

 - Mark

 On Aug 29, 2013, at 5:23 AM, sathish_ix skandhasw...@inautix.co.in
 wrote:

  Hi ,
 
  Check your configuration files uploaded into zookeeper is valid and no
 error
  in config files uploaded.
  I think due to this error, solr core will not be created.
 
  Thanks,
  Sathish
 
 
 
  --
  View this message in context:
 http://lucene.472066.n3.nabble.com/SolrCloud-Path-must-not-end-with-character-tp4087159p4087182.html
  Sent from the Solr - User mailing list archive at Nabble.com.






SolrCloud - Path must not end with / character

2013-08-29 Thread Prasi S
Hi ,
I have setup solrcloud with solr 4.4. It has two tomcats with 2 solr
instances ( one in each tomcat).
I  start zookeeper , and run the commands for linking the configuration
files with zookeeper.

After that, when i start tomcat, getting the belwo exception,

*Exception in Overseer main queue loop java.lang.IllegalArgumentException:
Path must not end with / character
*


*Full Trace:*
INFO  - 2013-08-29 12:52:30.368;
org.apache.solr.common.cloud.ZkStateReader; Updating cloud state from
ZooKeeper...
ERROR - 2013-08-29 12:52:30.370;
org.apache.solr.cloud.Overseer$ClusterStateUpdater; Exception in Overseer
main queue loop
java.lang.IllegalArgumentException: Path must not end with / character
at org.apache.zookeeper.common.PathUtils.validatePath(PathUtils.java:58)
at org.apache.zookeeper.ZooKeeper.getChildren(ZooKeeper.java:1450)
at
org.apache.solr.common.cloud.SolrZkClient$6.execute(SolrZkClient.java:235)
at
org.apache.solr.common.cloud.SolrZkClient$6.execute(SolrZkClient.java:232)
at
org.apache.solr.common.cloud.ZkCmdExecutor.retryOperation(ZkCmdExecutor.java:65)
at
org.apache.solr.common.cloud.SolrZkClient.getChildren(SolrZkClient.java:232)
at org.apache.solr.common.cloud.SolrZkClient.clean(SolrZkClient.java:618)
at
org.apache.solr.cloud.Overseer$ClusterStateUpdater.removeCore(Overseer.java:640)
at
org.apache.solr.cloud.Overseer$ClusterStateUpdater.processMessage(Overseer.java:182)
at org.apache.solr.cloud.Overseer$ClusterStateUpdater.run(Overseer.java:142)
at java.lang.Thread.run(Thread.java:662)


Thanks,
Prasi


Solr 4.0 - Fuzzy query and Proximity query

2013-08-28 Thread Prasi S
Hi,
with solr 4.0 the fuzzy query syntax is like  keyword~1 (or 2)
Proximity search is like value~20.

How does this differentiate between the two searches. My thought was
promiximity would be on phrases and fuzzy on individual words. Is that
correct?

I wasnted to do a promiximity search for text field and gave the below
query,
ip:port/collection1/select?q=trinity%20service~50debugQuery=yes,

it gives me results as

result name=response numFound=111 start=0 maxScore=4.1237307
doc
str name=business_name*Trinidad *Services/str
/doc
doc
str name=business_nameTrinity Services/str
/doc
doc
str name=business_nameTrinity Services/str
/doc
doc
str name=business_name*Trinitee *Service/str

How to differentiate between fuzzy and proximity.


Thanks,
Prasi


Re: Solr 4.0 - Fuzzy query and Proximity query

2013-08-28 Thread Prasi S
hi Erick,
Yes it is correct. These results are because of stemming + phonetic
matching. Below is the

Index time

ST
trinity
services
SF
trinity
services
LCF
trinity
services
SF
trinity
services
SF
trinity
services
WDF
trinity
services
Query time

SF
triniti
servic
PF
TRNTtriniti
SRFKservic
HWF
TRNTtriniti
SRFKservic
PSF
TRNTtriniti
SRFKservic
Apart from this, fuzzy would be for indivual words and proximity would be
phrase. Is this correct.
also can we have fuzzy on phrases?



On Wed, Aug 28, 2013 at 5:36 PM, Erick Erickson erickerick...@gmail.comwrote:

 The first thing I'd recommend is to look at the admin/analysis
 page. I suspect you aren't seeing fuzzy query results
 at all, what you're seeing is the result of stemming.

 Stemming is algorithmic, so sometimes produces very
 surprising results, i.e. Trinidad and Trinigee may stem
 to something like triniti.

 But you didn't provide the field definition so it's just a guess.

 Best
 Erick


 On Wed, Aug 28, 2013 at 7:43 AM, Prasi S prasi1...@gmail.com wrote:

  Hi,
  with solr 4.0 the fuzzy query syntax is like  keyword~1 (or 2)
  Proximity search is like value~20.
 
  How does this differentiate between the two searches. My thought was
  promiximity would be on phrases and fuzzy on individual words. Is that
  correct?
 
  I wasnted to do a promiximity search for text field and gave the below
  query,
  ip:port/collection1/select?q=trinity%20service~50debugQuery=yes,
 
  it gives me results as
 
  result name=response numFound=111 start=0 maxScore=4.1237307
  doc
  str name=business_name*Trinidad *Services/str
  /doc
  doc
  str name=business_nameTrinity Services/str
  /doc
  doc
  str name=business_nameTrinity Services/str
  /doc
  doc
  str name=business_name*Trinitee *Service/str
 
  How to differentiate between fuzzy and proximity.
 
 
  Thanks,
  Prasi
 



Re: Solr 4.0 - Fuzzy query and Proximity query

2013-08-28 Thread Prasi S
sry , i copied it wrong. Below is the correct analysis.

Index time

ST
trinity
services
SF
trinity
services
LCF
trinity
services
SF
trinity
services
SF
trinity
services
WDF
trinity
services
SF
triniti
servic
PF
TRNTtriniti
SRFKservic
HWF
TRNTtriniti
SRFKservic
PSF
TRNTtriniti
SRFKservic



*Query time*
ST
trinity
services
SF
trinity
services
LCF
trinity
services
WDF
trinity
services
SF
triniti
servic
PSF
triniti
servic
PF
TRNTtriniti
SRFKservic

Apart from this, fuzzy would be for indivual words and proximity would be
phrase. Is this correct.
also can we have fuzzy on phrases?


On Wed, Aug 28, 2013 at 5:58 PM, Prasi S prasi1...@gmail.com wrote:

 hi Erick,
 Yes it is correct. These results are because of stemming + phonetic
 matching. Below is the

 Index time

  ST
trinity
   services
  SF
trinity
   services
  LCF
trinity
   services
  SF
trinity
   services
  SF
trinity
   services
  WDF
trinity
   services
 Query time

 SF
triniti
   servic
  PF
TRNT  triniti
   SRFK  servic
  HWF
TRNT  triniti
   SRFK  servic
  PSF
TRNT  triniti
   SRFK  servic
 Apart from this, fuzzy would be for indivual words and proximity would be
 phrase. Is this correct.
 also can we have fuzzy on phrases?



 On Wed, Aug 28, 2013 at 5:36 PM, Erick Erickson 
 erickerick...@gmail.comwrote:

 The first thing I'd recommend is to look at the admin/analysis
 page. I suspect you aren't seeing fuzzy query results
 at all, what you're seeing is the result of stemming.

 Stemming is algorithmic, so sometimes produces very
 surprising results, i.e. Trinidad and Trinigee may stem
 to something like triniti.

 But you didn't provide the field definition so it's just a guess.

 Best
 Erick


 On Wed, Aug 28, 2013 at 7:43 AM, Prasi S prasi1...@gmail.com wrote:

  Hi,
  with solr 4.0 the fuzzy query syntax is like  keyword~1 (or 2)
  Proximity search is like value~20.
 
  How does this differentiate between the two searches. My thought was
  promiximity would be on phrases and fuzzy on individual words. Is that
  correct?
 
  I wasnted to do a promiximity search for text field and gave the below
  query,
  ip:port/collection1/select?q=trinity%20service~50debugQuery=yes,
 
  it gives me results as
 
  result name=response numFound=111 start=0 maxScore=4.1237307
  doc
  str name=business_name*Trinidad *Services/str
  /doc
  doc
  str name=business_nameTrinity Services/str
  /doc
  doc
  str name=business_nameTrinity Services/str
  /doc
  doc
  str name=business_name*Trinitee *Service/str
 
  How to differentiate between fuzzy and proximity.
 
 
  Thanks,
  Prasi
 





Indexing status when one tomcat goes down

2013-08-23 Thread Prasi S
hi all,
Im running solr cloud with solr 4.4 . I have 2 tomcat instances with 4
shards ( 2 in each).

What will happen if one of the tomcats go down during indexing. The otehr
tomcat throws status as  Leader not active in the logs.

Regards,
Prasi


Re: Solr Indexing Status

2013-08-22 Thread Prasi S
I am not using dih for indexing csv files. Im pushing data through solrj
code. But i want a status something like what dih gives. ie. fire a
command=status and we get the response. Is anythin like that available for
any type of file indexing which we do through api ?


On Thu, Aug 22, 2013 at 12:09 AM, Shalin Shekhar Mangar 
shalinman...@gmail.com wrote:

 Yes, you can invoke
 http://host:port/solr/dataimport?command=status which will return
 how many Solr docs have been added etc.

 On Wed, Aug 21, 2013 at 4:56 PM, Prasi S prasi1...@gmail.com wrote:
  Hi,
  I am using solr 4.4 to index csv files. I am using solrj for this. At
  frequent intervels my user may request for Status. I have to send get
  something like in DIH  Indexing in progress.. Added xxx documents.
 
  Is there anything like in dih, where we can fire a command=status to get
  the status of indexing for files.
 
 
  Thanks,
  Prasi



 --
 Regards,
 Shalin Shekhar Mangar.



DIH not proceeding after few millions

2013-08-22 Thread Prasi S
Hi, Im using DIH to index data to solr. Solr 4.4 version is used. Indexing
proceeds normal in the beginning.

I have some 10 data-config files.

file1 - select * from table where id between 1 and 100

file2 - select * from table where id between 100 and 300. and so
on.

Here 4 batches go normally. For the fifth batch, i ge the status from Admin
page ( Dataimport) as

*Duration: 2 hrs*.
Indexed:0 documents ; deleted:0 documents.

And indexing stops. But no documents were indexed. I use single external
zookeeper for this.

I dont see any exception in solr logs and in Zookeeper, below is the status.

INFO  [ProcessThread(sid:0 cport:-1)::PrepRequestProcessor@627] - Got
user-level KeeperException when processing sessionid:0x1 40a4ce824b0005
type:create cxid:0x29a zxid:0x157d txntype:-1 reqpath:n/a Error P

Any ideas?


Re: Solr Indexing Status

2013-08-22 Thread Prasi S
Thanks much . This was useful.


On Thu, Aug 22, 2013 at 2:24 PM, Shalin Shekhar Mangar 
shalinman...@gmail.com wrote:

 You can use the /admin/mbeans handler to get all system stats. You can
 find stats such as adds and cumulative_adds under the update
 handler section.

 http://localhost:8983/solr/collection1/admin/mbeans?stats=true

 On Thu, Aug 22, 2013 at 12:35 PM, Prasi S prasi1...@gmail.com wrote:
  I am not using dih for indexing csv files. Im pushing data through solrj
  code. But i want a status something like what dih gives. ie. fire a
  command=status and we get the response. Is anythin like that available
 for
  any type of file indexing which we do through api ?
 
 
  On Thu, Aug 22, 2013 at 12:09 AM, Shalin Shekhar Mangar 
  shalinman...@gmail.com wrote:
 
  Yes, you can invoke
  http://host:port/solr/dataimport?command=status which will return
  how many Solr docs have been added etc.
 
  On Wed, Aug 21, 2013 at 4:56 PM, Prasi S prasi1...@gmail.com wrote:
   Hi,
   I am using solr 4.4 to index csv files. I am using solrj for this. At
   frequent intervels my user may request for Status. I have to send
 get
   something like in DIH  Indexing in progress.. Added xxx documents.
  
   Is there anything like in dih, where we can fire a command=status to
 get
   the status of indexing for files.
  
  
   Thanks,
   Prasi
 
 
 
  --
  Regards,
  Shalin Shekhar Mangar.
 



 --
 Regards,
 Shalin Shekhar Mangar.



Solr Indexing Status

2013-08-21 Thread Prasi S
Hi,
I am using solr 4.4 to index csv files. I am using solrj for this. At
frequent intervels my user may request for Status. I have to send get
something like in DIH  Indexing in progress.. Added xxx documents.

Is there anything like in dih, where we can fire a command=status to get
the status of indexing for files.


Thanks,
Prasi


Solr 4.4 Query ignoring null values

2013-08-20 Thread Prasi S
Hi,
I have a few questions with Solr 4.4 query parser

1. http://localhost:8180/solr/collection1/select?q=business_name:Catherine
AND 
city:debugQuery=yeshttp://158.151.155.224:8180/solr/firstcollection/select?q=business_name:Catherinefq=physical_city:%22%22debugQuery=yes
  - 100 results
In this query , I can see the city has no value and solr is automatically
omitting it. Does this feature included in the previous solr versions also
( like solr 3.5  or earlier) or is this new to solr 4.4 ?

2. 
http://localhost:8180/solr/collection1/select?q=http://158.151.155.224:8180/solr/firstcollection/select?q=business_name:Catherinefq=physical_city:%22%22debugQuery=yes
business_name:Catherinehttp://158.151.155.224:8180/solr/firstcollection/select?q=business_name:Catherinefq=physical_city:%22%22debugQuery=yesfq=city:Burien
 - 43 results

This is normal.

3.  
http://localhost:8180/solr/collection1/select?q=http://158.151.155.224:8180/solr/firstcollection/select?q=business_name:Catherinefq=physical_city:%22%22debugQuery=yes
business_name:Catherinehttp://158.151.155.224:8180/solr/firstcollection/select?q=business_name:Catherinefq=physical_city:%22%22debugQuery=yesfq=city:
  - 0 Results. If the first query works, why not the third?


Thanks,
Prasi


Re: Solr 4.4 Query ignoring null values

2013-08-20 Thread Prasi S
Hi Hoss,
Below is the debug output for the query1. We have values for physical_city.
It is not an empty valued field. when i said no-value, it was that the
query does not have a value.

lst name=debug
str name=rawquerystring*business_name:Catherine AND physical_city:*
/str
str name=querystringbusiness_name:Catherine AND physical_city:/str
str name=parsedquery
(business_name:K0RN business_name:catherin)/no_coord
/str
str name=parsedquery_toString***business_name:K0RN
business_name:catherin*/str


Here , the query parser has skipped physical_city which had an empty value
in the Query. Does this behavior hold for earlier versions of solr (like
solr 3.5 or earlier )



On Tue, Aug 20, 2013 at 10:16 PM, Chris Hostetter
hossman_luc...@fucit.orgwrote:


 : Subject: Solr 4.4 Query ignoring null values

 First off: it's important that you understand there is no such thing as a
 null value in a Solr index -- the concept does not exist.  You can have
 documents that do not contain a value for a field, and you can have
 documents that contain the empty string as an indexed value, but there
 is no such thing as querying for documents that have null indexed.

 : 1.
 http://localhost:8180/solr/collection1/select?q=business_name:Catherine
 : AND city:debugQuery=yes
 http://158.151.155.224:8180/solr/firstcollection/select?q=business_name:Catherinefq=physical_city:%22%22debugQuery=yes
 
 :   - 100 results
 : In this query , I can see the city has no value and solr is automatically
 : omitting it. Does this feature included in the previous solr versions
 also

 You haven't shown us the output, so we can only guess as to what you mean
 by the city has no value (no value where? in the query string? in the
 debug output? in the stored fields of the results?)

 I suspect that what's happening is this...

  * your city field is a TextField with an analyzer that produces no tokens
 for the input string 
  * that means that the city clause of your query is a No-Op and the 100
 results you get are just teh 100 docs that match business_name:Catherine

 : 3.  http://localhost:8180/solr/collection1/select?q=
 http://158.151.155.224:8180/solr/firstcollection/select?q=business_name:Catherinefq=physical_city:%22%22debugQuery=yes
 
 : business_name:Catherine
 http://158.151.155.224:8180/solr/firstcollection/select?q=business_name:Catherinefq=physical_city:%22%22debugQuery=yes
 fq=city:
 :   - 0 Results. If the first query works, why not the third?

 In the first query, you have a No-Op 2nd clause -- not just a clause that
 doesn't match anything, but a clause that actually doesn't *mean* anything
 because the empty string doesn't produce any tokens from your analyzer, so
 it gets dropped by the query parser because there is no query to build
 from it -- but there is still the 1st clause which has meaning -- so as
 a result it is possible for documents to match that query, and 100
 documents do match.

 In your 3rd URL you have a query (it happens to be an fq (filter query)
 but that's not important to the point) consisting of a single clause which
 is a No-Op -- so the resulting query is incapable of matching any
 documents.

 ...

 If you want to be able to index documents containing the empty string ()
 as a value, and then search for documents containing that value -- you
 need to use a field type / analyzer that respects and preserves the empty
 string as a legal field value.

 If however you want to query (or filter) against documents that have a
 value, you should either:

   * index a special boolean field has_city
   * filter on a range query over all values, ie...
 *   fq=city:[* TO *]
 *   fq=-city:[* TO *]


 -Hoss



Solr Filter Query

2013-08-20 Thread Prasi S
Hi,
Is there any limit on how big a filter query can be ?
What are the values that should be set properly for handling big filter
queries.


thanks,
Prasi


Giving OpenSearcher as false

2013-08-19 Thread Prasi S
Hi,
1. What is the impact , use of giving opensearcher as true

 autoCommit
   maxTime${solr.autoCommit.maxTime:15000}/maxTime
   openSearchertrue/openSearcher

2. Giving the value as false , does this create index in the temp file
and then commit?


Regards,
Prasi


SolrCloud Zookeeper Exception

2013-08-19 Thread Prasi S
Hi,
I have setup solrcloud with 4.4 version. There is one external zookeeper
and two instances of solr ( Total 4 shards - 2 shards in each instance)

I was using dih to index from sql server. I twas indexing fine initially.
Later when i shutdown solr and zookeeper's and then restarted them, I get
the below infor in Zookeeper commandprompt,

2013-08-19 05:23:45,257 [myid:] - INFO  [ProcessThread(sid:0
cport:-1)::PrepRequ
estProcessor@627] - Got user-level KeeperException when processing
sessionid:0x1
409617c5750005 type:create cxid:0x29a zxid:0x842 txntype:-1 reqpath:n/a
Error Pa
th:/overseer Error:KeeperErrorCode = NodeExists for /overseer


When i give Solr indexing in dih, it is running but at last, it giving 
Added 0 documents. I have chacked the DB. It has results.

What could be the problem.

REgards,
Prasi


Setting hostPort in System properties

2013-08-13 Thread Prasi S
Hi,
when i set solr hostPort in tomcat system properties, it is not working. If
I specify that in solr.xml then it is working. Is it mandatory that
hostPort shouls be set only in solr.xml ?

Solr.xml setting:

solr

  solrcloud
str name=host${host:}/str
*int name=hostPort${port:}/int*

Tomcat runtime setting:
*
*
*set JAVA_OPTS=-Dprogram.name=%PROGNAME%
-Dlogging.configuration=file:%DIRNAME%logging.properties -DhostContext=solr
-Dhost=10.239.30.27 -DjhostPort=8080
*
*
*
Thanks,
Prasi


Re: Solr 4.4 Cloud always indexing to only one shard

2013-08-13 Thread Prasi S
I create a collection prior to tomcat startup.

--java -classpath .;zoo-lib/* org.apache.solr.cloud.ZkCLI -cmd upconfig
-zkhost localhost:2181 -confdir solr-conf -confname solrconf1

--java -classpath .;zoo-lib/* org.apache.solr.cloud.ZkCLI -cmd linkconfig
-zkhost 127.0.0.1:2181 -collection firstcollection -confname solrconf1
-solrhome ../solr_instances/solr

1. Start Zookeeper server
2. Link the configuaration to the collection
3. Check those in ZooClient
4. Start tomcats
5. Create cores and assign to collections.

http://localhost:8080/solr/admin/cores?action=CREATEname=mycore_sh1collection=firstcollectionshard=shard1

Are these ok or am I making a mistake?


On Mon, Aug 12, 2013 at 6:49 PM, Erick Erickson erickerick...@gmail.comwrote:

 Why are you using the core creation commands rather than the
 collection commands? The latter are intended for SolrCloud...

 Best
 Erick


 On Mon, Aug 12, 2013 at 4:51 AM, Prasi S prasi1...@gmail.com wrote:

  Hi,
  I have setup solrcloud in solr 4.4, with 2 solr's in 2 tomcat servers and
  Zookeeper.
 
  I setup Zookeeper with a collection firstcollection and then i give the
  belwo command
 
 
 
 http://localhost:8080/solr/admin/cores?action=CREATEname=mycore_sh1collection=firstcollectionshard=shard1
 
  Similarly, i create 4 shards. 2 shard in the first instance and two
 shards
  in the second instance.
 
  When i index files to
 
 http://localhost:8080/solr/firstcollection/dataimport?command=full-import,
  the data always gets indexed to the shard1.
 
  There are no documents in shard2, 3 ,4. I checked this with
 
  http://localhost:8080/solr/firstcollection/select?q=*:*fl=[shard]
 
  But searching across any of the two gives full results. It this a problem
  with 4.4 version.
 
  Similar scenario , i have tested in solr 4.0 and itr was working fine.
 
  Pls help.
 



Solr 4.4 Cloud always indexing to only one shard

2013-08-12 Thread Prasi S
Hi,
I have setup solrcloud in solr 4.4, with 2 solr's in 2 tomcat servers and
Zookeeper.

I setup Zookeeper with a collection firstcollection and then i give the
belwo command

http://localhost:8080/solr/admin/cores?action=CREATEname=mycore_sh1collection=firstcollectionshard=shard1

Similarly, i create 4 shards. 2 shard in the first instance and two shards
in the second instance.

When i index files to
http://localhost:8080/solr/firstcollection/dataimport?command=full-import,
the data always gets indexed to the shard1.

There are no documents in shard2, 3 ,4. I checked this with

http://localhost:8080/solr/firstcollection/select?q=*:*fl=[shard]

But searching across any of the two gives full results. It this a problem
with 4.4 version.

Similar scenario , i have tested in solr 4.0 and itr was working fine.

Pls help.


Solr 4.4 Default shard

2013-08-08 Thread Prasi S
I have setup solr 4.4 with cloud and have created two cores mycore_shard1,
mycore_shard2.  I have few questions here,


1. Once the setup is ready, i could see a default collection collection
with :shard1 in the admin - cloud page. How to remove it. I have deleted
the core.properties file in the /solr/collection/core.propeties file. But
still it shoots up. Where is it stored.

2. Where do I find the shard configurationf for mycore_shard1 ,
mycore_shard2 ?


Thanks in advance.


Re: Solr 4.4 Default shard

2013-08-08 Thread Prasi S
Initially i created a single collection,

-java -classpath .;zoo-lib/* org.apache.solr.cloud.ZkCLI -cmd upconfig
-zkhost localhost:2181 -confdir solr-conf -confname *myconf1*

--java -classpath .;zoo-lib/* org.apache.solr.cloud.ZkCLI -cmd
linkconfig -zkhost 127.0.0.1:2181 -collection *firstcollection
*-confname *myconf1
*-solrhome example/solr

When i create a single collection there was no problem. The default
collection1 and the new firstcollection were both present.

But when i created a second collection using the same upconfig parameter,

- java -classpath .;zoo-lib/* org.apache.solr.cloud.ZkCLI -cmd
upconfig -zkhost localhost:2181 -confdir solr-conf -confname *myconf2*

--java -classpath .;zoo-lib/* org.apache.solr.cloud.ZkCLI -cmd
linkconfig -zkhost 127.0.0.1:2181 -collection *secondcollection *-confname *
myconf2 *-solrhome example/solr

tomcat threw exception as , could not find Configname for collection1 :
found[ myconf1, myconf2].

Any idea?





On Thu, Aug 8, 2013 at 3:04 PM, Prasi S prasi1...@gmail.com wrote:

 I have setup solr 4.4 with cloud and have created two cores mycore_shard1,
 mycore_shard2.  I have few questions here,


 1. Once the setup is ready, i could see a default collection collection
 with :shard1 in the admin - cloud page. How to remove it. I have deleted
 the core.properties file in the /solr/collection/core.propeties file. But
 still it shoots up. Where is it stored.

 2. Where do I find the shard configurationf for mycore_shard1 ,
 mycore_shard2 ?


 Thanks in advance.



Re: Solr 4.4 Default shard

2013-08-08 Thread Prasi S
this was only with solr 4.4.I didnt face the issue in any other versions.


On Thu, Aug 8, 2013 at 4:23 PM, Prasi S prasi1...@gmail.com wrote:

 Initially i created a single collection,

 -java -classpath .;zoo-lib/* org.apache.solr.cloud.ZkCLI -cmd
 upconfig -zkhost localhost:2181 -confdir solr-conf -confname *myconf1*

 --java -classpath .;zoo-lib/* org.apache.solr.cloud.ZkCLI -cmd
 linkconfig -zkhost 127.0.0.1:2181 -collection *firstcollection *-confname
 *myconf1 *-solrhome example/solr

 When i create a single collection there was no problem. The default
 collection1 and the new firstcollection were both present.

 But when i created a second collection using the same upconfig parameter,

 - java -classpath .;zoo-lib/* org.apache.solr.cloud.ZkCLI -cmd
 upconfig -zkhost localhost:2181 -confdir solr-conf -confname *myconf2*

 --java -classpath .;zoo-lib/* org.apache.solr.cloud.ZkCLI -cmd
 linkconfig -zkhost 127.0.0.1:2181 -collection *secondcollection *
 -confname *myconf2 *-solrhome example/solr

 tomcat threw exception as , could not find Configname for collection1 :
 found[ myconf1, myconf2].

 Any idea?





 On Thu, Aug 8, 2013 at 3:04 PM, Prasi S prasi1...@gmail.com wrote:

 I have setup solr 4.4 with cloud and have created two cores
 mycore_shard1, mycore_shard2.  I have few questions here,


 1. Once the setup is ready, i could see a default collection collection
 with :shard1 in the admin - cloud page. How to remove it. I have deleted
 the core.properties file in the /solr/collection/core.propeties file. But
 still it shoots up. Where is it stored.

 2. Where do I find the shard configurationf for mycore_shard1 ,
 mycore_shard2 ?


 Thanks in advance.





Solr 4.4 ShingleFilterFactroy exception

2013-08-07 Thread Prasi S
Hi,
I have setup solr 4.4 with cloud. When i start solr, I get an exception as
below,

*ERROR [CoreContainer] Unable to create core: mycore_sh1:
org.apache.solr.common.SolrException: Plugin init failure for [schema.xml]
fieldType text_shingle: Plugin init failure for [schema.xml]
analyzer/filter: Error instantiating class:
'org.apache.lucene.analysis.shingle.ShingleFilterFactory'*
at
org.apache.solr.util.plugin.AbstractPluginLoader.load(AbstractPluginLoader.java:177)
[:4.4.0 1504776 - sarowe - 2013-07-19 02:58:35]
at org.apache.solr.schema.IndexSchema.readSchema(IndexSchema.java:467)
[:4.4.0 1504776 - sarowe - 2013-07-19 02:58:35]
at org.apache.solr.schema.IndexSchema.init(IndexSchema.java:164) [:4.4.0
1504776 - sarowe - 2013-07-19 02:58:35]
at
org.apache.solr.schema.IndexSchemaFactory.create(IndexSchemaFactory.java:55)
[:4.4.0 1504776 - sarowe - 2013-07-19 02:58:35]
at
org.apache.solr.schema.IndexSchemaFactory.buildIndexSchema(IndexSchemaFactory.java:69)
[:4.4.0 1504776 - sarowe - 2013-07-19 02:58:35]
at org.apache.solr.core.ZkContainer.createFromZk(ZkContainer.java:268)
[:4.4.0 1504776 - sarowe - 2013-07-19 02:58:35]
at org.apache.solr.core.CoreContainer.create(CoreContainer.java:655)
[:4.4.0 1504776 - sarowe - 2013-07-19 02:58:35]
at org.apache.solr.core.CoreContainer$1.call(CoreContainer.java:364)
[:4.4.0 1504776 - sarowe - 2013-07-19 02:58:35]
at org.apache.solr.core.CoreContainer$1.call(CoreContainer.java:356)
[:4.4.0 1504776 - sarowe - 2013-07-19 02:58:35]
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
[:1.6.0_43]
at java.util.concurrent.FutureTask.run(FutureTask.java:138) [:1.6.0_43]
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:439)
[:1.6.0_43]
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
[:1.6.0_43]
at java.util.concurrent.FutureTask.run(FutureTask.java:138) [:1.6.0_43]
at
java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:895)
[:1.6.0_43]
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:918)
[:1.6.0_43]
at java.lang.Thread.run(Thread.java:662) [:1.6.0_43]
Caused by: org.apache.solr.common.SolrException: Plugin init failure for
[schema.xml] analyzer/filter: Error instantiating class:
'org.apache.lucene.analysis.shingle.ShingleFilterFactory'



The same file works well with solr 4.2. Pls help.


Thanks,
Prasi


Re: Solr 4.4 ShingleFilterFactroy exception

2013-08-07 Thread Prasi S
Any suggestions pls?


On Wed, Aug 7, 2013 at 5:17 PM, Prasi S prasi1...@gmail.com wrote:

 Hi,
 I have setup solr 4.4 with cloud. When i start solr, I get an exception as
 below,

 *ERROR [CoreContainer] Unable to create core: mycore_sh1:
 org.apache.solr.common.SolrException: Plugin init failure for [schema.xml]
 fieldType text_shingle: Plugin init failure for [schema.xml]
 analyzer/filter: Error instantiating class:
 'org.apache.lucene.analysis.shingle.ShingleFilterFactory'*
  at
 org.apache.solr.util.plugin.AbstractPluginLoader.load(AbstractPluginLoader.java:177)
 [:4.4.0 1504776 - sarowe - 2013-07-19 02:58:35]
 at org.apache.solr.schema.IndexSchema.readSchema(IndexSchema.java:467)
 [:4.4.0 1504776 - sarowe - 2013-07-19 02:58:35]
  at org.apache.solr.schema.IndexSchema.init(IndexSchema.java:164)
 [:4.4.0 1504776 - sarowe - 2013-07-19 02:58:35]
 at
 org.apache.solr.schema.IndexSchemaFactory.create(IndexSchemaFactory.java:55)
 [:4.4.0 1504776 - sarowe - 2013-07-19 02:58:35]
  at
 org.apache.solr.schema.IndexSchemaFactory.buildIndexSchema(IndexSchemaFactory.java:69)
 [:4.4.0 1504776 - sarowe - 2013-07-19 02:58:35]
 at org.apache.solr.core.ZkContainer.createFromZk(ZkContainer.java:268)
 [:4.4.0 1504776 - sarowe - 2013-07-19 02:58:35]
  at org.apache.solr.core.CoreContainer.create(CoreContainer.java:655)
 [:4.4.0 1504776 - sarowe - 2013-07-19 02:58:35]
 at org.apache.solr.core.CoreContainer$1.call(CoreContainer.java:364)
 [:4.4.0 1504776 - sarowe - 2013-07-19 02:58:35]
  at org.apache.solr.core.CoreContainer$1.call(CoreContainer.java:356)
 [:4.4.0 1504776 - sarowe - 2013-07-19 02:58:35]
 at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
 [:1.6.0_43]
  at java.util.concurrent.FutureTask.run(FutureTask.java:138) [:1.6.0_43]
 at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:439)
 [:1.6.0_43]
  at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
 [:1.6.0_43]
 at java.util.concurrent.FutureTask.run(FutureTask.java:138) [:1.6.0_43]
  at
 java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:895)
 [:1.6.0_43]
 at
 java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:918)
 [:1.6.0_43]
  at java.lang.Thread.run(Thread.java:662) [:1.6.0_43]
 Caused by: org.apache.solr.common.SolrException: Plugin init failure for
 [schema.xml] analyzer/filter: Error instantiating class:
 'org.apache.lucene.analysis.shingle.ShingleFilterFactory'



 The same file works well with solr 4.2. Pls help.


 Thanks,
 Prasi



Solr 4.3 log4j

2013-08-05 Thread Prasi S
Hi,
Im using solr 4.3 to setup solrcloud. I haev placed all jar files in a
folder zoo-lib. I have also placed the jar fiels from /solr/example/lib/ext
to zoo-lib folder.

When I execute this command,

java -classpath .;zoo-lib/* org.apache.solr.cloud.ZkCLI -cmd upconfig
-zkhost localhost:2181 -confdir solr-conf -confname myconf

I get the below warning and it stops. Am i missing anything.

log4j:WARN No appenders could be found for logger
(org.apache.zookeeper.ZooKeepe
r).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for
more in
fo.


Re: Solr 4.3 log4j

2013-08-05 Thread Prasi S
It didn't work for both options..


On Mon, Aug 5, 2013 at 12:19 PM, Shawn Heisey s...@elyograg.org wrote:

 On 8/5/2013 12:19 AM, Prasi S wrote:
  Im using solr 4.3 to setup solrcloud. I haev placed all jar files in a
  folder zoo-lib. I have also placed the jar fiels from
 /solr/example/lib/ext
  to zoo-lib folder.
 
  When I execute this command,
 
  java -classpath .;zoo-lib/* org.apache.solr.cloud.ZkCLI -cmd upconfig
  -zkhost localhost:2181 -confdir solr-conf -confname myconf
 
  I get the below warning and it stops. Am i missing anything.
 
  log4j:WARN No appenders could be found for logger

 The classpath format looks wrong to me.  From what I understand, here's
 what you'd want:

 For UNIX/Linux: .:zoo-lib/*
 For Windows: .;zoo-lib\*

 Thanks,
 Shawn




Re: Solr 4.3 log4j

2013-08-05 Thread Prasi S
I could see there is a change in the logging for solr from 4.3 onwards and
the steps for setting it up right in tomcat.

But this gives a problem while loading configurations to Zookeeper . Am i
missing anything.


On Mon, Aug 5, 2013 at 12:51 PM, Prasi S prasi1...@gmail.com wrote:

 It didn't work for both options..


 On Mon, Aug 5, 2013 at 12:19 PM, Shawn Heisey s...@elyograg.org wrote:

 On 8/5/2013 12:19 AM, Prasi S wrote:
  Im using solr 4.3 to setup solrcloud. I haev placed all jar files in a
  folder zoo-lib. I have also placed the jar fiels from
 /solr/example/lib/ext
  to zoo-lib folder.
 
  When I execute this command,
 
  java -classpath .;zoo-lib/* org.apache.solr.cloud.ZkCLI -cmd upconfig
  -zkhost localhost:2181 -confdir solr-conf -confname myconf
 
  I get the below warning and it stops. Am i missing anything.
 
  log4j:WARN No appenders could be found for logger

 The classpath format looks wrong to me.  From what I understand, here's
 what you'd want:

 For UNIX/Linux: .:zoo-lib/*
 For Windows: .;zoo-lib\*

 Thanks,
 Shawn





SolrCloud Exceptions

2013-08-02 Thread Prasi S
Im using SolrCloud with two solr instances in two separate tomcats and
Zookeeper.

I have created a collection with 4 shards ( 2 in each solr) and was able to
distribute data.

Now I brought down solr2, then searched.

1.  As expected solr2 was down. But solr1 was throwing
str name=msg*no servers hosting shard:*/str
int name=code503/int

2. When solr 2 was brought up again , i get this exception in solr1


SolrCore Initialization Failures
java.nio.channels.OverlappingFileLockException:java.nio.channels.OverlappingFileLockException


Am i missign anything.


Solr DIH loggig skipped / error documents

2013-08-01 Thread Prasi S
Hi ,
Im using 4.0 for indexing database content. I have a few doubts here
 1. If a document fails during indexing, does indexing proceed?
2. does the faield document get logged in the solr logs?
3. Ho can I reindex the skipped documents once indexing has completed?

Please help.


Thanks,
Prasi


SolrCloud separate indexer and Searcher

2013-08-01 Thread Prasi S
i am using solr 4.0 for indexing db content to a cloud . Currently i have
two solr isntances running in separate app servers. Can I have a one solr
separately as indexer instance and another as a searcher.

This is possible in Master-Slave but does this hold good for SorlCloud also?
Because from what i understand, if Isetup two solr isntances, cloud
automatically routes docuemtns to both the servers.

Please guide me on this


thanks,
Prasi


SolrCloud replication server and Master Server

2013-08-01 Thread Prasi S
I have a requirement to set solrcloud with 2 instances of Solr( one on 8080
and otehr on 9090 ports respectivey)  and a Zookeeper ensembe( 3 modes).
Can I have one solr instance as a Master and the other as a replcia of the
Master.

Because, when i set up a solrcloud and index to one of the solr running on
8080, it automatically routes to 9090 solr also.

I need indexing only on 8080 and 9090 solr should be a replica of 8080solr.
Is this possible in Cloud.'

Pls guide me.


Re: SolrCloud replication server and Master Server

2013-08-01 Thread Prasi S
Here when i create a single shard and a replica, then my shard will be on
one server and replcia in teh other isn't ?


On Thu, Aug 1, 2013 at 6:29 PM, Rafał Kuć r@solr.pl wrote:

 Hello!

 There is no master - slave in SolrCloud. You can create a collection
 that has only a single shard and have one replica. When you send an
 indexing request it will be forwarded to the leader shard (in your
 case you want it to be the instance running on 8080), however
 both Solr instances will accept indexing requests.

 --
 Regards,
  Rafał Kuć
  Sematext :: http://sematext.com/ :: Solr - Lucene - ElasticSearch

  I have a requirement to set solrcloud with 2 instances of Solr( one on
 8080
  and otehr on 9090 ports respectivey)  and a Zookeeper ensembe( 3 modes).
  Can I have one solr instance as a Master and the other as a replcia of
 the
  Master.

  Because, when i set up a solrcloud and index to one of the solr running
 on
  8080, it automatically routes to 9090 solr also.

  I need indexing only on 8080 and 9090 solr should be a replica of
 8080solr.
  Is this possible in Cloud.'

  Pls guide me.




Re: SolrCloud replication server and Master Server

2013-08-01 Thread Prasi S
I am rephrasing my question,

Currently what i haev is

Solr 8080 - Shard1Shard2- Replica
Solr 9090 - Shard 2 Shard 1 - Replica

Can we have something like

Solr 8080 - Shard 1, Shard 2
solr 9090 -  Shard2-Replica , Shard1- Replica



On Thu, Aug 1, 2013 at 6:40 PM, Prasi S prasi1...@gmail.com wrote:

 Here when i create a single shard and a replica, then my shard will be on
 one server and replcia in teh other isn't ?


 On Thu, Aug 1, 2013 at 6:29 PM, Rafał Kuć r@solr.pl wrote:

 Hello!

 There is no master - slave in SolrCloud. You can create a collection
 that has only a single shard and have one replica. When you send an
 indexing request it will be forwarded to the leader shard (in your
 case you want it to be the instance running on 8080), however
 both Solr instances will accept indexing requests.

 --
 Regards,
  Rafał Kuć
  Sematext :: http://sematext.com/ :: Solr - Lucene - ElasticSearch

  I have a requirement to set solrcloud with 2 instances of Solr( one on
 8080
  and otehr on 9090 ports respectivey)  and a Zookeeper ensembe( 3 modes).
  Can I have one solr instance as a Master and the other as a replcia of
 the
  Master.

  Because, when i set up a solrcloud and index to one of the solr running
 on
  8080, it automatically routes to 9090 solr also.

  I need indexing only on 8080 and 9090 solr should be a replica of
 8080solr.
  Is this possible in Cloud.'

  Pls guide me.





Re: SolrCloud replication server and Master Server

2013-08-01 Thread Prasi S
Thank Rafal. I would check on that mzShardsPerNode..


On Thu, Aug 1, 2013 at 7:06 PM, Rafał Kuć r@solr.pl wrote:

 Hello!

 Take a look at http://wiki.apache.org/solr/SolrCloud - when creating a
 collection you can specify the maxShardsPerNode parameter, which
 allows you to control how many shards of a given collection is
 permitted to be placed on a single node. By default it is set to 1,
 try setting it to 2 when creating your collection.

 --
 Regards,
  Rafał Kuć
  Sematext :: http://sematext.com/ :: Solr - Lucene - ElasticSearch

  I am rephrasing my question,

  Currently what i haev is

 Solr 8080 - Shard1Shard2- Replica
 Solr 9090 - Shard 2 Shard 1 - Replica

  Can we have something like

 Solr 8080 - Shard 1, Shard 2
 solr 9090 -  Shard2-Replica , Shard1- Replica



  On Thu, Aug 1, 2013 at 6:40 PM, Prasi S prasi1...@gmail.com wrote:

  Here when i create a single shard and a replica, then my shard will be
 on
  one server and replcia in teh other isn't ?
 
 
  On Thu, Aug 1, 2013 at 6:29 PM, Rafał Kuć r@solr.pl wrote:
 
  Hello!
 
  There is no master - slave in SolrCloud. You can create a collection
  that has only a single shard and have one replica. When you send an
  indexing request it will be forwarded to the leader shard (in your
  case you want it to be the instance running on 8080), however
  both Solr instances will accept indexing requests.
 
  --
  Regards,
   Rafał Kuć
   Sematext :: http://sematext.com/ :: Solr - Lucene - ElasticSearch
 
   I have a requirement to set solrcloud with 2 instances of Solr( one
 on
  8080
   and otehr on 9090 ports respectivey)  and a Zookeeper ensembe( 3
 modes).
   Can I have one solr instance as a Master and the other as a replcia
 of
  the
   Master.
 
   Because, when i set up a solrcloud and index to one of the solr
 running
  on
   8080, it automatically routes to 9090 solr also.
 
   I need indexing only on 8080 and 9090 solr should be a replica of
  8080solr.
   Is this possible in Cloud.'
 
   Pls guide me.
 
 
 




Re: SolrCloud separate indexer and Searcher

2013-08-01 Thread Prasi S
Erick,
CAn you pls elaborate on what old-style master/slave mean? Does it mean
that we can have separate indexer and searcher.


On Thu, Aug 1, 2013 at 8:42 PM, Erick Erickson erickerick...@gmail.comwrote:

 To add to what Shawn said, do note that Solr 4.x can
 operate either in old-style master/slave mode or
 SolrCloud...

 Best
 Erick


 On Thu, Aug 1, 2013 at 10:34 AM, Shawn Heisey s...@elyograg.org wrote:

  On 8/1/2013 4:51 AM, Prasi S wrote:
   i am using solr 4.0 for indexing db content to a cloud . Currently i
 have
   two solr isntances running in separate app servers. Can I have a one
 solr
   separately as indexer instance and another as a searcher.
  
   This is possible in Master-Slave but does this hold good for SorlCloud
  also?
   Because from what i understand, if Isetup two solr isntances, cloud
   automatically routes docuemtns to both the servers.
 
  Master/Slave does exactly what you describe.
 
  SolrCloud doesn't have master/slave concepts - it's a true cluster.
  Indexing and querying both happen on all replicas in the cloud,
  automatically.
 
  Thanks,
  Shawn