Re: Solr Cluster Indexing data question

2010-09-30 Thread Steve Cohen
So how would one set it up to use multiple nodes for building an index? I
see a document for solr + hadoop (http://wiki.apache.org/solr/HadoopIndexing)
and it says it has an example but the example is missing.

Thanks,
Steve Cohen

On Thu, Sep 30, 2010 at 10:58 AM, Jak Akdemir  wrote:

> If you want to use both of your nodes for building index (which means
> two master), it makes them unified and collapses master slave
> relation.
>
> Would you take a look the link below for index snapshot problem?
> http://wiki.apache.org/solr/SolrCollectionDistributionScripts
>
> On Thu, Sep 30, 2010 at 11:03 AM, ZAROGKIKAS,GIORGOS
>  wrote:
> > Hi there solr experts
> >
> >
> > I have an solr cluster with two nodes  and separate index files for each
> > node
> >
> > Node1 is master
> > Node2 is slave
> >
> >
> > Node1 is the one that I index my data and replicate them to Node2
> >
> > How can I index my data at both nodes simultaneously ?
> > Is there any specific setup 
> >
> >
> > The problem is when my Node1 is down and I index the data from Node2 ,
> > Solr creates backup index folders like this "index.20100929060410"
> > and reduce the space of my hard disk
> >
> > Thanks in advance
> >
> >
> >
> >
> >
> >
> >
>


Where is the lock file?

2010-09-29 Thread Steve Cohen
Hello,

We were testing nutch configurations and apparently we got heavy handed with
our approach to stopping things.

Now when nutch starts indexing solr, we are seeing these messages:

org.apache.solr.common.SolrException: Lock obtain timed out:
SingleInstanceLock: write.lock
org.apache.lucene.store.LockObtainFailedException: Lock obtain timed out:
SingleInstanceLock: write
.lock   at org.apache.lucene.store.Lock.obtain(Lock.java:85)at
org.apache.lucene.index.IndexWriter.init(IndexWriter.java:1140)  at
org.apache.lucene.index.IndexWriter.(IndexWrite
r.java:938) at
org.apache.solr.update.SolrIndexWriter.(SolrIndexWriter.java:116)
at
org.apache.solr.update.UpdateHandler.createMainIndexWriter(UpdateHandler.java:122)
at org.a
pache.solr.update.DirectUpdateHandler2.openWriter(DirectUpdateHandler2.java:167)
at
org.apache.solr.update.DirectUpdateHandler2.addDoc(DirectUpdateHandler2.java:221)
at org.apache.so
lr.update.processor.RunUpdateProcessor.processAdd(RunUpdateProcessorFactory.java:59)
at
org.apache.solr.handler.XmlUpdateRequestHandler.processUpdate(XmlUpdateRequestHandler.java:196)

I've looked through the configuration file. I can see where it defines the
lock type and I can see the unlock configuration. But I don't see where it
specifies the lock file. Where is it? What is its name?

Also, to speed up nutch, we changed the configuration to start several map
tasks at once. Is nutch trying to kick off several solr sessions at once and
is that causing messages like the above? Should we just change the lock to
simple?

Thanks,
Steve Cohen