Error while inserting into index

2009-06-30 Thread David Baker
I recently upgraded to a nightly build of 1.4.  The build works fine, I 
can deploy fine.  But when I go to insert data into the index, I get the 
following error:


26-Jun-2009 5:52:06 PM 
org.apache.solr.update.processor.LogUpdateProcessor 
finish 

INFO: {} 0 
4  

26-Jun-2009 5:52:06 PM org.apache.solr.common.SolrException 
log   

SEVERE: java.lang.NoSuchFieldError: 
log   

  at 
com.pjaol.search.solr.update.LocalUpdaterProcessor.processAdd(LocalUpdateProcessorFactory.java:138)

  at 
org.apache.solr.handler.XMLLoader.processUpdate(XMLLoader.java:140)

  at 
org.apache.solr.handler.XMLLoader.load(XMLLoader.java:69)  

  at 
org.apache.solr.handler.ContentStreamHandlerBase.handleRequestBody(ContentStreamHandlerBase.java:54)   

  at 
org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:131)  

  at 
org.apache.solr.core.SolrCore.execute(SolrCore.java:1292)  

  at 
org.apache.solr.servlet.SolrDispatchFilter.execute(SolrDispatchFilter.java:338)

  at 
org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:241)   

  at 
org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)  

  at 
org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)  

  at 
org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:233)

  at 
org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:191)

  at 
org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:128)  

  at 
org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:102)  

  at 
org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:109)  

  at 
org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:286)

  at 
org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:845) 

  at 
org.apache.coyote.http11.Http11Protocol$Http11ConnectionHandler.process(Http11Protocol.java:583)   

  at 
org.apache.tomcat.util.net.JIoEndpoint$Worker.run(JIoEndpoint.java:447)

  at 
java.lang.Thread.run(Thread.java:619)

Error while trying to index

2009-06-26 Thread David Baker
I am trying to index a solr server from a nightly build.  I get the 
following error in my catalina.out:


26-Jun-2009 5:52:06 PM 
org.apache.solr.update.processor.LogUpdateProcessor 
finish 

INFO: {} 0 
4  

26-Jun-2009 5:52:06 PM org.apache.solr.common.SolrException 
log   

SEVERE: java.lang.NoSuchFieldError: 
log   

   at 
com.pjaol.search.solr.update.LocalUpdaterProcessor.processAdd(LocalUpdateProcessorFactory.java:138)

   at 
org.apache.solr.handler.XMLLoader.processUpdate(XMLLoader.java:140)

   at 
org.apache.solr.handler.XMLLoader.load(XMLLoader.java:69)  

   at 
org.apache.solr.handler.ContentStreamHandlerBase.handleRequestBody(ContentStreamHandlerBase.java:54)   

   at 
org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:131)  

   at 
org.apache.solr.core.SolrCore.execute(SolrCore.java:1292)  

   at 
org.apache.solr.servlet.SolrDispatchFilter.execute(SolrDispatchFilter.java:338)

   at 
org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:241)   

   at 
org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235)  

   at 
org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)  

   at 
org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:233)

   at 
org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:191)

   at 
org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:128)  

   at 
org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:102)  

   at 
org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:109)  

   at 
org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:286)

   at 
org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:845) 

   at 
org.apache.coyote.http11.Http11Protocol$Http11ConnectionHandler.process(Http11Protocol.java:583)   

   at 
org.apache.tomcat.util.net.JIoEndpoint$Worker.run(JIoEndpoint.java:447)

   at 
java.lang.Thread.run(Thread.java:619)   

Upgrade to solr 1.4

2009-06-26 Thread David Baker

Hi,

I need to upgrade from solr 1.3 to solr 1.4.  I was wondering if there 
is a particular revision of 1.4 that I should use that is considered 
very stable for a production environment?


Re: Function query using Map

2009-06-25 Thread David Baker

Noble Paul നോബിള്‍ नोब्ळ् wrote:

The five parameter feature is added in solr1.4 . which version of solr
are you using?

On Wed, Jun 24, 2009 at 12:57 AM, David Baker  wrote:
  

Hi,

I'm trying to use the map function with a function query.  I want to map a 
particular value to 1 and all other values to 0.  We currently use the map 
function that has 4 parameters with no problem.  However, for the map function 
with 5 parameters, I get a parse error.  The following are the query and error 
returned:

_query_
id:[* TO *] _val_:"map(ethnicity,3,3,1,0)"

_error message_

*type* Status report
*message* _org.apache.lucene.queryParser.ParseException: Cannot parse 'id:[* TO *] 
_val_:"map(ethnicity,3,3,1,0)"': Expected ')' at position 20 in 
'map(ethnicity,3,3,1,0)'_
*description* _The request sent by the client was syntactically incorrect 
(org.apache.lucene.queryParser.ParseException: Cannot parse 'id:[* TO *] 
_val_:"map(ethnicity,3,3,1,0)"': Expected ')' at position 20 in 
'map(ethnicity,3,3,1,0)').
_

It appears that the parser never evaluates the map string for anything other 
than the 4 parameters version.  Could anyone give me some insight into this?  
Thanks in advance.






--
-
Noble Paul | Principal Engineer| AOL | http://aol.com
  

we're running 1.3, which explains this. Thanks for the response.


Function query using Map

2009-06-23 Thread David Baker

Hi,

I'm trying to use the map function with a function query.  I want to map 
a particular value to 1 and all other values to 0.  We currently use the 
map function that has 4 parameters with no problem.  However, for the 
map function with 5 parameters, I get a parse error.  The following are 
the query and error returned:


_query_
id:[* TO *] _val_:"map(ethnicity,3,3,1,0)"

_error message_

*type* Status report
*message* _org.apache.lucene.queryParser.ParseException: Cannot parse 
'id:[* TO *] _val_:"map(ethnicity,3,3,1,0)"': Expected ')' at position 
20 in 'map(ethnicity,3,3,1,0)'_
*description* _The request sent by the client was syntactically 
incorrect (org.apache.lucene.queryParser.ParseException: Cannot parse 
'id:[* TO *] _val_:"map(ethnicity,3,3,1,0)"': Expected ')' at position 
20 in 'map(ethnicity,3,3,1,0)').

_

It appears that the parser never evaluates the map string for anything 
other than the 4 parameters version.  Could anyone give me some insight 
into this?  Thanks in advance.




Re: Garbage Collectors

2009-04-16 Thread David Baker

Otis Gospodnetic wrote:

Personally, I'd start from scratch:
-Xmx -Xms...

-server is not even needed any more.

If you are not using Java 1.6, I suggest you do.

Next, I'd try to investigate why objects are not being cleaned up - this should 
not be happening in the first place.  Is Solr the only webapp running?


Otis
--
Sematext -- http://sematext.com/ -- Lucene - Solr - Nutch



- Original Message 
  

From: David Baker 
To: solr-user@lucene.apache.org
Sent: Thursday, April 16, 2009 3:33:18 PM
Subject: Garbage Collectors

I have an issue with garbage collection on our solr servers.  We have an issue 
where the  old generation  never  gets cleaned up on one of our servers.  This 
server has a little over 2 million records which are updated every hour or so.  
I have tried the parallel GC and the concurrent GC.  The parallel seems more 
stable for us, but both end up running out of memory.  I have increased the 
memory allocated to the servers, but this just seems to delay the problem.  My 
question is, what are the suggested options for using the parallel GC.  
Currently we are using something of this nature:


-server -Xmx4096m -Xms512m -XX:+UseAdaptiveSizePolicy -XX:+UseParallelOldGC 
-XX:GCTimeRatio=19 -XX:NewSize=128m -XX:SurvivorRatio=2 
-Dsolr.solr.home=/usr/local/solr-tomcat-fi/solr


I am new to solr and GC tuning, so any advice is appreciated.

Thanks for the reply, yes, solr is the only app running under this 
tomcat server. I will remove -server, and other options except the heap 
allocation options and see how it performs. Any suggestions on how to go 
about finding out why objects are not being cleaned up if these changes 
dont work?




Garbage Collectors

2009-04-16 Thread David Baker
I have an issue with garbage collection on our solr servers.  We have an 
issue where the  old generation  never  gets cleaned up on one of our 
servers.  This server has a little over 2 million records which are 
updated every hour or so.  I have tried the parallel GC and the 
concurrent GC.  The parallel seems more stable for us, but both end up 
running out of memory.  I have increased the memory allocated to the 
servers, but this just seems to delay the problem.  My question is, what 
are the suggested options for using the parallel GC.  Currently we are 
using something of this nature:


-server -Xmx4096m -Xms512m -XX:+UseAdaptiveSizePolicy 
-XX:+UseParallelOldGC -XX:GCTimeRatio=19 -XX:NewSize=128m 
-XX:SurvivorRatio=2 -Dsolr.solr.home=/usr/local/solr-tomcat-fi/solr


I am new to solr and GC tuning, so any advice is appreciated.