Re: Verify a certain Replica contains a document

2015-05-15 Thread Shai Erera
Yes. Here's what I do:

Start two embedded Solr nodes (i.e. like using MiniSolrCloudCluster). They
were started on ports 63175 and 63201.

Create a collection with one shard and replica.
/solr/admin/collections?action=clusterstatus shows it was created on
127.0.0.1:63201_solr.

Index a document: curl -i -X POST
http://127.0.0.1:63175/solr/mycollection/update/json?commit=true -d
'[{"id":"doc1"}]'

Verify 63175 contains no cores:
http://127.0.0.1:63175/solr/admin/cores?action=status
Verify 63201 contains one core:
http://127.0.0.1:63201/solr/admin/cores?action=status -- returns an index
w/ numDocs=maxDoc=1.

All of these return the document though:

http://127.0.0.1:63175/solr/mycollection/select?q=*
http://127.0.0.1:63175/solr/mycollection/select?q=*&distrib=false
http://127.0.0.1:63175/solr/mycollection_shard1_replica1/select?q=*
http://127.0.0.1:63175/solr/mycollection_shard1_replica1/select?q=*&distrib=false

This returns "Can not find: /solr/core_node1/select" on both nodes (which
is expected since there's no such core on any of the nods):
http://127.0.0.1:63175/solr/core_node1/select?q=*

Shai

On Sat, May 16, 2015 at 8:08 AM, Anshum Gupta 
wrote:

> Did you also try querying /core.name/select with distrib=false ?
>
> On Fri, May 15, 2015 at 9:22 PM, Shai Erera  wrote:
>
> > Hi
> >
> > Is there a REST API in Solr that allows me to query a certain
> Replica/core?
> > I am writing some custom replica-recovery code and I'd like to verify
> that
> > it works well.
> >
> > I wanted to use the /collection/select API, passing
> > shards=host.under.test:ip/solr/collection, but that also works even if
> > 'host.under.test' does not hold any local replicas. This makes sense
> from a
> > distributed search perspective, but doesn't help me. Also, passing
> > distrib=false, which I found by searching the web, didn't help and seems
> to
> > be ignored, or at least there's still a fallback that makes
> > 'host.under.test' access the other nodes in the cluster to fulfill the
> > request.
> >
> > Next I looked at /admin/cores?action=STATUS API. This looks better as it
> > allows me to list the cores on 'host.under.test' and I can get index-wide
> > statistics such as numDocs and maxDoc. This is better cause in my tests I
> > know how many documents I should expect.
> >
> > But I was wondering if
> >
> > (1) Is Core admin API the proper way to achieve what I want, or is there
> a
> > better way?
> > (2) Is there core-specific API for select/get, like there is for
> > /collection. I tried /core.name/select, but again, I received results
> even
> > when querying the node w/ no local replicas.
> >
> > Shai
> >
>
>
>
> --
> Anshum Gupta
>


Re: Verify a certain Replica contains a document

2015-05-15 Thread Anshum Gupta
Did you also try querying /core.name/select with distrib=false ?

On Fri, May 15, 2015 at 9:22 PM, Shai Erera  wrote:

> Hi
>
> Is there a REST API in Solr that allows me to query a certain Replica/core?
> I am writing some custom replica-recovery code and I'd like to verify that
> it works well.
>
> I wanted to use the /collection/select API, passing
> shards=host.under.test:ip/solr/collection, but that also works even if
> 'host.under.test' does not hold any local replicas. This makes sense from a
> distributed search perspective, but doesn't help me. Also, passing
> distrib=false, which I found by searching the web, didn't help and seems to
> be ignored, or at least there's still a fallback that makes
> 'host.under.test' access the other nodes in the cluster to fulfill the
> request.
>
> Next I looked at /admin/cores?action=STATUS API. This looks better as it
> allows me to list the cores on 'host.under.test' and I can get index-wide
> statistics such as numDocs and maxDoc. This is better cause in my tests I
> know how many documents I should expect.
>
> But I was wondering if
>
> (1) Is Core admin API the proper way to achieve what I want, or is there a
> better way?
> (2) Is there core-specific API for select/get, like there is for
> /collection. I tried /core.name/select, but again, I received results even
> when querying the node w/ no local replicas.
>
> Shai
>



-- 
Anshum Gupta


Verify a certain Replica contains a document

2015-05-15 Thread Shai Erera
Hi

Is there a REST API in Solr that allows me to query a certain Replica/core?
I am writing some custom replica-recovery code and I'd like to verify that
it works well.

I wanted to use the /collection/select API, passing
shards=host.under.test:ip/solr/collection, but that also works even if
'host.under.test' does not hold any local replicas. This makes sense from a
distributed search perspective, but doesn't help me. Also, passing
distrib=false, which I found by searching the web, didn't help and seems to
be ignored, or at least there's still a fallback that makes
'host.under.test' access the other nodes in the cluster to fulfill the
request.

Next I looked at /admin/cores?action=STATUS API. This looks better as it
allows me to list the cores on 'host.under.test' and I can get index-wide
statistics such as numDocs and maxDoc. This is better cause in my tests I
know how many documents I should expect.

But I was wondering if

(1) Is Core admin API the proper way to achieve what I want, or is there a
better way?
(2) Is there core-specific API for select/get, like there is for
/collection. I tried /core.name/select, but again, I received results even
when querying the node w/ no local replicas.

Shai


JSON

2015-05-15 Thread William Bell
Can we gt this one fixed? If the Body is empty don't through a Null Pointer
Exception?

Thanks

>* 
>http://localhost:8983/solr/gettingstarted/select?wt=json&indent=true&q=foundation";
;*
>* -H "Content-type:application/json"*

You're telling Solr the body encoding is JSON, but then you don't send any
body.
We could catch that error earlier perhaps, but it still looks like an error?

-Yonik

-- 
Bill Bell
billnb...@gmail.com
cell 720-256-8076


Re: Searcher is opening twice on Reload

2015-05-15 Thread Aman Tandon
Any help here..

With Regards
Aman Tandon

On Fri, May 15, 2015 at 1:24 PM, Aman Tandon 
wrote:

> Thanks chris, but in the issue it is mentioned that first searcher
> listener is opening twice but in my case firstly the firstSearcher is
> opening and then newSearcher. Is it same?
>
> With Regards
> Aman Tandon
>
> On Thu, May 14, 2015 at 11:05 PM, Chris Hostetter <
> hossman_luc...@fucit.org> wrote:
>
>>
>> I suspect you aren't doing anything wrong, i think it's the same as this
>> bug...
>>
>> https://issues.apache.org/jira/browse/SOLR-7035
>>
>>
>> : Date: Thu, 14 May 2015 12:53:34 +0530
>> : From: Aman Tandon 
>> : Reply-To: solr-user@lucene.apache.org
>> : To: "solr-user@lucene.apache.org" 
>> : Subject: Searcher is opening twice on Reload
>> :
>> : Hi,
>> :
>> : Please help me here, when I am doing the reload of core, my searcher is
>> : being opening twice. I am also attaching the logs output, please
>> suggest me
>> : what wrong I am doing here or this is default behavior on reload.
>> :
>> : May 14, 2015 12:47:38 PM org.apache.solr.spelling.DirectSolrSpellChecker
>> : > INFO: init:
>> : >
>> {name=default,field=titlews,classname=solr.DirectSolrSpellChecker,distanceMeasure=internal,accuracy=0.5,maxEdits=1,minPrefix=1,maxInspections=5,minQueryLength=5,maxQueryFrequency=100.0,thresholdTokenFrequency=100.0}
>> : > May 14, 2015 12:47:38 PM
>> : > org.apache.solr.handler.component.SpellCheckComponent
>> : > INFO: No queryConverter defined, using default converter
>> : > May 14, 2015 12:47:38 PM
>> : > org.apache.solr.handler.component.QueryElevationComponent
>> : > INFO: Loading QueryElevation from data dir: elevate.xml
>> : > May 14, 2015 12:47:38 PM org.apache.solr.handler.ReplicationHandler
>> : > INFO: Commits will be reserved for  1
>> : > May 14, 2015 12:47:38 PM org.apache.solr.core.QuerySenderListener
>> : > INFO: QuerySenderListener sending requests to Searcher@41dc3c83
>> [IM-Search]
>> : > main{StandardDirectoryReader(segments_dd4:82296:nrt
>> : > _jdq(4.8):C5602938/2310052:delGen=3132
>> : > _jkq(4.8):C6860454/1398005:delGen=2992
>> : > _jx2(4.8):C5237053/1505048:delGen=3241
>> : > _joo(4.8):C5825253/1599671:delGen=3323
>> : > _k4d(4.8):C5860360/1916531:delGen=3150
>> : > _o27(4.8):C5290435/1018865:delGen=370
>> : > _mju(4.8):C5074973/1602707:delGen=1474
>> : > _jka(4.8):C5172599/1774839:delGen=3202
>> : > _nik(4.8):C4698916/1512091:delGen=804
>> _o8y(4.8):C1137592/521423:delGen=190
>> : > _oeu(4.8):C469094/86291:delGen=29 _odq(4.8):C217505/65596:delGen=55
>> : > _ogd(4.8):C50454/4155:delGen=5 _oea(4.8):C40833/7192:delGen=37
>> : > _ofy(4.8):C73614/7273:delGen=13 _ogx(4.8):C395681/1388:delGen=4
>> : > _ogh(4.8):C7676/70:delGen=2 _ohf(4.8):C108769/21:delGen=2
>> : > _ogc(4.8):C24435/384:delGen=4 _ogi(4.8):C23088/158:delGen=3
>> : > _ogj(4.8):C4217/2:delGen=1 _ohs(4.8):C7 _oh6(4.8):C20509/205:delGen=5
>> : > _oh7(4.8):C3171 _oho(4.8):C6/1:delGen=1 _ohq(4.8):C1
>> : > _ohv(4.8):C10484/996:delGen=2 _ohx(4.8):C500 _ohy(4.8):C1
>> _ohz(4.8):C1)}
>> : > ^[OFMay 14, 2015 12:47:43 PM org.apache.solr.core.SolrCore
>> : > INFO: [IM-Search] webapp=/solr path=/select
>> : >
>> params={spellcheck=true&lon=0&q=q&wt=json&qt=opsview.monitor&lat=0&rows=0&ps=1}
>> : > hits=6 status=0 QTime=1
>> : > May 14, 2015 12:47:44 PM org.apache.solr.core.SolrCore
>> : > INFO: [IM-Search] webapp=null path=null
>> : >
>> params={start=0&event=firstSearcher&q=rice&distrib=false&qt=im.search.intent&rows=25}
>> : > hits=42749 status=0 QTime=5667
>> : > May 14, 2015 12:47:58 PM org.apache.solr.request.UnInvertedField
>> : > INFO: UnInverted multi-valued field
>> : >
>> {field=city,memSize=209216385,tindexSize=11029,time=3904,phase1=3783,nTerms=77614,bigTerms=3,termInstances=31291566,uses=0}
>> : > May 14, 2015 12:48:01 PM org.apache.solr.request.UnInvertedField
>> : > INFO: UnInverted multi-valued field
>> : >
>> {field=biztype,memSize=208847178,tindexSize=40,time=1318,phase1=1193,nTerms=9,bigTerms=4,termInstances=1607459,uses=0}
>> : > May 14, 2015 12:48:01 PM org.apache.solr.core.SolrCore
>> : > INFO: [IM-Search] webapp=null path=null
>> : >
>> params={start=0&event=firstSearcher&q=rice&distrib=false&qt=im.search&rows=25}
>> : > hits=57619 status=0 QTime=17194
>> : > May 14, 2015 12:48:04 PM org.apache.solr.core.SolrCore
>> : > INFO: [IM-Search] webapp=null path=null
>> : >
>> params={start=0&event=firstSearcher&q=potassium+cyanide&distrib=false&qt=eto.search.offer&rows=20}
>> : > hits=443 status=0 QTime=3272
>> : > May 14, 2015 12:48:09 PM org.apache.solr.core.SolrCore
>> : > INFO: [IM-Search] webapp=null path=null
>> : >
>> params={start=0&event=firstSearcher&q=motor+spare+parts&distrib=false&qt=im.search&fq=attribs:(locprefglobal+locprefnational+locprefcity)&rows=20}
>> : > hits=107297 status=0 QTime=5254
>> : > May 14, 2015 12:48:09 PM org.apache.solr.core.QuerySenderListener
>> : > INFO: QuerySenderListener done.
>> : > May 14, 2015 12:48:09 PM
>> : >
>> org.apache.solr.handler.component.SpellCheckComponent$Spel

Re: schema.xml & xi:include -> copyField source :'_my_title' is not a glob and doesn't match any explicit field or dynamicField

2015-05-15 Thread Steve Rowe
Hi Clemens,

I forgot that XInclude requires well-formed XML, so schema-common.xml without 
 tags won’t work, since it will have multiple root elements.

But instead of XInclude, you can define external entities for files you want to 
include, and then include a reference to them where you want the contents to be 
included.

This worked for me:

——
schema.xml
——

 ]>

  &schema_common;

——

——
schema-common.incl
——
 _my_id
 
 
 
   
   

 
  
   

 
——

Here’s what I get back from curl 
"http://localhost:8983/solr/mycore/schema?wt=schema.xml&indent=on”:

——


  _my_id
  
  
  
  
  
  
  

——

Steve

> On May 15, 2015, at 8:57 AM, Clemens Wyss DEV  wrote:
> 
> Thought about that too (should have written ;) ).
> When I remove the schema-tag from the composite xml I get:
> org.apache.solr.common.SolrException: Unable to create core [test]
>   at org.apache.solr.core.CoreContainer.create(CoreContainer.java:533)
>   at org.apache.solr.core.CoreContainer.create(CoreContainer.java:493)
> ...
>   at 
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:675)
>   at 
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:382)
>   at 
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:192)
> Caused by: org.apache.solr.common.SolrException: Could not load conf for core 
> test: org.apache.solr.common.SolrException: org.xml.sax.SAXParseException; 
> systemId: solrres:/schema.xml; lineNumber: 3; columnNumber: 84; Error 
> attempting to parse XML file (href='schema-common.xml').. Schema file is 
> C:\source\search\search-impl\WebContent\WEB-INF\solr\configsets\test\conf\schema.xml
>   at 
> org.apache.solr.core.ConfigSetService.getConfig(ConfigSetService.java:78)
>   at org.apache.solr.core.CoreContainer.create(CoreContainer.java:516)
>   ... 12 more
> Caused by: com.google.common.util.concurrent.UncheckedExecutionException: 
> org.apache.solr.common.SolrException: org.xml.sax.SAXParseException; 
> systemId: solrres:/schema.xml; lineNumber: 3; columnNumber: 84; Error 
> attempting to parse XML file (href='schema-common.xml').. Schema file is 
> C:\source\search\search-impl\WebContent\WEB-INF\solr\configsets\test\conf\schema.xml
>   at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2199)
>   at com.google.common.cache.LocalCache.get(LocalCache.java:3932)
>   at 
> com.google.common.cache.LocalCache$LocalManualCache.get(LocalCache.java:4721)
>   at 
> org.apache.solr.core.ConfigSetService$SchemaCaching.createIndexSchema(ConfigSetService.java:206)
>   at 
> org.apache.solr.core.ConfigSetService.getConfig(ConfigSetService.java:74)
>   ... 13 more
> Caused by: org.apache.solr.common.SolrException: 
> org.xml.sax.SAXParseException; systemId: solrres:/schema.xml; lineNumber: 3; 
> columnNumber: 84; Error attempting to parse XML file 
> (href='schema-common.xml').. Schema file is 
> C:\source\search\search-impl\WebContent\WEB-INF\solr\configsets\test\conf\schema.xml
>   at org.apache.solr.schema.IndexSchema.readSchema(IndexSchema.java:596)
>   at org.apache.solr.schema.IndexSchema.(IndexSchema.java:175)
>   at 
> org.apache.solr.schema.IndexSchemaFactory.create(IndexSchemaFactory.java:55)
>   at 
> org.apache.solr.schema.IndexSchemaFactory.buildIndexSchema(IndexSchemaFactory.java:69)
>   at 
> org.apache.solr.core.ConfigSetService$SchemaCaching$1.call(ConfigSetService.java:210)
>   at 
> org.apache.solr.core.ConfigSetService$SchemaCaching$1.call(ConfigSetService.java:206)
>   at 
> com.google.common.cache.LocalCache$LocalManualCache$1.load(LocalCache.java:4724)
>   at 
> com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3522)
>   at 
> com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2315)
>   at 
> com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2278)
>   at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2193)
>   ... 17 more
> Caused by: org.apache.solr.common.SolrException: 
> org.xml.sax.SAXParseException; systemId: solrres:/schema.xml; lineNumber: 3; 
> columnNumber: 84; Error attempting to parse XML file 
> (href='schema-common.xml').
>   at org.apache.solr.core.Config.(Config.java:156)
>   at org.apache.solr.core.Config.(Config.java:92)
>   at org.apache.solr.schema.IndexSchema.readSchema(IndexSchema.java:455)
>   ... 27 more
> Caused by: org.xml.sax.SAXParseException; systemId: solrres:/schema.xml; 
> lineNumber: 3; columnNumber: 84; Error attempting to parse XML file 
> (href='schema-common.xml').
>   at org.apache.solr.core.Config.(Config.java:145)
>   ... 29 more
> 
> -Ursprüngliche Nachricht-
> Von: Steve Rowe [mailto:sar...@gmail.com] 
> Gesendet: Freitag, 15. Mai 2015 13:30
> An: solr-user@lucene.apache.org
> Betreff: Re: schema.xml & xi:include -> copyField source :'_

Re: Problem with solr.LengthFilterFactory

2015-05-15 Thread Jack Krupansky
Sorry that my brain has turned to mush... the issue you are hitting is due
to a known, undocumented limit in the whitespace tokenizer:

https://issues.apache.org/jira/browse/LUCENE-5785
"White space tokenizer has undocumented limit of 256 characters per token"

If you look at the parsed query you will see that two query terms were
generated. This is because the whitespace tokenizer will simply split long
tokens every 256 characters. So, your filter will never see a long term.

There is a note on the Jira (evidently by me!) that you can use the pattern
tokenizer as a workaround. But... if your term is a string anyway, you
could just use the keyword tokenizer.


-- Jack Krupansky

On Fri, May 15, 2015 at 4:06 PM, Charles Sanders 
wrote:

> Shawn,
> Thanks a bunch for working with me on this.
>
> I have deleted all records from my index. Stopped solr. Made the schema
> changes as requested. Started solr. Then insert the one test record. Then
> search. Still see the same results. No portal_package is not the unique
> key, its uri. Which is a string field.
>
>  multiValued="true"/>
>
>  positionIncrementGap="100">
> 
> 
> 
>
> {
> "documentKind": "test",
> "uri": "test300",
> "id": "test300",
> "portal_package":"12345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890"
> }
>
>
> {
> "responseHeader": {
> "status": 0,
> "QTime": 47,
> "params": {
> "spellcheck": "true",
> "enableElevation": "false",
> "df": "allText",
> "echoParams": "all",
> "spellcheck.maxCollations": "5",
> "spellcheck.dictionary": "andreasAutoComplete",
> "spellcheck.count": "5",
> "spellcheck.collate": "true",
> "spellcheck.onlyMorePopular": "true",
> "rows": "10",
> "indent": "true",
> "q":
> "portal_package:12345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890",
> "_": "1431719989047",
> "debug": "query",
> "wt": "json"
> }
> },
> "response": {
> "numFound": 1,
> "start": 0,
> "docs": [
> {
> "documentKind": "test",
> "uri": "test300",
> "id": "test300",
> "portal_package": [
>
> "12345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890"
> ],
> "_version_": 1501267024421060600,
> "timestamp": "2015-05-15T19:56:43.247Z",
> "language": "en"
> }
> ]
> },
> "debug": {
> "rawquerystring":
> "portal_package:12345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890",
> "querystring":
> "portal_package:12345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890",
> "parsedquery":
> "portal_package:1234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456
> portal_package:7890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012

Re: Problem with solr.LengthFilterFactory

2015-05-15 Thread Shawn Heisey
On 5/15/2015 2:06 PM, Charles Sanders wrote:
> I have deleted all records from my index. Stopped solr. Made the schema 
> changes as requested. Started solr. Then insert the one test record. Then 
> search. Still see the same results. No portal_package is not the unique key, 
> its uri. Which is a string field. 
>
>  multiValued="true"/> 
>
>  
>  
>  
>  

You got rid of  and it's closing tag entirely, that's not
going to work.  This is what you need, and you'll need to reindex the
doc again:


  


  


Here's a temporary paste that you can copy/paste:

http://apaste.info/NXn

Thanks,
Shawn



Re: Problem with solr.LengthFilterFactory

2015-05-15 Thread Charles Sanders
Shawn, 
Thanks a bunch for working with me on this. 

I have deleted all records from my index. Stopped solr. Made the schema changes 
as requested. Started solr. Then insert the one test record. Then search. Still 
see the same results. No portal_package is not the unique key, its uri. Which 
is a string field. 

 

 
 
 
 

{ 
"documentKind": "test", 
"uri": "test300", 
"id": "test300", 
"portal_package":"12345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890"
 
} 


{ 
"responseHeader": { 
"status": 0, 
"QTime": 47, 
"params": { 
"spellcheck": "true", 
"enableElevation": "false", 
"df": "allText", 
"echoParams": "all", 
"spellcheck.maxCollations": "5", 
"spellcheck.dictionary": "andreasAutoComplete", 
"spellcheck.count": "5", 
"spellcheck.collate": "true", 
"spellcheck.onlyMorePopular": "true", 
"rows": "10", 
"indent": "true", 
"q": 
"portal_package:12345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890",
 
"_": "1431719989047", 
"debug": "query", 
"wt": "json" 
} 
}, 
"response": { 
"numFound": 1, 
"start": 0, 
"docs": [ 
{ 
"documentKind": "test", 
"uri": "test300", 
"id": "test300", 
"portal_package": [ 
"12345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890"
 
], 
"_version_": 1501267024421060600, 
"timestamp": "2015-05-15T19:56:43.247Z", 
"language": "en" 
} 
] 
}, 
"debug": { 
"rawquerystring": 
"portal_package:12345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890",
 
"querystring": 
"portal_package:12345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890",
 
"parsedquery": 
"portal_package:1234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456
 
portal_package:7890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890",
 
"parsedquery_toString": 
"portal_package:1234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456
 
portal_package:7890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890",
 
"QParser": "LuceneQParser" 
} 
} 





- Original Message -

From: "Shawn Heisey"  
To: solr-user@lucene.apache.org 
Sent: Friday, May 15, 2015 3:29

Re: Problem with solr.LengthFilterFactory

2015-05-15 Thread Shawn Heisey
On 5/15/2015 1:23 PM, Shawn Heisey wrote:
> Then I looked back at your fieldType definition and noticed that you
> are only defining an index analyzer. Remove the 'type="index"' part of
> the analyzer config so it happens at both index and query time,
> reindex, then try again.

The reindex may be very important here.  I would actually completely
delete your data directory and restart Solr before reindexing, to be
sure you don't have old recordsfrom any previous reindexes.

http://wiki.apache.org/solr/HowToReindex

I think this next part is unlikely, but I'm going to ask it anyway:  Is
the portal_package field your schema uniqueKey?  If it is, that might be
an additional source of problems.  Using a solr.Textfield for a
uniqueKey field causes Solr to behave in unexpected ways.

Thanks,
Shawn



Re: Problem with solr.LengthFilterFactory

2015-05-15 Thread Shawn Heisey
On 5/15/2015 12:47 PM, Charles Sanders wrote:
> Ran the same test as below. Added echoParams=all and debug=query. Thanks for 
> the help! 

Initially, I was a little confused by something I saw in your debug
output.  I've broken up the query text with newlines into 50 character
increments for clarity in an email context:

"parsedquery": "portal_package:
12345678901234567890123456789012345678901234567890
12345678901234567890123456789012345678901234567890
12345678901234567890123456789012345678901234567890
12345678901234567890123456789012345678901234567890
12345678901234567890123456789012345678901234567890
12345
portal_package:67890123456789012345678901234567890
12345678901234567890123456789012345678901234567890
12345678901234567890123456789012345678901234567890
12345678901234567890123456789012345678901234567890
12345678901234567890123456789012345678901234567890
1234567890",

Then I looked back at your fieldType definition and noticed that you are
only defining an index analyzer.  Remove the 'type="index"' part of the
analyzer config so it happens at both index and query time, reindex,
then try again.

If your index is small enough (especially if this is the only doc it
contains), you can go to the admin UI and load the terms with the Schema
Browser after selecting the portal_package field and see what is
actually in your index for this document.

As long as your analysis chain is the way it is, Solr's behavior will be
odd.  Once we get it straightened out, then we can look for any other
problems.

Thanks,
Shawn



Re: Problem with solr.LengthFilterFactory

2015-05-15 Thread Charles Sanders
Ran the same test as below. Added echoParams=all and debug=query. Thanks for 
the help! 

Results 

{ 
"responseHeader": { 
"status": 0, 
"QTime": 46, 
"params": { 
"spellcheck": "true", 
"enableElevation": "false", 
"df": "allText", 
"echoParams": "all", 
"spellcheck.maxCollations": "5", 
"spellcheck.dictionary": "andreasAutoComplete", 
"spellcheck.count": "5", 
"spellcheck.collate": "true", 
"spellcheck.onlyMorePopular": "true", 
"rows": "10", 
"indent": "true", 
"q": 
"portal_package:12345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890",
 
"_": "1431715486263", 
"debug": "query", 
"wt": "json" 
} 
}, 
"response": { 
"numFound": 1, 
"start": 0, 
"docs": [ 
{ 
"documentKind": "test", 
"uri": "test300", 
"id": "test300", 
"portal_package": [ 
"12345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890"
 
], 
"_version_": 1501262378591846400, 
"timestamp": "2015-05-15T18:42:52.625Z", 
"language": "en" 
} 
] 
}, 
"debug": { 
"rawquerystring": 
"portal_package:12345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890",
 
"querystring": 
"portal_package:12345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890",
 
"parsedquery": 
"portal_package:123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345
 
portal_package:67890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890",
 
"parsedquery_toString": 
"portal_package:123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345
 
portal_package:67890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890",
 
"QParser": "LuceneQParser" 
} 
} 




- Original Message -

From: "Shawn Heisey"  
To: solr-user@lucene.apache.org 
Sent: Friday, May 15, 2015 2:36:03 PM 
Subject: Re: Problem with solr.LengthFilterFactory 

On 5/15/2015 10:04 AM, Charles Sanders wrote: 
> Agree 100%. The value returned is the value stored. Not affected by the 
> analyzer. 
> 
> However, I searched for that token. See my query? I would expect the analyzer 
> to remove the large token. So that when I search for the large token I would 
> find nothing. Rather it returns my record. 
> 
> Am I missing something here? 

Can you add echoParams=all and debug=query to your request and send the 
response you get? 

Thanks, 
Shawn 




Re: Problem with solr.LengthFilterFactory

2015-05-15 Thread Shawn Heisey
On 5/15/2015 10:04 AM, Charles Sanders wrote:
> Agree 100%. The value returned is the value stored. Not affected by the 
> analyzer. 
>
> However, I searched for that token. See my query? I would expect the analyzer 
> to remove the large token. So that when I search for the large token I would 
> find nothing. Rather it returns my record. 
>
> Am I missing something here? 

Can you add echoParams=all and debug=query to your request and send the
response you get?

Thanks,
Shawn



NPE when Faceting with MoreLikeThis handler in Solr 5.1.0

2015-05-15 Thread Tim Hearn
Hi everyone,

Recently I upgraded to solr 5.1.0.  When trying to generate facets using
the more like this handler, I now get a a NullPointerException.  I never
got this exception while using Solr 4.10.0 Details are below:

Stack Trace:
at
org.apache.solr.request.SimpleFacets.getHeatmapCounts(SimpleFacets.java:1555)
at
org.apache.solr.request.SimpleFacets.getFacetCounts(SimpleFacets.java:284)
at
org.apache.solr.handler.MoreLikeThisHandler.handleRequestBody(MoreLikeThisHandler.java:233)
at
org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:143)
at org.apache.solr.core.SolrCore.execute(SolrCore.java:1984)
at
org.apache.solr.servlet.SolrDispatchFilter.execute(SolrDispatchFilter.java:829)
at
org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:446)
at
org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:220)
at
org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1419)
at
org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:455)
at
org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:137)
at
org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:557)
at
org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:231)
at
org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1075)
at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:384)
at
org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:193)
at
org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1009)
at
org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:135)
at
org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:255)
at
org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:154)
at
org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:116)
at org.eclipse.jetty.server.Server.handle(Server.java:368)
at
org.eclipse.jetty.server.AbstractHttpConnection.handleRequest(AbstractHttpConnection.java:489)
at
org.eclipse.jetty.server.BlockingHttpConnection.handleRequest(BlockingHttpConnection.java:53)
at
org.eclipse.jetty.server.AbstractHttpConnection.headerComplete(AbstractHttpConnection.java:942)
at
org.eclipse.jetty.server.AbstractHttpConnection$RequestHandler.headerComplete(AbstractHttpConnection.java:1004)
at org.eclipse.jetty.http.HttpParser.parseNext(HttpParser.java:640)
at org.eclipse.jetty.http.HttpParser.parseAvailable(HttpParser.java:235)
at
org.eclipse.jetty.server.BlockingHttpConnection.handle(BlockingHttpConnection.java:72)
at
org.eclipse.jetty.server.bio.SocketConnector$ConnectorEndPoint.run(SocketConnector.java:264)
at
org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:608)
at
org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:543)
at java.lang.Thread.run(Thread.java:745)


Query:
qt=/mlt&
q=id:545dbb57b54c2403f286050e546dcdcab54cf2d074e5a2f7&
mlt.mindf=5&
mlt.mintf=1&
mlt.minwl=3&
mlt.boost=true&
fq=storeid:546dcdcab54cf2d074e5a2f7&
mlt.fl=overview_mlt,abstract_mlt,description_mlt,company_profile_mlt,bio_mlt&
mlt.interestingTerms=details&
fl=conceptid,score
&sort=score desc&
start=0&
rows=2&
facet=true&
facet.field=tags&
facet.field=locations&
facet.mincount=1&
facet.method=enum&
facet.limit=-1&
facet.sort=count

Schema.xml(relevant parts):
   

   

   


solrconfig.xml(relevant parts):
  
  


Re: Problem with solr.LengthFilterFactory

2015-05-15 Thread Charles Sanders
Agree 100%. The value returned is the value stored. Not affected by the 
analyzer. 

However, I searched for that token. See my query? I would expect the analyzer 
to remove the large token. So that when I search for the large token I would 
find nothing. Rather it returns my record. 

Am I missing something here? 


- Original Message -

From: "Jack Krupansky"  
To: solr-user@lucene.apache.org 
Sent: Friday, May 15, 2015 11:56:51 AM 
Subject: Re: Problem with solr.LengthFilterFactory 

The returned value is the stored or original source value - only the 
indexed terms are affected by token filtering. 

You could use an update processor if you want to adjust the actual source 
value, such as the truncate processor to truncate long source values: 

http://lucene.apache.org/solr/5_1_0/solr-core/org/apache/solr/update/processor/TruncateFieldUpdateProcessorFactory.html
 


-- Jack Krupansky 

On Fri, May 15, 2015 at 11:38 AM, Charles Sanders  
wrote: 

> Yes, that is what I am seeing. Looking in the code myself, I see no reason 
> for this behavior. That is why I assumed I was doing something very wrong. 
> 
> Below I have included an example. I set the max length to 300. I insert a 
> record with a single token of 500 characters. I expect the token to be 
> removed and not included in the index. When I query using the large token, 
> the record is returned. I can see the same result using the analysis page 
> in the solr console. 
> 
> He is a test example: 
> 
>  multiValued="true"/> 
> 
>  positionIncrementGap="100"> 
>  
>  
>  
>  
>  
> 
> 
> A test record: 
> 
> { 
> "documentKind": "test", 
> "uri": "test300", 
> "id": "test300", 
> "portal_package": 
> "12345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890"
>  
> } 
> 
> 
> Query result: 
> 
> { 
> "responseHeader": { 
> "status": 0, 
> "QTime": 55, 
> "params": { 
> "indent": "true", 
> "q": 
> "portal_package:12345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890",
>  
> "_": "1431704135745", 
> "wt": "json" 
> } 
> }, 
> "response": { 
> "numFound": 1, 
> "start": 0, 
> "docs": [ 
> { 
> "documentKind": "test", 
> "uri": "test300", 
> "id": "test300", 
> "portal_package": [ 
> 
> "12345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890"
>  
> ], 
> "_version_": 1501249997589446700, 
> "timestamp": "2015-05-15T15:26:05.205Z", 
> "language": "en" 
> } 
> ] 
> } 
> } 
> 
> 
> 
> 
> 
> - Original Message - 
> 
> From: "Shawn Heisey"  
> To: solr-user@lucene.apache.org 
> Sent: Friday, May 15, 2015 11:13:14 AM 
> Subject: Re: Problem with solr.LengthFilterFactory 
> 
> On 5/15/2015 8:49 AM, Charles Sanders wrote: 
> > I'm seeing a problem with the LengthFilter. It appears to work fine 
> until I increase the max value above 254. At the point it stops removing 
> the very large token from the stream. As a result I get the error: 
> > java.lang.IllegalArgumentException: Document contains at least one 
> immense term.. UTF8 encoding is longer than the max length 32766 
> > 
> > I'm certain I'm doing this wrong. Can someone please show me the light. 
> :) 
> > 
> >  positionIncrementGap="100"> 
> >  
> >  
> >  
> >  
> >  
> 
> So with max="254", you don't get the error? Looking at the code for 
> LengthFilter, I can't see any way for it to behave differently with a 
> max of 254 vs. a max of 255 or higher. All of the interfaces and 
> classes involved use "int" for length, which means it should work 
> perfectly with numbers above 254. 
> 
> Thanks, 
> Shawn 
> 
> 
> 



Re: Problem with solr.LengthFilterFactory

2015-05-15 Thread Jack Krupansky
The returned value is the stored or original source value - only the
indexed terms are affected by token filtering.

You could use an update processor if you want to adjust the actual source
value, such as the truncate processor to truncate long source values:

http://lucene.apache.org/solr/5_1_0/solr-core/org/apache/solr/update/processor/TruncateFieldUpdateProcessorFactory.html


-- Jack Krupansky

On Fri, May 15, 2015 at 11:38 AM, Charles Sanders 
wrote:

> Yes, that is what I am seeing. Looking in the code myself, I see no reason
> for this behavior. That is why I assumed I was doing something very wrong.
>
> Below I have included an example. I set the max length to 300. I insert a
> record with a single token of 500 characters. I expect the token to be
> removed and not included in the index. When I query using the large token,
> the record is returned. I can see the same result using the analysis page
> in the solr console.
>
> He is a test example:
>
>  multiValued="true"/>
>
>  positionIncrementGap="100">
> 
> 
> 
> 
> 
>
>
> A test record:
>
> {
> "documentKind": "test",
> "uri": "test300",
> "id": "test300",
> "portal_package":
> "12345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890"
> }
>
>
> Query result:
>
> {
> "responseHeader": {
> "status": 0,
> "QTime": 55,
> "params": {
> "indent": "true",
> "q":
> "portal_package:12345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890",
> "_": "1431704135745",
> "wt": "json"
> }
> },
> "response": {
> "numFound": 1,
> "start": 0,
> "docs": [
> {
> "documentKind": "test",
> "uri": "test300",
> "id": "test300",
> "portal_package": [
>
> "12345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890"
> ],
> "_version_": 1501249997589446700,
> "timestamp": "2015-05-15T15:26:05.205Z",
> "language": "en"
> }
> ]
> }
> }
>
>
>
>
>
> - Original Message -
>
> From: "Shawn Heisey" 
> To: solr-user@lucene.apache.org
> Sent: Friday, May 15, 2015 11:13:14 AM
> Subject: Re: Problem with solr.LengthFilterFactory
>
> On 5/15/2015 8:49 AM, Charles Sanders wrote:
> > I'm seeing a problem with the LengthFilter. It appears to work fine
> until I increase the max value above 254. At the point it stops removing
> the very large token from the stream. As a result I get the error:
> > java.lang.IllegalArgumentException: Document contains at least one
> immense term.. UTF8 encoding is longer than the max length 32766
> >
> > I'm certain I'm doing this wrong. Can someone please show me the light.
> :)
> >
> >  positionIncrementGap="100">
> > 
> > 
> > 
> > 
> > 
>
> So with max="254", you don't get the error? Looking at the code for
> LengthFilter, I can't see any way for it to behave differently with a
> max of 254 vs. a max of 255 or higher. All of the interfaces and
> classes involved use "int" for length, which means it should work
> perfectly with numbers above 254.
>
> Thanks,
> Shawn
>
>
>


Re: Solr 5.1 json facets: buckets are empty for TrieIntField

2015-05-15 Thread Andrii Berezhynskyi
Indeed, it does work in a nightly build. Thank you Yonik

On Fri, May 15, 2015 at 5:27 PM, Yonik Seeley  wrote:

> That was previously found and fixed - can you try a recent nightly build?
>
> https://builds.apache.org/job/Solr-Artifacts-5.x/lastSuccessfulBuild/artifact/solr/package/
> -Yonik
>
>
> On Fri, May 15, 2015 at 4:04 AM, Andrii Berezhynskyi
>  wrote:
> > I have a strange issue of facet buckets being empty for tint fields.
> > I have the following schema:
> >
> >  > omitNorms="true"/>
> >
> >  > omitNorms="true"/>
> >
> >  sortMissingLast="true"
> > omitNorms="true"/>
> >
> > ...
> >
> >  > multiValued="false"/>
> >
> >  > multiValued="true"/>
> >
> >  > multiValued="true"/>
> >
> > Then I just import:
> >
> > [{
> > "sku": "TEST_FACET",
> > "color": "yellow",
> > "width": 100.23,
> > "price": 1200
> > }]
> >
> > when I do the following faceting request:
> >
> > json.facet={
> >
> > colors:{terms: {field:color}},
> >
> > width:{terms: {field:width}},
> >
> > price:{terms: {field:price}}
> >
> > }
> >
> > I get empty buckets for price (tint):
> >
> > "facets":{ "count":1,
> >
> >   "colors":{ "buckets":[{ "val":"yellow", "count":1}]},
> >
> >   "width":{ "buckets":[{ "val":100.23, "count":1}]},
> >
> >   "price":{ "buckets":[]}}}
> >
> > Is somebody else able to reproduce this issue?
> >
> > Best regards,
> > Andrii
>


Re: Problem with solr.LengthFilterFactory

2015-05-15 Thread Charles Sanders
Yes, that is what I am seeing. Looking in the code myself, I see no reason for 
this behavior. That is why I assumed I was doing something very wrong. 

Below I have included an example. I set the max length to 300. I insert a 
record with a single token of 500 characters. I expect the token to be removed 
and not included in the index. When I query using the large token, the record 
is returned. I can see the same result using the analysis page in the solr 
console. 

He is a test example: 

 

 
 
 
 
 
 


A test record: 

{ 
"documentKind": "test", 
"uri": "test300", 
"id": "test300", 
"portal_package": 
"12345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890"
 
} 


Query result: 

{ 
"responseHeader": { 
"status": 0, 
"QTime": 55, 
"params": { 
"indent": "true", 
"q": 
"portal_package:12345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890",
 
"_": "1431704135745", 
"wt": "json" 
} 
}, 
"response": { 
"numFound": 1, 
"start": 0, 
"docs": [ 
{ 
"documentKind": "test", 
"uri": "test300", 
"id": "test300", 
"portal_package": [ 
"12345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890"
 
], 
"_version_": 1501249997589446700, 
"timestamp": "2015-05-15T15:26:05.205Z", 
"language": "en" 
} 
] 
} 
} 





- Original Message -

From: "Shawn Heisey"  
To: solr-user@lucene.apache.org 
Sent: Friday, May 15, 2015 11:13:14 AM 
Subject: Re: Problem with solr.LengthFilterFactory 

On 5/15/2015 8:49 AM, Charles Sanders wrote: 
> I'm seeing a problem with the LengthFilter. It appears to work fine until I 
> increase the max value above 254. At the point it stops removing the very 
> large token from the stream. As a result I get the error: 
> java.lang.IllegalArgumentException: Document contains at least one immense 
> term.. UTF8 encoding is longer than the max length 32766 
> 
> I'm certain I'm doing this wrong. Can someone please show me the light. :) 
> 
>  
>  
>  
>  
>  
>  

So with max="254", you don't get the error? Looking at the code for 
LengthFilter, I can't see any way for it to behave differently with a 
max of 254 vs. a max of 255 or higher. All of the interfaces and 
classes involved use "int" for length, which means it should work 
perfectly with numbers above 254. 

Thanks, 
Shawn 




Re: Solr 5.1 json facets: buckets are empty for TrieIntField

2015-05-15 Thread Yonik Seeley
That was previously found and fixed - can you try a recent nightly build?
https://builds.apache.org/job/Solr-Artifacts-5.x/lastSuccessfulBuild/artifact/solr/package/
-Yonik


On Fri, May 15, 2015 at 4:04 AM, Andrii Berezhynskyi
 wrote:
> I have a strange issue of facet buckets being empty for tint fields.
> I have the following schema:
>
>  omitNorms="true"/>
>
>  omitNorms="true"/>
>
>  omitNorms="true"/>
>
> ...
>
>  multiValued="false"/>
>
>  multiValued="true"/>
>
>  multiValued="true"/>
>
> Then I just import:
>
> [{
> "sku": "TEST_FACET",
> "color": "yellow",
> "width": 100.23,
> "price": 1200
> }]
>
> when I do the following faceting request:
>
> json.facet={
>
> colors:{terms: {field:color}},
>
> width:{terms: {field:width}},
>
> price:{terms: {field:price}}
>
> }
>
> I get empty buckets for price (tint):
>
> "facets":{ "count":1,
>
>   "colors":{ "buckets":[{ "val":"yellow", "count":1}]},
>
>   "width":{ "buckets":[{ "val":100.23, "count":1}]},
>
>   "price":{ "buckets":[]}}}
>
> Is somebody else able to reproduce this issue?
>
> Best regards,
> Andrii


Re: Problem with solr.LengthFilterFactory

2015-05-15 Thread Shawn Heisey
On 5/15/2015 8:49 AM, Charles Sanders wrote:
> I'm seeing a problem with the LengthFilter. It appears to work fine until I 
> increase the max value above 254. At the point it stops removing the very 
> large token from the stream. As a result I get the error: 
> java.lang.IllegalArgumentException: Document contains at least one immense 
> term.. UTF8 encoding is longer than the max length 32766 
>
> I'm certain I'm doing this wrong. Can someone please show me the light. :) 
>
>  
>  
>  
>  
>  
>  

So with max="254", you don't get the error?  Looking at the code for
LengthFilter, I can't see any way for it to behave differently with a
max of 254 vs. a max of 255 or higher.  All of the interfaces and
classes involved use "int" for length, which means it should work
perfectly with numbers above 254.

Thanks,
Shawn



Problem with solr.LengthFilterFactory

2015-05-15 Thread Charles Sanders
I'm seeing a problem with the LengthFilter. It appears to work fine until I 
increase the max value above 254. At the point it stops removing the very large 
token from the stream. As a result I get the error: 
java.lang.IllegalArgumentException: Document contains at least one immense 
term.. UTF8 encoding is longer than the max length 32766 

I'm certain I'm doing this wrong. Can someone please show me the light. :) 

 
 
 
 
 
 


Solr Version - 4.8.1 


-Charles 




Re: how to index and query newly added filed

2015-05-15 Thread Erick Erickson
Images are stripped by the mail server, but I did see the original.
Things to check
1> the field must have 'stored="true" ' set
2> you must add documents with that field.
3> Only documents that actually have a value in the field will show
anything. Solr doesn't
return an empty field, so if a particular doc doesn't have the
value the new field won't
be shown.

Best,
Erick

On Fri, May 15, 2015 at 7:31 AM, pradeep
 wrote:
>  Hi,I am new to
> solr. From this image, i have added one new field 'startPrice'. i am
> selecting that filed but it is not coming in response. what would be the
> probme,
>
>
>
> --
> View this message in context: 
> http://lucene.472066.n3.nabble.com/how-to-index-and-query-newly-added-filed-tp4205595.html
> Sent from the Solr - User mailing list archive at Nabble.com.


Re: Full Copy during cluster restarts

2015-05-15 Thread Erick Erickson
The most likely cause is that you were indexing when the nodes were
down and/or have not issued a hard commit. If you stop indexing _and_
issue a hard commit (openSearcher=true or false doesn't matter), then
the nodes should come back up without having to replicate the index.

This might help:
http://lucidworks.com/blog/understanding-transaction-logs-softcommit-and-commit-in-sorlcloud/

Best,
Erick

On Thu, May 14, 2015 at 9:07 PM, Paddy Krishnamoorthy  wrote:
> We are running a solr 4.10.2 cluster with dozens of nodes. When ever we
> restart the cluster we see that 2-3 nodes goes in to a fullCopy mode, which
> takes for ever to recover. Because of the recovery our entire cluster runs
> very slow and eventually becomes unresponsive.
>
> Any thoughts?
>
>
> Thanks
> Paddy
> --
> Did you breathe today?
> Learn from here: http://us.artofliving.org/index.html


how to index and query newly added filed

2015-05-15 Thread pradeep
 Hi,I am new to
solr. From this image, i have added one new field 'startPrice'. i am
selecting that filed but it is not coming in response. what would be the
probme,



--
View this message in context: 
http://lucene.472066.n3.nabble.com/how-to-index-and-query-newly-added-filed-tp4205595.html
Sent from the Solr - User mailing list archive at Nabble.com.

AW: schema.xml & xi:include -> copyField source :'_my_title' is not a glob and doesn't match any explicit field or dynamicField

2015-05-15 Thread Clemens Wyss DEV
Thought about that too (should have written ;) ).
When I remove the schema-tag from the composite xml I get:
org.apache.solr.common.SolrException: Unable to create core [test]
at org.apache.solr.core.CoreContainer.create(CoreContainer.java:533)
at org.apache.solr.core.CoreContainer.create(CoreContainer.java:493)
...
at 
org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:675)
at 
org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:382)
at 
org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:192)
Caused by: org.apache.solr.common.SolrException: Could not load conf for core 
test: org.apache.solr.common.SolrException: org.xml.sax.SAXParseException; 
systemId: solrres:/schema.xml; lineNumber: 3; columnNumber: 84; Error 
attempting to parse XML file (href='schema-common.xml').. Schema file is 
C:\source\search\search-impl\WebContent\WEB-INF\solr\configsets\test\conf\schema.xml
at 
org.apache.solr.core.ConfigSetService.getConfig(ConfigSetService.java:78)
at org.apache.solr.core.CoreContainer.create(CoreContainer.java:516)
... 12 more
Caused by: com.google.common.util.concurrent.UncheckedExecutionException: 
org.apache.solr.common.SolrException: org.xml.sax.SAXParseException; systemId: 
solrres:/schema.xml; lineNumber: 3; columnNumber: 84; Error attempting to parse 
XML file (href='schema-common.xml').. Schema file is 
C:\source\search\search-impl\WebContent\WEB-INF\solr\configsets\test\conf\schema.xml
at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2199)
at com.google.common.cache.LocalCache.get(LocalCache.java:3932)
at 
com.google.common.cache.LocalCache$LocalManualCache.get(LocalCache.java:4721)
at 
org.apache.solr.core.ConfigSetService$SchemaCaching.createIndexSchema(ConfigSetService.java:206)
at 
org.apache.solr.core.ConfigSetService.getConfig(ConfigSetService.java:74)
... 13 more
Caused by: org.apache.solr.common.SolrException: org.xml.sax.SAXParseException; 
systemId: solrres:/schema.xml; lineNumber: 3; columnNumber: 84; Error 
attempting to parse XML file (href='schema-common.xml').. Schema file is 
C:\source\search\search-impl\WebContent\WEB-INF\solr\configsets\test\conf\schema.xml
at org.apache.solr.schema.IndexSchema.readSchema(IndexSchema.java:596)
at org.apache.solr.schema.IndexSchema.(IndexSchema.java:175)
at 
org.apache.solr.schema.IndexSchemaFactory.create(IndexSchemaFactory.java:55)
at 
org.apache.solr.schema.IndexSchemaFactory.buildIndexSchema(IndexSchemaFactory.java:69)
at 
org.apache.solr.core.ConfigSetService$SchemaCaching$1.call(ConfigSetService.java:210)
at 
org.apache.solr.core.ConfigSetService$SchemaCaching$1.call(ConfigSetService.java:206)
at 
com.google.common.cache.LocalCache$LocalManualCache$1.load(LocalCache.java:4724)
at 
com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3522)
at 
com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2315)
at 
com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2278)
at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2193)
... 17 more
Caused by: org.apache.solr.common.SolrException: org.xml.sax.SAXParseException; 
systemId: solrres:/schema.xml; lineNumber: 3; columnNumber: 84; Error 
attempting to parse XML file (href='schema-common.xml').
at org.apache.solr.core.Config.(Config.java:156)
at org.apache.solr.core.Config.(Config.java:92)
at org.apache.solr.schema.IndexSchema.readSchema(IndexSchema.java:455)
... 27 more
Caused by: org.xml.sax.SAXParseException; systemId: solrres:/schema.xml; 
lineNumber: 3; columnNumber: 84; Error attempting to parse XML file 
(href='schema-common.xml').
at org.apache.solr.core.Config.(Config.java:145)
... 29 more

-Ursprüngliche Nachricht-
Von: Steve Rowe [mailto:sar...@gmail.com] 
Gesendet: Freitag, 15. Mai 2015 13:30
An: solr-user@lucene.apache.org
Betreff: Re: schema.xml & xi:include -> copyField source :'_my_title' is not a 
glob and doesn't match any explicit field or dynamicField

Hi Clemens,

I think the problem is the structure of the composite schema - you’ll end up 
with:

   <- your other schema file
   <- the included schema-common.xml

   tags from your schema-common.xml.  You won’t be able to use 
it alone in that case, but if you need to do that, you could just create 
another schema file that includes it inside wrapping  tags.

Steve

> On May 15, 2015, at 4:01 AM, Clemens Wyss DEV  wrote:
> 
> Given the following schema.xml
>   version="1.5">  _my_id   name="_version_" stored="true" type="long"/>   dest="_my_suggest" source="_my_title"/>  
> stored="true" type="string"/>
>  stored="true" type="string"/>
> stored="false" type="str

Re: schema.xml & xi:include -> copyField source :'_my_title' is not a glob and doesn't match any explicit field or dynamicField

2015-05-15 Thread Steve Rowe
Hi Clemens,

I think the problem is the structure of the composite schema - you’ll end up 
with:

   <- your other schema file
   <- the included schema-common.xml

   tags from your schema-common.xml.  You won’t be able to use 
it alone in that case, but if you need to do that, you could just create 
another schema file that includes it inside wrapping  tags.

Steve

> On May 15, 2015, at 4:01 AM, Clemens Wyss DEV  wrote:
> 
> Given the following schema.xml
> 
> 
>  _my_id
>  
>  
>  
> stored="true" type="string"/>
>  stored="true" type="string"/>
> type="string"/> 
>  
>   
>
>  positionIncrementGap="0" precisionStep="0"/>
>  
> 
> 
> When I try to include the very schema from another schema file, e.g.:
> 
> 
>   xmlns:xi="http://www.w3.org/2001/XInclude"/> 
> 
> 
> I get SolrException
> copyField source :'_my_title' is not a glob and doesn't match any explicit 
> field or dynamicField
> 
> Am I facing a bug or a feature?
> 
> Thanks
> - Clemens



Solr 5.1 json facets: buckets are empty for TrieIntField

2015-05-15 Thread Andrii Berezhynskyi
I have a strange issue of facet buckets being empty for tint fields.
I have the following schema:







...







Then I just import:

[{
"sku": "TEST_FACET",
"color": "yellow",
"width": 100.23,
"price": 1200
}]

when I do the following faceting request:

json.facet={

colors:{terms: {field:color}},

width:{terms: {field:width}},

price:{terms: {field:price}}

}

I get empty buckets for price (tint):

"facets":{ "count":1,

  "colors":{ "buckets":[{ "val":"yellow", "count":1}]},

  "width":{ "buckets":[{ "val":100.23, "count":1}]},

  "price":{ "buckets":[]}}}

Is somebody else able to reproduce this issue?

Best regards,
Andrii


Re: A Synonym Searching for Phrase?

2015-05-15 Thread Rajani Maski
Hi Ryan,

I am not really sure whether this[1] solution mentioned in the link below
can work for your case considering its cons. However, I recommend having a
quick look at it.

@Chris, Would eagerly wait for your contribution.


[1] https://support.lucidworks.com/hc/en-us/articles/205359448



On Thu, May 14, 2015 at 11:30 PM, Chris Morley  wrote:

> I have implemented that but it's not open sourced yet.  It will be soon.
>
>  -Chris.
>
>
>
>
> 
>  From: "Ryan Yacyshyn" 
> Sent: Thursday, May 14, 2015 12:07 PM
> To: solr-user@lucene.apache.org
> Subject: A Synonym Searching for Phrase?
> Hi All,
>
> I'm running into an issue where I have some tokens that really mean the
> same thing as two. For example, there are a couple ways users might want
> to
> search for certain type of visa called the "s pass", but they might query
> for spass or s-pass.
>
> I thought I could add a line in my synonym file to solve this, such as:
>
> s-pass, spass => s pass
>
> This doesn't seem to work. I found an Auto Phrase TokenFilter (
> https://github.com/LucidWorks/auto-phrase-tokenfilter) that looks like it
> might help, but it sounds like it needs to use a specific query parser as
> well (we're using edismax).
>
> Has anyone came across this specific problem before? Would really
> appreciate your suggestions / help.
>
> We're using Solr 4.8.x (and lucidWorks 2.9).
>
> Thanks!
> Ryan
>
>
>


schema.xml & xi:include -> copyField source :'_my_title' is not a glob and doesn't match any explicit field or dynamicField

2015-05-15 Thread Clemens Wyss DEV
Given the following schema.xml


  _my_id
  
  
  


 
  
   

 
  


When I try to include the very schema from another schema file, e.g.:


  http://www.w3.org/2001/XInclude"/> 


I get SolrException
copyField source :'_my_title' is not a glob and doesn't match any explicit 
field or dynamicField

Am I facing a bug or a feature?

Thanks
- Clemens


Re: Searcher is opening twice on Reload

2015-05-15 Thread Aman Tandon
Thanks chris, but in the issue it is mentioned that first searcher listener
is opening twice but in my case firstly the firstSearcher is opening and
then newSearcher. Is it same?

With Regards
Aman Tandon

On Thu, May 14, 2015 at 11:05 PM, Chris Hostetter 
wrote:

>
> I suspect you aren't doing anything wrong, i think it's the same as this
> bug...
>
> https://issues.apache.org/jira/browse/SOLR-7035
>
>
> : Date: Thu, 14 May 2015 12:53:34 +0530
> : From: Aman Tandon 
> : Reply-To: solr-user@lucene.apache.org
> : To: "solr-user@lucene.apache.org" 
> : Subject: Searcher is opening twice on Reload
> :
> : Hi,
> :
> : Please help me here, when I am doing the reload of core, my searcher is
> : being opening twice. I am also attaching the logs output, please suggest
> me
> : what wrong I am doing here or this is default behavior on reload.
> :
> : May 14, 2015 12:47:38 PM org.apache.solr.spelling.DirectSolrSpellChecker
> : > INFO: init:
> : >
> {name=default,field=titlews,classname=solr.DirectSolrSpellChecker,distanceMeasure=internal,accuracy=0.5,maxEdits=1,minPrefix=1,maxInspections=5,minQueryLength=5,maxQueryFrequency=100.0,thresholdTokenFrequency=100.0}
> : > May 14, 2015 12:47:38 PM
> : > org.apache.solr.handler.component.SpellCheckComponent
> : > INFO: No queryConverter defined, using default converter
> : > May 14, 2015 12:47:38 PM
> : > org.apache.solr.handler.component.QueryElevationComponent
> : > INFO: Loading QueryElevation from data dir: elevate.xml
> : > May 14, 2015 12:47:38 PM org.apache.solr.handler.ReplicationHandler
> : > INFO: Commits will be reserved for  1
> : > May 14, 2015 12:47:38 PM org.apache.solr.core.QuerySenderListener
> : > INFO: QuerySenderListener sending requests to Searcher@41dc3c83
> [IM-Search]
> : > main{StandardDirectoryReader(segments_dd4:82296:nrt
> : > _jdq(4.8):C5602938/2310052:delGen=3132
> : > _jkq(4.8):C6860454/1398005:delGen=2992
> : > _jx2(4.8):C5237053/1505048:delGen=3241
> : > _joo(4.8):C5825253/1599671:delGen=3323
> : > _k4d(4.8):C5860360/1916531:delGen=3150
> : > _o27(4.8):C5290435/1018865:delGen=370
> : > _mju(4.8):C5074973/1602707:delGen=1474
> : > _jka(4.8):C5172599/1774839:delGen=3202
> : > _nik(4.8):C4698916/1512091:delGen=804
> _o8y(4.8):C1137592/521423:delGen=190
> : > _oeu(4.8):C469094/86291:delGen=29 _odq(4.8):C217505/65596:delGen=55
> : > _ogd(4.8):C50454/4155:delGen=5 _oea(4.8):C40833/7192:delGen=37
> : > _ofy(4.8):C73614/7273:delGen=13 _ogx(4.8):C395681/1388:delGen=4
> : > _ogh(4.8):C7676/70:delGen=2 _ohf(4.8):C108769/21:delGen=2
> : > _ogc(4.8):C24435/384:delGen=4 _ogi(4.8):C23088/158:delGen=3
> : > _ogj(4.8):C4217/2:delGen=1 _ohs(4.8):C7 _oh6(4.8):C20509/205:delGen=5
> : > _oh7(4.8):C3171 _oho(4.8):C6/1:delGen=1 _ohq(4.8):C1
> : > _ohv(4.8):C10484/996:delGen=2 _ohx(4.8):C500 _ohy(4.8):C1
> _ohz(4.8):C1)}
> : > ^[OFMay 14, 2015 12:47:43 PM org.apache.solr.core.SolrCore
> : > INFO: [IM-Search] webapp=/solr path=/select
> : >
> params={spellcheck=true&lon=0&q=q&wt=json&qt=opsview.monitor&lat=0&rows=0&ps=1}
> : > hits=6 status=0 QTime=1
> : > May 14, 2015 12:47:44 PM org.apache.solr.core.SolrCore
> : > INFO: [IM-Search] webapp=null path=null
> : >
> params={start=0&event=firstSearcher&q=rice&distrib=false&qt=im.search.intent&rows=25}
> : > hits=42749 status=0 QTime=5667
> : > May 14, 2015 12:47:58 PM org.apache.solr.request.UnInvertedField
> : > INFO: UnInverted multi-valued field
> : >
> {field=city,memSize=209216385,tindexSize=11029,time=3904,phase1=3783,nTerms=77614,bigTerms=3,termInstances=31291566,uses=0}
> : > May 14, 2015 12:48:01 PM org.apache.solr.request.UnInvertedField
> : > INFO: UnInverted multi-valued field
> : >
> {field=biztype,memSize=208847178,tindexSize=40,time=1318,phase1=1193,nTerms=9,bigTerms=4,termInstances=1607459,uses=0}
> : > May 14, 2015 12:48:01 PM org.apache.solr.core.SolrCore
> : > INFO: [IM-Search] webapp=null path=null
> : >
> params={start=0&event=firstSearcher&q=rice&distrib=false&qt=im.search&rows=25}
> : > hits=57619 status=0 QTime=17194
> : > May 14, 2015 12:48:04 PM org.apache.solr.core.SolrCore
> : > INFO: [IM-Search] webapp=null path=null
> : >
> params={start=0&event=firstSearcher&q=potassium+cyanide&distrib=false&qt=eto.search.offer&rows=20}
> : > hits=443 status=0 QTime=3272
> : > May 14, 2015 12:48:09 PM org.apache.solr.core.SolrCore
> : > INFO: [IM-Search] webapp=null path=null
> : >
> params={start=0&event=firstSearcher&q=motor+spare+parts&distrib=false&qt=im.search&fq=attribs:(locprefglobal+locprefnational+locprefcity)&rows=20}
> : > hits=107297 status=0 QTime=5254
> : > May 14, 2015 12:48:09 PM org.apache.solr.core.QuerySenderListener
> : > INFO: QuerySenderListener done.
> : > May 14, 2015 12:48:09 PM
> : >
> org.apache.solr.handler.component.SpellCheckComponent$SpellCheckerListener
> : > INFO: Loading spell index for spellchecker: default
> : > May 14, 2015 12:48:09 PM
> : >
> org.apache.solr.handler.component.SpellCheckComponent$SpellCheckerListener
> : > INFO: Loading spell index for spellchec