Re: Best example solrconfig.xml?

2020-12-16 Thread Walter Underwood
That sample solrconfig.xml includes , but the 7.0 release notes say that 
is no longer supported. Should that be removed from the config?

" element in solrconfig.xml is no longer supported. Equivalent 
functionality can be configured in solr.xml using  
element and SolrJmxReporter implementation. Limited back-compatibility is 
offered by automatically adding a default instance of SolrJmxReporter if it's 
missing, AND when a local MBean server is found (which can be activated either 
via ENABLE_REMOTE_JMX_OPTS in solr.in.sh or via system properties, eg. 
-Dcom.sun.management.jmxremote). This default instance exports all Solr metrics 
from all registries as hierarchical MBeans. This behavior can be also disabled 
by specifying a SolrJmxReporter configuration with a boolean init arg "enabled" 
set to "false". For a more fine-grained control users should explicitly specify 
at least one SolrJmxReporter configuration.”

https://lucene.apache.org/solr/8_7_0/changes/Changes.html#v7.0.0.upgrading_from_solr_6.x

wunder
Walter Underwood
wun...@wunderwood.org
http://observer.wunderwood.org/  (my blog)

> On Dec 15, 2020, at 7:36 PM, Walter Underwood  wrote:
> 
> Thanks. Yeah, already enabled the ClassicIndexSchemaFactory.
> 
> Nice tip about uninvertible=false.
> 
> The circuit breakers look really useful. I was ready to front each server 
> with nginx and let it do the limiting. I’ve now seen both Netflix and Chegg 
> search clusters take out the entire site because they got into a stable 
> congested state. People just don’t believe that will happen until they see it.
> 
> wunder
> Walter Underwood
> wun...@wunderwood.org <mailto:wun...@wunderwood.org>
> http://observer.wunderwood.org/  (my blog)
> 
>> On Dec 15, 2020, at 6:31 PM, Erick Erickson > <mailto:erickerick...@gmail.com>> wrote:
>> 
>> I’d start with that config set, making sure that “schemaless” is disabled.
>> 
>> Do be aware that some of the defaults have changed, although the big change 
>> for docValues was there in 6.0.
>> 
>> One thing you might want to do is set uninvertible=false in your schema. 
>> That’ll cause Solr to barf if you, say, sort, facet, group on a field that 
>> does _not_ have docValues=true. I suspect this will cause no surprises for 
>> you, but it’s kind of a nice backstop to keep from having surprises in terms 
>> of heap size…
>> 
>> Best,
>> Erick
>> 
>>> On Dec 15, 2020, at 6:56 PM, Walter Underwood >> <mailto:wun...@wunderwood.org>> wrote:
>>> 
>>> We’re moving from 6.6 to 8.7 and I’m thinking of starting with an 8.7 
>>> solrconfig.xml and porting our changes into it.
>>> 
>>> Is this the best one to start with?
>>> 
>>> solr/server/solr/configsets/_default/conf/solrconfig.xml
>>> 
>>> wunder
>>> Walter Underwood
>>> wun...@wunderwood.org <mailto:wun...@wunderwood.org>
>>> http://observer.wunderwood.org/  (my blog)
>>> 
>> 
> 



Re: Best example solrconfig.xml?

2020-12-15 Thread Walter Underwood
Thanks. Yeah, already enabled the ClassicIndexSchemaFactory.

Nice tip about uninvertible=false.

The circuit breakers look really useful. I was ready to front each server with 
nginx and let it do the limiting. I’ve now seen both Netflix and Chegg search 
clusters take out the entire site because they got into a stable congested 
state. People just don’t believe that will happen until they see it.

wunder
Walter Underwood
wun...@wunderwood.org
http://observer.wunderwood.org/  (my blog)

> On Dec 15, 2020, at 6:31 PM, Erick Erickson  wrote:
> 
> I’d start with that config set, making sure that “schemaless” is disabled.
> 
> Do be aware that some of the defaults have changed, although the big change 
> for docValues was there in 6.0.
> 
> One thing you might want to do is set uninvertible=false in your schema. 
> That’ll cause Solr to barf if you, say, sort, facet, group on a field that 
> does _not_ have docValues=true. I suspect this will cause no surprises for 
> you, but it’s kind of a nice backstop to keep from having surprises in terms 
> of heap size…
> 
> Best,
> Erick
> 
>> On Dec 15, 2020, at 6:56 PM, Walter Underwood  wrote:
>> 
>> We’re moving from 6.6 to 8.7 and I’m thinking of starting with an 8.7 
>> solrconfig.xml and porting our changes into it.
>> 
>> Is this the best one to start with?
>> 
>> solr/server/solr/configsets/_default/conf/solrconfig.xml
>> 
>> wunder
>> Walter Underwood
>> wun...@wunderwood.org
>> http://observer.wunderwood.org/  (my blog)
>> 
> 



Re: Best example solrconfig.xml?

2020-12-15 Thread Erick Erickson
I’d start with that config set, making sure that “schemaless” is disabled.

Do be aware that some of the defaults have changed, although the big change for 
docValues was there in 6.0.

One thing you might want to do is set uninvertible=false in your schema. 
That’ll cause Solr to barf if you, say, sort, facet, group on a field that does 
_not_ have docValues=true. I suspect this will cause no surprises for you, but 
it’s kind of a nice backstop to keep from having surprises in terms of heap 
size…

Best,
Erick

> On Dec 15, 2020, at 6:56 PM, Walter Underwood  wrote:
> 
> We’re moving from 6.6 to 8.7 and I’m thinking of starting with an 8.7 
> solrconfig.xml and porting our changes into it.
> 
> Is this the best one to start with?
> 
> solr/server/solr/configsets/_default/conf/solrconfig.xml
> 
> wunder
> Walter Underwood
> wun...@wunderwood.org
> http://observer.wunderwood.org/  (my blog)
> 



Best example solrconfig.xml?

2020-12-15 Thread Walter Underwood
We’re moving from 6.6 to 8.7 and I’m thinking of starting with an 8.7 
solrconfig.xml and porting our changes into it.

Is this the best one to start with?

solr/server/solr/configsets/_default/conf/solrconfig.xml

wunder
Walter Underwood
wun...@wunderwood.org
http://observer.wunderwood.org/  (my blog)



Re: How to reflect changes of solrconfig.xml to all the cores without causing any conflict

2020-11-09 Thread Shawn Heisey

On 11/9/2020 5:44 AM, raj.yadav wrote:

*Question:*
Since reload is not done, none of the replica (including leader) will have
updated solrconfig. And if we restart replica and if it trys to sync up with
leader will it reflect the latest changes of solrconfig or it will be the
same as leader.





Solr Collection detail:
single collection having 6 shard. each Vm is hosting single replica.
Collection size: 60 GB (each shard size is 10 GB)
Average doc size: 1.0Kb


If you restart Solr, it is effectively the same thing as reloading all 
cores on that Solr instance.


Your description (use of the terms "collection" and "shards") suggests 
that you're running in SolrCloud mode.  If you are, then modifying 
solrconfig.xml on the disk will change nothing.  You need to modify the 
solrconfig.xml that lives in ZooKeeper, or re-upload the changes to ZK. 
 Is that what you're doing?  After that, to make any changes effective, 
you have to reload the collection or restart the correct Solr instances.


I cannot tell you exactly what will happen as far as SolrCloud index 
synchronization, because I know nothing about your setup.  If the 
follower replica type is TLOG or PULL, then the index will be an exact 
copy of the leader's index.  With NRT, all replicas will independently 
index the data.


Thanks,
Shawn


How to reflect changes of solrconfig.xml to all the cores without causing any conflict

2020-11-09 Thread raj.yadav
Recently we had modified `noCFSRatio` parameter of merge policy.

 
8
5
50.0
4000
0.0
  

This is our current merge policy. Earlier `noCFSRatio` was set to `0.1`.

generally to reflect any changes of solrconfig we reload the collection. But
we stop doing this because we observe that during reload operation some of
the replicas go under-recovery after reloading operation. 
So instead of reload, we restart each replica one by one.  

Our restart procesdure:
1. Indexing was stopped on the collection and  issued a hard commit 
2. First restarted are the non leader replica and in the end restarted
leader replica

*Question:*
Since reload is not done, none of the replica (including leader) will have
updated solrconfig. And if we restart replica and if it trys to sync up with
leader will it reflect the latest changes of solrconfig or it will be the
same as leader. 

Also after this exercise, we have seen a sudden spike in CPU utilization on
a few replicas though there is not much increase in our system load. 
 

System config of VM:
disk size: 250 GB
cpu: (8 vcpus, 64 GiB memory)

Solr Collection detail:
single collection having 6 shard. each Vm is hosting single replica. 
Collection size: 60 GB (each shard size is 10 GB)
Average doc size: 1.0Kb




--
Sent from: https://lucene.472066.n3.nabble.com/Solr-User-f472068.html


Re: Setting in solrconfig.xml does it override Solr REST Post calls with parameters commit=true=false

2020-07-21 Thread Erick Erickson
Yep, that assumes you can afford it to be 5 minutes between the time you send a 
doc to Solr and your users can search it.

Part of it depends on what your indexing rate is. If you’re only sending docs 
occasionally, you may want to make that longer. Frankly, though, the interval 
there isn’t too important practically, it’s opening a new searcher that impacts 
your setup most obviously.

Best,
Erick

> On Jul 21, 2020, at 3:52 PM, Tyrone Tse  wrote:
> 
> Eric
> 
> Thanks for your quick response.
> So in the solrconfig.xml keep the  out of the box setting of 15 seconds
> 
>
>  15000
>  false
>
> 
> and also have the  setting set to something like 5 minutes
>  
>30
>  
> 
> Then in the existing SolrJ code just simply delete the line
> 
> up.setAction(AbstractUpdateRequest.ACTION.COMMIT, true, true);
> 
> Is this what you recommended I try.
> 
> Thanks
> 
> Tyrone
> 
> 
> On Tue, Jul 21, 2020 at 1:16 PM Erick Erickson 
> wrote:
> 
>> What you’re seeing is the input from the client. You’ve passed true, true,
>> which are waitFlush and waitSearcher
>> which sets softCommit _for that call_ to false. It has nothing to do with
>> the settings in your config file.
>> 
>> bq. I am not passing the parameter to do a softCommit  in the SolrJ
>> command.
>> 
>> I don’t think so. That’s a hard commit. This is a little tricky since the
>> waitFlush and waitSearcher params don’t
>> tell you that they are about hard commits. There used to only be hard
>> commits, so...
>> 
>> But…. these settings are highly suspicious. Here’s the long form:
>> 
>> 
>> https://lucidworks.com/post/understanding-transaction-logs-softcommit-and-commit-in-sorlcloud/
>> 
>> It is risky to have your autocommit settings commented out. You risk
>> transaction logs growing
>> forever to no  purpose whatsoever.
>> 
>> Your call does a hard commit _and_ opens a new searcher for, apparently,
>> every document.
>> 
>> But your autoSoftCommit settings also open a new searcher without doing
>> anything about flushing
>> data to disk every second. Usually, this is far too often unless you have
>> extremely stringent latency
>> requirements, and in this case unless you’re only indexing once in a great
>> while your caches
>> are pretty useless.
>> 
>> I strongly urge you to uncomment autocommit settings. Make the autoCommit
>> interval something
>> reasonable (15-60 seconds for instance) with openSearcher=false.
>> 
>> Then lengthen your autoSoftCommit settings to as long as you can stand.
>> The longer the interval,
>> the less work you’ll do opening new searchers, which is a rather expensive
>> operation. I like
>> 5-10 minutes if possible, but your app may require shorter intervals.
>> 
>> Then don’t send any commit settings in your SolrJ program at all.
>> 
>> Best,
>> Erick
>> 
>>> On Jul 21, 2020, at 1:32 PM, Tyrone Tse  wrote:
>>> 
>>> I am using Solr 8.5 cloud, and in my collection I have edited the
>>> solrconfig.xml file to use
>>> 
>>>   1000
>>> 
>>> 
>>> and commented out the default  configuration
>>> 
>>> 
>>> 
>>> We are using SolrJ to post files to the Solr here is the snippet of Java
>>> code that does it
>>> 
>>> try(HttpSolrClient solrClient = solr.build()){
>>>   ContentStreamUpdateRequest up = new
>>> ContentStreamUpdateRequest("/update/extract");
>>>   up.addFile(f, mimeType);
>>>   String tempId = f.getName() + (new Date()).toString();
>>>   up.setParam("literal.id", tempId);
>>>   up.setParam("literal.username", user);
>>>   up.setParam("literal.fileName", f.getName());
>>>   up.setParam("literal.filePath", path);
>>>   up.setParam("uprefix", "attr_");
>>>   up.setParam("fmap.content", "attr_content");
>>>   up.setAction(AbstractUpdateRequest.ACTION.COMMIT, true, true);
>>>   logger.info("PreRequest");
>>>   solrClient.request(up);
>>>   logger.info("PostRequest");
>>>   resultId = tempId;
>>> } catch (IOException | SolrServerException |
>>> HttpSolrClient.RemoteSolrException e) {
>>>   logger.error("Error connecting.committing to Solr", e);
>>> }
>>> 
>>> So I am not passing the parameter to do a softCommit  in the SolrJ
>> command.
>>> 
>>> When I posted a file to my Solr core, when I look at the solr.log file I
>>> see the following information
>>> 
>>> 2020-07-21 16:38:54.719 INFO  (qtp1546693040-302) [c:files s:shard1
>>> r:core_node5 x:files_shard1_replica_n2]
>> o.a.s.u.p.LogUpdateProcessorFactory
>>> [files_shard1_replica_n2]  webapp=/solr path=/update
>>> 
>> params={update.distrib=TOLEADER=files-update-processor=true=true=true=false=
>>> http://192.168.1.191:8983/solr/files_shard2_replica_n6/
>>> 
>>> Does having   set in the solrconfig.xml override REST
>> Post
>>> calls that have the parameter softCommit=false and force a softCommit
>> when
>>> the data is posted to Solr.
>>> 
>>> Thanks in advance.
>> 
>> 



Re: Setting in solrconfig.xml does it override Solr REST Post calls with parameters commit=true=false

2020-07-21 Thread Tyrone Tse
Eric

Thanks for your quick response.
So in the solrconfig.xml keep the  out of the box setting of 15 seconds


  15000
  false


and also have the  setting set to something like 5 minutes
  
30
  

Then in the existing SolrJ code just simply delete the line

up.setAction(AbstractUpdateRequest.ACTION.COMMIT, true, true);

Is this what you recommended I try.

Thanks

Tyrone


On Tue, Jul 21, 2020 at 1:16 PM Erick Erickson 
wrote:

> What you’re seeing is the input from the client. You’ve passed true, true,
> which are waitFlush and waitSearcher
> which sets softCommit _for that call_ to false. It has nothing to do with
> the settings in your config file.
>
> bq. I am not passing the parameter to do a softCommit  in the SolrJ
> command.
>
> I don’t think so. That’s a hard commit. This is a little tricky since the
> waitFlush and waitSearcher params don’t
> tell you that they are about hard commits. There used to only be hard
> commits, so...
>
> But…. these settings are highly suspicious. Here’s the long form:
>
>
> https://lucidworks.com/post/understanding-transaction-logs-softcommit-and-commit-in-sorlcloud/
>
> It is risky to have your autocommit settings commented out. You risk
> transaction logs growing
> forever to no  purpose whatsoever.
>
> Your call does a hard commit _and_ opens a new searcher for, apparently,
> every document.
>
> But your autoSoftCommit settings also open a new searcher without doing
> anything about flushing
> data to disk every second. Usually, this is far too often unless you have
> extremely stringent latency
> requirements, and in this case unless you’re only indexing once in a great
> while your caches
> are pretty useless.
>
> I strongly urge you to uncomment autocommit settings. Make the autoCommit
> interval something
> reasonable (15-60 seconds for instance) with openSearcher=false.
>
> Then lengthen your autoSoftCommit settings to as long as you can stand.
> The longer the interval,
> the less work you’ll do opening new searchers, which is a rather expensive
> operation. I like
> 5-10 minutes if possible, but your app may require shorter intervals.
>
> Then don’t send any commit settings in your SolrJ program at all.
>
> Best,
> Erick
>
> > On Jul 21, 2020, at 1:32 PM, Tyrone Tse  wrote:
> >
> > I am using Solr 8.5 cloud, and in my collection I have edited the
> > solrconfig.xml file to use
> > 
> >1000
> >  
> >
> > and commented out the default  configuration
> >
> >  
> >
> > We are using SolrJ to post files to the Solr here is the snippet of Java
> > code that does it
> >
> > try(HttpSolrClient solrClient = solr.build()){
> >ContentStreamUpdateRequest up = new
> > ContentStreamUpdateRequest("/update/extract");
> >up.addFile(f, mimeType);
> >String tempId = f.getName() + (new Date()).toString();
> >up.setParam("literal.id", tempId);
> >up.setParam("literal.username", user);
> >up.setParam("literal.fileName", f.getName());
> >up.setParam("literal.filePath", path);
> >up.setParam("uprefix", "attr_");
> >up.setParam("fmap.content", "attr_content");
> >up.setAction(AbstractUpdateRequest.ACTION.COMMIT, true, true);
> >logger.info("PreRequest");
> >solrClient.request(up);
> >logger.info("PostRequest");
> >resultId = tempId;
> > } catch (IOException | SolrServerException |
> > HttpSolrClient.RemoteSolrException e) {
> >logger.error("Error connecting.committing to Solr", e);
> > }
> >
> > So I am not passing the parameter to do a softCommit  in the SolrJ
> command.
> >
> > When I posted a file to my Solr core, when I look at the solr.log file I
> > see the following information
> >
> > 2020-07-21 16:38:54.719 INFO  (qtp1546693040-302) [c:files s:shard1
> > r:core_node5 x:files_shard1_replica_n2]
> o.a.s.u.p.LogUpdateProcessorFactory
> > [files_shard1_replica_n2]  webapp=/solr path=/update
> >
> params={update.distrib=TOLEADER=files-update-processor=true=true=true=false=
> > http://192.168.1.191:8983/solr/files_shard2_replica_n6/
> >
> > Does having   set in the solrconfig.xml override REST
> Post
> > calls that have the parameter softCommit=false and force a softCommit
> when
> > the data is posted to Solr.
> >
> > Thanks in advance.
>
>


Re: Setting in solrconfig.xml does it override Solr REST Post calls with parameters commit=true=false

2020-07-21 Thread Erick Erickson
What you’re seeing is the input from the client. You’ve passed true, true, 
which are waitFlush and waitSearcher
which sets softCommit _for that call_ to false. It has nothing to do with the 
settings in your config file.

bq. I am not passing the parameter to do a softCommit  in the SolrJ command.

I don’t think so. That’s a hard commit. This is a little tricky since the 
waitFlush and waitSearcher params don’t
tell you that they are about hard commits. There used to only be hard commits, 
so...

But…. these settings are highly suspicious. Here’s the long form:

https://lucidworks.com/post/understanding-transaction-logs-softcommit-and-commit-in-sorlcloud/

It is risky to have your autocommit settings commented out. You risk 
transaction logs growing
forever to no  purpose whatsoever.

Your call does a hard commit _and_ opens a new searcher for, apparently, every 
document.

But your autoSoftCommit settings also open a new searcher without doing 
anything about flushing
data to disk every second. Usually, this is far too often unless you have 
extremely stringent latency
requirements, and in this case unless you’re only indexing once in a great 
while your caches
are pretty useless.

I strongly urge you to uncomment autocommit settings. Make the autoCommit 
interval something
reasonable (15-60 seconds for instance) with openSearcher=false.

Then lengthen your autoSoftCommit settings to as long as you can stand. The 
longer the interval,
the less work you’ll do opening new searchers, which is a rather expensive 
operation. I like
5-10 minutes if possible, but your app may require shorter intervals.

Then don’t send any commit settings in your SolrJ program at all.

Best,
Erick

> On Jul 21, 2020, at 1:32 PM, Tyrone Tse  wrote:
> 
> I am using Solr 8.5 cloud, and in my collection I have edited the
> solrconfig.xml file to use
> 
>1000
>  
> 
> and commented out the default  configuration
> 
>  
> 
> We are using SolrJ to post files to the Solr here is the snippet of Java
> code that does it
> 
> try(HttpSolrClient solrClient = solr.build()){
>ContentStreamUpdateRequest up = new
> ContentStreamUpdateRequest("/update/extract");
>up.addFile(f, mimeType);
>String tempId = f.getName() + (new Date()).toString();
>up.setParam("literal.id", tempId);
>up.setParam("literal.username", user);
>up.setParam("literal.fileName", f.getName());
>up.setParam("literal.filePath", path);
>up.setParam("uprefix", "attr_");
>up.setParam("fmap.content", "attr_content");
>up.setAction(AbstractUpdateRequest.ACTION.COMMIT, true, true);
>logger.info("PreRequest");
>solrClient.request(up);
>logger.info("PostRequest");
>resultId = tempId;
> } catch (IOException | SolrServerException |
> HttpSolrClient.RemoteSolrException e) {
>logger.error("Error connecting.committing to Solr", e);
> }
> 
> So I am not passing the parameter to do a softCommit  in the SolrJ command.
> 
> When I posted a file to my Solr core, when I look at the solr.log file I
> see the following information
> 
> 2020-07-21 16:38:54.719 INFO  (qtp1546693040-302) [c:files s:shard1
> r:core_node5 x:files_shard1_replica_n2] o.a.s.u.p.LogUpdateProcessorFactory
> [files_shard1_replica_n2]  webapp=/solr path=/update
> params={update.distrib=TOLEADER=files-update-processor=true=true=true=false=
> http://192.168.1.191:8983/solr/files_shard2_replica_n6/
> 
> Does having   set in the solrconfig.xml override REST Post
> calls that have the parameter softCommit=false and force a softCommit when
> the data is posted to Solr.
> 
> Thanks in advance.



Setting in solrconfig.xml does it override Solr REST Post calls with parameters commit=true=false

2020-07-21 Thread Tyrone Tse
I am using Solr 8.5 cloud, and in my collection I have edited the
solrconfig.xml file to use

1000
  

and commented out the default  configuration

  

We are using SolrJ to post files to the Solr here is the snippet of Java
code that does it

try(HttpSolrClient solrClient = solr.build()){
ContentStreamUpdateRequest up = new
ContentStreamUpdateRequest("/update/extract");
up.addFile(f, mimeType);
String tempId = f.getName() + (new Date()).toString();
up.setParam("literal.id", tempId);
up.setParam("literal.username", user);
up.setParam("literal.fileName", f.getName());
up.setParam("literal.filePath", path);
up.setParam("uprefix", "attr_");
up.setParam("fmap.content", "attr_content");
up.setAction(AbstractUpdateRequest.ACTION.COMMIT, true, true);
logger.info("PreRequest");
solrClient.request(up);
logger.info("PostRequest");
resultId = tempId;
} catch (IOException | SolrServerException |
HttpSolrClient.RemoteSolrException e) {
logger.error("Error connecting.committing to Solr", e);
}

So I am not passing the parameter to do a softCommit  in the SolrJ command.

When I posted a file to my Solr core, when I look at the solr.log file I
see the following information

2020-07-21 16:38:54.719 INFO  (qtp1546693040-302) [c:files s:shard1
r:core_node5 x:files_shard1_replica_n2] o.a.s.u.p.LogUpdateProcessorFactory
[files_shard1_replica_n2]  webapp=/solr path=/update
params={update.distrib=TOLEADER=files-update-processor=true=true=true=false=
http://192.168.1.191:8983/solr/files_shard2_replica_n6/

Does having   set in the solrconfig.xml override REST Post
calls that have the parameter softCommit=false and force a softCommit when
the data is posted to Solr.

Thanks in advance.


Re: Which solrconfig.xml and schema file should I start with?

2020-05-10 Thread Jan Høydahl
Choose whichever example is closest to what you want to do. Then strip it down 
removing everything you don’t use. Note that _default configset has schema 
guessing enabled which you don’t want in production.

Jan Høydahl

> 9. mai 2020 kl. 22:34 skrev Steven White :
> 
> Hi everyone,
> 
> There are multiple copies with each a bit different of the
> files solrconfig.xml and the various schema files.  Should I be using
> what's under \solr-8.5.1\server\solr\configsets\_default\conf as my
> foundation to build on?
> 
> Thanks
> 
> Steve


Which solrconfig.xml and schema file should I start with?

2020-05-09 Thread Steven White
Hi everyone,

There are multiple copies with each a bit different of the
files solrconfig.xml and the various schema files.  Should I be using
what's under \solr-8.5.1\server\solr\configsets\_default\conf as my
foundation to build on?

Thanks

Steve


Re: Ignore faceting for particular fields in solr using Solrconfig.xml

2019-05-23 Thread Bernd Fehling

Have a look at "invariants" for your requestHandler in solrconfig.xml.
It might be an option for you.

Regards
Bernd


Am 22.05.19 um 22:23 schrieb RaviTeja:

Hello Solr Expert,

How are you?

Am trying to ignore faceting for some of the fields. Can you please help me
out to ignore faceting using solrconfig.xml.
I tried but I can ignore faceting all the fields that useless. I'm trying
to ignore some specific fields.

Really Appreciate your help for the response!

Regards,
Ravi



Re: Ignore faceting for particular fields in solr using Solrconfig.xml

2019-05-22 Thread Erick Erickson
Just don’t ask for them. Or you saying that users can specify arbitrary fields 
to facet on and you want to prevent certain fields from being possible?

No, there’s no good way to do that in solrconfig.xml. You could write a query 
component that stripped out certain fields from the facet.field parameter.

Likely the easiest would be to do that in the application I assume you have 
between Solr and your users.

Best,
Erick

> On May 22, 2019, at 1:23 PM, RaviTeja  wrote:
> 
> Hello Solr Expert,
> 
> How are you?
> 
> Am trying to ignore faceting for some of the fields. Can you please help me
> out to ignore faceting using solrconfig.xml.
> I tried but I can ignore faceting all the fields that useless. I'm trying
> to ignore some specific fields.
> 
> Really Appreciate your help for the response!
> 
> Regards,
> Ravi



Ignore faceting for particular fields in solr using Solrconfig.xml

2019-05-22 Thread RaviTeja
Hello Solr Expert,

How are you?

Am trying to ignore faceting for some of the fields. Can you please help me
out to ignore faceting using solrconfig.xml.
I tried but I can ignore faceting all the fields that useless. I'm trying
to ignore some specific fields.

Really Appreciate your help for the response!

Regards,
Ravi


Re: SolrCloud Using Solrconfig.xml Instead of Configoverlay.json for RequestHandler QF

2018-11-07 Thread Jan Høydahl
Hi,

What is the output of these?

http://localhost:8983/solr/foo/config   This should give the unified 
config
http://localhost:8983/solr/foo/config/overlay   This should give the overlay 
only
http://localhost:8983/solr/foo/config/paramsYou may have some params 
defined that override things?

See 
https://lucene.apache.org/solr/guide/7_5/config-api.html#config-api-endpoints 
for the docs


If you believe it is a bug, please try to reproduce from scratch with a config 
as simple as possible and include in this email thread a step-by-step list of 
actions to reproduce from a clean solr.

--
Jan Høydahl, search solution architect
Cominvent AS - www.cominvent.com

> 5. nov. 2018 kl. 21:00 skrev Corey Ellsworth :
> 
> Hello,
> 
> I'm using Solr Cloud 6.6. I have a situation where I have a RequestHandler 
> configuration that exists in both the solrconfig.xml file and 
> configoverlay.json file (We inherited this application and are not sure why 
> it is set up like this). From reading the documentation, it seems the 
> configoverlay.json configuration should be the one used. However, while it 
> seems that Solr is pulling most configurations from the configoverlay.json 
> file, it is using the "qf" value from the solrconfig.xml file. Unfortunately, 
> the "qf" fields between the solrconfig and configoverlay files are different, 
> which is leading to unexpected search results.
> 
> Has anyone seen this before? If so, what might I do to force the qf  field to 
> pull from the configoverlay.json file? Thank you for any insight into this.
> 
> Best regards,
> 
> Corey Ellsworth



SolrCloud Using Solrconfig.xml Instead of Configoverlay.json for RequestHandler QF

2018-11-05 Thread Corey Ellsworth
Hello,

I'm using Solr Cloud 6.6. I have a situation where I have a RequestHandler 
configuration that exists in both the solrconfig.xml file and 
configoverlay.json file (We inherited this application and are not sure why it 
is set up like this). From reading the documentation, it seems the 
configoverlay.json configuration should be the one used. However, while it 
seems that Solr is pulling most configurations from the configoverlay.json 
file, it is using the "qf" value from the solrconfig.xml file. Unfortunately, 
the "qf" fields between the solrconfig and configoverlay files are different, 
which is leading to unexpected search results.

Has anyone seen this before? If so, what might I do to force the qf  field to 
pull from the configoverlay.json file? Thank you for any insight into this.

Best regards,

Corey Ellsworth


RE: Collection created error: Can't find resource 'solrconfig.xml'

2018-07-26 Thread Reem
Thanks for your reply Erich, that was really helpful!

From: Erick Erickson
Sent: Sunday, July 22, 2018 7:26 AM
To: solr-user
Subject: Re: Collection created error: Can't find resource 'solrconfig.xml'

>From your SO post:

"I've also created a directory /configs (as indicated by the error) on
every node and copied the 'web' configset files
(/share/solr/server/solr/web) that I want to use for overriding the
collection default configuration. However, this didn't solve the
problem."

Do not copy files around like this, not only is it unnecessary, but it
also masks (at best) the real problem. I'd urge you to remove all
those directories, not so much that Solr will be confused as their
presence will be confusing later.

The problem has to be that you didn't really upload the configs to the
right place. Go to the admin UI page, cloud>>tree and then in the
right pane you should be seeing a view of your Zookeeper data, and
there should be a /configs node. Under that node you should see all of
the configsets you've uploaded, including a "web" znode with all the
files below it you'd expect. Be sure to look here on a node other than
node-n01, all your Solr nodes' admin screens should show the exact
same Zookeeper (tree) information.

Some possiblities:

> you didn't upload the configs to the Zookeeper ensemble that's being used by 
> the rest of Solr. The upconfig command uses "-zkhost node-n01:2181", do all 
> of the other Solr instances also start with "-z node-n01:2181"?

> your configset in Zookeeper is a level too low or high or just in some 
> unexpected place.

> your Solr instances aren't all pointing to the same Zookeeper ensemble that 
> you uploaded your "web" configset to.

> you're somehow running embedded Zookeeper on one node


How do you start your Solr instances? Do they all use the proper
external Zookeeper ensemble? Those messages rather look like you
aren't starting your Solr instances in cloud mode. I can;t make that
square with your create command knowing they're there, but you get the
idea.

How do you start your Zookeeper ensemble?

As for why you can use "techproducts", again I think you're pointing
to different ZKs.

Best,
Erick

On Sat, Jul 21, 2018 at 6:41 PM, Reem  wrote:
> I have Solr-7.2.1 running on a cluster of 32 linux nodes/servers (+ a node 
> that hosts ZooKeeper). I wanted to create a collection with the following 
> command:
> [solr@node-n03 solr]$ curl 
> "node-n03:8983/solr/admin/collections?action=CREATE=web=32=1=1=web"
>
> Before that I uploaded the configuration into ZooKeeper using this command:
> ./server/scripts/cloud-scripts/zkcli.sh -zkhost node-n01:2181 -cmd upconfig 
> -confname web -confdir /share/solr-7.2.1/server/configs/web
>
> However, I'm getting this error for the 32 nodes:
> …"IP.IP.IP.56:8983_solr":"org.apache.solr.client.solrj.impl.HttpSolrClient$RemoteSolrException:Error
>  from server at http://IP.IP.IP.56:8983/solr: Error CREATEing SolrCore 
> 'web_shard13_replica_n24': Unable to create core [web_shard13_replica_n24] 
> Caused by: Can't find resource 'solrconfig.xml' in classpath or 
> '/configs/web', cwd=/share/solr-7.2.1/server" …
>
> I have posted more details (including the full error message) and updates on 
> the question here:
>
> https://stackoverflow.com/questions/51458097/solr-collectio-created-erorr-cant-find-resource-solrconfig-xml
>
> Do you have any idea on how to solve this problem?
>
> Reem



Re: Collection created error: Can't find resource 'solrconfig.xml'

2018-07-21 Thread Erick Erickson
>From your SO post:

"I've also created a directory /configs (as indicated by the error) on
every node and copied the 'web' configset files
(/share/solr/server/solr/web) that I want to use for overriding the
collection default configuration. However, this didn't solve the
problem."

Do not copy files around like this, not only is it unnecessary, but it
also masks (at best) the real problem. I'd urge you to remove all
those directories, not so much that Solr will be confused as their
presence will be confusing later.

The problem has to be that you didn't really upload the configs to the
right place. Go to the admin UI page, cloud>>tree and then in the
right pane you should be seeing a view of your Zookeeper data, and
there should be a /configs node. Under that node you should see all of
the configsets you've uploaded, including a "web" znode with all the
files below it you'd expect. Be sure to look here on a node other than
node-n01, all your Solr nodes' admin screens should show the exact
same Zookeeper (tree) information.

Some possiblities:

> you didn't upload the configs to the Zookeeper ensemble that's being used by 
> the rest of Solr. The upconfig command uses "-zkhost node-n01:2181", do all 
> of the other Solr instances also start with "-z node-n01:2181"?

> your configset in Zookeeper is a level too low or high or just in some 
> unexpected place.

> your Solr instances aren't all pointing to the same Zookeeper ensemble that 
> you uploaded your "web" configset to.

> you're somehow running embedded Zookeeper on one node


How do you start your Solr instances? Do they all use the proper
external Zookeeper ensemble? Those messages rather look like you
aren't starting your Solr instances in cloud mode. I can;t make that
square with your create command knowing they're there, but you get the
idea.

How do you start your Zookeeper ensemble?

As for why you can use "techproducts", again I think you're pointing
to different ZKs.

Best,
Erick

On Sat, Jul 21, 2018 at 6:41 PM, Reem  wrote:
> I have Solr-7.2.1 running on a cluster of 32 linux nodes/servers (+ a node 
> that hosts ZooKeeper). I wanted to create a collection with the following 
> command:
> [solr@node-n03 solr]$ curl 
> "node-n03:8983/solr/admin/collections?action=CREATE=web=32=1=1=web"
>
> Before that I uploaded the configuration into ZooKeeper using this command:
> ./server/scripts/cloud-scripts/zkcli.sh -zkhost node-n01:2181 -cmd upconfig 
> -confname web -confdir /share/solr-7.2.1/server/configs/web
>
> However, I'm getting this error for the 32 nodes:
> …"IP.IP.IP.56:8983_solr":"org.apache.solr.client.solrj.impl.HttpSolrClient$RemoteSolrException:Error
>  from server at http://IP.IP.IP.56:8983/solr: Error CREATEing SolrCore 
> 'web_shard13_replica_n24': Unable to create core [web_shard13_replica_n24] 
> Caused by: Can't find resource 'solrconfig.xml' in classpath or 
> '/configs/web', cwd=/share/solr-7.2.1/server" …
>
> I have posted more details (including the full error message) and updates on 
> the question here:
>
> https://stackoverflow.com/questions/51458097/solr-collectio-created-erorr-cant-find-resource-solrconfig-xml
>
> Do you have any idea on how to solve this problem?
>
> Reem


Collection created error: Can't find resource 'solrconfig.xml'

2018-07-21 Thread Reem
I have Solr-7.2.1 running on a cluster of 32 linux nodes/servers (+ a node that 
hosts ZooKeeper). I wanted to create a collection with the following command:
[solr@node-n03 solr]$ curl 
"node-n03:8983/solr/admin/collections?action=CREATE=web=32=1=1=web"

Before that I uploaded the configuration into ZooKeeper using this command:
./server/scripts/cloud-scripts/zkcli.sh -zkhost node-n01:2181 -cmd upconfig 
-confname web -confdir /share/solr-7.2.1/server/configs/web

However, I'm getting this error for the 32 nodes:
…"IP.IP.IP.56:8983_solr":"org.apache.solr.client.solrj.impl.HttpSolrClient$RemoteSolrException:Error
 from server at http://IP.IP.IP.56:8983/solr: Error CREATEing SolrCore 
'web_shard13_replica_n24': Unable to create core [web_shard13_replica_n24] 
Caused by: Can't find resource 'solrconfig.xml' in classpath or '/configs/web', 
cwd=/share/solr-7.2.1/server" …

I have posted more details (including the full error message) and updates on 
the question here:

https://stackoverflow.com/questions/51458097/solr-collectio-created-erorr-cant-find-resource-solrconfig-xml

Do you have any idea on how to solve this problem?

Reem


RE: Change/Override Solrconfig.xml across collections

2018-06-27 Thread Markus Jelsma
https://lucene.apache.org/solr/guide/6_6/configuring-solrconfig-xml.html

This page has a section about parameter substitution in config via command line 
overrides.

 
 
-Original message-
> From:Ganesh Sethuraman 
> Sent: Wednesday 27th June 2018 1:06
> To: solr-user@lucene.apache.org
> Subject: Change/Override Solrconfig.xml across collections
> 
> I would like to implement the Slow Query logging feature (
> https://lucene.apache.org/solr/guide/6_6/configuring-logging.html#ConfiguringLogging-LoggingSlowQueries)
> across multiple collection without changing solrconfig.xml in each and
> every collection. Is that possible? I am using solr 7.2.1
> 
> If this is not possible, is it possible to update only the solrconfig.xml
> into zookeeper for each collection, without the schema update? i have both
> schema and solrconfig.xml in the same directory
> 
> Regards
> Ganesh
> 


Change/Override Solrconfig.xml across collections

2018-06-26 Thread Ganesh Sethuraman
I would like to implement the Slow Query logging feature (
https://lucene.apache.org/solr/guide/6_6/configuring-logging.html#ConfiguringLogging-LoggingSlowQueries)
across multiple collection without changing solrconfig.xml in each and
every collection. Is that possible? I am using solr 7.2.1

If this is not possible, is it possible to update only the solrconfig.xml
into zookeeper for each collection, without the schema update? i have both
schema and solrconfig.xml in the same directory

Regards
Ganesh


Re: How to enable or disable component based on document field value in solrconfig.xml

2018-04-13 Thread Erick Erickson
Have you considered the StatelessScriptUpdateProcessorFactory? That
allows you to do pretty much anything you want.

Don't quite know whether TemplateUPdateProcessorFactory deals well
with empty fields or not, but it might be worth a shot.

And, of course, your ETL process could to that on the client side, but
sounds like that's not really a good option.

Best,
Erick

On Fri, Apr 13, 2018 at 3:22 AM, sarat chandra
<saratalwayssmi...@gmail.com> wrote:
>  We have a use case where we need to populate uniq field from multiple
> fields. Our solrconfig.xml file like below.
>
> **
> *   *
> * Code*
> * FullCode*
> *   *
> * *
> * ExtendedCode*
> * FullCode*
> *   *
> **
> * _*
> * FullCode*
> * *
> **
> The above configuration works fine if the document has both Code and
> ExtendedCode. But For some of the documents we don't have ExtendedCode. For
> those cases we are getting the error like Mandatory field not provided.
>
> How we need to modify the configuration to support both scenarios.
> i tried with below but no luck.
>  enable="${ExtendedCode}" >
> ExtendedCode
> FullCode
>
>
>
> Depend on the Field value, i need to enable or disable the processor
> dynamically.
>
> how can i achieve this kind of behavior ? , Thanks
>
> --
> --with regards
> SARAT CHANDRA


How to enable or disable component based on document field value in solrconfig.xml

2018-04-13 Thread sarat chandra
 We have a use case where we need to populate uniq field from multiple
fields. Our solrconfig.xml file like below.

**
*   *
* Code*
* FullCode*
*   *
* *
* ExtendedCode*
* FullCode*
*   *
**
* _*
* FullCode*
* *
**
The above configuration works fine if the document has both Code and
ExtendedCode. But For some of the documents we don't have ExtendedCode. For
those cases we are getting the error like Mandatory field not provided.

How we need to modify the configuration to support both scenarios.
i tried with below but no luck.

ExtendedCode
FullCode
   


Depend on the Field value, i need to enable or disable the processor
dynamically.

how can i achieve this kind of behavior ? , Thanks

-- 
--with regards
SARAT CHANDRA


Re: Rename solrconfig.xml

2018-02-27 Thread Zheng Lin Edwin Yeo
Hi Shawn,

Yes, I'm running SolrCloud.

Meaning we have to create all the cores in the collection with the default
solrconfig.xml first?
Then we have to modify the core.properties, and rename the solrconfig.xml.
After which, we have to reload the renamed config to ZooKeeper, then reload
the collection?

We will need to customize our program for the client, which is why we
wanted to have our own unique config file.

Regards,
Edwin


On 27 February 2018 at 17:21, Shawn Heisey <elyog...@elyograg.org> wrote:

> On 2/27/2018 12:59 AM, Zheng Lin Edwin Yeo wrote:
>
>> Regarding the core.properties, understand from the Solr guide that we need
>> to define the "config" properties first. However, my core.properties will
>> only be created when I create the collection from the command
>> http://localhost:8983/solr/admin/collections?action=CREATE;
>> name=collection
>>
>> The core.properties does not exists, and if I try to create one manually,
>> Solr will not read it, and it will still try to look for solrconfig.xml.
>>
>> What should be the right way to create the core.properties?
>>
>
> If you're running SolrCloud, you'll very likely have to allow it to create
> all the cores in the collection, then go back and modify the
> core.properties files that get created, and reload the collection once
> they're all changed.  If this actually works, keep in mind that the renamed
> config file is going to be loaded from zookeeper, right where
> solrconfig.xml would normally exist.
>
> Specifying options remotely in core.properties can only be done with the
> CoreAdmin API, but this is not used when in Cloud mode. The Collections API
> actually *does* use the CoreAdmin API behind the scenes, but because its
> usage in SolrCloud is very much an expert-level task, you shouldn't use it
> directly.
>
> The big question I have:  Why would you want to cause yourself difficulty
> by doing this?
>
> Thanks,
> Shawn
>
>


Re: Rename solrconfig.xml

2018-02-27 Thread Shawn Heisey

On 2/27/2018 12:59 AM, Zheng Lin Edwin Yeo wrote:

Regarding the core.properties, understand from the Solr guide that we need
to define the "config" properties first. However, my core.properties will
only be created when I create the collection from the command
http://localhost:8983/solr/admin/collections?action=CREATE=collection

The core.properties does not exists, and if I try to create one manually,
Solr will not read it, and it will still try to look for solrconfig.xml.

What should be the right way to create the core.properties?


If you're running SolrCloud, you'll very likely have to allow it to 
create all the cores in the collection, then go back and modify the 
core.properties files that get created, and reload the collection once 
they're all changed.  If this actually works, keep in mind that the 
renamed config file is going to be loaded from zookeeper, right where 
solrconfig.xml would normally exist.


Specifying options remotely in core.properties can only be done with the 
CoreAdmin API, but this is not used when in Cloud mode. The Collections 
API actually *does* use the CoreAdmin API behind the scenes, but because 
its usage in SolrCloud is very much an expert-level task, you shouldn't 
use it directly.


The big question I have:  Why would you want to cause yourself 
difficulty by doing this?


Thanks,
Shawn



Re: Rename solrconfig.xml

2018-02-26 Thread Zheng Lin Edwin Yeo
Regarding the core.properties, understand from the Solr guide that we need
to define the "config" properties first. However, my core.properties will
only be created when I create the collection from the command
http://localhost:8983/solr/admin/collections?action=CREATE=collection

The core.properties does not exists, and if I try to create one manually,
Solr will not read it, and it will still try to look for solrconfig.xml.

What should be the right way to create the core.properties?

Regards,
Edwin

On 27 February 2018 at 11:31, @Nandan@ <nandanpriyadarshi...@gmail.com>
wrote:

> You can change into core config file and then you can use any name .
> As i used as table_solrconfig.xml
> Same concept will applicable with schema.xml file too.
>
> On Feb 27, 2018 11:11 AM, "Zheng Lin Edwin Yeo" <edwinye...@gmail.com>
> wrote:
>
> Hi Alexandre,
>
> Thanks for your reply.
>
> Will this cause other issues with the functionality if it is renamed?
>
> Regards,
> Edwin
>
> On 27 February 2018 at 07:15, Alexandre Rafalovitch <arafa...@gmail.com>
> wrote:
>
> > I believe this can be set with "config" property in the
> > core.properties file:
> > https://lucene.apache.org/solr/guide/7_2/defining-core-
> > properties.html#defining-core-properties
> >
> > Whether it is a good idea longer term, is a different question.
> >
> > Regards,
> >Alex.
> >
> > On 23 February 2018 at 18:06, Zheng Lin Edwin Yeo <edwinye...@gmail.com>
> > wrote:
> > > Hi,
> > >
> > > Would like to check, how can we rename solrconfig.xml to something
> else?
> > > For example, I want to rename it to myconfig.xml. Is this possible?
> > >
> > > I'm using Solr 6.5.1, and planning to upgrade to Solr 7.2.1.
> > >
> > > Regards,
> > > Edwin
> >
>


Re: Rename solrconfig.xml

2018-02-26 Thread @Nandan@
You can change into core config file and then you can use any name .
As i used as table_solrconfig.xml
Same concept will applicable with schema.xml file too.

On Feb 27, 2018 11:11 AM, "Zheng Lin Edwin Yeo" <edwinye...@gmail.com>
wrote:

Hi Alexandre,

Thanks for your reply.

Will this cause other issues with the functionality if it is renamed?

Regards,
Edwin

On 27 February 2018 at 07:15, Alexandre Rafalovitch <arafa...@gmail.com>
wrote:

> I believe this can be set with "config" property in the
> core.properties file:
> https://lucene.apache.org/solr/guide/7_2/defining-core-
> properties.html#defining-core-properties
>
> Whether it is a good idea longer term, is a different question.
>
> Regards,
>Alex.
>
> On 23 February 2018 at 18:06, Zheng Lin Edwin Yeo <edwinye...@gmail.com>
> wrote:
> > Hi,
> >
> > Would like to check, how can we rename solrconfig.xml to something else?
> > For example, I want to rename it to myconfig.xml. Is this possible?
> >
> > I'm using Solr 6.5.1, and planning to upgrade to Solr 7.2.1.
> >
> > Regards,
> > Edwin
>


Re: Rename solrconfig.xml

2018-02-26 Thread Zheng Lin Edwin Yeo
Hi Alexandre,

Thanks for your reply.

Will this cause other issues with the functionality if it is renamed?

Regards,
Edwin

On 27 February 2018 at 07:15, Alexandre Rafalovitch <arafa...@gmail.com>
wrote:

> I believe this can be set with "config" property in the
> core.properties file:
> https://lucene.apache.org/solr/guide/7_2/defining-core-
> properties.html#defining-core-properties
>
> Whether it is a good idea longer term, is a different question.
>
> Regards,
>Alex.
>
> On 23 February 2018 at 18:06, Zheng Lin Edwin Yeo <edwinye...@gmail.com>
> wrote:
> > Hi,
> >
> > Would like to check, how can we rename solrconfig.xml to something else?
> > For example, I want to rename it to myconfig.xml. Is this possible?
> >
> > I'm using Solr 6.5.1, and planning to upgrade to Solr 7.2.1.
> >
> > Regards,
> > Edwin
>


Re: Rename solrconfig.xml

2018-02-26 Thread Alexandre Rafalovitch
I believe this can be set with "config" property in the
core.properties file:
https://lucene.apache.org/solr/guide/7_2/defining-core-properties.html#defining-core-properties

Whether it is a good idea longer term, is a different question.

Regards,
   Alex.

On 23 February 2018 at 18:06, Zheng Lin Edwin Yeo <edwinye...@gmail.com> wrote:
> Hi,
>
> Would like to check, how can we rename solrconfig.xml to something else?
> For example, I want to rename it to myconfig.xml. Is this possible?
>
> I'm using Solr 6.5.1, and planning to upgrade to Solr 7.2.1.
>
> Regards,
> Edwin


Rename solrconfig.xml

2018-02-23 Thread Zheng Lin Edwin Yeo
Hi,

Would like to check, how can we rename solrconfig.xml to something else?
For example, I want to rename it to myconfig.xml. Is this possible?

I'm using Solr 6.5.1, and planning to upgrade to Solr 7.2.1.

Regards,
Edwin


Re: Conditional based Filters on SolrConfig.xml

2017-12-08 Thread Shawn Heisey
On 12/8/2017 4:07 AM, sarat chandra wrote:
> Currently we have a request handler contains appends option like below
>
> 
>   inStock:true
>
> Now i want to append this filter to query on conditional based.
> If my request query contains a flag or if the flag is true, i need to
> append the above filter to query, otherwise the filter should not append to
> query.
>
> how can i achieve this kind of behavior ?

This is going to require custom code.  If you want it to happen inside
Solr, then the custom code would be a handler plugin.

It would probably be *much* easier on the client side, since it is
likely that you already have custom code there.  The client can make the
decision about what parameters should be in the request, instead of
relying on the server to add them.

Thanks,
Shawn



Conditional based Filters on SolrConfig.xml

2017-12-08 Thread sarat chandra
HI

Currently we have a request handler contains appends option like below


  inStock:true



Now i want to append this filter to query on conditional based.
If my request query contains a flag or if the flag is true, i need to
append the above filter to query, otherwise the filter should not append to
query.

how can i achieve this kind of behavior ?






-- 
--with regards
SARAT CHANDRA


Solr boost property through request handler in solrconfig.xml

2017-10-19 Thread ruby
If I'm not using edismax or dismax, is there a way to boost a specific
property through solrconfig.xml? I'm avoiding hard-coding boost in query.
Following is my the request handler  in solronfig.xml right now





 explicit
 10
 myFiled   
 OR
 fc
  




--
Sent from: http://lucene.472066.n3.nabble.com/Solr-User-f472068.html


Solr boost property through request handler in solrconfig.xml

2017-10-19 Thread ruby
If I'm not using edismax or dismax, is there a way to boost a specific
property through solrconfig.xml? I'm avoiding hard-coding boost in query.
Following is my the request handler  in solronfig.xml right now





 explicit
 10
 myFiled   
 OR
 fc
  




--
Sent from: http://lucene.472066.n3.nabble.com/Solr-User-f472068.html


Re: edismax-with-bq-complexphrase-query not working in solrconfig.xml search handler

2017-09-04 Thread Doug Turnbull
The problem is that the ${q} macro syntax is interpreted by Solr as a Java
Property. Thus these two syntaxes conflict when we encode the macro
substitution into the solrconfig.xml. See
https://cwiki.apache.org/confluence/display/solr/Configuring+solrconfig.xml

The way to escape seems to be to indicate ${q} is the default value to use,
ie:

  {!complexphrase df=title}"${q:${q}}"

I must say as a user new to this syntax, this is a surprising complexity.
Also, while a great feature, macro substitution doesn't seem particularly
well documented and i took us a while to figure this out. Should a syntax
that doesn't conflict with Java properties be considered? Perhaps sunset
the original syntax? At the least document how to port your macros to
solrconfig?

Best!
-Doug


On Mon, Sep 4, 2017 at 1:16 PM Bertrand Rigaldies <
brigald...@opensourceconnections.com> wrote:

> Hi there, on Solr 6.6.0, using the built-in "techproducts" example:
>
> bin/solr start -f -e techproducts
>
> I can successfully search with URL-based bq as shown in the URL (bq setting
> in *bold*) below:
>
> http://localhost:8983/solr/techproducts/select?
> *bq={!complexphrase%20inOrder=true%20df=name}%22${q}%22*
> =on=edismax=*,score=on=iPod%20Mini=json
>
> where I see the expected ComplexPhraseQuery expression in the parsed query:
>
> "parsedquery":"(+(DisjunctionMaxQuery((text:ipod))
> DisjunctionMaxQuery((text:mini))) *ComplexPhraseQuery(\"iPod Mini\")*
> )/no_coord"
>
> However, *I have been unable to accomplish the same query using a search
> handler *in solrconfig.xml, as shown below:
>
>   
> 
>   ALL
>   10
>
>   
>   edismax
>
>   
>   name
>
>   
>   name, score
>
>   
>   
>   *{!complexphrase inOrder=true df=name}"${q}"*
>
>   
>   
> 
>   
>
> The expression {!complexphrase inOrder=true
> df=name}"${q}" errors out when loading solrconfig.xml with the
> following error:
>
> Error: org.apache.solr.common.SolrException: No system property or default
> value specified for q value:{!complexphrase inOrder=true df=name}"${q}"
>
> as if Solr is looking for a Java property "q"!?
>
> The variation shows below without the macro expansion, but assuming a
> normal localParam substitution, seems to be taking "$q" as a literate:
>
> Setting:
>   {!complexphrase inOrder=true}name:"$q"
> Produces:
>   "parsedquery":"(+(DisjunctionMaxQuery((name:ipod))
> DisjunctionMaxQuery((name:mini))) *ComplexPhraseQuery(\"$q\")*)/no_coord"
>
> The variation with an attempt to escape the macro expansion does not work
> either:
>
> Setting:
>   {!complexphrase inOrder=true
> df=name}"\\$\\{q\\}"
> Produces:
>   "parsedquery":"(+(DisjunctionMaxQuery((name:ipod))
> DisjunctionMaxQuery((name:mini)))* ComplexPhraseQuery(\"\\$\\{q\\}\"))*
> /no_coord"
>
> Also, the following variations do *not* produce the expected
> ComplexPhraseQuery statement in the parsed query:
> Settings:
>   {!complexphrase inOrder=true}name:$q
>   {!complexphrase inOrder=true df=name}$q
>   {!complexphrase inOrder=true}name:($q)
>   {!complexphrase inOrder=true df=name}($q)
>   {!complexphrase inOrder=true}name:\"$q\"
>   {!complexphrase inOrder=true df=name}\"$q\"
>   {!complexphrase inOrder=true df=name v="$q"}
>   {!complexphrase inOrder=true df=name v=\"$q\"}
> Produces:
>   "parsedquery":"(+(DisjunctionMaxQuery((name:ipod))
> DisjunctionMaxQuery((name:mini))) *name:q*)/no_coord"
>
> And finally, also not producing the expected ComplexPhraseQuery statement:
> Setting:
>   {!complexphrase inOrder=true df=name v=$q}
> Produces:
>   "parsedquery":"(+(DisjunctionMaxQuery((name:ipod))
> DisjunctionMaxQuery((name:mini))) *(name:ipod name:mini)*)/no_coord"
>
> The documentation for ComplexPhraseQuery
> <https://lucene.apache.org/solr/guide/6_6/other-parsers.html#other-parsers
> >
> stipulates
> some required "escaping": Special care has to be given when escaping:
> clauses between double quotes (usually whole query) is parsed twice, these
> parts have to be escaped as twice. eg "foo\\: bar\\^"
>
> hence, it is possible that I have not used the proper escaping syntax.
>
> It is troubling that I cannot use the same URL parameter expression in a
> search handler to accomplish the same effect, a strong assumption of mine
> in how Solr can be used.
>
> Any suggestion, comment, or similar experience? Does it look like a bug?
>
> Thank you,
> Bertrand
>
-- 
Consultant, OpenSource Connections. Contact info at
http://o19s.com/about-us/doug-turnbull/; Free/Busy (http://bit.ly/dougs_cal)


edismax-with-bq-complexphrase-query not working in solrconfig.xml search handler

2017-09-04 Thread Bertrand Rigaldies
Hi there, on Solr 6.6.0, using the built-in "techproducts" example:

bin/solr start -f -e techproducts

I can successfully search with URL-based bq as shown in the URL (bq setting
in *bold*) below:

http://localhost:8983/solr/techproducts/select?
*bq={!complexphrase%20inOrder=true%20df=name}%22${q}%22*
=on=edismax=*,score=on=iPod%20Mini=json

where I see the expected ComplexPhraseQuery expression in the parsed query:

"parsedquery":"(+(DisjunctionMaxQuery((text:ipod))
DisjunctionMaxQuery((text:mini))) *ComplexPhraseQuery(\"iPod Mini\")*
)/no_coord"

However, *I have been unable to accomplish the same query using a search
handler *in solrconfig.xml, as shown below:

  

  ALL
  10

  
  edismax

  
  name

  
  name, score

  
  
  *{!complexphrase inOrder=true df=name}"${q}"*

  
  

  

The expression {!complexphrase inOrder=true
df=name}"${q}" errors out when loading solrconfig.xml with the
following error:

Error: org.apache.solr.common.SolrException: No system property or default
value specified for q value:{!complexphrase inOrder=true df=name}"${q}"

as if Solr is looking for a Java property "q"!?

The variation shows below without the macro expansion, but assuming a
normal localParam substitution, seems to be taking "$q" as a literate:

Setting:
  {!complexphrase inOrder=true}name:"$q"
Produces:
  "parsedquery":"(+(DisjunctionMaxQuery((name:ipod))
DisjunctionMaxQuery((name:mini))) *ComplexPhraseQuery(\"$q\")*)/no_coord"

The variation with an attempt to escape the macro expansion does not work
either:

Setting:
  {!complexphrase inOrder=true df=name}"\\$\\{q\\}"
Produces:
  "parsedquery":"(+(DisjunctionMaxQuery((name:ipod))
DisjunctionMaxQuery((name:mini)))* ComplexPhraseQuery(\"\\$\\{q\\}\"))*
/no_coord"

Also, the following variations do *not* produce the expected
ComplexPhraseQuery statement in the parsed query:
Settings:
  {!complexphrase inOrder=true}name:$q
  {!complexphrase inOrder=true df=name}$q
  {!complexphrase inOrder=true}name:($q)
  {!complexphrase inOrder=true df=name}($q)
  {!complexphrase inOrder=true}name:\"$q\"
  {!complexphrase inOrder=true df=name}\"$q\"
  {!complexphrase inOrder=true df=name v="$q"}
  {!complexphrase inOrder=true df=name v=\"$q\"}
Produces:
  "parsedquery":"(+(DisjunctionMaxQuery((name:ipod))
DisjunctionMaxQuery((name:mini))) *name:q*)/no_coord"

And finally, also not producing the expected ComplexPhraseQuery statement:
Setting:
  {!complexphrase inOrder=true df=name v=$q}
Produces:
  "parsedquery":"(+(DisjunctionMaxQuery((name:ipod))
DisjunctionMaxQuery((name:mini))) *(name:ipod name:mini)*)/no_coord"

The documentation for ComplexPhraseQuery
<https://lucene.apache.org/solr/guide/6_6/other-parsers.html#other-parsers>
stipulates
some required "escaping": Special care has to be given when escaping:
clauses between double quotes (usually whole query) is parsed twice, these
parts have to be escaped as twice. eg "foo\\: bar\\^"

hence, it is possible that I have not used the proper escaping syntax.

It is troubling that I cannot use the same URL parameter expression in a
search handler to accomplish the same effect, a strong assumption of mine
in how Solr can be used.

Any suggestion, comment, or similar experience? Does it look like a bug?

Thank you,
Bertrand


precedence for configurations in solrconfig.xml file

2017-07-26 Thread suresh pendap
Hi,
If I have a configoverlay.json file with the below content

{"props":{"updateHandler":{"autoCommit":{
"maxTime":5,
"maxDocs":1


and I also have a JVM properties set on the Solr JVM instance as


-Dsolr.autoCommit.maxtime=2 -Dsolr.autoCommit.maxDocs=10



I would like to know the order of precedence in which the

configurations are applied.


Regards

Suresh


Re: Modifying solrconfig.xml in solr cloud

2017-03-14 Thread Binoy Dalal
Thanks Eric. Missed that somehow.

On Tue, 14 Mar 2017, 10:44 Erick Erickson, <erickerick...@gmail.com> wrote:

> First hit from googling "solr config API"
>
> https://cwiki.apache.org/confluence/display/solr/Config+API
>
> Best,
> Erick
>
> On Mon, Mar 13, 2017 at 8:27 PM, Binoy Dalal <binoydala...@gmail.com>
> wrote:
> > Is there a simpler way of modifying solrconfig.xml in cloud mode without
> > having to download the file from zookeeper, modifying it and reuploading
> it?
> >
> > Something like the schema API maybe?
> > --
> > Regards,
> > Binoy Dalal
>
-- 
Regards,
Binoy Dalal


Re: Modifying solrconfig.xml in solr cloud

2017-03-13 Thread Erick Erickson
First hit from googling "solr config API"

https://cwiki.apache.org/confluence/display/solr/Config+API

Best,
Erick

On Mon, Mar 13, 2017 at 8:27 PM, Binoy Dalal <binoydala...@gmail.com> wrote:
> Is there a simpler way of modifying solrconfig.xml in cloud mode without
> having to download the file from zookeeper, modifying it and reuploading it?
>
> Something like the schema API maybe?
> --
> Regards,
> Binoy Dalal


Modifying solrconfig.xml in solr cloud

2017-03-13 Thread Binoy Dalal
Is there a simpler​ way of modifying solrconfig.xml in cloud mode without
having to download the file from zookeeper, modifying it and reuploading it?

Something like the schema API maybe?
-- 
Regards,
Binoy Dalal


Re: Changing "configSetBaseDir" leads to, Can't find resource 'solrconfig.xml' in classpath error while creating core

2017-02-14 Thread Erick Erickson
Works for me if I camel-case configset, i.e. configSet. Is this a
misunderstanding on your part or are there  docs with it all lowercase
that we should fix? See:
https://cwiki.apache.org/confluence/display/solr/Config+Sets

I did get the same error you did with an all lower-case 'configset'.

Best,
Erick

On Tue, Feb 14, 2017 at 3:30 PM, saiks <karlapudisam...@gmail.com> wrote:
> Hi All,
>
> I have a core "core1" created with custom config
> {solr.solr.home}/configsets/custom_config
>
> I changed configSetBaseDir to a different directory in solr.xml and copied
> the folders over to the new dir and deleted the old configs
>  name="configSetBaseDir">${configSetBaseDir:/xxx/Desktop/changed-configset}
>
> Now, if I restart Solr, core1 has no problems it is reading solrConfig.xml
> and schema.xml from the new configset dir.
>
> But, if I try to create a new core using custom configset, it gives an error
> http://localhost:8983/solr/admin/cores?action=CREATE=core2=custom_config
>
> 
> 
> 400
> 3
> 
> 
> 
> org.apache.solr.common.SolrException
>  name="root-error-class">org.apache.solr.core.SolrResourceNotFoundException
> 
> 
> Error CREATEing SolrCore 'newcore': Unable to create core [newcore] Caused
> by: Can't find resource 'solrconfig.xml' in classpath or
> '/Users/karlapudis/TrySolr/solr-6.3.0/server/solr/newcore'
> 
> 400
> 
> 
>
> Is is because the new configSet directory is not under {solr.solr.home} ?
> Is there a way to add this new configSetBaseDir to classpath?
>
> Any help is appreciated.
>
> Thanks
>
>
>
> --
> View this message in context: 
> http://lucene.472066.n3.nabble.com/Changing-configSetBaseDir-leads-to-Can-t-find-resource-solrconfig-xml-in-classpath-error-while-create-tp4320390.html
> Sent from the Solr - User mailing list archive at Nabble.com.


Changing "configSetBaseDir" leads to, Can't find resource 'solrconfig.xml' in classpath error while creating core

2017-02-14 Thread saiks
Hi All,

I have a core "core1" created with custom config
{solr.solr.home}/configsets/custom_config

I changed configSetBaseDir to a different directory in solr.xml and copied
the folders over to the new dir and deleted the old configs
${configSetBaseDir:/xxx/Desktop/changed-configset}

Now, if I restart Solr, core1 has no problems it is reading solrConfig.xml
and schema.xml from the new configset dir.

But, if I try to create a new core using custom configset, it gives an error
http://localhost:8983/solr/admin/cores?action=CREATE=core2=custom_config



400
3



org.apache.solr.common.SolrException
org.apache.solr.core.SolrResourceNotFoundException


Error CREATEing SolrCore 'newcore': Unable to create core [newcore] Caused
by: Can't find resource 'solrconfig.xml' in classpath or
'/Users/karlapudis/TrySolr/solr-6.3.0/server/solr/newcore'

400



Is is because the new configSet directory is not under {solr.solr.home} ?
Is there a way to add this new configSetBaseDir to classpath?

Any help is appreciated.

Thanks



--
View this message in context: 
http://lucene.472066.n3.nabble.com/Changing-configSetBaseDir-leads-to-Can-t-find-resource-solrconfig-xml-in-classpath-error-while-create-tp4320390.html
Sent from the Solr - User mailing list archive at Nabble.com.


Re: newSearcher autowarming queries in solrconfig.xml run but does not appear to warm cache

2016-10-19 Thread Dalton Gooding
e results
>> from the querResultCache.
>>
>> What _is_ relevant is populating the low-level Lucene caches with
>> values from the indexed terms. My
>> contention is that this is not happening with match-all queries, i.e.
>> field:* or field:[* TO *] because in
>> those cases, a doc matches or doesn't based on whether it has anything
>> in the field. There's no point
>> in finding values since it doesn't matter anyway. And "finding values"
>> means reading indexed terms
>> from disk into low-level Lucene caches.
>>
>> When I say "populate the low-level Lucene caches", what I'm really
>> talking about is reading them from
>> disk into your physical memory via MMapDirectory, see Uwe's excellent
>> blog:
>> http://blog.thetaphi.de/2012/07/use-lucenes-mmapdirectory-on-64bit.html
>>
>> So the suggestion is that you use real values from your index or
>> possibly ranges is so that part or all
>> of your disk files get read into MMapDirectorySpace via the first or
>> new Searcher event.
>>
>> Please just give it a try. My bet is that you'll see your QTime values
>> first time after autowarming
>> go down. Significantly. Be sure to use a wide variety of different
>> values for autowarming.
>>
>> BTW, the autowarmCounts in solrconfig.xml filterCache and
>> queryResultCache are intended
>> to warm by using the last N fq or q clauses on the theory that the
>> most recent N are predictive
>> of the next N.
>>
>> Best,
>> Erick
>>
>>
>> ***
>>
>> I believe the return time back to the command line from the curl
>> command and the QTime as shown below
>>
>> time curl -v
>>
>> 'http:///solr/core1/select?fq=DataType_s%3AProduct=WebSections_ms%3Ahouse=%28VisibleOnline_ms%3ANAT+OR+VisibleOnline_ms%3A7%29=%7B%21tag%3Dcurrent_group%7DGroupIds_ms%3A458=SalesRank_f+desc=true=%7B%21ex%3Dcurrent_group%7Dattr_GroupLevel0=BrandID_s=%7B%21ex%3Dcurrent_group%7Dattr_GroupLevel2=%7B%21ex%3Dcurrent_group%7Dattr_GroupLevel1=SubBrandID_s=ProductAttr_967_ms=ProductAttr_NEG21_ms=ProductAttr_1852_ms=Price_7_f%3A%5B%2A+TO+%2A%5D=Price_2_f%3A%5B%2A+TO+%2A%5D=Price_3_f%3A%5B%2A+TO+%2A%5D=Price_4_f%3A%5B%2A+TO+%2A%5D=Price_5_f%3A%5B%2A+TO+%2A%5D=Price_6_f%3A%5B%2A+TO+%2A%5D=1=json=map=%28title%3A%2A+OR+text%3A%2A%29+AND+%28ms%3ALive%29=0=24'
>>
>> real    0m1.436s
>> user    0m0.001s
>> sys    0m0.006s
>>
>> "QTime":1387
>>
>> From what you suggested, changing the rows value from 20 to something
>> greater should add more documents to the cache. Injunction with tuning
>> the queries to remove the * wild card, this should provide a better
>> warming query?
>>
>> Should I also increase the queryResultWindowSize in the solrconfig.xml
>> to help built out the cache?
>>
>> Cheers,
>>
>> Guy
>>
>>
>>
>>
>>
>> On Thu, Oct 6, 2016 at 4:43 PM, Dalton Gooding
>> <daltonwestco...@yahoo.com.au> wrote:
>>> Erick,
>>>
>>> Thanks for the response. After I run the initial query and get a long
>>> response time, if I change the query to remove or add additional query
>>> statements, I find the speed is good.
>>>
>>> If I run the modified query after a new searcher has registered, the
>>> response is slow but after the modified query has been completed, the
>>> warming query sent from CuRl is much faster. I assume it is because the
>>> document cache has updated with the documents from the modified query. A
>>> large number of our queries work with the same document set, I am trying
>>> to
>>> get a warming query to populate the document cache to be as big as
>>> feasible.
>>>
>>> Should the firstSearcher and newSearcher warm the document cache?
>>>
>>>
>>> On Friday, 7 October 2016, 9:31, Erick Erickson <erickerick...@gmail.com>
>>> wrote:
>>>
>>>
>>> Submitting the exact same query twice will return results from the
>>> queryResultCache. I'm not entirely
>>> sure that the firstSearcher events get put into the cache.
>>>
>>> So if you change the query even slighty my guess is that you'll see
>>> response times very close to your
>>> original ones of over a second.
>>>
>>> Best,
>>> Erick
>>>
>>> On Thu, Oct 6, 2016 at 2:56 PM, Dalton Gooding
>>> <daltonwestco...@yahoo.com.au.invalid> wrote:
>>>> After setting a number of newSearcher and firstSearcher queries,

Re: newSearcher autowarming queries in solrconfig.xml run but does not appear to warm cache

2016-10-16 Thread Dalton Gooding
ding values"
> means reading indexed terms
> from disk into low-level Lucene caches.
>
> When I say "populate the low-level Lucene caches", what I'm really
> talking about is reading them from
> disk into your physical memory via MMapDirectory, see Uwe's excellent blog:
> http://blog.thetaphi.de/2012/07/use-lucenes-mmapdirectory-on-64bit.html
>
> So the suggestion is that you use real values from your index or
> possibly ranges is so that part or all
> of your disk files get read into MMapDirectorySpace via the first or
> new Searcher event.
>
> Please just give it a try. My bet is that you'll see your QTime values
> first time after autowarming
> go down. Significantly. Be sure to use a wide variety of different
> values for autowarming.
>
> BTW, the autowarmCounts in solrconfig.xml filterCache and
> queryResultCache are intended
> to warm by using the last N fq or q clauses on the theory that the
> most recent N are predictive
> of the next N.
>
> Best,
> Erick
>
>
> ***
>
> I believe the return time back to the command line from the curl
> command and the QTime as shown below
>
> time curl -v
> 'http:///solr/core1/select?fq=DataType_s%3AProduct=WebSections_ms%3Ahouse=%28VisibleOnline_ms%3ANAT+OR+VisibleOnline_ms%3A7%29=%7B%21tag%3Dcurrent_group%7DGroupIds_ms%3A458=SalesRank_f+desc=true=%7B%21ex%3Dcurrent_group%7Dattr_GroupLevel0=BrandID_s=%7B%21ex%3Dcurrent_group%7Dattr_GroupLevel2=%7B%21ex%3Dcurrent_group%7Dattr_GroupLevel1=SubBrandID_s=ProductAttr_967_ms=ProductAttr_NEG21_ms=ProductAttr_1852_ms=Price_7_f%3A%5B%2A+TO+%2A%5D=Price_2_f%3A%5B%2A+TO+%2A%5D=Price_3_f%3A%5B%2A+TO+%2A%5D=Price_4_f%3A%5B%2A+TO+%2A%5D=Price_5_f%3A%5B%2A+TO+%2A%5D=Price_6_f%3A%5B%2A+TO+%2A%5D=1=json=map=%28title%3A%2A+OR+text%3A%2A%29+AND+%28ms%3ALive%29=0=24'
>
> real    0m1.436s
> user    0m0.001s
> sys    0m0.006s
>
> "QTime":1387
>
> From what you suggested, changing the rows value from 20 to something
> greater should add more documents to the cache. Injunction with tuning
> the queries to remove the * wild card, this should provide a better
> warming query?
>
> Should I also increase the queryResultWindowSize in the solrconfig.xml
> to help built out the cache?
>
> Cheers,
>
> Guy
>
>
>
>
>
> On Thu, Oct 6, 2016 at 4:43 PM, Dalton Gooding
> <daltonwestco...@yahoo.com.au> wrote:
>> Erick,
>>
>> Thanks for the response. After I run the initial query and get a long
>> response time, if I change the query to remove or add additional query
>> statements, I find the speed is good.
>>
>> If I run the modified query after a new searcher has registered, the
>> response is slow but after the modified query has been completed, the
>> warming query sent from CuRl is much faster. I assume it is because the
>> document cache has updated with the documents from the modified query. A
>> large number of our queries work with the same document set, I am trying
>> to
>> get a warming query to populate the document cache to be as big as
>> feasible.
>>
>> Should the firstSearcher and newSearcher warm the document cache?
>>
>>
>> On Friday, 7 October 2016, 9:31, Erick Erickson <erickerick...@gmail.com>
>> wrote:
>>
>>
>> Submitting the exact same query twice will return results from the
>> queryResultCache. I'm not entirely
>> sure that the firstSearcher events get put into the cache.
>>
>> So if you change the query even slighty my guess is that you'll see
>> response times very close to your
>> original ones of over a second.
>>
>> Best,
>> Erick
>>
>> On Thu, Oct 6, 2016 at 2:56 PM, Dalton Gooding
>> <daltonwestco...@yahoo.com.au.invalid> wrote:
>>> After setting a number of newSearcher and firstSearcher queries, I can
>>> see
>>> in the console logs that the queries are run, but when I run the same
>>> query
>>> against the new searcher (using CuRL), I get a slow response time for the
>>> first run.
>>>
>>> Config:
>>>    
>>>          DataType_s:Product
>>> WebSections_ms:house              >> name="fq">{!tag=current_group}GroupIds_ms:*
>>>              true              >> name="facet.field">BrandID_s              >> name="facet.query">Price_2_f:[* TO *]              >> name="facet.query">Price_3_f:[* TO *]              >> name="facet.query">Price_4_f:[* TO *]              >> name="facet.query">Price_5_f:[* TO *]              >> name="facet.query">Price_6_f:[* TO *

Re: newSearcher autowarming queries in solrconfig.xml run but does not appear to warm cache

2016-10-09 Thread Dalton Gooding
Erick,
I have tried tuning the queries with some limited success. I still get drastic 
differences between the first time I fire my warming query (after newSearcher 
ran query) and the second time, or any variant of the query i.e. removing 
fields or changing parameters, it runs much faster.
I am not sure what I am missing here, I put a query into the newSearcher 
section that runs fine, but the exact same query run after warming still takes 
the full time of a un-warmed query.
Can you break it down to the most basic type of newSearcher query to try and 
shrink the gap between first query and subsequent queries sent?
I cannot see why sending the same query after a newSearcher is slow, when 
subsequent queries run faster. I figured this was the idea of the newSearcher 
stanza's.  

On Friday, 7 October 2016, 14:45, Erick Erickson <erickerick...@gmail.com> 
wrote:
 

 Replying on the public thread, somehow your mail was sent to me privately.

Pasted your email to me below for others.

You are still confusing documents and results. Forget about the rows
parameter, for this discussion it's irrelevant.

The QTime is the time spent searching. It is unaffected by whether a
document is in the documentCache or not.
It _solely_ measures the time that Solr/Lucene take to find the top N
documents (where N is the rows param) and
record their internal Lucene doc ID.

Increasing the rows or the document cache won't change anything about
the QTime. The documentCache is
totally the wrong place to focus.


The response when you re-submit the query suggests that getting the
top N docs' internal Lucene ID is
fetched from the queryResultCache. Changing the window size is also
irrelevant to this discussion. If you
vary the query even slightly you won't hit the queryResultCache. A
very easy way to check this is the
admin UI>>select core>>plugins/stats>>QueryHandler and then probably
the "select" handler. If you see
the hits go up after the fast query then you're getting the results
from the querResultCache.

What _is_ relevant is populating the low-level Lucene caches with
values from the indexed terms. My
contention is that this is not happening with match-all queries, i.e.
field:* or field:[* TO *] because in
those cases, a doc matches or doesn't based on whether it has anything
in the field. There's no point
in finding values since it doesn't matter anyway. And "finding values"
means reading indexed terms
from disk into low-level Lucene caches.

When I say "populate the low-level Lucene caches", what I'm really
talking about is reading them from
disk into your physical memory via MMapDirectory, see Uwe's excellent blog:
http://blog.thetaphi.de/2012/07/use-lucenes-mmapdirectory-on-64bit.html

So the suggestion is that you use real values from your index or
possibly ranges is so that part or all
of your disk files get read into MMapDirectorySpace via the first or
new Searcher event.

Please just give it a try. My bet is that you'll see your QTime values
first time after autowarming
go down. Significantly. Be sure to use a wide variety of different
values for autowarming.

BTW, the autowarmCounts in solrconfig.xml filterCache and
queryResultCache are intended
to warm by using the last N fq or q clauses on the theory that the
most recent N are predictive
of the next N.

Best,
Erick


***

I believe the return time back to the command line from the curl
command and the QTime as shown below

time curl -v 
'http:///solr/core1/select?fq=DataType_s%3AProduct=WebSections_ms%3Ahouse=%28VisibleOnline_ms%3ANAT+OR+VisibleOnline_ms%3A7%29=%7B%21tag%3Dcurrent_group%7DGroupIds_ms%3A458=SalesRank_f+desc=true=%7B%21ex%3Dcurrent_group%7Dattr_GroupLevel0=BrandID_s=%7B%21ex%3Dcurrent_group%7Dattr_GroupLevel2=%7B%21ex%3Dcurrent_group%7Dattr_GroupLevel1=SubBrandID_s=ProductAttr_967_ms=ProductAttr_NEG21_ms=ProductAttr_1852_ms=Price_7_f%3A%5B%2A+TO+%2A%5D=Price_2_f%3A%5B%2A+TO+%2A%5D=Price_3_f%3A%5B%2A+TO+%2A%5D=Price_4_f%3A%5B%2A+TO+%2A%5D=Price_5_f%3A%5B%2A+TO+%2A%5D=Price_6_f%3A%5B%2A+TO+%2A%5D=1=json=map=%28title%3A%2A+OR+text%3A%2A%29+AND+%28ms%3ALive%29=0=24'

real    0m1.436s
user    0m0.001s
sys    0m0.006s

"QTime":1387

>From what you suggested, changing the rows value from 20 to something
greater should add more documents to the cache. Injunction with tuning
the queries to remove the * wild card, this should provide a better
warming query?

Should I also increase the queryResultWindowSize in the solrconfig.xml
to help built out the cache?

Cheers,

Guy





On Thu, Oct 6, 2016 at 4:43 PM, Dalton Gooding
<daltonwestco...@yahoo.com.au> wrote:
> Erick,
>
> Thanks for the response. After I run the initial query and get a long
> response time, if I change the query to remove or add additional query
> statements, I find the speed is good.
>
> If I run the modified query after a new searcher has registered, the
> response is slow but 

Re: newSearcher autowarming queries in solrconfig.xml run but does not appear to warm cache

2016-10-06 Thread Dalton Gooding
Erick,
Thanks for the response. After I run the initial query and get a long response 
time, if I change the query to remove or add additional query statements, I 
find the speed is good.
If I run the modified query after a new searcher has registered, the response 
is slow but after the modified query has been completed, the warming query sent 
from CuRl is much faster. I assume it is because the document cache has updated 
with the documents from the modified query. A large number of our queries work 
with the same document set, I am trying to get a warming query to populate the 
document cache to be as big as feasible.
Should the firstSearcher and newSearcher warm the document cache? 

On Friday, 7 October 2016, 9:31, Erick Erickson  
wrote:
 

 Submitting the exact same query twice will return results from the
queryResultCache. I'm not entirely
sure that the firstSearcher events get put into the cache.

So if you change the query even slighty my guess is that you'll see
response times very close to your
original ones of over a second.

Best,
Erick

On Thu, Oct 6, 2016 at 2:56 PM, Dalton Gooding
 wrote:
> After setting a number of newSearcher and firstSearcher queries, I can see in 
> the console logs that the queries are run, but when I run the same query 
> against the new searcher (using CuRL), I get a slow response time for the 
> first run.
>
> Config:
>          name="queries">         DataType_s:Product           
>   WebSections_ms:house              name="fq">{!tag=current_group}GroupIds_ms:*
>              true              name="facet.field">BrandID_s              name="facet.query">Price_2_f:[* TO *]              name="facet.query">Price_3_f:[* TO *]              name="facet.query">Price_4_f:[* TO *]              name="facet.query">Price_5_f:[* TO *]              name="facet.query">Price_6_f:[* TO *]              name="facet.query">Price_7_f:[* TO *]              name="facet.query">Price_8_f:[* TO *]              name="facet.mincount">1              fc   
>           json              name="json.nl">map              (title:* OR text:*)  
>            0              20   
>             
>
> Console log:
> INFO  (searcherExecutor-7-thread-1-processing-x:core1) [  x:core1] 
> o.a.s.c.S.Request [core1] webapp=null path=null 
> params={facet=true=1=0=Price_2_f:[*+TO+*]=Price_3_f:[*+TO+*]=Price_4_f:[*+TO+*]=Price_5_f:[*+TO+*]=Price_6_f:[*+TO+*]=Price_7_f:[*+TO+*]=Price_8_f:[*+TO+*]=newSearcher=(title:*+OR+text:*)=false=map=BrandID_s=json=fc=DataType_s:Product=WebSections_ms:house=VisibleOnline_ms:7={!tag%3Dcurrent_group}GroupIds_ms:*=20}
>  hits=2549 status=0 QTime=1263
>
>
> If I run the same query after the index has registered I see a QTime of over 
> a second, the second time I run the query I see around 80ms. This leads me to 
> believe the warming did not occur or the query was not commited to cache on 
> start up of the new searcher.
> Can someone please advise on how to use the newSearcher queries to 
> effectively warm SolR caches. Should I see an improved response for the first 
> time I run the query if the same query has been used as a newSearcher query?
> Cheers,
> Dalton

   

Re: newSearcher autowarming queries in solrconfig.xml run but does not appear to warm cache

2016-10-06 Thread Erick Erickson
Submitting the exact same query twice will return results from the
queryResultCache. I'm not entirely
sure that the firstSearcher events get put into the cache.

So if you change the query even slighty my guess is that you'll see
response times very close to your
original ones of over a second.

Best,
Erick

On Thu, Oct 6, 2016 at 2:56 PM, Dalton Gooding
 wrote:
> After setting a number of newSearcher and firstSearcher queries, I can see in 
> the console logs that the queries are run, but when I run the same query 
> against the new searcher (using CuRL), I get a slow response time for the 
> first run.
>
> Config:
>name="queries"> DataType_s:Product  
> WebSections_ms:house   name="fq">{!tag=current_group}GroupIds_ms:*
>   true   name="facet.field">BrandID_s   name="facet.query">Price_2_f:[* TO *]   name="facet.query">Price_3_f:[* TO *]   name="facet.query">Price_4_f:[* TO *]   name="facet.query">Price_5_f:[* TO *]   name="facet.query">Price_6_f:[* TO *]   name="facet.query">Price_7_f:[* TO *]   name="facet.query">Price_8_f:[* TO *]   name="facet.mincount">1  fc  
> json   name="json.nl">map  (title:* OR text:*) 
>  0  20 
> 
>
> Console log:
> INFO  (searcherExecutor-7-thread-1-processing-x:core1) [   x:core1] 
> o.a.s.c.S.Request [core1] webapp=null path=null 
> params={facet=true=1=0=Price_2_f:[*+TO+*]=Price_3_f:[*+TO+*]=Price_4_f:[*+TO+*]=Price_5_f:[*+TO+*]=Price_6_f:[*+TO+*]=Price_7_f:[*+TO+*]=Price_8_f:[*+TO+*]=newSearcher=(title:*+OR+text:*)=false=map=BrandID_s=json=fc=DataType_s:Product=WebSections_ms:house=VisibleOnline_ms:7={!tag%3Dcurrent_group}GroupIds_ms:*=20}
>  hits=2549 status=0 QTime=1263
>
>
> If I run the same query after the index has registered I see a QTime of over 
> a second, the second time I run the query I see around 80ms. This leads me to 
> believe the warming did not occur or the query was not commited to cache on 
> start up of the new searcher.
> Can someone please advise on how to use the newSearcher queries to 
> effectively warm SolR caches. Should I see an improved response for the first 
> time I run the query if the same query has been used as a newSearcher query?
> Cheers,
> Dalton


newSearcher autowarming queries in solrconfig.xml run but does not appear to warm cache

2016-10-06 Thread Dalton Gooding
After setting a number of newSearcher and firstSearcher queries, I can see in 
the console logs that the queries are run, but when I run the same query 
against the new searcher (using CuRL), I get a slow response time for the first 
run. 

Config:
                   DataType_s:Product            
  WebSections_ms:house              {!tag=current_group}GroupIds_ms:*
              true              BrandID_s              Price_2_f:[* TO *]              Price_3_f:[* TO *]              Price_4_f:[* TO *]              Price_5_f:[* TO *]              Price_6_f:[* TO *]              Price_7_f:[* TO *]              Price_8_f:[* TO *]              1              fc    
          json              map  
            (title:* OR text:*)              0              20              
   

Console log:
INFO  (searcherExecutor-7-thread-1-processing-x:core1) [   x:core1] 
o.a.s.c.S.Request [core1] webapp=null path=null 
params={facet=true=1=0=Price_2_f:[*+TO+*]=Price_3_f:[*+TO+*]=Price_4_f:[*+TO+*]=Price_5_f:[*+TO+*]=Price_6_f:[*+TO+*]=Price_7_f:[*+TO+*]=Price_8_f:[*+TO+*]=newSearcher=(title:*+OR+text:*)=false=map=BrandID_s=json=fc=DataType_s:Product=WebSections_ms:house=VisibleOnline_ms:7={!tag%3Dcurrent_group}GroupIds_ms:*=20}
 hits=2549 status=0 QTime=1263


If I run the same query after the index has registered I see a QTime of over a 
second, the second time I run the query I see around 80ms. This leads me to 
believe the warming did not occur or the query was not commited to cache on 
start up of the new searcher.
Can someone please advise on how to use the newSearcher queries to effectively 
warm SolR caches. Should I see an improved response for the first time I run 
the query if the same query has been used as a newSearcher query?
Cheers,
Dalton

Re: solrcloud: How can I get the schema.xml and solrconfig.xml locate.

2016-09-06 Thread cuizhaohua
Thank you for your help!

I am solve my problem with no config and shcema. It's OK now!!   : )

Thanks,
cuizhaohua



--
View this message in context: 
http://lucene.472066.n3.nabble.com/solrcloud-How-can-I-get-the-schema-xml-and-solrconfig-xml-locate-tp4294736p4294885.html
Sent from the Solr - User mailing list archive at Nabble.com.


Re: solrcloud: How can I get the schema.xml and solrconfig.xml locate.

2016-09-06 Thread Shawn Heisey
On 9/6/2016 12:03 AM, cuizhaohua wrote:
> My env is 3 zookeeper server, 10 node of solrcloud.   one of my collections
> with 5 shard, and each one have two cores.
>
>
> On web UI. I was via "core admin" unload a core .   and now ,I want add or
> reload the core  via "Add Core"
>
> name:
> instanceDir:
> dataDir:
> config:
> schema:
> collection:
> shard:
>
> how can I confirm the "  config:  "  and  " schema:   "  input field. 

The CoreAdmin API should *not* be used when you're running SolrCloud. 
Your cloud may now be in a delicate state, and recovery will not be
entirely straightforward.

The config and schema parameters are rarely needed for CoreAdmin.  The
minimum that you need for SolrCloud when using the CoreAdmin API is
name, collection, and shard.  These are usually the only parameters
you'll need.  However ... I must repeat what I said above:  When you are
in cloud mode, do NOT use CoreAdmin.  In cloud mode, it is an expert
option that can easily get you into a bad state.  I would even call it
an expert option when running in standalone mode.

Exactly what you will need to do right now depends on whether you asked
Solr to delete the instanceDir and/or dataDir when you unloaded the core.

If you did not ask Solr to delete the instanceDir when you unloaded the
core, you will find that core.properties has been renamed something like
core.properties.unloaded ... if you rename the file back to
core.properties and restart Solr, then the core will be back.  If there
are additional replicas in the cloud for that shard, then the core will
be brought back into sync with the rest of the cloud, and everything
will be fine.

If you DID ask Solr to delete the instanceDir, then hopefully there are
additional replicas of that shard, so you can use the ADDREPLICA action
on the Collections API to build another replica.  If the core that you
deleted was the only copy of that shard, then deleting the collection
entirely and recreating it might be your only good option for proper
operation.

https://cwiki.apache.org/confluence/display/solr/Collections+API#CollectionsAPI-api_addreplica

I know a little bit about how to use CoreAdmin properly in cloud mode,
but it's best to simply avoid using it entirely, and only use the
Collections API.

Thanks,
Shawn



solrcloud: How can I get the schema.xml and solrconfig.xml locate.

2016-09-06 Thread cuizhaohua
My env is 3 zookeeper server, 10 node of solrcloud.   one of my collections
with 5 shard, and each one have two cores.


On web UI. I was via "core admin" unload a core .   and now ,I want add or
reload the core  via "Add Core"

name:
instanceDir:
dataDir:
config:
schema:
collection:
shard:

how can I confirm the "  config:  "  and  " schema:   "  input field. 

Thank you for any help.  






--
View this message in context: 
http://lucene.472066.n3.nabble.com/solrcloud-How-can-I-get-the-schema-xml-and-solrconfig-xml-locate-tp4294736.html
Sent from the Solr - User mailing list archive at Nabble.com.


RE: Where should the listeners be defined in solrconfig.xml (Solr 6.0.1)

2016-08-03 Thread Alexandre Drouin
That's good to know thanks!

I should have thought to check the java code before asking.


Alexandre Drouin


-Original Message-
From: Mikhail Khludnev [mailto:m...@apache.org] 
Sent: August 3, 2016 10:52 AM
To: solr-user <solr-user@lucene.apache.org>
Subject: Re: Where should the listeners be defined in solrconfig.xml (Solr 
6.0.1)
Importance: High

As far as I remember the code it captures  everywhere
https://github.com/apache/lucene-solr/blob/master/solr/core/src/java/org/apache/solr/core/SolrConfig.java#L331
Double slash "//listener" means "everywhere".

On Wed, Aug 3, 2016 at 4:38 PM, Alexandre Drouin < 
alexandre.dro...@orckestra.com> wrote:

> Does anyone knows where the listener should be defined in solrconfig.xml?
>
>
> Alexandre Drouin
>
>
> -Original Message-
> From: Alexandre Drouin [mailto:alexandre.dro...@orckestra.com]
> Sent: July 29, 2016 10:46 AM
> To: solr-user@lucene.apache.org
> Subject: Where should the listeners be defined in solrconfig.xml (Solr
> 6.0.1)
> Importance: High
>
> Hello,
>
> I was wondering where I should put the  configuration.  I 
> can see from the sample solrconfig.xml that they are defined under the 
>  and  elements.
> The Schema API for listeners does not specify a parent of type 
> updateHandler or query so I wanted to know if I also define them 
> directly under the root of the xml document (config)?
>
> Alexandre Drouin
>
>


--
Sincerely yours
Mikhail Khludnev


Re: Where should the listeners be defined in solrconfig.xml (Solr 6.0.1)

2016-08-03 Thread Mikhail Khludnev
As far as I remember the code it captures  everywhere
https://github.com/apache/lucene-solr/blob/master/solr/core/src/java/org/apache/solr/core/SolrConfig.java#L331
Double slash "//listener" means "everywhere".

On Wed, Aug 3, 2016 at 4:38 PM, Alexandre Drouin <
alexandre.dro...@orckestra.com> wrote:

> Does anyone knows where the listener should be defined in solrconfig.xml?
>
>
> Alexandre Drouin
>
>
> -Original Message-
> From: Alexandre Drouin [mailto:alexandre.dro...@orckestra.com]
> Sent: July 29, 2016 10:46 AM
> To: solr-user@lucene.apache.org
> Subject: Where should the listeners be defined in solrconfig.xml (Solr
> 6.0.1)
> Importance: High
>
> Hello,
>
> I was wondering where I should put the  configuration.  I can
> see from the sample solrconfig.xml that they are defined under the
>  and  elements.
> The Schema API for listeners does not specify a parent of type
> updateHandler or query so I wanted to know if I also define them directly
> under the root of the xml document (config)?
>
> Alexandre Drouin
>
>


-- 
Sincerely yours
Mikhail Khludnev


RE: Where should the listeners be defined in solrconfig.xml (Solr 6.0.1)

2016-08-03 Thread Alexandre Drouin
Does anyone knows where the listener should be defined in solrconfig.xml?


Alexandre Drouin


-Original Message-
From: Alexandre Drouin [mailto:alexandre.dro...@orckestra.com] 
Sent: July 29, 2016 10:46 AM
To: solr-user@lucene.apache.org
Subject: Where should the listeners be defined in solrconfig.xml (Solr 6.0.1)
Importance: High

Hello,

I was wondering where I should put the  configuration.  I can see 
from the sample solrconfig.xml that they are defined under the  
and  elements.  
The Schema API for listeners does not specify a parent of type updateHandler or 
query so I wanted to know if I also define them directly under the root of the 
xml document (config)? 

Alexandre Drouin



Where should the listeners be defined in solrconfig.xml (Solr 6.0.1)

2016-07-29 Thread Alexandre Drouin
Hello,

I was wondering where I should put the  configuration.  I can see 
from the sample solrconfig.xml that they are defined under the  
and  elements.  
The Schema API for listeners does not specify a parent of type updateHandler or 
query so I wanted to know if I also define them directly under the root of the 
xml document (config)? 

Alexandre Drouin



Re: Is it possible to pass parameters through solrconfig.xml ?

2016-05-24 Thread Chris Hostetter

your question confuses me - pass "through" from where?  

when search components are defined in solrconfig.xml, they can be declared 
with any init params you want which will be passed to the init() method.   
Both the sample_techproducts_configs and data_driven_schema_configs that 
come with Solr show off examples of this (via SpellCheckComponent & 
QueryElevationComponent)

SearchComponents can also access any request params via the 
SolrQueryRequest (see ResponseBuilder.req).  These could also include 
default/invariant/appends params if they are defined on the requestHandler 
used (or in the recenlty added "initParams" options in solrconfig.xml)


: Date: Tue, 24 May 2016 09:08:30 -0700 (MST)
: From: vitaly bulgakov <bulgako...@yahoo.com>
: Reply-To: solr-user@lucene.apache.org
: To: solr-user@lucene.apache.org
: Subject: Is it possible to pass parameters through solrconfig.xml ?
: 
: I need to pass a parameter to one of my searchComponent class from
: solrconfog.xml file.
: Please advice me how to do it if it is possible.  
: 
: 
: 
: --
: View this message in context: 
http://lucene.472066.n3.nabble.com/Is-it-possible-to-pass-parameters-through-solrconfig-xml-tp4278852.html
: Sent from the Solr - User mailing list archive at Nabble.com.
: 

-Hoss
http://www.lucidworks.com/


Is it possible to pass parameters through solrconfig.xml ?

2016-05-24 Thread vitaly bulgakov
I need to pass a parameter to one of my searchComponent class from
solrconfog.xml file.
Please advice me how to do it if it is possible.  



--
View this message in context: 
http://lucene.472066.n3.nabble.com/Is-it-possible-to-pass-parameters-through-solrconfig-xml-tp4278852.html
Sent from the Solr - User mailing list archive at Nabble.com.


Re: solrconfig.xml location for cloud setup not working (works on single node)

2016-05-20 Thread Erick Erickson
I think you're missing the importance of Zookeeper here. You need to
upload the config to Zookeeper (as of Solr 5.5 you can do this with
bin/solr zk --upload, before that you have to use zkcli)...

Anyway, the pattern is:
1> create your config directory
2> upload it to Zookeeper.
3> create your collection _referencing that uploaded config_. There's
some logic in there that if you don't specify collection.configName on
the create command it'll look for a configset with the same name as
your collection

Thereafter, any time you want to change _anything_ in your configs,
you need to again upload them to Zookeeper and reload your collection.

Best,
Erick

On Fri, May 20, 2016 at 2:39 PM, Justin Edmands
<j.edma...@sagedining.com> wrote:
> I have configured a single node and configured a proper datahandler. I need 
> to move it to a clustered setup. Seems like I cannot, for the life of me, get 
> the datahandler to work with the cloud setup.
>
> ls 
> /opt/solr/solr-6.0.0/example/cloud/node1/solr/activityDigest_shard1_replica1/conf/
>
> currency.xml db-data-config.xml lang protwords.txt _rest_managed.json 
> schema.xml solrconfig.xml stopwords.txt synonyms.txt
>
> inside the solrconfig.xml, I have created a request handler to point to my 
> config file:
>
>  regex=".*\.jar" />
>  regex="solr-dataimporthandler-\d.*\.jar" />
>  class="org.apache.solr.handler.dataimport.DataImportHandler">
> 
> db-data-config.xml
> 
> 
>
>
> in the db-data-config.xml I have a working config (works in a single node 
> setup that is)
>
> 
>  driver="com.mysql.jdbc.Driver"
> url="jdbc:mysql://localhost:3306/ops"
> user="root"
> password="<"redacted" />
> 
>  pk="id"
> query="
> ...
> bunch of stuff
> ...
>
> 
>
> I expect to see a data import window but I just see "Sorry, no 
> dataimport-handler defined!" It's as if the directory is incorrect that I am 
> editing the conf files for.
>
>
> I guess a more direct question would be, does the cloud instance use a 
> solrconfig.xml from somewhere else?
>


solrconfig.xml location for cloud setup not working (works on single node)

2016-05-20 Thread Justin Edmands
I have configured a single node and configured a proper datahandler. I need to 
move it to a clustered setup. Seems like I cannot, for the life of me, get the 
datahandler to work with the cloud setup. 

ls 
/opt/solr/solr-6.0.0/example/cloud/node1/solr/activityDigest_shard1_replica1/conf/
 

currency.xml db-data-config.xml lang protwords.txt _rest_managed.json 
schema.xml solrconfig.xml stopwords.txt synonyms.txt 

inside the solrconfig.xml, I have created a request handler to point to my 
config file: 

 
 
 
 
db-data-config.xml 
 
 


in the db-data-config.xml I have a working config (works in a single node setup 
that is) 

 
 
 


Re: maxBooleanClauses in solrconfig.xml is ignored

2016-04-07 Thread Zaccheo Bagnati
No SolrCloud. however I've found the problem (though the reason it is not
completely clear to me).
I was passing terms as
field:("term1" "term2",,,)
I simply changed it as
field:(term1 term2 ...) and it worked as expected

I'm not so expert in reading debugQuery output but parsed_filter_queries
value seems the same for the version with and without quotes. (I'm using
the condition as a fq)
However now it works
Thanks


Il giorno gio 7 apr 2016 alle ore 16:28 Shawn Heisey <apa...@elyograg.org>
ha scritto:

> On 4/7/2016 8:05 AM, Zaccheo Bagnati wrote:
> > I'm trying to set the maxBooleanClauses parameter in solrconfig.xml to
> 1024
> > but I still have "Too many boolean clauses" error even with 513 terms
> (with
> > 512 terms it works).
> > I've read in the documentation (
> >
> https://cwiki.apache.org/confluence/display/solr/Query+Settings+in+SolrConfig
> )
> > the warning that it is a global setting but I have only 1 core so there
> are
> > not conflicting definitions. I don't know how to deal with this
> > I'm using SOLR 5.5.
>
> The default value for maxBooleanClauses is 1024, so if you're getting an
> error with 513 terms, then either your query is getting parsed so there
> are more terms, or you have a config somewhere that is setting the value
> to 512.
>
> Can you add "debugQuery=true" to your query and see what you are getting
> for the parsedquery?
>
> Are you running SolrCloud?  If you are, then editing a config file is
> not enough.  You also have to upload the changes to zookeeper.
>
> Thanks,
> Shawn
>
>


Re: maxBooleanClauses in solrconfig.xml is ignored

2016-04-07 Thread Jack Krupansky
Edismax phrase-boost terms?

-- Jack Krupansky

On Thu, Apr 7, 2016 at 10:28 AM, Shawn Heisey <apa...@elyograg.org> wrote:

> On 4/7/2016 8:05 AM, Zaccheo Bagnati wrote:
> > I'm trying to set the maxBooleanClauses parameter in solrconfig.xml to
> 1024
> > but I still have "Too many boolean clauses" error even with 513 terms
> (with
> > 512 terms it works).
> > I've read in the documentation (
> >
> https://cwiki.apache.org/confluence/display/solr/Query+Settings+in+SolrConfig
> )
> > the warning that it is a global setting but I have only 1 core so there
> are
> > not conflicting definitions. I don't know how to deal with this
> > I'm using SOLR 5.5.
>
> The default value for maxBooleanClauses is 1024, so if you're getting an
> error with 513 terms, then either your query is getting parsed so there
> are more terms, or you have a config somewhere that is setting the value
> to 512.
>
> Can you add "debugQuery=true" to your query and see what you are getting
> for the parsedquery?
>
> Are you running SolrCloud?  If you are, then editing a config file is
> not enough.  You also have to upload the changes to zookeeper.
>
> Thanks,
> Shawn
>
>


Re: maxBooleanClauses in solrconfig.xml is ignored

2016-04-07 Thread Shawn Heisey
On 4/7/2016 8:05 AM, Zaccheo Bagnati wrote:
> I'm trying to set the maxBooleanClauses parameter in solrconfig.xml to 1024
> but I still have "Too many boolean clauses" error even with 513 terms (with
> 512 terms it works).
> I've read in the documentation (
> https://cwiki.apache.org/confluence/display/solr/Query+Settings+in+SolrConfig)
> the warning that it is a global setting but I have only 1 core so there are
> not conflicting definitions. I don't know how to deal with this
> I'm using SOLR 5.5.

The default value for maxBooleanClauses is 1024, so if you're getting an
error with 513 terms, then either your query is getting parsed so there
are more terms, or you have a config somewhere that is setting the value
to 512.

Can you add "debugQuery=true" to your query and see what you are getting
for the parsedquery?

Are you running SolrCloud?  If you are, then editing a config file is
not enough.  You also have to upload the changes to zookeeper.

Thanks,
Shawn



maxBooleanClauses in solrconfig.xml is ignored

2016-04-07 Thread Zaccheo Bagnati
Hi all,
I'm trying to set the maxBooleanClauses parameter in solrconfig.xml to 1024
but I still have "Too many boolean clauses" error even with 513 terms (with
512 terms it works).
I've read in the documentation (
https://cwiki.apache.org/confluence/display/solr/Query+Settings+in+SolrConfig)
the warning that it is a global setting but I have only 1 core so there are
not conflicting definitions. I don't know how to deal with this
I'm using SOLR 5.5.
Thanks
Zaccheo


Re: [scottchu] Is it possible to create a new colr in Solr 5.5 using my old schema.xml and solrconfig.xml?

2016-03-28 Thread Shawn Heisey
On 3/28/2016 12:54 AM, scott.chu wrote:
> I have old schema.xml and solrconfig.xml from Solr 3.5. I want to rebuild the 
> core structure in Solr 5.5. I got some questions to request for answer or 
> suggestions:
>
> 1. Can I just put these 2 old xmls into the config folder and issue 
> 'bin\solr.cmd -c corename -d config folder path' to build a Solr 5.5 core? If 
> not, what modification should I make?
> (Note: AFAIK, MainIndex and IndexDefaults are replaced IndexConfig and no 
> long exist in new solrconfig.xml. Besides, there're no any schema.xmls in 
> Solr 5.5's examples.)

Solr 5.5 is capable of using schema.xml, but every 5.5 example was
changed to use the Managed schema factory instead of the Classic.  In
earlier versions, at least one one of the example configsets used the
Classic schema.

Most likely a 3.5 config will *not* work as-is in any 5.x version. 
There have simply been too many changes from two major releases and a
LOT of minor releases.

When the version jump is small and doesn't include a major release,
upgrading and using your existing config is usually no big deal.  With a
jump from 3.5 to 5.5, the best option is to start with a 5.5 example and
modify it (using 5.5 options, not 3.5) until it does what you need. 
This will mean adding the replication configuration and anything else
that's custom in the 3.5 config.  You may need to compare the
collection1 example config from 3.5 to the configsets in 5.5 to get an
idea of what's changed.

Something else to consider is starting with 5.4.1 instead of 5.5. 
Between the managed-schema changes and a handful of bugs in 5.5, the
techproducts-sample-configs configset found in 5.4.1 will probably work
better for you, and that config will probably work well through the
first few releases in 6.x with only *minor* changes.

> 2. What should I put in that config folder? Are they same as in Solr 3.5's?

It needs solrconfig.xml and a schema file whose name may be controlled
by solrconfig.xml.  Also add any support files referenced by either of
those files.  Support files can include one or more DIH configs,
synonyms, stopwords, etc.

> 3. Is there any special rules where I put the config folder? For example, Do 
> I have to put that folder under solr-5.5.0\server\solr\configsets? Is this 
> path must be relative to Solr home folder?

For the most part, data under configsets is used when creating a new
core with the "bin/solr create" command.  The configset will be copied
to the correct location (either core/conf or zookeeper) when creating a
new core/collection.  There is a non-cloud feature called configsets
which actually does use/share the configset files directly, but this
feature has some quirks and some things may not work as expected.

> 4. If the core is created ok, where is the the core folder that Solr has 
> built?

I'm assuming that you're talking about using the "bin/solr create"
command.  This will create a new core instanceDir in the solr home. 
Where this lives will depend on how you've started Solr, and whether you
used the service install script for Linux/UNIX platforms.

> 5. Is creating core with schema.xml achievable via admin UI?

This is a difficult question to answer, because creating cores with the
CoreAdmin section of the UI usually doesn't work the way people expect
it to.  If you're running SolrCloud, the CoreAdmin feature should not be
used at all.  With SolrCloud, you can definitely create new collections
without touching the filesystem directly, using the Collections API.

Thanks,
Shawn



[scottchu] Is it possible to create a new colr in Solr 5.5 using my old schema.xml and solrconfig.xml?

2016-03-28 Thread scott.chu
I have old schema.xml and solrconfig.xml from Solr 3.5. I want to rebuild the 
core structure in Solr 5.5. I got some questions to request for answer or 
suggestions:

1. Can I just put these 2 old xmls into the config folder and issue 
'bin\solr.cmd -c corename -d config folder path' to build a Solr 5.5 core? If 
not, what modification should I make?
(Note: AFAIK, MainIndex and IndexDefaults are replaced IndexConfig and no 
long exist in new solrconfig.xml. Besides, there're no any schema.xmls in Solr 
5.5's examples.)
2. What should I put in that config folder? Are they same as in Solr 3.5's?
3. Is there any special rules where I put the config folder? For example, Do I 
have to put that folder under solr-5.5.0\server\solr\configsets? Is this path 
must be relative to Solr home folder?
4. If the core is created ok, where is the the core folder that Solr has built?
5. Is creating core with schema.xml achievable via admin UI?

Thanks for help in advance!

scott.chu,scott@udngroup.com
2016/3/28 (週一)

P.S. Thanks to Reth RM's reply to my another post(How to rebuild master-slave 
multi-core with schema.xmlfrom old verison in Solr 5.5). I know I can just put 
replicationHanlder into new solrconfig.xml but still need a try.


RE: solrconfig.xml - configuration scope

2016-01-19 Thread Cario, Elaine
I know this is an old post, but I've seen the same behavior as well, 
specifically in setting invariants.  We had 2 cores with different 
schema/solrconfig, but they both had a (differently configured) request handler 
named /search.  The invariants set in one core leached into requests for the 
other core.  I never drilled down further to see if it was because of the 
matching request handler names, we just worked around the issue in some other 
way (which was a bit of a hack and we're working on a better solution for the 
original problem we were trying solve, in any case).  This was with 4.10.

-Original Message-
From: Erick Erickson [mailto:erickerick...@gmail.com] 
Sent: Tuesday, December 08, 2015 3:09 PM
To: solr-user <solr-user@lucene.apache.org>
Subject: Re: solrconfig.xml - configuration scope

What specifically are you seeing? Most things are per-core as you surmised.

There are a few things which, through interaction with Lucene global variables 
affect multiple cores, the one that comes to mind is maxBooleanClauses, where 
the value in the last core loaded "wins".

There might be some others though, that's the one that I remember.

What version of Solr are you running?

Best,
Erick

On Tue, Dec 8, 2015 at 9:52 AM, Fitzpatrick, Adrian <adri...@revenue.ie> wrote:
> Hi,
>
> This is probably a very basic question that has been asked many times before 
> - apologies in advance if so!
>
> I'm looking to validate whether something I **think** I have observed when 
> using Solr is a known behaviour:
>
> From my read of the docs etc. it was my understanding that 
> solrconfig.xml was the configuration for a core, and that if I had 
> multiple cores in my Solr server, each would have their own version of 
> that file with their own settings. However, in practice, when working 
> with such a multiple core setup, what I have observed suggests that 
> some (perhaps many?) of the settings within solrconfig.xml can have a 
> system-wide impact. I.e. I change a setting in core A and I see 
> behaviour in other cores B,C which suggests they are obeying the 
> changed value from the core A rather than the setting value from their 
> own copy of solrconfig.xml
>
> So, as I said, main question is this known/expected behaviour, or am I 
> imagining things! If the former, is there any documentation etc. that 
> provides any clarification around how the configuration scope operates?
>
> Thanks,
>
> Adrian
>
> Please note that Revenue cannot guarantee that any personal and sensitive 
> data, sent in plain text via standard email, is fully secure. Customers who 
> choose to use this channel are deemed to have accepted any risk involved. The 
> alternative communication methods offered by Revenue include standard post 
> and the option to use our (encrypted) MyEnquiries service which is available 
> within myAccount and ROS. You can register for either myAccount or ROS on the 
> Revenue website.
>
> Tabhair faoi deara nach fidir leis na Coimisinir 
> Ioncaim rthaocht a thabhairt go bhfuil aon sonra 
> pearsanta agus ogair a gcuirtear isteach i 
> ngnth-thacs tr r-phost caighdenach go huile 
> is go hiomln sln. Meastar go nglacann 
> custaimir a sideann an cainal seo le 
> haon riosca bainteach. I measc na modhanna cumarside eile at 
> ag na Coimisinir n post caighdenach agus an 
> rogha r seirbhs (criptithe) M'Fhiosruithe a 
> sid, t s ar fil laistigh de 
> MoChrsa agus ROS. Is fidir leat clr 
> le haghaidh ceachtar MoChrsa n ROS ar shuomh 
> grasin na gCoimisinir.


Re: error in initializing solrconfig.xml

2016-01-14 Thread Erick Erickson
Tell us a lot more. What exact error are you seeing in the Solr log?


On Wed, Jan 13, 2016 at 11:50 PM, Zap Org <zapor...@gmail.com> wrote:
> i have 2 running solr nodes in my cluster one node hot down. i restarted
> tomcat server and its throughing exception for initializing  solrconfig.xml
> and didnot recognize collection


error in initializing solrconfig.xml

2016-01-13 Thread Zap Org
i have 2 running solr nodes in my cluster one node hot down. i restarted
tomcat server and its throughing exception for initializing  solrconfig.xml
and didnot recognize collection


solrconfig.xml - configuration scope

2015-12-08 Thread Fitzpatrick, Adrian
Hi,

This is probably a very basic question that has been asked many times before - 
apologies in advance if so!

I'm looking to validate whether something I **think** I have observed when 
using Solr is a known behaviour:

>From my read of the docs etc. it was my understanding that solrconfig.xml was 
>the configuration for a core, and that if I had multiple cores in my Solr 
>server, each would have their own version of that file with their own 
>settings. However, in practice, when working with such a multiple core setup, 
>what I have observed suggests that some (perhaps many?) of the settings within 
>solrconfig.xml can have a system-wide impact. I.e. I change a setting in core 
>A and I see behaviour in other cores B,C which suggests they are obeying the 
>changed value from the core A rather than the setting value from their own 
>copy of solrconfig.xml

So, as I said, main question is this known/expected behaviour, or am I 
imagining things! If the former, is there any documentation etc. that provides 
any clarification around how the configuration scope operates?

Thanks,

Adrian

Please note that Revenue cannot guarantee that any personal and sensitive data, 
sent in plain text via standard email, is fully secure. Customers who choose to 
use this channel are deemed to have accepted any risk involved. The alternative 
communication methods offered by Revenue include standard post and the option 
to use our (encrypted) MyEnquiries service which is available within myAccount 
and ROS. You can register for either myAccount or ROS on the Revenue website.

Tabhair faoi deara nach fidir leis na Coimisinir 
Ioncaim rthaocht a thabhairt go bhfuil aon sonra 
pearsanta agus ogair a gcuirtear isteach i ngnth-thacs 
tr r-phost caighdenach go huile is go hiomln 
sln. Meastar go nglacann custaimir a 
sideann an cainal seo le haon riosca bainteach. I measc 
na modhanna cumarside eile at ag na Coimisinir 
n post caighdenach agus an rogha r seirbhs 
(criptithe) M'Fhiosruithe a sid, t s ar 
fil laistigh de MoChrsa agus ROS. Is fidir leat 
clr le haghaidh ceachtar MoChrsa n ROS 
ar shuomh grasin na gCoimisinir.


Re: solrconfig.xml - configuration scope

2015-12-08 Thread Erick Erickson
What specifically are you seeing? Most things are per-core as you surmised.

There are a few things which, through interaction with Lucene global
variables affect multiple cores, the one that comes to mind is
maxBooleanClauses, where the value in the last core loaded "wins".

There might be some others though, that's the one that I remember.

What version of Solr are you running?

Best,
Erick

On Tue, Dec 8, 2015 at 9:52 AM, Fitzpatrick, Adrian <adri...@revenue.ie> wrote:
> Hi,
>
> This is probably a very basic question that has been asked many times before 
> - apologies in advance if so!
>
> I'm looking to validate whether something I **think** I have observed when 
> using Solr is a known behaviour:
>
> From my read of the docs etc. it was my understanding that solrconfig.xml was 
> the configuration for a core, and that if I had multiple cores in my Solr 
> server, each would have their own version of that file with their own 
> settings. However, in practice, when working with such a multiple core setup, 
> what I have observed suggests that some (perhaps many?) of the settings 
> within solrconfig.xml can have a system-wide impact. I.e. I change a setting 
> in core A and I see behaviour in other cores B,C which suggests they are 
> obeying the changed value from the core A rather than the setting value from 
> their own copy of solrconfig.xml
>
> So, as I said, main question is this known/expected behaviour, or am I 
> imagining things! If the former, is there any documentation etc. that 
> provides any clarification around how the configuration scope operates?
>
> Thanks,
>
> Adrian
>
> Please note that Revenue cannot guarantee that any personal and sensitive 
> data, sent in plain text via standard email, is fully secure. Customers who 
> choose to use this channel are deemed to have accepted any risk involved. The 
> alternative communication methods offered by Revenue include standard post 
> and the option to use our (encrypted) MyEnquiries service which is available 
> within myAccount and ROS. You can register for either myAccount or ROS on the 
> Revenue website.
>
> Tabhair faoi deara nach fidir leis na Coimisinir 
> Ioncaim rthaocht a thabhairt go bhfuil aon sonra 
> pearsanta agus ogair a gcuirtear isteach i 
> ngnth-thacs tr r-phost caighdenach go huile 
> is go hiomln sln. Meastar go nglacann 
> custaimir a sideann an cainal seo le 
> haon riosca bainteach. I measc na modhanna cumarside eile at 
> ag na Coimisinir n post caighdenach agus an 
> rogha r seirbhs (criptithe) M'Fhiosruithe a 
> sid, t s ar fil laistigh de 
> MoChrsa agus ROS. Is fidir leat clr 
> le haghaidh ceachtar MoChrsa n ROS ar shuomh 
> grasin na gCoimisinir.


Re: which solrconfig.xml

2015-09-04 Thread Mark Fenbers

Chris,

The document "Uploading Structured Data Store Data with the Data Import 
Handler" has a number of references to solrconfig.xml, starting on Page 
2 and continuing on page 3 in the section "Configuring solrconfig.xml".  
It also is mentioned on Page 5 in the "Property Writer" and the "Data 
Sources" sections.  And other places in this document as well.


The solrconfig.xml file is also referenced (without a path) in the "Solr 
Quick Start" document, in the Design Overview section and other sections 
as well.  None of these references suggests the location of the 
solrconfig.xml file.  Doing a "find . -name solrconfig.xml" from the 
Solr home directory reveals about a dozen or so of these files in 
various subdirectories.  Thus, my confusion as to which one I need to 
customize...


I feel ready to graduate from the examples in "Solr Quick Start" 
document, e.g., using bin/solr -e dih and have fed in existing files on 
disk.  The tutorial was *excellent* for this part.  But now I want to 
build a "real" index using *my own* data from a database.  In doing 
this, I find the coaching in the tutorial to be rather absent.  For 
example, I haven't read in any of the documents I have found so far an 
explanation of why one might want to use more than one Solr node and 
more than one shard, or what the advantages are of using Solr in cloud 
mode vs stand-alone mode.  As a result, I had to 
improvise/guess/trial-and-error.  I did manage to configure my own data 
source and changed my queries to apply to my own data, but I did 
something wrong somewhere in solrconfig.xml because I get errors when 
running, now.  I solved some of them by copying the *.jar files from the 
./dist directory to the solr/lib directory (a tip I found when I googled 
the error message), but that only helped to a certain point.


I will post more specific questions about my issues when I have a chance 
to re-investigate that (hopefully later today).


I have *not* found specific Java code examples using Solr yet, but I 
haven't exhausted exploring the Solr website yet.  Hopefully, I'll find 
some examples using Solr in Java code...


Mark

On 9/2/2015 9:51 PM, Chris Hostetter wrote:

: various $HOME/solr-5.3.0 subdirectories.  The documents/tutorials say to edit
: the solrconfig.xml file for various configuration details, but they never say
: which one of these dozen to edit.  Moreover, I cannot determine which version

can you please give us a specific examples (ie: urls, page numbers &
version of the ref guide, etc...) of documentation that tell you to edit
the solrconfig.xml w/o being explicit about where to to find it so that we
can fix the docs?

FWIW: The official "Quick Start" tutorial does not mention editing
solrconfig.xml at all...

http://lucene.apache.org/solr/quickstart.html



-Hoss
http://www.lucidworks.com/





Re: which solrconfig.xml

2015-09-04 Thread Erick Erickson
Mark:

Right, the problem with Google searches (as you well know) is that you
get random snippets from all over the place, ones that often assume
some background knowledge.

There are several good books around that tend to have things arranged
progressively that might be a good investment, here's a good one:

http://www.amazon.com/Solr-Action-Trey-Grainger/dp/1617291021

and searching "apache solr" on Amazon turns up a bunch of others.

Best,
Erick



On Fri, Sep 4, 2015 at 4:43 AM, Mark Fenbers <mark.fenb...@noaa.gov> wrote:
> Chris,
>
> The document "Uploading Structured Data Store Data with the Data Import
> Handler" has a number of references to solrconfig.xml, starting on Page 2
> and continuing on page 3 in the section "Configuring solrconfig.xml".  It
> also is mentioned on Page 5 in the "Property Writer" and the "Data Sources"
> sections.  And other places in this document as well.
>
> The solrconfig.xml file is also referenced (without a path) in the "Solr
> Quick Start" document, in the Design Overview section and other sections as
> well.  None of these references suggests the location of the solrconfig.xml
> file.  Doing a "find . -name solrconfig.xml" from the Solr home directory
> reveals about a dozen or so of these files in various subdirectories.  Thus,
> my confusion as to which one I need to customize...
>
> I feel ready to graduate from the examples in "Solr Quick Start" document,
> e.g., using bin/solr -e dih and have fed in existing files on disk.  The
> tutorial was *excellent* for this part.  But now I want to build a "real"
> index using *my own* data from a database.  In doing this, I find the
> coaching in the tutorial to be rather absent.  For example, I haven't read
> in any of the documents I have found so far an explanation of why one might
> want to use more than one Solr node and more than one shard, or what the
> advantages are of using Solr in cloud mode vs stand-alone mode.  As a
> result, I had to improvise/guess/trial-and-error.  I did manage to configure
> my own data source and changed my queries to apply to my own data, but I did
> something wrong somewhere in solrconfig.xml because I get errors when
> running, now.  I solved some of them by copying the *.jar files from the
> ./dist directory to the solr/lib directory (a tip I found when I googled the
> error message), but that only helped to a certain point.
>
> I will post more specific questions about my issues when I have a chance to
> re-investigate that (hopefully later today).
>
> I have *not* found specific Java code examples using Solr yet, but I haven't
> exhausted exploring the Solr website yet.  Hopefully, I'll find some
> examples using Solr in Java code...
>
> Mark
>
> On 9/2/2015 9:51 PM, Chris Hostetter wrote:
>>
>> : various $HOME/solr-5.3.0 subdirectories.  The documents/tutorials say to
>> edit
>> : the solrconfig.xml file for various configuration details, but they
>> never say
>> : which one of these dozen to edit.  Moreover, I cannot determine which
>> version
>>
>> can you please give us a specific examples (ie: urls, page numbers &
>> version of the ref guide, etc...) of documentation that tell you to edit
>> the solrconfig.xml w/o being explicit about where to to find it so that we
>> can fix the docs?
>>
>> FWIW: The official "Quick Start" tutorial does not mention editing
>> solrconfig.xml at all...
>>
>> http://lucene.apache.org/solr/quickstart.html
>>
>>
>>
>> -Hoss
>> http://www.lucidworks.com/
>>
>


Re: which solrconfig.xml

2015-09-04 Thread Mikhail Khludnev
Mark,
Thanks for your feedback. Making Solr handy is important for us.

On Fri, Sep 4, 2015 at 1:43 PM, Mark Fenbers <mark.fenb...@noaa.gov> wrote:

> Chris,
>
> The document "Uploading Structured Data Store Data with the Data Import
> Handler" has a number of references to solrconfig.xml, starting on Page 2
> and continuing on page 3 in the section "Configuring solrconfig.xml".  It
> also is mentioned on Page 5 in the "Property Writer" and the "Data Sources"
> sections.  And other places in this document as well.
>
> The solrconfig.xml file is also referenced (without a path) in the "Solr
> Quick Start" document, in the Design Overview section and other sections as
> well.  None of these references suggests the location of the solrconfig.xml
> file.  Doing a "find . -name solrconfig.xml" from the Solr home directory
> reveals about a dozen or so of these files in various subdirectories.
> Thus, my confusion as to which one I need to customize...
>

Here I can only suggest to get into SolrAdmin, pick a particular core, and
find Instance directory on Overview tab. Here is the directory, which you
can run find for solrconfig.xml on.
I just wonder what exactly we can contribute into the guide?

We have
https://cwiki.apache.org/confluence/display/solr/Configuring+solrconfig.xml
The solrconfig.xml file is located in the conf/ directory for each
collection.


>
> I feel ready to graduate from the examples in "Solr Quick Start" document,
> e.g., using bin/solr -e dih and have fed in existing files on disk.  The
> tutorial was *excellent* for this part.  But now I want to build a "real"
> index using *my own* data from a database.  In doing this, I find the
> coaching in the tutorial to be rather absent.  For example, I haven't read
> in any of the documents I have found so far an explanation of why one might
> want to use more than one Solr node and more than one shard, or what the
> advantages are of using Solr in cloud mode vs stand-alone mode.

https://cwiki.apache.org/confluence/display/solr/SolrCloud

..., these capabilities provide distributed indexing and search
capabilities, supporting the following features:

   - ...
   - *Automatic load balancing and fail-over for queries*



> As a result, I had to improvise/guess/trial-and-error.  I did manage to
> configure my own data source and changed my queries to apply to my own
> data, but I did something wrong somewhere in solrconfig.xml because I get
> errors when running, now.  I solved some of them by copying the *.jar files
> from the ./dist directory to the solr/lib directory (a tip I found when I
> googled the error message), but that only helped to a certain point.
>
> I will post more specific questions about my issues when I have a chance
> to re-investigate that (hopefully later today).
>
> I have *not* found specific Java code examples using Solr yet, but I
> haven't exhausted exploring the Solr website yet.  Hopefully, I'll find
> some examples using Solr in Java code...
>

https://cwiki.apache.org/confluence/display/solr/Using+SolrJ
I think the essential parts are covered there.


>
> Mark
>
>
> On 9/2/2015 9:51 PM, Chris Hostetter wrote:
>
>> : various $HOME/solr-5.3.0 subdirectories.  The documents/tutorials say
>> to edit
>> : the solrconfig.xml file for various configuration details, but they
>> never say
>> : which one of these dozen to edit.  Moreover, I cannot determine which
>> version
>>
>> can you please give us a specific examples (ie: urls, page numbers &
>> version of the ref guide, etc...) of documentation that tell you to edit
>> the solrconfig.xml w/o being explicit about where to to find it so that we
>> can fix the docs?
>>
>> FWIW: The official "Quick Start" tutorial does not mention editing
>> solrconfig.xml at all...
>>
>> http://lucene.apache.org/solr/quickstart.html
>>
>>
>>
>> -Hoss
>> http://www.lucidworks.com/
>>
>>
>


-- 
Sincerely yours
Mikhail Khludnev
Principal Engineer,
Grid Dynamics

<http://www.griddynamics.com>
<mkhlud...@griddynamics.com>


Re: which solrconfig.xml

2015-09-03 Thread Mikhail Khludnev
Hello,

fwiw, a handy tool to answer such questions is $lsof -p 

On Wed, Sep 2, 2015 at 11:03 PM, Mark Fenbers <mark.fenb...@noaa.gov> wrote:

> Hi,  I've been fiddling with Solr for two whole days since
> downloading/unzipping it.  I've learned a lot by reading 4 documents and
> the web site.  However, there are a dozen or so instances of solrconfig.xml
> in various $HOME/solr-5.3.0 subdirectories.  The documents/tutorials say to
> edit the solrconfig.xml file for various configuration details, but they
> never say which one of these dozen to edit.  Moreover, I cannot determine
> which version is being used once I start solr, so that I would know which
> instance of this file to edit/customize.
>
> Can you help??
>
> Thanks!
> Mark
>



-- 
Sincerely yours
Mikhail Khludnev
Principal Engineer,
Grid Dynamics

<http://www.griddynamics.com>
<mkhlud...@griddynamics.com>


Re: which solrconfig.xml

2015-09-02 Thread Chris Hostetter
: various $HOME/solr-5.3.0 subdirectories.  The documents/tutorials say to edit
: the solrconfig.xml file for various configuration details, but they never say
: which one of these dozen to edit.  Moreover, I cannot determine which version

can you please give us a specific examples (ie: urls, page numbers & 
version of the ref guide, etc...) of documentation that tell you to edit 
the solrconfig.xml w/o being explicit about where to to find it so that we 
can fix the docs?

FWIW: The official "Quick Start" tutorial does not mention editing 
solrconfig.xml at all...

http://lucene.apache.org/solr/quickstart.html



-Hoss
http://www.lucidworks.com/


which solrconfig.xml

2015-09-02 Thread Mark Fenbers
Hi,  I've been fiddling with Solr for two whole days since 
downloading/unzipping it.  I've learned a lot by reading 4 documents and 
the web site.  However, there are a dozen or so instances of 
solrconfig.xml in various $HOME/solr-5.3.0 subdirectories.  The 
documents/tutorials say to edit the solrconfig.xml file for various 
configuration details, but they never say which one of these dozen to 
edit.  Moreover, I cannot determine which version is being used once I 
start solr, so that I would know which instance of this file to 
edit/customize.


Can you help??

Thanks!
Mark


Re: which solrconfig.xml

2015-09-02 Thread Alexandre Rafalovitch
Have you looked at Admin Web UI in details yet? When you look at the
"Overview" page, on the right hand side, it lists a bunch of
directories. You want one that says "Instance". Then, your
solrconfig.xml is in "conf" directory under that.

Regards,
   Alex.
P.s. Welcome!


Solr Analyzers, Tokenizers, Filters, URPs and even a newsletter:
http://www.solr-start.com/


On 2 September 2015 at 17:03, Mark Fenbers <mark.fenb...@noaa.gov> wrote:
> Hi,  I've been fiddling with Solr for two whole days since
> downloading/unzipping it.  I've learned a lot by reading 4 documents and the
> web site.  However, there are a dozen or so instances of solrconfig.xml in
> various $HOME/solr-5.3.0 subdirectories.  The documents/tutorials say to
> edit the solrconfig.xml file for various configuration details, but they
> never say which one of these dozen to edit.  Moreover, I cannot determine
> which version is being used once I start solr, so that I would know which
> instance of this file to edit/customize.
>
> Can you help??
>
> Thanks!
> Mark


Re: What's not a valid attribute data in Solr's schema.xml and solrconfig.xml

2015-06-12 Thread Steven White
Thank you Erik and Shawn for your support.

I'm using Solr's Schema API and Config API to manage and administer a Solr
deployment based on customer specific setting that my application will need
to do to a Solr deployment.  A client application will be using my APIs and
as part of data validation, I'm trying to figure out what to allow and what
not too as invalid attributes data that I cannot send to Solr.

For example, I wasn't sure that a request-handler name can have spaces or
can be all numeric, etc.  What about fields and field types, is there a
restriction for the field names?

I know my question is broad, but if there is a starting point, I can use
that to help me write application so that it is defensive against clients
who will use it to manage Solr.  If they use invalid data, I don't want to
send it to Solr and cause Solr to break.

Steve

On Fri, Jun 12, 2015 at 4:41 PM, Erik Hatcher erik.hatc...@gmail.com
wrote:

 Do note that AdminHandler*s* (plural) is * A special Handler that
 registers all standard admin handlers”, so if you’re trying to do something
 tricky with admin handlers,  Note that AdminHandlers is also deprecated and
 these admin handlers are implicitly registered with ImplicitPlugins these
 days.

 What kind of handler are you adding?  Or are you trying to change the
 prefix of all the admin handlers instead of /admin?

 —
 Erik Hatcher, Senior Solutions Architect
 http://www.lucidworks.com




  On Jun 12, 2015, at 3:56 PM, Shawn Heisey apa...@elyograg.org wrote:
 
  On 6/12/2015 1:02 PM, Steven White wrote:
  You are right.  If I use solr.SearchHandler it works, but if I
  use solr.admin.AdminHandlers like so:
 
   requestHandler name=987 class=solr.admin.AdminHandlers
   /requestHandler
 
  Solr reports this error:
 
  HTTP ERROR 500
 
  Problem accessing /solr/db/config/requestHandler. Reason:
 
 {msg=SolrCore 'db' is not available due to init failure: The
  AdminHandler needs to be registered to a path.  Typically this is
 
  With an admin handler, it won't be possible to access that handler if
  it's not a path that starts with a slash, so it can be incorporated into
  the request URL.  Solr is making sure the config will work, and throwing
  an error when it won't work.
 
  With search handlers, if you set up the config right, you *CAN* access
  that handler through a request parameter on the /select handler, it does
  not need to be part of the URL path.  The default config found in
  examples for 4.x and later will prevent that from working, but you can
  change the config to allow it ... so search handlers must work even if
  the name is not a path.  The config validation is not as strict as it is
  for an admin handler.
 
  Thanks,
  Shawn
 




Re: What's not a valid attribute data in Solr's schema.xml and solrconfig.xml

2015-06-12 Thread Steven White
Thanks Shawn.

Steve

On Fri, Jun 12, 2015 at 6:00 PM, Shawn Heisey apa...@elyograg.org wrote:

 On 6/12/2015 3:30 PM, Steven White wrote:
  Thank you Erik and Shawn for your support.
 
  I'm using Solr's Schema API and Config API to manage and administer a
 Solr
  deployment based on customer specific setting that my application will
 need
  to do to a Solr deployment.  A client application will be using my APIs
 and
  as part of data validation, I'm trying to figure out what to allow and
 what
  not too as invalid attributes data that I cannot send to Solr.
 
  For example, I wasn't sure that a request-handler name can have spaces or
  can be all numeric, etc.  What about fields and field types, is there a
  restriction for the field names?
 
  I know my question is broad, but if there is a starting point, I can use
  that to help me write application so that it is defensive against clients
  who will use it to manage Solr.  If they use invalid data, I don't want
 to
  send it to Solr and cause Solr to break.

 Even if things like spaces and punctuation are accepted, I wouldn't use
 them.  You can't be sure that all parts of Solr will support strange
 characters, much less third-party software.

 For handler names, they should always start with a forward slash and
 stick to letters, numbers, and the underscore, and also make sure you
 stick to ASCII characters numbered below 127, even if Solr would allow
 you to use other characters.  If you stick to that, you can be
 reasonably sure everything will work with any software.

 Field and type names should stick to letters, numbers, and the
 underscore character, also within the standard ASCII character set.

 I like to use only lowercase letters, but that's not a requirement.
 Note that if you do use mixed case, almost everything in Solr is case
 sensitive, so you must use the same case everywhere, and you should not
 use two names that differ only in the case of the letters, just in case
 something is NOT case sensitive.

 I also prefer to start identifiers with a letter, not a number, but I'm
 pretty sure that is also not a requirement.

 For best results, similar rules should be followed for other identifiers
 in a Solr config/schema.

 Thanks,
 Shawn




Re: What's not a valid attribute data in Solr's schema.xml and solrconfig.xml

2015-06-12 Thread Shawn Heisey
On 6/12/2015 3:30 PM, Steven White wrote:
 Thank you Erik and Shawn for your support.

 I'm using Solr's Schema API and Config API to manage and administer a Solr
 deployment based on customer specific setting that my application will need
 to do to a Solr deployment.  A client application will be using my APIs and
 as part of data validation, I'm trying to figure out what to allow and what
 not too as invalid attributes data that I cannot send to Solr.

 For example, I wasn't sure that a request-handler name can have spaces or
 can be all numeric, etc.  What about fields and field types, is there a
 restriction for the field names?

 I know my question is broad, but if there is a starting point, I can use
 that to help me write application so that it is defensive against clients
 who will use it to manage Solr.  If they use invalid data, I don't want to
 send it to Solr and cause Solr to break.

Even if things like spaces and punctuation are accepted, I wouldn't use
them.  You can't be sure that all parts of Solr will support strange
characters, much less third-party software.

For handler names, they should always start with a forward slash and
stick to letters, numbers, and the underscore, and also make sure you
stick to ASCII characters numbered below 127, even if Solr would allow
you to use other characters.  If you stick to that, you can be
reasonably sure everything will work with any software.

Field and type names should stick to letters, numbers, and the
underscore character, also within the standard ASCII character set.

I like to use only lowercase letters, but that's not a requirement. 
Note that if you do use mixed case, almost everything in Solr is case
sensitive, so you must use the same case everywhere, and you should not
use two names that differ only in the case of the letters, just in case
something is NOT case sensitive.

I also prefer to start identifiers with a letter, not a number, but I'm
pretty sure that is also not a requirement.

For best results, similar rules should be followed for other identifiers
in a Solr config/schema.

Thanks,
Shawn



Re: What's not a valid attribute data in Solr's schema.xml and solrconfig.xml

2015-06-12 Thread Erik Hatcher
Do note that AdminHandler*s* (plural) is * A special Handler that registers 
all standard admin handlers”, so if you’re trying to do something tricky with 
admin handlers,  Note that AdminHandlers is also deprecated and these admin 
handlers are implicitly registered with ImplicitPlugins these days.  

What kind of handler are you adding?  Or are you trying to change the prefix of 
all the admin handlers instead of /admin?

—
Erik Hatcher, Senior Solutions Architect
http://www.lucidworks.com




 On Jun 12, 2015, at 3:56 PM, Shawn Heisey apa...@elyograg.org wrote:
 
 On 6/12/2015 1:02 PM, Steven White wrote:
 You are right.  If I use solr.SearchHandler it works, but if I
 use solr.admin.AdminHandlers like so:
 
  requestHandler name=987 class=solr.admin.AdminHandlers
  /requestHandler
 
 Solr reports this error:
 
 HTTP ERROR 500
 
 Problem accessing /solr/db/config/requestHandler. Reason:
 
{msg=SolrCore 'db' is not available due to init failure: The
 AdminHandler needs to be registered to a path.  Typically this is
 
 With an admin handler, it won't be possible to access that handler if
 it's not a path that starts with a slash, so it can be incorporated into
 the request URL.  Solr is making sure the config will work, and throwing
 an error when it won't work.
 
 With search handlers, if you set up the config right, you *CAN* access
 that handler through a request parameter on the /select handler, it does
 not need to be part of the URL path.  The default config found in
 examples for 4.x and later will prevent that from working, but you can
 change the config to allow it ... so search handlers must work even if
 the name is not a path.  The config validation is not as strict as it is
 for an admin handler.
 
 Thanks,
 Shawn
 



Re: What's not a valid attribute data in Solr's schema.xml and solrconfig.xml

2015-06-12 Thread Shawn Heisey
On 6/12/2015 12:24 PM, Steven White wrote:
 I'm trying to sort out what's not valid in Solr's files.  For example, the
 following request-handler will cause Solr to fail to load (notice the
 missing / from 987 in the 'name'):

   requestHandler name=987 class=solr.SearchHandler
   /requestHandler

 But having a name with a space, such as / 987 or / 1 2 3  works.

 This is one example, but my question is much brother and extends to other
 attributes: where can I find what's not valid data in attributes used by
 both schema.xml and solrconfig.xml file?

I added that exact text to solrconfig.xml in a core created by
solr-5.1.0, and everything worked just fine.  I can see the handler
named 987 in Plugins/Stats - QUERYHANDLER.

What errors did you get in your log, and what version of Solr are you
running?

Thanks,
Shawn




Re: What's not a valid attribute data in Solr's schema.xml and solrconfig.xml

2015-06-12 Thread Steven White
You are right.  If I use solr.SearchHandler it works, but if I
use solr.admin.AdminHandlers like so:

  requestHandler name=987 class=solr.admin.AdminHandlers
  /requestHandler

Solr reports this error:

HTTP ERROR 500

Problem accessing /solr/db/config/requestHandler. Reason:

{msg=SolrCore 'db' is not available due to init failure: The
AdminHandler needs to be registered to a path.  Typically this is
'/admin',trace=org.apache.solr.common.SolrException: SolrCore 'db' is not
available due to init failure: The AdminHandler needs to be registered to a
path.  Typically this is '/admin'
at org.apache.solr.core.CoreContainer.getCore(CoreContainer.java:763)
at
org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:307)
at
org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:220)
at
org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1419)
at
org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:455)

Does this mean each request handler has its own level of error checking?

Steve

On Fri, Jun 12, 2015 at 2:43 PM, Shawn Heisey apa...@elyograg.org wrote:

 On 6/12/2015 12:24 PM, Steven White wrote:
  I'm trying to sort out what's not valid in Solr's files.  For example,
 the
  following request-handler will cause Solr to fail to load (notice the
  missing / from 987 in the 'name'):
 
requestHandler name=987 class=solr.SearchHandler
/requestHandler
 
  But having a name with a space, such as / 987 or / 1 2 3  works.
 
  This is one example, but my question is much brother and extends to other
  attributes: where can I find what's not valid data in attributes used by
  both schema.xml and solrconfig.xml file?

 I added that exact text to solrconfig.xml in a core created by
 solr-5.1.0, and everything worked just fine.  I can see the handler
 named 987 in Plugins/Stats - QUERYHANDLER.

 What errors did you get in your log, and what version of Solr are you
 running?

 Thanks,
 Shawn





Re: What's not a valid attribute data in Solr's schema.xml and solrconfig.xml

2015-06-12 Thread Shawn Heisey
On 6/12/2015 1:02 PM, Steven White wrote:
 You are right.  If I use solr.SearchHandler it works, but if I
 use solr.admin.AdminHandlers like so:

   requestHandler name=987 class=solr.admin.AdminHandlers
   /requestHandler

 Solr reports this error:

 HTTP ERROR 500

 Problem accessing /solr/db/config/requestHandler. Reason:

 {msg=SolrCore 'db' is not available due to init failure: The
 AdminHandler needs to be registered to a path.  Typically this is

With an admin handler, it won't be possible to access that handler if
it's not a path that starts with a slash, so it can be incorporated into
the request URL.  Solr is making sure the config will work, and throwing
an error when it won't work.

With search handlers, if you set up the config right, you *CAN* access
that handler through a request parameter on the /select handler, it does
not need to be part of the URL path.  The default config found in
examples for 4.x and later will prevent that from working, but you can
change the config to allow it ... so search handlers must work even if
the name is not a path.  The config validation is not as strict as it is
for an admin handler.

Thanks,
Shawn



What's not a valid attribute data in Solr's schema.xml and solrconfig.xml

2015-06-12 Thread Steven White
Hi,

I'm trying to sort out what's not valid in Solr's files.  For example, the
following request-handler will cause Solr to fail to load (notice the
missing / from 987 in the 'name'):

  requestHandler name=987 class=solr.SearchHandler
  /requestHandler

But having a name with a space, such as / 987 or / 1 2 3  works.

This is one example, but my question is much brother and extends to other
attributes: where can I find what's not valid data in attributes used by
both schema.xml and solrconfig.xml file?

Thanks in advance.

Steve


SolrCloud 4.8 - solrconfig.xml hot changes

2015-04-15 Thread Vincenzo D'Amore
Hi all,

can I change solrconfig.xml configuration when solrcloud is up and running?

Best regards,
Vincenzo


-- 
Vincenzo D'Amore
email: v.dam...@gmail.com
skype: free.dev
mobile: +39 349 8513251


Re: SolrCloud 4.8 - solrconfig.xml hot changes

2015-04-15 Thread Erick Erickson
Yes, but you must then push the changes up to Zookeeper (usually via
zkcli -cmd upconfig ) then reload the collection to get the
changes to take effect on all the replicas.

Best,
Erick

On Wed, Apr 15, 2015 at 6:12 AM, Vincenzo D'Amore v.dam...@gmail.com wrote:
 Hi all,

 can I change solrconfig.xml configuration when solrcloud is up and running?

 Best regards,
 Vincenzo


 --
 Vincenzo D'Amore
 email: v.dam...@gmail.com
 skype: free.dev
 mobile: +39 349 8513251


Re: SolrCloud 4.8 - solrconfig.xml hot changes

2015-04-15 Thread Vincenzo D'Amore
Thanks, it works :)

On Wed, Apr 15, 2015 at 4:38 PM, Erick Erickson erickerick...@gmail.com
wrote:

 Yes, but you must then push the changes up to Zookeeper (usually via
 zkcli -cmd upconfig ) then reload the collection to get the
 changes to take effect on all the replicas.

 Best,
 Erick

 On Wed, Apr 15, 2015 at 6:12 AM, Vincenzo D'Amore v.dam...@gmail.com
 wrote:
  Hi all,
 
  can I change solrconfig.xml configuration when solrcloud is up and
 running?
 
  Best regards,
  Vincenzo
 
 
  --
  Vincenzo D'Amore
  email: v.dam...@gmail.com
  skype: free.dev
  mobile: +39 349 8513251




-- 
Vincenzo D'Amore
email: v.dam...@gmail.com
skype: free.dev
mobile: +39 349 8513251


Re: solrconfig.xml error

2015-04-08 Thread Andrea Gazzarini

Hi Pradeep,
AFAIK the mailing list doesn't allow attachments. I think pasting the 
error should be enough


Best,
Andrea

On 04/08/2015 09:02 AM, Pradeep wrote:
We have installed solr-4.3.0 is our local but we are getting error. 
Please find attachment. And help us to fix this error.


Thank You.
Regards,
Pradeep




solrconfig.xml error

2015-04-08 Thread Pradeep
We have installed solr-4.3.0 is our local but we are getting error. 
Please find attachment. And help us to fix this error.


Thank You.
Regards,
Pradeep


Re: Config join parse in solrconfig.xml

2015-04-07 Thread Frank li
Cool. It actually works after I removed those extra columns. Thanks for
your help.

On Mon, Apr 6, 2015 at 8:19 PM, Erick Erickson erickerick...@gmail.com
wrote:

 df does not allow multiple fields, it stands for default field, not
 default fields. To get what you're looking for, you need to use
 edismax or explicitly create the multiple clauses.

 I'm not quite sure what the join parser is doing with the df
 parameter. So my first question is what happens if you just use a
 single field for df?.

 Best,
 Erick

 On Mon, Apr 6, 2015 at 11:51 AM, Frank li fudon...@gmail.com wrote:
  The error message was from the query with debug=query.
 
  On Mon, Apr 6, 2015 at 11:49 AM, Frank li fudon...@gmail.com wrote:
 
  Hi Erick,
 
 
  Thanks for your response.
 
  Here is the query I am sending:
 
 
 http://dev-solr:8080/solr/collection1/select?q={!join+from=litigation_id_ls+to=lit_id_lms}all_text:applefq=type:PartyLawyerLawfirmfacet=truefacet.field=lawyer_id_lmsfacet.mincount=1rows=0
  
 http://dev-solr:8080/solr/collection1/select?q=%7B!join+from=litigation_id_ls+to=lit_id_lms%7Dall_text:applefq=type:PartyLawyerLawfirmfacet=truefacet.field=lawyer_id_lmsfacet.mincount=1rows=0
 
 
  You can see it has all_text:apple. I added field name all_text,
  because it gives error without it.
 
  Errors:
 
  lst name=errorstr name=msgundefined field all_text number party
  name all_code ent_name/strint name=code400/int/lst
 
 
  These fields are defined as the default search fields in our
  solr_config.xml file:
 
  str name=dfall_text number party name all_code ent_name/str
 
 
  Thanks,
 
  Fudong
 
  On Fri, Apr 3, 2015 at 1:31 PM, Erick Erickson erickerick...@gmail.com
 
  wrote:
 
  You have to show us several more things:
 
  1 what exactly does the query look like?
  2 what do you expect?
  3 output when you specify debug=query
  4 anything else that would help. You might review:
 
  http://wiki.apache.org/solr/UsingMailingLists
 
  Best,
  Erick
 
  On Fri, Apr 3, 2015 at 10:58 AM, Frank li fudon...@gmail.com wrote:
   Hi,
  
   I am starting using join parser with our solr. We have some default
  fields.
   They are defined in solrconfig.xml:
  
 lst name=defaults
  str name=defTypeedismax/str
  str name=echoParamsexplicit/str
  int name=rows10/int
  str name=dfall_text number party name all_code
 ent_name/str
  str name=qfall_text number^3 name^5 party^3 all_code^2
   ent_name^7/str
  str name=flid description market_sector_type parent
  ult_parent
   ent_name title patent_title *_ls *_lms *_is *_texts *_ac *_as *_s
 *_ss
  *_ds
   *_sms *_ss *_bs/str
  str name=q.opAND/str
/lst
  
  
   I found out once I use join parser, it does not recognize the default
   fields any more. How do I modify the configuration for this?
  
   Thanks,
  
   Fred
 
 
 



Re: Config join parse in solrconfig.xml

2015-04-06 Thread Frank li
Hi Erick,


Thanks for your response.

Here is the query I am sending:
http://dev-solr:8080/solr/collection1/select?q={!join+from=litigation_id_ls+to=lit_id_lms}all_text:applefq=type:PartyLawyerLawfirmfacet=truefacet.field=lawyer_id_lmsfacet.mincount=1rows=0

You can see it has all_text:apple. I added field name all_text, because
it gives error without it.

Errors:

lst name=errorstr name=msgundefined field all_text number party
name all_code ent_name/strint name=code400/int/lst


These fields are defined as the default search fields in our
solr_config.xml file:

str name=dfall_text number party name all_code ent_name/str


Thanks,

Fudong

On Fri, Apr 3, 2015 at 1:31 PM, Erick Erickson erickerick...@gmail.com
wrote:

 You have to show us several more things:

 1 what exactly does the query look like?
 2 what do you expect?
 3 output when you specify debug=query
 4 anything else that would help. You might review:

 http://wiki.apache.org/solr/UsingMailingLists

 Best,
 Erick

 On Fri, Apr 3, 2015 at 10:58 AM, Frank li fudon...@gmail.com wrote:
  Hi,
 
  I am starting using join parser with our solr. We have some default
 fields.
  They are defined in solrconfig.xml:
 
lst name=defaults
 str name=defTypeedismax/str
 str name=echoParamsexplicit/str
 int name=rows10/int
 str name=dfall_text number party name all_code ent_name/str
 str name=qfall_text number^3 name^5 party^3 all_code^2
  ent_name^7/str
 str name=flid description market_sector_type parent ult_parent
  ent_name title patent_title *_ls *_lms *_is *_texts *_ac *_as *_s *_ss
 *_ds
  *_sms *_ss *_bs/str
 str name=q.opAND/str
   /lst
 
 
  I found out once I use join parser, it does not recognize the default
  fields any more. How do I modify the configuration for this?
 
  Thanks,
 
  Fred



Re: Config join parse in solrconfig.xml

2015-04-06 Thread Frank li
The error message was from the query with debug=query.

On Mon, Apr 6, 2015 at 11:49 AM, Frank li fudon...@gmail.com wrote:

 Hi Erick,


 Thanks for your response.

 Here is the query I am sending:

 http://dev-solr:8080/solr/collection1/select?q={!join+from=litigation_id_ls+to=lit_id_lms}all_text:applefq=type:PartyLawyerLawfirmfacet=truefacet.field=lawyer_id_lmsfacet.mincount=1rows=0
 http://dev-solr:8080/solr/collection1/select?q=%7B!join+from=litigation_id_ls+to=lit_id_lms%7Dall_text:applefq=type:PartyLawyerLawfirmfacet=truefacet.field=lawyer_id_lmsfacet.mincount=1rows=0

 You can see it has all_text:apple. I added field name all_text,
 because it gives error without it.

 Errors:

 lst name=errorstr name=msgundefined field all_text number party
 name all_code ent_name/strint name=code400/int/lst


 These fields are defined as the default search fields in our
 solr_config.xml file:

 str name=dfall_text number party name all_code ent_name/str


 Thanks,

 Fudong

 On Fri, Apr 3, 2015 at 1:31 PM, Erick Erickson erickerick...@gmail.com
 wrote:

 You have to show us several more things:

 1 what exactly does the query look like?
 2 what do you expect?
 3 output when you specify debug=query
 4 anything else that would help. You might review:

 http://wiki.apache.org/solr/UsingMailingLists

 Best,
 Erick

 On Fri, Apr 3, 2015 at 10:58 AM, Frank li fudon...@gmail.com wrote:
  Hi,
 
  I am starting using join parser with our solr. We have some default
 fields.
  They are defined in solrconfig.xml:
 
lst name=defaults
 str name=defTypeedismax/str
 str name=echoParamsexplicit/str
 int name=rows10/int
 str name=dfall_text number party name all_code ent_name/str
 str name=qfall_text number^3 name^5 party^3 all_code^2
  ent_name^7/str
 str name=flid description market_sector_type parent
 ult_parent
  ent_name title patent_title *_ls *_lms *_is *_texts *_ac *_as *_s *_ss
 *_ds
  *_sms *_ss *_bs/str
 str name=q.opAND/str
   /lst
 
 
  I found out once I use join parser, it does not recognize the default
  fields any more. How do I modify the configuration for this?
 
  Thanks,
 
  Fred





Re: Config join parse in solrconfig.xml

2015-04-06 Thread Erick Erickson
df does not allow multiple fields, it stands for default field, not
default fields. To get what you're looking for, you need to use
edismax or explicitly create the multiple clauses.

I'm not quite sure what the join parser is doing with the df
parameter. So my first question is what happens if you just use a
single field for df?.

Best,
Erick

On Mon, Apr 6, 2015 at 11:51 AM, Frank li fudon...@gmail.com wrote:
 The error message was from the query with debug=query.

 On Mon, Apr 6, 2015 at 11:49 AM, Frank li fudon...@gmail.com wrote:

 Hi Erick,


 Thanks for your response.

 Here is the query I am sending:

 http://dev-solr:8080/solr/collection1/select?q={!join+from=litigation_id_ls+to=lit_id_lms}all_text:applefq=type:PartyLawyerLawfirmfacet=truefacet.field=lawyer_id_lmsfacet.mincount=1rows=0
 http://dev-solr:8080/solr/collection1/select?q=%7B!join+from=litigation_id_ls+to=lit_id_lms%7Dall_text:applefq=type:PartyLawyerLawfirmfacet=truefacet.field=lawyer_id_lmsfacet.mincount=1rows=0

 You can see it has all_text:apple. I added field name all_text,
 because it gives error without it.

 Errors:

 lst name=errorstr name=msgundefined field all_text number party
 name all_code ent_name/strint name=code400/int/lst


 These fields are defined as the default search fields in our
 solr_config.xml file:

 str name=dfall_text number party name all_code ent_name/str


 Thanks,

 Fudong

 On Fri, Apr 3, 2015 at 1:31 PM, Erick Erickson erickerick...@gmail.com
 wrote:

 You have to show us several more things:

 1 what exactly does the query look like?
 2 what do you expect?
 3 output when you specify debug=query
 4 anything else that would help. You might review:

 http://wiki.apache.org/solr/UsingMailingLists

 Best,
 Erick

 On Fri, Apr 3, 2015 at 10:58 AM, Frank li fudon...@gmail.com wrote:
  Hi,
 
  I am starting using join parser with our solr. We have some default
 fields.
  They are defined in solrconfig.xml:
 
lst name=defaults
 str name=defTypeedismax/str
 str name=echoParamsexplicit/str
 int name=rows10/int
 str name=dfall_text number party name all_code ent_name/str
 str name=qfall_text number^3 name^5 party^3 all_code^2
  ent_name^7/str
 str name=flid description market_sector_type parent
 ult_parent
  ent_name title patent_title *_ls *_lms *_is *_texts *_ac *_as *_s *_ss
 *_ds
  *_sms *_ss *_bs/str
 str name=q.opAND/str
   /lst
 
 
  I found out once I use join parser, it does not recognize the default
  fields any more. How do I modify the configuration for this?
 
  Thanks,
 
  Fred





Config join parse in solrconfig.xml

2015-04-03 Thread Frank li
Hi,

I am starting using join parser with our solr. We have some default fields.
They are defined in solrconfig.xml:

  lst name=defaults
   str name=defTypeedismax/str
   str name=echoParamsexplicit/str
   int name=rows10/int
   str name=dfall_text number party name all_code ent_name/str
   str name=qfall_text number^3 name^5 party^3 all_code^2
ent_name^7/str
   str name=flid description market_sector_type parent ult_parent
ent_name title patent_title *_ls *_lms *_is *_texts *_ac *_as *_s *_ss *_ds
*_sms *_ss *_bs/str
   str name=q.opAND/str
 /lst


I found out once I use join parser, it does not recognize the default
fields any more. How do I modify the configuration for this?

Thanks,

Fred


Re: Config join parse in solrconfig.xml

2015-04-03 Thread Erick Erickson
You have to show us several more things:

1 what exactly does the query look like?
2 what do you expect?
3 output when you specify debug=query
4 anything else that would help. You might review:

http://wiki.apache.org/solr/UsingMailingLists

Best,
Erick

On Fri, Apr 3, 2015 at 10:58 AM, Frank li fudon...@gmail.com wrote:
 Hi,

 I am starting using join parser with our solr. We have some default fields.
 They are defined in solrconfig.xml:

   lst name=defaults
str name=defTypeedismax/str
str name=echoParamsexplicit/str
int name=rows10/int
str name=dfall_text number party name all_code ent_name/str
str name=qfall_text number^3 name^5 party^3 all_code^2
 ent_name^7/str
str name=flid description market_sector_type parent ult_parent
 ent_name title patent_title *_ls *_lms *_is *_texts *_ac *_as *_s *_ss *_ds
 *_sms *_ss *_bs/str
str name=q.opAND/str
  /lst


 I found out once I use join parser, it does not recognize the default
 fields any more. How do I modify the configuration for this?

 Thanks,

 Fred


Re: Where is schema.xml and solrconfig.xml in solr 5.0.0

2015-03-12 Thread Nitin Solanki
Hi. Erick..
   Would please help me distinguish between
Uploading a Configuration Directory and Linking a Collection to a
Configuration Set ?

On Thu, Mar 12, 2015 at 2:01 AM, Nitin Solanki nitinml...@gmail.com wrote:

 Thanks a lot Erick.. It will be helpful.

 On Wed, Mar 11, 2015 at 9:27 PM, Erick Erickson erickerick...@gmail.com
 wrote:

 The configs are in Zookeeper. So you have to switch your thinking,
 it's rather confusing at first.

 When you create a collection, you specify a config set, these are
 usually in

 ./server/solr/configsets/data_driven_schema,
 ./server/solr/configsets/techproducts and the like.

 The entire conf directory under one of these is copied to Zookeeper
 (which you can see
 from the admin screen cloudtree, then in the right hand side you'll
 be able to find the config sets
 you uploaded.

 But, you cannot edit them there directly. You edit them on disk, then
 push them to Zookeeper,
 then reload the collection (or restart everything). See the reference
 guide here:
 https://cwiki.apache.org/confluence/display/solr/Command+Line+Utilities

 Best,
 Erick

 On Wed, Mar 11, 2015 at 6:01 AM, Nitin Solanki nitinml...@gmail.com
 wrote:
  Hi, alexandre..
 
  Thanks for responding...
  When I created new collection(wikingram) using solrCloud. It gets create
  into example/cloud/node*(node1, node2) like that.
  I have used *schema.xml and solrconfig.xml of
 sample_techproducts_configs*
  configuration.
 
  Now, The problem is that.
  If I change the configuration of *solrconfig.xml of *
  *sample_techproducts_configs*. Its configuration doesn't reflect on
  *wikingram* collection.
  How to reflect the changes of configuration in the collection?
 
  On Wed, Mar 11, 2015 at 5:42 PM, Alexandre Rafalovitch 
 arafa...@gmail.com
  wrote:
 
  Which example are you using? Or how are you creating your collection?
 
  If you are using your example, it creates a new directory under
  example. If you are creating a new collection with -c, it creates
  a new directory under the server/solr. The actual files are a bit
  deeper than usual to allow for a log folder next to the collection
  folder. So, for example:
  example/schemaless/solr/gettingstarted/conf/solrconfig.xml
 
  If it's a dynamic schema configuration, you don't actually have
  schema.xml, but managed-schema, as you should be mostly using REST
  calls to configure it.
 
  If you want to see the configuration files before the collection
  actually created, they are under server/solr/configsets, though they
  are not configsets in Solr sense, as they do get copied when you
  create your collections (sharing them causes issues).
 
  Regards,
 Alex.
  
  Solr Analyzers, Tokenizers, Filters, URPs and even a newsletter:
  http://www.solr-start.com/
 
 
  On 11 March 2015 at 07:50, Nitin Solanki nitinml...@gmail.com wrote:
   Hello,
  I have switched from solr 4.10.2 to solr 5.0.0. In
 solr
   4-10.2, schema.xml and solrconfig.xml were in example/solr/conf/
 folder.
   Where is schema.xml and solrconfig.xml in solr 5.0.0 ? and also want
 to
   know how to configure in solrcloud ?
 





Re: Where is schema.xml and solrconfig.xml in solr 5.0.0

2015-03-12 Thread Nitin Solanki
Thanks Shawn and Erick for explanation...

On Thu, Mar 12, 2015 at 9:02 PM, Shawn Heisey apa...@elyograg.org wrote:

 On 3/12/2015 9:18 AM, Erick Erickson wrote:
  By and large, I really never use linking. But it's about associating a
  config set
  you've _already_ uploaded with a collection.
 
  So uploading is pushing the configset from your local machine up to
 Zookeeper,
  and linking is using that uploaded, named configuration with an
  arbitrary collection.
 
  But usually you just make this association when creating the collection.

 The primary use case that I see for linkconfig is in testing upgrades to
 configurations.  So let's say you have a production collection that uses
 a config that you name fooV1 for foo version 1.  You can build a test
 collection that uses a config named fooV2, work out all the bugs, and
 then when you're ready to deploy it, you can use linkconfig to link your
 production collection to fooV2, reload the collection, and you're using
 the new config.  I haven't discussed here how to handle the situation
 where a reindex is required.

 One thing you CAN do is run linkconfig for a collection that doesn't
 exist yet, and then you don't need to include collection.configName when
 you create the collection, because the link is already present in
 zookeeper.  I personally don't like doing things this way, but I'm
 pretty sure it works.

 Thanks,
 Shawn




Re: Where is schema.xml and solrconfig.xml in solr 5.0.0

2015-03-12 Thread Erick Erickson
By and large, I really never use linking. But it's about associating a
config set
you've _already_ uploaded with a collection.

So uploading is pushing the configset from your local machine up to Zookeeper,
and linking is using that uploaded, named configuration with an
arbitrary collection.

But usually you just make this association when creating the collection.

It's simple to test all this out, just upconfig a couple of config
sets, play with the linking
and reload the collections. From there the admin UI will show you what actually
happened.

Best,
Erick

On Thu, Mar 12, 2015 at 2:39 AM, Nitin Solanki nitinml...@gmail.com wrote:
 Hi. Erick..
Would please help me distinguish between
 Uploading a Configuration Directory and Linking a Collection to a
 Configuration Set ?

 On Thu, Mar 12, 2015 at 2:01 AM, Nitin Solanki nitinml...@gmail.com wrote:

 Thanks a lot Erick.. It will be helpful.

 On Wed, Mar 11, 2015 at 9:27 PM, Erick Erickson erickerick...@gmail.com
 wrote:

 The configs are in Zookeeper. So you have to switch your thinking,
 it's rather confusing at first.

 When you create a collection, you specify a config set, these are
 usually in

 ./server/solr/configsets/data_driven_schema,
 ./server/solr/configsets/techproducts and the like.

 The entire conf directory under one of these is copied to Zookeeper
 (which you can see
 from the admin screen cloudtree, then in the right hand side you'll
 be able to find the config sets
 you uploaded.

 But, you cannot edit them there directly. You edit them on disk, then
 push them to Zookeeper,
 then reload the collection (or restart everything). See the reference
 guide here:
 https://cwiki.apache.org/confluence/display/solr/Command+Line+Utilities

 Best,
 Erick

 On Wed, Mar 11, 2015 at 6:01 AM, Nitin Solanki nitinml...@gmail.com
 wrote:
  Hi, alexandre..
 
  Thanks for responding...
  When I created new collection(wikingram) using solrCloud. It gets create
  into example/cloud/node*(node1, node2) like that.
  I have used *schema.xml and solrconfig.xml of
 sample_techproducts_configs*
  configuration.
 
  Now, The problem is that.
  If I change the configuration of *solrconfig.xml of *
  *sample_techproducts_configs*. Its configuration doesn't reflect on
  *wikingram* collection.
  How to reflect the changes of configuration in the collection?
 
  On Wed, Mar 11, 2015 at 5:42 PM, Alexandre Rafalovitch 
 arafa...@gmail.com
  wrote:
 
  Which example are you using? Or how are you creating your collection?
 
  If you are using your example, it creates a new directory under
  example. If you are creating a new collection with -c, it creates
  a new directory under the server/solr. The actual files are a bit
  deeper than usual to allow for a log folder next to the collection
  folder. So, for example:
  example/schemaless/solr/gettingstarted/conf/solrconfig.xml
 
  If it's a dynamic schema configuration, you don't actually have
  schema.xml, but managed-schema, as you should be mostly using REST
  calls to configure it.
 
  If you want to see the configuration files before the collection
  actually created, they are under server/solr/configsets, though they
  are not configsets in Solr sense, as they do get copied when you
  create your collections (sharing them causes issues).
 
  Regards,
 Alex.
  
  Solr Analyzers, Tokenizers, Filters, URPs and even a newsletter:
  http://www.solr-start.com/
 
 
  On 11 March 2015 at 07:50, Nitin Solanki nitinml...@gmail.com wrote:
   Hello,
  I have switched from solr 4.10.2 to solr 5.0.0. In
 solr
   4-10.2, schema.xml and solrconfig.xml were in example/solr/conf/
 folder.
   Where is schema.xml and solrconfig.xml in solr 5.0.0 ? and also want
 to
   know how to configure in solrcloud ?
 





  1   2   3   4   >