Re: Collection deleted still in zookeeper
Yes, maybe it's more complicated I was thinking. But it's good to know that newer version of Solr still work in the same way. Thanks again On Mon, 7 Dec 2020 at 13:08, Erick Erickson wrote: > What should happen when you delete a collection and _only_ that > collection references the configset has been discussed several > times, and… whatever is chosen is wrong ;) > > 1> if we delete the configset, then if you want to delete a collection > to insure that you’re starting all over for whatever reason, your > configset is gone and you need to find it again. > > 2> If we _don’t_ delete the configset, then you can wind up with > obsolete configsets polluting Zookeeper… > > 3> If we make a copy of the configset every time we make a collection, > then there can be a bazillion of them in a large installation. > > Best, > Erick > > > On Dec 7, 2020, at 6:52 AM, Marisol Redondo < > marisol.redondo.gar...@gmail.com> wrote: > > > > Thanks Erick for the answer, you gave me the clue to find the issue. > > > > The real problem is that when I removed the collection using the solr API > > (http://solrintance:port > /solr/admin/collections?action=DELETE=collectionname) > > the config files are not deleted. I don't know if this is the normal > > behavior in every version of solr (I'm using version 6), but I think when > > deleting the collection, the config files for this collection should be > > removed. > > > > Anyway, I found that the config where still in the UI/cloud/tree/configs > > and they can be removed using the solr zk -r configs/myconfig and this > > solve the issue. > > > > Thanks > > > > > > > > > > > > > > On Fri, 4 Dec 2020 at 15:46, Erick Erickson > wrote: > > > >> This almost always a result of one of two things: > >> > >> 1> you didn’t upload the config to the correct place or the ZK that Solr > >> uses. > >> or > >> 2> you still have a syntax problem in the config. > >> > >> The solr.log file on the node that’s failing may have a more useful > >> error message about what’s wrong. Also, you can try validating the XML > >> with one of the online tools. > >> > >> Are you totally and absolutely sure that, for instance, you’re uploading > >> to the correct Zookeeper? You should be able to look at the admin UI > >> screen and see the ZK address. I’ve seen this happen when people > >> inadvertently use the embedded ZK for one operation but not for the > >> other. Of have the ZK_HOST environment variable pointing to some > >> ZK ensemble that’s used when you start Solr but not when you upload > >> files. Or… > >> > >> Use the admin UI>>cloud>>tree>>configs>>your_config_name > >> to see if the solrconfig has the correct changes. I’ll often add some > >> bogus comment in the early part of the file that I can use to make > >> sure I’ve uploaded the correct file to the correct place. > >> > >> I use the "bin/solr zk upconfig” command to move files back and forth > >> FWIW, that > >> avoids, say putting the individual file a in the wrong directory... > >> > >> Best, > >> Erick > >> > >>> On Dec 4, 2020, at 9:18 AM, Marisol Redondo < > >> marisol.redondo.gar...@gmail.com> wrote: > >>> > >>> Hi, > >>> > >>> When trying to modify the config.xml file for a collection I made a > >> mistake > >>> and the config was wrong. So I removed the collection to create it > again > >>> from a backend. > >>> But, although I'm sure I'm using a correct config.xml, solr is still > >>> complaining about the error in the older solrconfig.xml > >>> > >>> I have tried to removed the collection more than once, I have stopped > >> solr > >>> and zookeeper and still having the same error. It's like zookeeper is > >> still > >>> storing the older solrconfig.xml and don't upload the configuration > file > >>> from the new collection. > >>> > >>> I have tried to > >>> - upload the files > >>> - remove the collection and create it again, but empty > >>> - restore the collection from the backup > >>> And I get always the same error: > >>> collection_name_shard1_replica1: > >>> > >> > org.apache.solr.common.SolrException:org.apache.solr.common.SolrException: > >>> Could not load conf for core collection_name_shard1_replica1: Error > >> loading > >>> solr config from solrconfig.xml > >>> > >>> Thanks for your help > >> > >> > >
Re: Collection deleted still in zookeeper
Thanks Erick for the answer, you gave me the clue to find the issue. The real problem is that when I removed the collection using the solr API (http://solrintance:port/solr/admin/collections?action=DELETE=collectionname) the config files are not deleted. I don't know if this is the normal behavior in every version of solr (I'm using version 6), but I think when deleting the collection, the config files for this collection should be removed. Anyway, I found that the config where still in the UI/cloud/tree/configs and they can be removed using the solr zk -r configs/myconfig and this solve the issue. Thanks On Fri, 4 Dec 2020 at 15:46, Erick Erickson wrote: > This almost always a result of one of two things: > > 1> you didn’t upload the config to the correct place or the ZK that Solr > uses. > or > 2> you still have a syntax problem in the config. > > The solr.log file on the node that’s failing may have a more useful > error message about what’s wrong. Also, you can try validating the XML > with one of the online tools. > > Are you totally and absolutely sure that, for instance, you’re uploading > to the correct Zookeeper? You should be able to look at the admin UI > screen and see the ZK address. I’ve seen this happen when people > inadvertently use the embedded ZK for one operation but not for the > other. Of have the ZK_HOST environment variable pointing to some > ZK ensemble that’s used when you start Solr but not when you upload > files. Or… > > Use the admin UI>>cloud>>tree>>configs>>your_config_name > to see if the solrconfig has the correct changes. I’ll often add some > bogus comment in the early part of the file that I can use to make > sure I’ve uploaded the correct file to the correct place. > > I use the "bin/solr zk upconfig” command to move files back and forth > FWIW, that > avoids, say putting the individual file a in the wrong directory... > > Best, > Erick > > > On Dec 4, 2020, at 9:18 AM, Marisol Redondo < > marisol.redondo.gar...@gmail.com> wrote: > > > > Hi, > > > > When trying to modify the config.xml file for a collection I made a > mistake > > and the config was wrong. So I removed the collection to create it again > > from a backend. > > But, although I'm sure I'm using a correct config.xml, solr is still > > complaining about the error in the older solrconfig.xml > > > > I have tried to removed the collection more than once, I have stopped > solr > > and zookeeper and still having the same error. It's like zookeeper is > still > > storing the older solrconfig.xml and don't upload the configuration file > > from the new collection. > > > > I have tried to > > - upload the files > > - remove the collection and create it again, but empty > > - restore the collection from the backup > > And I get always the same error: > > collection_name_shard1_replica1: > > > org.apache.solr.common.SolrException:org.apache.solr.common.SolrException: > > Could not load conf for core collection_name_shard1_replica1: Error > loading > > solr config from solrconfig.xml > > > > Thanks for your help > >
Collection deleted still in zookeeper
Hi, When trying to modify the config.xml file for a collection I made a mistake and the config was wrong. So I removed the collection to create it again from a backend. But, although I'm sure I'm using a correct config.xml, solr is still complaining about the error in the older solrconfig.xml I have tried to removed the collection more than once, I have stopped solr and zookeeper and still having the same error. It's like zookeeper is still storing the older solrconfig.xml and don't upload the configuration file from the new collection. I have tried to - upload the files - remove the collection and create it again, but empty - restore the collection from the backup And I get always the same error: collection_name_shard1_replica1: org.apache.solr.common.SolrException:org.apache.solr.common.SolrException: Could not load conf for core collection_name_shard1_replica1: Error loading solr config from solrconfig.xml Thanks for your help
Re: Porter Stem filter and employing
Following Erik idea, I started to look in different fields or queries than the title field itself, and I started using the normal requesthandler (/select) and adding parameters to see if any of the parameters in my query make this problem. And I discovered that in my customize RequestHandler I'm using deftype=edixmax and mm=100% (and other params), when I remove the param mm, I get the documents. I have been looking for information about this parameter and I've only found one page in solr https://lucene.apache.org/solr/guide/6_6/the-dismax-query-parser.html. Is there any other documentation that can help me to understand how this parameter works, I don't want to break all the searches removing that. Thanks for all your help On Mon, 4 Mar 2019 at 17:11, Erick Erickson wrote: > First, if you _changed_ the analysis chain without re-indexing all > documents, that could account for it. > > Second, the analysis page is a little tricky. It _assumes_ that the words > you put in the boxes have been parsed into the field you select. So let’s > say you have this field “title” that has stemming turned on. Let’s further > assume your default search field is “text” (this is configured in > solrconfig.xml, the “df” parameter in your request handler). > > Now, if your search is "q=employ” the actual search will be against your > default field, as though you had entered “q=text:employ”. This is a common > problem, adding "=query" to the search and looking at the result > parsed_query.toString() will show you what’s actually the result of the > query parsing and may help. > > Best, > Erick > > > On Mar 4, 2019, at 3:13 AM, Marisol Redondo < > marisol.redondo.gar...@gmail.com> wrote: > > > > Thank you for the answer and heading me to this solution. But I've > already > > used this filter for index analysis and I'm not getting any result. So I > > don't understand why I'm not getting the result. > > If I use the Analysis tool, I'm gettin > > So, maybe the problem is other? But I don't see what can be the problem, > > because, when using the Analysis took I got the same result for index and > > query: (the entry to this filter was employing carer) > > > > *PSF (Index)* > > > > *PSF (query)* > > > > text > > > > emploi > > > > carer > > > > text > > > > emploi > > > > carer > > > > raw_bytes > > > > [65 6d 70 6c 6f 69] > > > > [63 61 72 65 72] > > > > raw_bytes > > > > [65 6d 70 6c 6f 69] > > > > [63 61 72 65 72] > > > > start > > > > 0 > > > > 12 > > > > start > > > > 0 > > > > 12 > > > > end > > > > 9 > > > > 17 > > > > end > > > > 9 > > > > 17 > > > > positionLength > > > > 1 > > > > 1 > > > > positionLength > > > > 1 > > > > 1 > > > > type > > > > > > > > > > > > type > > > > > > > > > > > > position > > > > 1 > > > > 3 > > > > position > > > > 1 > > > > 3 > > > > keyword > > > > FALSE > > > > FALSE > > > > keyword > > > > FALSE > > > > FALSE > > > > > > > > > > > > > > > > > > On Fri, 1 Mar 2019 at 15:51, Shawn Heisey wrote: > > > >> On 3/1/2019 4:38 AM, Marisol Redondo wrote: > >>> When using the PorterStemFilter, I saw that the work "employing" is > >> change > >>> to "emploi" and my document is not found in the query to solr because > of > >>> that. > >>> > >>> This also happens with other words that finish in -ying as annoying or > >>> deploying. > >>> > >>> It there any path for this filter or should I create a new Jira issue? > >> > >> > >> When you are using a stemming filter, you will need to use the same > >> filter on both the query analysis and the index analysis, so that > >> similar words are stemmed to the same root in both cases, leading to > >> matches. > >> > >> If the other steps in your analysis chain are changing the words so that > >> the stemming filter cannot recognize the word, that might also cause > >> problems. > >> > >> Thanks, > >> Shawn > >> > >
Re: Porter Stem filter and employing
Thank you for the answer and heading me to this solution. But I've already used this filter for index analysis and I'm not getting any result. So I don't understand why I'm not getting the result. If I use the Analysis tool, I'm gettin So, maybe the problem is other? But I don't see what can be the problem, because, when using the Analysis took I got the same result for index and query: (the entry to this filter was employing carer) *PSF (Index)* *PSF (query)* text emploi carer text emploi carer raw_bytes [65 6d 70 6c 6f 69] [63 61 72 65 72] raw_bytes [65 6d 70 6c 6f 69] [63 61 72 65 72] start 0 12 start 0 12 end 9 17 end 9 17 positionLength 1 1 positionLength 1 1 type type position 1 3 position 1 3 keyword FALSE FALSE keyword FALSE FALSE On Fri, 1 Mar 2019 at 15:51, Shawn Heisey wrote: > On 3/1/2019 4:38 AM, Marisol Redondo wrote: > > When using the PorterStemFilter, I saw that the work "employing" is > change > > to "emploi" and my document is not found in the query to solr because of > > that. > > > > This also happens with other words that finish in -ying as annoying or > > deploying. > > > > It there any path for this filter or should I create a new Jira issue? > > > When you are using a stemming filter, you will need to use the same > filter on both the query analysis and the index analysis, so that > similar words are stemmed to the same root in both cases, leading to > matches. > > If the other steps in your analysis chain are changing the words so that > the stemming filter cannot recognize the word, that might also cause > problems. > > Thanks, > Shawn >
Porter Stem filter and employing
Hi. When using the PorterStemFilter, I saw that the work "employing" is change to "emploi" and my document is not found in the query to solr because of that. This also happens with other words that finish in -ying as annoying or deploying. It there any path for this filter or should I create a new Jira issue? Thanks