Re: Question about Luke
Hello, Ms. Tomoko Uchida: Thank you for your response, and I apologize for the unclearness in my previous email. Yes, "Luke" I mean is Luke GUI. So, again thank you for answering my question. Sincerely, Kaya Ota 2019年11月20日(水) 22:13 Tomoko Uchida : > Hello, > > > Is it different from checkIndex -exorcise option? > > (As far as I recently leaned, checkIndex -exorcise will delete > unreadable indices. ) > > If you mean desktop app Luke, "Repair" is just a wrapper of > CheckIndex.exorciseIndex(). There is no difference between doing > "Repair" from Luke GUI and calling "CheckIndex -exorcise" from CLI. > > > 2019年11月11日(月) 20:36 Kayak28 : > > > > Hello, Community: > > > > I am using Solr7.4.0 currently, and I was testing how Solr actually > behaves when it has a corrupted index. > > And I used Luke to fix the broken index from GUI. > > I just came up with the following questions. > > Is it possible to use the repair index tool from CLI? (in the case, Solr > was on AWS for example.) > > Is it different from checkIndex -exorcise option? > > (As far as I recently leaned, checkIndex -exorcise will delete > unreadable indices. ) > > > > If anyone gives me a reply, I would be very thankful. > > > > Sincerely, > > Kaya Ota > > - > To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org > For additional commands, e-mail: dev-h...@lucene.apache.org > >
Question about Luke
Hello, Community: I am using Solr7.4.0 currently, and I was testing how Solr actually behaves when it has a corrupted index. And I used Luke to fix the broken index from GUI. I just came up with the following questions. Is it possible to use the repair index tool from CLI? (in the case, Solr was on AWS for example.) Is it different from checkIndex -exorcise option? (As far as I recently leaned, checkIndex -exorcise will delete unreadable indices. ) If anyone gives me a reply, I would be very thankful. Sincerely, Kaya Ota
Printing NULL character in log files.
Hello, Solr Community: I am using Solr7.4.0 which uses log4j (version 2.11 from Solr's Chanes.txt) as its component. Some of the log files that Solr generated contain <0x00> (null characters) in log files (like below) Because of this issue, it is difficult for me to trace what actually happened to the Solr. Does anyone have the same issue before? If anyone knows a way to fix this issue or a cause of this issue, could you please let me know? Any clue will be very appreciated. [Example Log 1] 2019-10-20 06:02:03.643 INFO (coreCloseExecutor-140-thread-4) [ x:corename1] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.corename, tag=4c16<0x00><0x00><0x00><0x00>...<0x00><0x00>00ff 2019-10-20 06:02:03.643 INFO (coreCloseExecutor-140-thread-4) [ x:corename1] o.a.s.m.r.SolrJmxReporter Closing reporter [org.apache.solr.metrics.reporters.SolrJmxReporter@17281659: rootName = null, domain = solr.core.corename, service url = null, agent id = null] for registry solr.core.corename1/ com.codahale.metrics.MetricRegistry@6c9f45cc<0x00><0x00><0x00><0x00>..(continue printing <0x00> untill the end of file.) [Example Log 2] 2019-10-27 06:02:02.891 INFO (coreCloseExecutor-140-thread-17) [ x:core2] o.a.s.m.r.SolrJmxReporter Closing reporter [org.apache.solr.metrics.reporters.SolrJmxReporter@35e76d2e: rootName = null, domain = solr.core.core2, service url = null, agent id = null] for registry solr.core.core2 / com.codahale.metrics.MetricRegistry@76be90f4 2019-10-27 06:02:02.891 INFO (coreCloseExecutor-140-thread-26) [ x:core3]<0x00><0x00><0x00><0x00><0x00><0x00><0x00><0x00>...<0x00><0x00> o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.TUN000, tag=34f04984 2019-10-27 06:02:02.891 INFO (coreCloseExecutor-140-thread-26) [ x:TUN000] o.a.s.m.r.SolrJmxReporter Closing reporter [org.apache.solr.metrics.reporters.SolrJmxReporter@378cecb: rootName = null, domain = solr.core.TUN000, service url = null, agent id = null] for registry solr.core.TUN000 / com.codahale.metrics.MetricRegistry@9c3410c 2019-10-27 06:02:05.063 INFO (Thread-1) [ ] o.e.j.s.h.ContextHandler Stopped o.e.j.w.WebAppContext@5fbe4146 {/solr,null,UNAVAILABLE}{file:///E:/apatchSolr/RCSS-basic-4.0.1/LUSOLR/solr/server//solr-webapp/webapp} <0x00><0x00><0x00><0x00><0x00><0x00>...(printing <0x00> until the end of the file)..<0x00><0x00> Sincerely, Kaya Ota
Re: Questions about corrupted Segments files.
Hello, Mr. Erick, Mr. Dmitry, and the community members: Thank you for your advice. I am going to try Luck and --exorcise option this weekend. Again, I appreciated your replays. Sincerely, Kaya Ota 2019年11月6日(水) 22:36 Erick Erickson : > If Luke doesn’t do the trick, use the -exorcise option and start your > indexing process over again. > > Best, > Erick > > > On Nov 6, 2019, at 6:24 AM, Dmitry Kan wrote: > > > > Hi Kaya, > > > > Try luke: > > http://dmitrykan.blogspot.com/2018/01/new-luke-on-javafx.html > > > > Best, > > > > Dmitry > > > > On Wed 6. Nov 2019 at 3.24, Kayak28 wrote: > > Hello, Community members: > > > > I am using Solr 7.7.2. > > On the other day, while indexing to the Solr, my computer powered off. > > As a result, there are corrupted segment files. > > > > Is there any way to fix the corrupted segment files without re-indexing? > > > > I have read a blog post (in Japanese) writing about checkIndex method > which can be used to determine/fix corrupted segment files, but when I > tried to run the following command, I got the error message. > > So, I am not sure if checkIndex can actually fix the index files. > > > > > > java -cp lucene-core-7.7.2.jar -ea:org.apache.lucene... > org.apache.lucene.index.CheckIndex solr/server/solr/basic_copy/data/index > -fix > > > > ERROR: unexpected extra argument '-fix' > > > > > > > > If anybody knows about either a way to fix corrupted segment files or a > way to use checkIndex '-fix' option correctly, could you please let me > know? > > > > Any clue will be very appreciated. > > > > Sincerely, > > Kaya Ota > > > > > > -- > > -- > > Dmitry Kan > > Luke Toolbox: http://github.com/DmitryKey/luke > > Blog: http://dmitrykan.blogspot.com > > Twitter: http://twitter.com/dmitrykan > > SemanticAnalyzer: www.semanticanalyzer.info > > > - > To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org > For additional commands, e-mail: dev-h...@lucene.apache.org > >
Questions about corrupted Segments files.
Hello, Community members: I am using Solr 7.7.2. On the other day, while indexing to the Solr, my computer powered off. As a result, there are corrupted segment files. Is there any way to fix the corrupted segment files without re-indexing? I have read a blog post (in Japanese) writing about checkIndex method which can be used to determine/fix corrupted segment files, but when I tried to run the following command, I got the error message. So, I am not sure if checkIndex can actually fix the index files. java -cp lucene-core-7.7.2.jar -ea:org.apache.lucene... org.apache.lucene.index.CheckIndex solr/server/solr/basic_copy/data/index -fix ERROR: unexpected extra argument '-fix' If anybody knows about either a way to fix corrupted segment files or a way to use checkIndex '-fix' option correctly, could you please let me know? Any clue will be very appreciated. Sincerely, Kaya Ota
Problem of Shutdown Process for Windows Server
Hello, Community: I use Solr with Windows servers, and cannot shutdown Solr successfully. When I try to stop Solr using solr.cmd, which is kicked from Windows Task Manager, it "looks" like Solr stops without any problem. Here "looks" means that at least log file that Solr wrote does not seem to have any error. (I pasted a piece of the log where I believe "success" at the end of this email ) However, next time I start up the Solr, I face the error message that says "Address already in use." This problem happens occasionally, happens a different server at irregular time/date. So, I could not simulate the situation yet. I wonder why Solr could not shutdown successfully. If anyone of you has faced a similar incident or knows a solution, then it is very helpful to share your bits of advice. Any clue will be very appreciated. *Environment* OS: Windows Server 2012 R2 Java: Oracle JDK 1.8.0 Solr Version: 5.2.1 Solr Structures:15 Solr server, enabled to distributed search with sharding (Not using SolrCloud) Memory(Solr / physical) : 20GB/32GB Index Size: around 300GB *Logs* INFO - 2019-05-25 21:06:15.996; [ ] org.apache.solr.core.CachingDirectoryFactory; looking to close D:\Documents\solr-home\collection1\data [CachedDir<>] INFO - 2019-05-25 21:06:15.996; [ ] org.apache.solr.core.CachingDirectoryFactory; Closing directory: D:\Documents\solr-home\collection1\data INFO - 2019-05-25 21:06:15.996; [ ] org.apache.solr.core.CachingDirectoryFactory; looking to close D:\Documents\solr-home\collection1\data\index [CachedDir<>] INFO - 2019-05-25 21:06:15.996; [ ] org.apache.solr.core.CachingDirectoryFactory; Closing directory: D:\Documents\solr-home\collection1\data\index INFO - 2019-05-25 21:06:16.199; [ ] org.eclipse.jetty.server.handler.ContextHandler; Stopped o.e.j.w.WebAppContext@4b9e13df {/solr,file:/D:/Documents/solr/server/solr-webapp/webapp/,UNAVAILABLE}{/solr.war} * Note: solr-home directory is the directory where I store Solr cores. Sincerely, Kaya Ota