There is SmartChineseSentenceTokenizerFactory or SentenceTokenizer which is getting being deprecated & replaced with HMMChineseTokenizer. Not aware of other tokenizer but you may to either build your own similar to SentenceTokenizer or employ any external Sentence detection/recognizer & built Solr tokenizer on top of it.
Don't know how complex your use case is but I would suggest to look SentenceTokenizer and create similar tokenizer. Thanks, Susheel -----Original Message----- From: Sandeep B A [mailto:belgavi.sand...@gmail.com] Sent: Friday, September 05, 2014 10:40 AM To: solr-user@lucene.apache.org Subject: Re: Is there any sentence tokenizers in sold 4.9.0? Sorry for typo it is solr 4.9.0 instead of sold 4.9.0 On Sep 5, 2014 7:48 PM, "Sandeep B A" <belgavi.sand...@gmail.com> wrote: > Hi, > > I was looking out the options for sentence tokenizers default in solr > but could not find it. Does any one used? Integrated from any other > language tokenizers to solr. Example python etc.. Please let me know. > > > Thanks and regards, > Sandeep > This e-mail message may contain confidential or legally privileged information and is intended only for the use of the intended recipient(s). Any unauthorized disclosure, dissemination, distribution, copying or the taking of any action in reliance on the information herein is prohibited. E-mails are not secure and cannot be guaranteed to be error free as they can be intercepted, amended, or contain viruses. Anyone who communicates with us by e-mail is deemed to have accepted these risks. The Digital Group is not responsible for errors or omissions in this message and denies any responsibility for any damage arising from the use of e-mail. Any opinion defamatory or deemed to be defamatory or any material which could be reasonably branded to be a species of plagiarism and other statements contained in this message and any attachment are solely those of the author and do not necessarily represent those of the company.