[
https://issues.apache.org/jira/browse/SOLR-4148?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Shawn Heisey closed SOLR-4148.
------------------------------
Resolution: Won't Fix
Fix Version/s: (was: 4.3)
This isn't a bug, as Robert pointed out. Closing this issue. If I get
ambitious later, I may create an issue and patch that allows configuration of
the max token length on all tokenizers
This is part of an effort to close old issues that I have reported. Search
tag: elyograg2013springclean
> WhiteSpaceTokenizer splits long no-whitespace sequences every 255 characters
> ----------------------------------------------------------------------------
>
> Key: SOLR-4148
> URL: https://issues.apache.org/jira/browse/SOLR-4148
> Project: Solr
> Issue Type: Bug
> Components: Schema and Analysis
> Affects Versions: 3.5, 4.0
> Environment: for branch_4x:
> Linux bigindy5 2.6.32-279.14.1.el6.centos.plus.x86_64 #1 SMP Wed Nov 7
> 00:40:45 UTC 2012 x86_64 x86_64 x86_64 GNU/Linux
> java version "1.7.0_09"
> Java(TM) SE Runtime Environment (build 1.7.0_09-b05)
> Java HotSpot(TM) 64-Bit Server VM (build 23.5-b02, mixed mode)
> for 3.5:
> Linux idxa1 2.6.32-279.14.1.el6.centos.plus.x86_64 #1 SMP Wed Nov 7 00:40:45
> UTC 2012 x86_64 x86_64 x86_64 GNU/Linux
> java version "1.6.0_29"
> Java(TM) SE Runtime Environment (build 1.6.0_29-b11)
> Java HotSpot(TM) 64-Bit Server VM (build 20.4-b02, mixed mode)
> Reporter: Shawn Heisey
>
> I have the following text input in a field with an analysis chain that starts
> with WhiteSpaceTokenizerFactory.
> Auto,DEU,Drogendezernat,Drogenfahnder,Drogenfahndung,Drug,Duesseldorf,Ermittler,Ermittlung,Fahnder,Fahndung,Fahrzeug,Germany,Innere,Kriminalitaet,Kriminalpolizei,Kriminalpolizisten,Kripo,Oeffentliche,Ordnung,Polizei,Polizeikelle,Polizist,Polizistin,Rauschgift,Rauschgiftdezernat,Sicherheit,Sicherheitsbehoerden,Sicherheitskraefte,Uniform,Verbrechen,Waffen,Zivilbeamte,Zivilfahnder,Zivilpolizisten,anhalten,car,crime,criminal,drug,enforcement,investigators,law,officers,police,policeman,policemen,policeofficer,public,saftey,security,squad,stop,stoppen,stopping,suspect,suspicious,ueberpruefen,verdaechtig
> This input has no whitespace. WhiteSpaceTokenizer is taking this and
> breaking it into tokens after every 255th character. It happens on both 3.5
> and a recent branch_4x checkout. I am using ICUTokenizerFactory on some
> fields in branch_4x, it doesn't have this problem. I have not tried the
> ICUTokenizerFactory on 3.5.
--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]