Re: Why does PayloadTermWeight explanation hide the scorePayload Explanation?
Payloads are not used much, so it is not surprising there is a hole in a side feature. Please file a bug. Better yet, file a patch! On Fri, Jul 20, 2012 at 9:13 AM, Scott Smerchek wrote: > I'm using the PayloadTermQuery and scoring documents using a custom > algorithm based on the payloads of the matching terms. The algorithm is > implemented in the custom PayloadFunction and I have added an Override for > the explain. However, the PayloadTermWeight explanation hides the details > of the payload score... > > Explanation payloadExpl = new Explanation(scorer.getPayloadScore(), > "scorePayload(...)"); > > This is different than the way that PayloadNearSpanWeight explains the > payload. It actually asks the payload function for the explanation rather > than hiding it: > > Explanation payloadExpl = function.explain(doc, scorer.payloadsSeen, > scorer.payloadScore); > > This seems like a bug/oversight. Should this be a JIRA issue, or is this > intended? The fix is obviously very simple (just ask the PayloadFunction as > in the PayloadNearSpanWeight). > > - Scott -- Lance Norskog goks...@gmail.com
Re: RE: How to setup SimpleFSDirectoryFactory
Interesting. Which version of Solr is this? What happens if you do a commit? On Sat, Jul 21, 2012 at 8:01 AM, geetha anjali wrote: > Hi uwe, > Great to know. We have files indexing 1/min. After 30 mins I see all > my physical memory say its 100 percentage used(windows). On deep > investigation found that mmap is not releasing os files handles. Do you > find this behaviour? > > Thanks > > On 20 Jul 2012 14:04, "Uwe Schindler" wrote: > > Hi Bill, > > MMapDirectory uses the file system cache of your operating system, which has > following consequences: In Linux, top & free should normally report only > *few* free memory, because the O/S uses all memory not allocated by > applications to cache disk I/O (and shows it as allocated, so having 0% free > memory is just normal on Linux and also Windows). If you have other > applications or Lucene/Solr itself that allocate lot's of heap space or > malloc() a lot, then you are reducing free physical memory, so reducing fs > cache. This depends also on your swappiness parameter (if swappiness is > higher, inactive processes are swapped out easier, default is 60% on linux - > freeing more space for FS cache - the backside is of course that maybe > in-memory structures of Lucene and other applications get pages out). > > You will only see no paging at all if all memory allocated all applications > + all mmapped files fit into memory. But paging in/out the mmapped Lucene > index is much cheaper than using SimpleFSDirectory or NIOFSDirectory. If > you use SimpleFS or NIO and your index is not in FS cache, it will also read > it from physical disk again, so where is the difference. Paging is actually > cheaper as no syscalls are involved. > > If you want as much as possible of your index in physical RAM, copy it to > /dev/null regularily and buy more RUM :-) > > > - > Uwe Schindler > H.-H.-Meier-Allee 63, D-28213 Bremen > http://www.thetaphi.de > eMail: uwe@thetaphi... > >> From: Bill Bell [mailto:billnb...@gmail.com] >> Sent: Friday, July 20, 2012 5:17 AM >> Subject: Re: ... >> s=op using it? The least used memory will be removed from the OS >> automaticall=? Isee some paging. Wouldn't paging slow down the querying? > >> >> My index is 10gb and every 8 hours we get most of it in shared memory. The >> m=mory is 99 percent used, and that does not leave any room for other > apps. = > >> Other implications? >> >> Sent from my mobile device >> 720-256-8076 >> >> On Jul 19, 2012, at 9:49 A... >> H=ap space or free system RAM: > >> > >> > http://blog.thetaphi.de/2012/07/use-lucenes-mmapdirectory-on-64bit.htm >> > l >> > >> > Uwe >> >... >> >> use i= since you might run out of memory on large indexes right? > >> >> >> >> Here is how I got iSimpleFSDirectoryFactory to work. Just set - >> >> Dsolr.directoryFactor... >> >> set it=all up with a helper in solrconfig.xml... > >> >> >> >> if (Constants.WINDOWS) { >> >> if (MMapDirectory.UNMAP_SUPPORTED && Constants.JRE_IS_64... -- Lance Norskog goks...@gmail.com
Index relational XML with DataImportHandler
Hi, I'm trying to index a set of stores and their articles. I have two XML-files, one that contains the data of the stores and one that contains articles for each store. I'm using DIH with XPathEntityProcessor to process the file containing the store, and using a nested entity I try to get all articles that belongs to the specific store. The problem I encounter is that every store gets the same articles. For testing purposes I've stripped down the xml-files to only include id:s for testing purposes. The store file (StoresTest.xml) looks like this: 01020104 The Store-Articles relations file (StoreArticlesTest.xml) looks like this: 180041700410004 And my dih-config file looks like this: The result I get in Solr is this: ... 0102 18004 0104 18004 As you see, both stores gets the article for the first store. I would have expected the second store to have two articles: 17004 and 10004. In the log messages printed using LogTransformer I see that each store.idis processed but somehow it only picks up the articles for the first store. Any ideas? /Tobias Berg
RE: Solr grouping / facet query
Thanks for the reply Robi. The key idea is to "search for authors" including text in their bio and any authored titles then display any relevant titles next to the author's name. Currently, the only way to do this is index by title and include the bio data in each document then group by author. This is how I now have things setup, however, the bio data is duplicated. I'm not sure if this duplicated data in the index will affect relevancy or performance ... I think the benefits of grouping this way will outweigh the cost. Thank you. -- View this message in context: http://lucene.472066.n3.nabble.com/Solr-grouping-facet-query-tp3995787p3996481.html Sent from the Solr - User mailing list archive at Nabble.com.
Re: errors after update from 3.4 to 3.6
Resolved by setting batchSize to -1. See http://wiki.apache.org/solr/DataImportHandler#Configuring_JdbcDataSource From: Xue-Feng Yang To: "solr-user@lucene.apache.org" Sent: Saturday, July 21, 2012 9:54:40 AM Subject: errors after update from 3.4 to 3.6 Hi all, I have a DIH which works for solr 3.4 but not for 3.6. I test that the SQL which works in MySQL console. The following errors are copied from DataImportHandler Development Console. Does anyone know the reason for those errors? Thanks, Xuefeng // 015db-data-config.xmluimafull-importdebugSELECT CAST(concat(b.bk_name,' ',v.chap,':',v.sect) AS CHAR ) as title_id, v.bk_id, v.chap, v.sect, v.content FROM Verse v JOIN book b WHERE v.bk_id=b.bk_id AND ver_id='8'org.apache.solr.handler.dataimport.DataImportHandlerException: Unable to execute query: SELECT CAST(concat(b.bk_name,' ',v.chap,':',v.sect) AS CHAR ) as title_id, v.bk_id, v.chap, v.sect, v.content FROM Verse v JOIN book b WHERE v.bk_id=b.bk_id AND ver_id='8' Processing Document # 1 at org.apache.solr.handler.dataimport.DataImportHandlerException.wrapAndThrow(DataImportHandlerException.java:72) at org.apache.solr.handler.dataimport.JdbcDataSource$ResultSetIterator.(JdbcDataSource.java:253) at org.apache.solr.handler.dataimport.JdbcDataSource.getData(JdbcDataSource.java:210) at org.apache.solr.handler.dataimport.JdbcDataSource.getData(JdbcDataSource.java:39) at org.apache.solr.handler.dataimport.DebugLogger$2.getData(DebugLogger.java:188) at org.apache.solr.handler.dataimport.SqlEntityProcessor.initQuery(SqlEntityProcessor.java:59) at org.apache.solr.handler.dataimport.SqlEntityProcessor.nextRow(SqlEntityProcessor.java:73) at org.apache.solr.handler.dataimport.EntityProcessorWrapper.pullRow(EntityProcessorWrapper.java:330) at org.apache.solr.handler.dataimport.EntityProcessorWrapper.nextRow(EntityProcessorWrapper.java:296) at org.apache.solr.handler.dataimport.DocBuilder.buildDocument(DocBuilder.java:683) at org.apache.solr.handler.dataimport.DocBuilder.buildDocument(DocBuilder.java:619) at org.apache.solr.handler.dataimport.DocBuilder.doFullDump(DocBuilder.java:327) at org.apache.solr.handler.dataimport.DocBuilder.execute(DocBuilder.java:225) at org.apache.solr.handler.dataimport.DataImporter.doFullImport(DataImporter.java:375) at org.apache.solr.handler.dataimport.DataImporter.runCmd(DataImporter.java:445) at org.apache.solr.handler.dataimport.DataImportHandler.handleRequestBody(DataImportHandler.java:205) at org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:129) at org.apache.solr.core.SolrCore.execute(SolrCore.java:1376) at org.apache.solr.servlet.SolrDispatchFilter.execute(SolrDispatchFilter.java:365) at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:260) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:256) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:217) at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:279) at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:175) at org.apache.catalina.core.StandardPipeline.doInvoke(StandardPipeline.java:655) at org.apache.catalina.core.StandardPipeline.invoke(StandardPipeline.java:595) at com.sun.enterprise.web.WebPipeline.invoke(WebPipeline.java:98) at com.sun.enterprise.web.PESessionLockingStandardPipeline.invoke(PESessionLockingStandardPipeline.java:91) at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:162) at org.apache.catalina.connector.CoyoteAdapter.doService(CoyoteAdapter.java:330) at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:231) at com.sun.enterprise.v3.services.impl.ContainerMapper.service(ContainerMapper.java:174) at com.sun.grizzly.http.ProcessorTask.invokeAdapter(ProcessorTask.java:828) at com.sun.grizzly.http.ProcessorTask.doProcess(ProcessorTask.java:725) at com.sun.grizzly.http.ProcessorTask.process(ProcessorTask.java:1019) at com.sun.grizzly.http.DefaultProtocolFilter.execute(DefaultProtocolFilter.java:225) at com.sun.grizzly.DefaultProtocolChain.executeProtocolFilter(DefaultProtocolChain.java:137) at com.sun.grizzly.DefaultProtocolChain.execute(DefaultProtocolChain.java:104) at com.sun.grizzly.DefaultProtocolChain.execute(DefaultProtocolChain.java:90) at com.sun.grizzly.http.HttpProtocolChain.execute(HttpProtocolChain.java:79) at com.sun.grizzly.ProtocolChainContextTask.doCall(ProtocolChainContextTask.java:54) at com.sun.grizzly.SelectionKeyContextTask.call(SelectionKeyContextTask.java:59) at com.sun.grizzly.ContextTask.run(ContextTask.java:71) at com.sun.grizzly.util.AbstractThreadPool$Worker.doWork(AbstractThreadPool.java:532) at com.sun.grizzly.util.AbstractThreadPool$Worker.run(AbstractThre
Re: How to Increase the number of connexion on Solr/Tomcat6?
Yeah, that's Tomcat's memory leak detector. Technically that's a memory leak, but in practice it won't really amount to much. I'm surprised there are no errors related to your empty response problem in the logs. That is strange, and might point to a problem with your Tomcat install. Perhaps your instinct to use Jetty was the right one after all. Michael Della Bitta Appinions, Inc. -- Where Influence Isn’t a Game. http://www.appinions.com On Fri, Jul 20, 2012 at 6:36 PM, Bruno Mannina wrote: > In the catalina.out, I have only these few rows with: > > . > INFO: Closing Searcher@1faa614 main > fieldValueCache{lookups=0,hits=0,hitratio=0.00,inserts=0,evictions=0,size=0,warmupTime=0,cumulative_lookups=0,cumulative_hits=0,cumulative_hitratio=0.00,cumulative_inserts=0,cumulative_evictions=0} > 15 juil. 2012 13:51:31 org.apache.catalina.loader.WebappClassLoader > clearThreadLocalMap > GRAVE: The web application [/solr] created a ThreadLocal with key of type > [org.apache.solr.schema.DateField.ThreadLocalDateFormat] (value > [org.apache.solr.schema.DateField$ThreadLocalDateFormat@75a744]) and a value > of type [org.apache.solr.schema.DateField.ISO8601CanonicalDateFormat] (value > [org.apache.solr.schema.DateField$ISO8601CanonicalDateFormat@6b2ed43a]) but > failed to remove it when the web application was stopped. This is very > likely to create a memory leak. > 15 juil. 2012 13:51:31 org.apache.catalina.loader.WebappClassLoader > clearThreadLocalMap > GRAVE: The web application [/solr] created a ThreadLocal with key of type > [org.apache.solr.schema.DateField.ThreadLocalDateFormat] (value > [org.apache.solr.schema.DateField$ThreadLocalDateFormat@75a744]) and a value > of type [org.apache.solr.schema.DateField.ISO8601CanonicalDateFormat] (value > [org.apache.solr.schema.DateField$ISO8601CanonicalDateFormat@6b2ed43a]) but > failed to remove it when the web application was stopped. This is very > likely to create a memory leak. > 15 juil. 2012 13:51:31 org.apache.catalina.loader.WebappClassLoader > clearThreadLocalMap > GRAVE: The web application [/solr] created a ThreadLocal with key of type > [org.apache.solr.schema.DateField.ThreadLocalDateFormat] (value > [org.apache.solr.schema.DateField$ThreadLocalDateFormat@75a744]) and a value > of type [org.apache.solr.schema.DateField.ISO8601CanonicalDateFormat] (value > [org.apache.solr.schema.DateField$ISO8601CanonicalDateFormat@6b2ed43a]) but > failed to remove it when the web application was stopped. This is very > likely to create a memory leak. > 15 juil. 2012 13:51:31 org.apache.catalina.loader.WebappClassLoader > clearThreadLocalMap > GRAVE: The web application [/solr] created a ThreadLocal with key of type > [org.apache.solr.schema.DateField.ThreadLocalDateFormat] (value > [org.apache.solr.schema.DateField$ThreadLocalDateFormat@75a744]) and a value > of type [org.apache.solr.schema.DateField.ISO8601CanonicalDateFormat] (value > [org.apache.solr.schema.DateField$ISO8601CanonicalDateFormat@6b2ed43a]) but > failed to remove it when the web application was stopped. This is very > likely to create a memory leak. > 15 juil. 2012 13:51:31 org.apache.coyote.http11.Http11Protocol destroy > INFO: Arrêt de Coyote HTTP/1.1 sur http-8983 > 15 juil. 2012 13:54:29 org.apache.catalina.startup.ClassLoaderFactory > validateFile > ATTENTION: Problem with directory [/usr/share/tomcat6/server/classes], > exists: [false], isDirectory: [false], canRead: [false] > 15 juil. 2012 13:54:29 org.apache.catalina.startup.ClassLoaderFactory > validateFile > ATTENTION: Problem with directory [/usr/share/tomcat6/server], exists: > [false], isDirectory: [false], canRead: [false] > 15 juil. 2012 13:54:29 org.apache.catalina.startup.ClassLoaderFactory > validateFile > ATTENTION: Problem with directory [/usr/share/tomcat6/shared/classes], > exists: [false], isDirectory: [false], canRead: [false] > 15 juil. 2012 13:54:29 org.apache.catalina.startup.ClassLoaderFactory > validateFile > ATTENTION: Problem with directory [/usr/share/tomcat6/shared], exists: > [false], isDirectory: [false], canRead: [false] > 15 juil. 2012 13:54:29 org.apache.coyote.http11.Http11Protocol init > INFO: Initialisation de Coyote HTTP/1.1 sur http-8983 > ... > ... > ... > > Le 21/07/2012 00:04, Bruno Mannina a écrit : > >> Le 21/07/2012 00:02, Bruno Mannina a écrit : >>> >>> Le 21/07/2012 00:00, Bruno Mannina a écrit : catalinat.out <-- twice >>> >>> Sorry concerning this file, I do a >>> sudo cat .. |more and it's ok I see the content >>> >> And inside the catalina.out I have all my requests, without error or >> missing requests >> >> :'( it's amazing >> >> >
Re: RE: How to setup SimpleFSDirectoryFactory
Hi uwe, Great to know. We have files indexing 1/min. After 30 mins I see all my physical memory say its 100 percentage used(windows). On deep investigation found that mmap is not releasing os files handles. Do you find this behaviour? Thanks On 20 Jul 2012 14:04, "Uwe Schindler" wrote: Hi Bill, MMapDirectory uses the file system cache of your operating system, which has following consequences: In Linux, top & free should normally report only *few* free memory, because the O/S uses all memory not allocated by applications to cache disk I/O (and shows it as allocated, so having 0% free memory is just normal on Linux and also Windows). If you have other applications or Lucene/Solr itself that allocate lot's of heap space or malloc() a lot, then you are reducing free physical memory, so reducing fs cache. This depends also on your swappiness parameter (if swappiness is higher, inactive processes are swapped out easier, default is 60% on linux - freeing more space for FS cache - the backside is of course that maybe in-memory structures of Lucene and other applications get pages out). You will only see no paging at all if all memory allocated all applications + all mmapped files fit into memory. But paging in/out the mmapped Lucene index is much cheaper than using SimpleFSDirectory or NIOFSDirectory. If you use SimpleFS or NIO and your index is not in FS cache, it will also read it from physical disk again, so where is the difference. Paging is actually cheaper as no syscalls are involved. If you want as much as possible of your index in physical RAM, copy it to /dev/null regularily and buy more RUM :-) - Uwe Schindler H.-H.-Meier-Allee 63, D-28213 Bremen http://www.thetaphi.de eMail: uwe@thetaphi... > From: Bill Bell [mailto:billnb...@gmail.com] > Sent: Friday, July 20, 2012 5:17 AM > Subject: Re: ... > s=op using it? The least used memory will be removed from the OS > automaticall=? Isee some paging. Wouldn't paging slow down the querying? > > My index is 10gb and every 8 hours we get most of it in shared memory. The > m=mory is 99 percent used, and that does not leave any room for other apps. = > Other implications? > > Sent from my mobile device > 720-256-8076 > > On Jul 19, 2012, at 9:49 A... > H=ap space or free system RAM: > > > > http://blog.thetaphi.de/2012/07/use-lucenes-mmapdirectory-on-64bit.htm > > l > > > > Uwe > >... > >> use i= since you might run out of memory on large indexes right? > >> > >> Here is how I got iSimpleFSDirectoryFactory to work. Just set - > >> Dsolr.directoryFactor... > >> set it=all up with a helper in solrconfig.xml... > >> > >> if (Constants.WINDOWS) { > >> if (MMapDirectory.UNMAP_SUPPORTED && Constants.JRE_IS_64...
errors after update from 3.4 to 3.6
Hi all, I have a DIH which works for solr 3.4 but not for 3.6. I test that the SQL which works in MySQL console. The following errors are copied from DataImportHandler Development Console. Does anyone know the reason for those errors? Thanks, Xuefeng // 015db-data-config.xmluimafull-importdebugSELECT CAST(concat(b.bk_name,' ',v.chap,':',v.sect) AS CHAR ) as title_id, v.bk_id, v.chap, v.sect, v.content FROM Verse v JOIN book b WHERE v.bk_id=b.bk_id AND ver_id='8'org.apache.solr.handler.dataimport.DataImportHandlerException: Unable to execute query: SELECT CAST(concat(b.bk_name,' ',v.chap,':',v.sect) AS CHAR ) as title_id, v.bk_id, v.chap, v.sect, v.content FROM Verse v JOIN book b WHERE v.bk_id=b.bk_id AND ver_id='8' Processing Document # 1 at org.apache.solr.handler.dataimport.DataImportHandlerException.wrapAndThrow(DataImportHandlerException.java:72) at org.apache.solr.handler.dataimport.JdbcDataSource$ResultSetIterator.(JdbcDataSource.java:253) at org.apache.solr.handler.dataimport.JdbcDataSource.getData(JdbcDataSource.java:210) at org.apache.solr.handler.dataimport.JdbcDataSource.getData(JdbcDataSource.java:39) at org.apache.solr.handler.dataimport.DebugLogger$2.getData(DebugLogger.java:188) at org.apache.solr.handler.dataimport.SqlEntityProcessor.initQuery(SqlEntityProcessor.java:59) at org.apache.solr.handler.dataimport.SqlEntityProcessor.nextRow(SqlEntityProcessor.java:73) at org.apache.solr.handler.dataimport.EntityProcessorWrapper.pullRow(EntityProcessorWrapper.java:330) at org.apache.solr.handler.dataimport.EntityProcessorWrapper.nextRow(EntityProcessorWrapper.java:296) at org.apache.solr.handler.dataimport.DocBuilder.buildDocument(DocBuilder.java:683) at org.apache.solr.handler.dataimport.DocBuilder.buildDocument(DocBuilder.java:619) at org.apache.solr.handler.dataimport.DocBuilder.doFullDump(DocBuilder.java:327) at org.apache.solr.handler.dataimport.DocBuilder.execute(DocBuilder.java:225) at org.apache.solr.handler.dataimport.DataImporter.doFullImport(DataImporter.java:375) at org.apache.solr.handler.dataimport.DataImporter.runCmd(DataImporter.java:445) at org.apache.solr.handler.dataimport.DataImportHandler.handleRequestBody(DataImportHandler.java:205) at org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:129) at org.apache.solr.core.SolrCore.execute(SolrCore.java:1376) at org.apache.solr.servlet.SolrDispatchFilter.execute(SolrDispatchFilter.java:365) at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:260) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:256) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:217) at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:279) at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:175) at org.apache.catalina.core.StandardPipeline.doInvoke(StandardPipeline.java:655) at org.apache.catalina.core.StandardPipeline.invoke(StandardPipeline.java:595) at com.sun.enterprise.web.WebPipeline.invoke(WebPipeline.java:98) at com.sun.enterprise.web.PESessionLockingStandardPipeline.invoke(PESessionLockingStandardPipeline.java:91) at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:162) at org.apache.catalina.connector.CoyoteAdapter.doService(CoyoteAdapter.java:330) at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:231) at com.sun.enterprise.v3.services.impl.ContainerMapper.service(ContainerMapper.java:174) at com.sun.grizzly.http.ProcessorTask.invokeAdapter(ProcessorTask.java:828) at com.sun.grizzly.http.ProcessorTask.doProcess(ProcessorTask.java:725) at com.sun.grizzly.http.ProcessorTask.process(ProcessorTask.java:1019) at com.sun.grizzly.http.DefaultProtocolFilter.execute(DefaultProtocolFilter.java:225) at com.sun.grizzly.DefaultProtocolChain.executeProtocolFilter(DefaultProtocolChain.java:137) at com.sun.grizzly.DefaultProtocolChain.execute(DefaultProtocolChain.java:104) at com.sun.grizzly.DefaultProtocolChain.execute(DefaultProtocolChain.java:90) at com.sun.grizzly.http.HttpProtocolChain.execute(HttpProtocolChain.java:79) at com.sun.grizzly.ProtocolChainContextTask.doCall(ProtocolChainContextTask.java:54) at com.sun.grizzly.SelectionKeyContextTask.call(SelectionKeyContextTask.java:59) at com.sun.grizzly.ContextTask.run(ContextTask.java:71) at com.sun.grizzly.util.AbstractThreadPool$Worker.doWork(AbstractThreadPool.java:532) at com.sun.grizzly.util.AbstractThreadPool$Worker.run(AbstractThreadPool.java:513) at java.lang.Thread.run(Thread.java:662) Caused by: java.sql.SQLException: Illegal value for setFetchSize(). at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:1073) at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:987) at com.mysql.jdbc.SQLError.c
Re: ICUCollation throws exception
It looks to me like a misconfiguration in your schema, as if you have specified ICUCollation*Field* as an Analyzer? Can you check the configuration of 'alphaOnlySort' and tell me what it is? On Sat, Jul 21, 2012 at 4:53 AM, Günter Hipler wrote: > Hi Robert, > > instead of my colleague, he is currently away. Subsequently you will see > the entire log. Oliver told me the problem and I had the impression that > something necessary for initialisation is missing. I "googled" a little bit > and found a patch you have done > https://issues.apache.org/jira/secure/attachment/12505302/SOLR-2919.patch > related to "(ICU)CollationKeyFilter". But even initializations like this: > > *** > > + > + > + class="solr.ICUCollationField" > + locale="en" strength="primary" alternate="shifted"/> > + > + + locale="en" strength="primary" alternate="shifted" > variableTop=" "/> > + > + + locale="en" strength="primary" caseLevel="true"/> > + > + + locale="en" numeric="true"/> > + > + class="solr.ICUCollationField" > + locale="en" strength="tertiary" caseFirst="upper"/> > + > > > didn't work. At least this is what I heard from Oliver. > > Thanks and best wishes from Basel > > Günter > > ***Exception** > INFO: Reading Solr Schema > Jul 16, 2012 5:27:48 PM org.apache.solr.schema.IndexSchema readSchema > INFO: Schema name=swissbib-default > Jul 16, 2012 5:27:48 PM org.apache.solr.common.SolrException log > SEVERE: null:org.apache.solr.common.SolrException: Plugin init failure for > [schema.xml] fieldType "alphaOnlySort": Plugin init failure for > [schema.xml] analyzer/filter: class org.apache.solr.schema.ICUCollationField > at > org.apache.solr.util.plugin.AbstractPluginLoader.load(AbstractPluginLoader.java:168) > at org.apache.solr.schema.IndexSchema.readSchema(IndexSchema.java:359) > at org.apache.solr.schema.IndexSchema.(IndexSchema.java:106) > at org.apache.solr.core.CoreContainer.create(CoreContainer.java:812) > at org.apache.solr.core.CoreContainer.load(CoreContainer.java:510) > at org.apache.solr.core.CoreContainer.load(CoreContainer.java:333) > at > org.apache.solr.core.CoreContainer$Initializer.initialize(CoreContainer.java:282) > at > org.apache.solr.servlet.SolrDispatchFilter.init(SolrDispatchFilter.java:101) > at > org.apache.catalina.core.ApplicationFilterConfig.getFilter(ApplicationFilterConfig.java:295) > at > org.apache.catalina.core.ApplicationFilterConfig.setFilterDef(ApplicationFilterConfig.java:422) > at > org.apache.catalina.core.ApplicationFilterConfig.(ApplicationFilterConfig.java:115) > at > org.apache.catalina.core.StandardContext.filterStart(StandardContext.java:4072) > at > org.apache.catalina.core.StandardContext.start(StandardContext.java:4726) > at > org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:799) > at > org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:779) > at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:601) > at org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:943) > at > org.apache.catalina.startup.HostConfig.deployWARs(HostConfig.java:778) > at > org.apache.catalina.startup.HostConfig.deployApps(HostConfig.java:504) > at org.apache.catalina.startup.HostConfig.start(HostConfig.java:1317) > at > org.apache.catalina.startup.HostConfig.lifecycleEvent(HostConfig.java:324) > at > org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:142) > at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1065) > at org.apache.catalina.core.StandardHost.start(StandardHost.java:840) > at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1057) > at > org.apache.catalina.core.StandardEngine.start(StandardEngine.java:463) > at > org.apache.catalina.core.StandardService.start(StandardService.java:525) > at > org.apache.catalina.core.StandardServer.start(StandardServer.java:754) > at org.apache.catalina.startup.Catalina.start(Catalina.java:595) > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) > at java.lang.reflect.Method.invoke(Method.java:597) > at org.apache.catalina.startup.Bootstrap.start(Bootstrap.java:289) > at org.apache.catalina.startup.Bootstrap.main(Bootstrap.java:414) > Caused by: org.apache.solr.common.SolrException: Plugin init failure for > [schema.xml] analyzer/filter: class org.apache.solr.schema.ICUCollationField > at > org.apache.solr.util.plugin.AbstractPluginLoader.load(AbstractPluginLoader.java:168) > at > org.apache.solr.schema.FieldTypePluginLoader.readAnalyzer(FieldTypePluginLoader.java:356) > at > org
Re: ICUCollation throws exception
Hi Robert, instead of my colleague, he is currently away. Subsequently you will see the entire log. Oliver told me the problem and I had the impression that something necessary for initialisation is missing. I "googled" a little bit and found a patch you have done https://issues.apache.org/jira/secure/attachment/12505302/SOLR-2919.patch related to "(ICU)CollationKeyFilter". But even initializations like this: *** + + + + + + + + + + + + didn't work. At least this is what I heard from Oliver. Thanks and best wishes from Basel Günter ***Exception** INFO: Reading Solr Schema Jul 16, 2012 5:27:48 PM org.apache.solr.schema.IndexSchema readSchema INFO: Schema name=swissbib-default Jul 16, 2012 5:27:48 PM org.apache.solr.common.SolrException log SEVERE: null:org.apache.solr.common.SolrException: Plugin init failure for [schema.xml] fieldType "alphaOnlySort": Plugin init failure for [schema.xml] analyzer/filter: class org.apache.solr.schema.ICUCollationField at org.apache.solr.util.plugin.AbstractPluginLoader.load(AbstractPluginLoader.java:168) at org.apache.solr.schema.IndexSchema.readSchema(IndexSchema.java:359) at org.apache.solr.schema.IndexSchema.(IndexSchema.java:106) at org.apache.solr.core.CoreContainer.create(CoreContainer.java:812) at org.apache.solr.core.CoreContainer.load(CoreContainer.java:510) at org.apache.solr.core.CoreContainer.load(CoreContainer.java:333) at org.apache.solr.core.CoreContainer$Initializer.initialize(CoreContainer.java:282) at org.apache.solr.servlet.SolrDispatchFilter.init(SolrDispatchFilter.java:101) at org.apache.catalina.core.ApplicationFilterConfig.getFilter(ApplicationFilterConfig.java:295) at org.apache.catalina.core.ApplicationFilterConfig.setFilterDef(ApplicationFilterConfig.java:422) at org.apache.catalina.core.ApplicationFilterConfig.(ApplicationFilterConfig.java:115) at org.apache.catalina.core.StandardContext.filterStart(StandardContext.java:4072) at org.apache.catalina.core.StandardContext.start(StandardContext.java:4726) at org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:799) at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:779) at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:601) at org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:943) at org.apache.catalina.startup.HostConfig.deployWARs(HostConfig.java:778) at org.apache.catalina.startup.HostConfig.deployApps(HostConfig.java:504) at org.apache.catalina.startup.HostConfig.start(HostConfig.java:1317) at org.apache.catalina.startup.HostConfig.lifecycleEvent(HostConfig.java:324) at org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:142) at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1065) at org.apache.catalina.core.StandardHost.start(StandardHost.java:840) at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1057) at org.apache.catalina.core.StandardEngine.start(StandardEngine.java:463) at org.apache.catalina.core.StandardService.start(StandardService.java:525) at org.apache.catalina.core.StandardServer.start(StandardServer.java:754) at org.apache.catalina.startup.Catalina.start(Catalina.java:595) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at org.apache.catalina.startup.Bootstrap.start(Bootstrap.java:289) at org.apache.catalina.startup.Bootstrap.main(Bootstrap.java:414) Caused by: org.apache.solr.common.SolrException: Plugin init failure for [schema.xml] analyzer/filter: class org.apache.solr.schema.ICUCollationField at org.apache.solr.util.plugin.AbstractPluginLoader.load(AbstractPluginLoader.java:168) at org.apache.solr.schema.FieldTypePluginLoader.readAnalyzer(FieldTypePluginLoader.java:356) at org.apache.solr.schema.FieldTypePluginLoader.create(FieldTypePluginLoader.java:95) at org.apache.solr.schema.FieldTypePluginLoader.create(FieldTypePluginLoader.java:43) at org.apache.solr.util.plugin.AbstractPluginLoader.load(AbstractPluginLoader.java:142) ... 34 more Caused by: java.lang.ClassCastException: class org.apache.solr.schema.ICUCollationField at java.lang.Class.asSubclass(Class.java:3018) at org.apache.solr.core.SolrResourceLoader.findClass(SolrResourceLoader.java:409) at org.apache.solr.core.SolrResourceLoader.newInstance(SolrResourceLoader.java:430) at org.apache.solr.util.plugin.AbstractPluginLoader.create(AbstractPluginLoader.java:86) at org.apache.solr.util.plugin.AbstractPluginLoader.load(AbstractPluginLoader.java:142) ... 38 more Jul 16, 2012 5:27:48 PM org.apache.solr.core.CoreCon