Re: Index optimize runs in background.
Thanks everybody for your replies. I have noticed the optimization running in background every time I indexed. This is 5 node cluster with solr-5.1.0 and uses the CloudSolrClient. Kindly share your findings on this issue. Our index has almost 100M documents running on SolrCloud. We have been optimizing the index after indexing for years and it has worked well for us. Thanks, Modassar On Fri, May 22, 2015 at 11:55 PM, Erick Erickson erickerick...@gmail.com wrote: Actually, I've recently seen very similar behavior in Solr 4.10.3, but involving hard commits openSearcher=true, see: https://issues.apache.org/jira/browse/SOLR-7572. Of course I can't reproduce this at will, sii. A unit test should be very simple to write though, maybe I can get to it today. Erick On Fri, May 22, 2015 at 8:27 AM, Upayavira u...@odoko.co.uk wrote: On Fri, May 22, 2015, at 03:55 PM, Shawn Heisey wrote: On 5/21/2015 6:21 AM, Modassar Ather wrote: I am using Solr-5.1.0. I have an indexer class which invokes cloudSolrClient.optimize(true, true, 1). My indexer exits after the invocation of optimize and the optimization keeps on running in the background. Kindly let me know if it is per design and how can I make my indexer to wait until the optimization is over. Is there a configuration/parameter I need to set for the same. Please note that the same indexer with cloudSolrServer.optimize(true, true, 1) on Solr-4.10 used to wait till the optimize was over before exiting. This is very odd, because I could not get HttpSolrServer to optimize in the background, even when that was what I wanted. I wondered if maybe the Cloud object behaves differently with regard to blocking until an optimize is finished ... except that there is no code for optimizing in CloudSolrClient at all ... so I don't know where the different behavior would actually be happening. A more important question is, why are you optimising? Generally it isn't recommended anymore as it reduces the natural distribution of documents amongst segments and makes future merges more costly. Upayavira
Difference in running Solr with Jetty internally or externally
Hi, I understand that Jetty comes together with the Solr installation package, and that by default, Solr uses Jetty internally to power it's HTTP stack. Would like to check, will there be any performance difference when we run the Jetty internally as compared to running an external copy of Jetty? I have heard of source saying that the performance will be better if we run on external copy. Is that true? Also, as support for deploying Solr as a WAR in standalone servlet containers like Jetty is no longer supported from Solr 5.0, is it still possible to deploy Solr using an external copy of Jetty? Thank you in advance for your clarification. Regards, Edwin
Re: Running Solr 5.1.0 as a Service on Windows
I've managed to get the Solr started as a Windows service after re-configuring the startup script, as I've previously missed out some of the custom configurations there. However, I still couldn't get the zookeeper to start the same way too. Are we able to use NSSM to start up zookeeper as a Microsoft Windows service too? Regards, Edwin On 25 May 2015 at 12:16, Zheng Lin Edwin Yeo edwinye...@gmail.com wrote: Hi, Has anyone tried to run Solr 5.1.0 as a Microsoft Windows service? i've tried to follow the steps from this website http://www.norconex.com/how-to-run-solr5-as-a-service-on-windows/, which uses NSSM. However, when I tried to start the service from the Component Services in the Windows Control Panel Administrative tools, I get the following message: Windows could not start the Solr5 service on Local Computer. The service did not return an error. This could be an internal Windows error or an internal service error. Is this the correct way to set it up, or is there other methods? Regards, Edwin
Re: SolrCloud 4.8 - Transaction log size over 1GB
Hi Erick, thanks for your support. Reading the post I realised that my scenario does not apply the autoCommit configuration, now we don't have autoCommit in our solrconfig.xml. We need docs are searchable only after the indexing process, and all the documents are committed only at end of index process. Now I don't understand why tlog files are so big, given that we have an hard commit at end of every indexing. On Sun, May 24, 2015 at 5:49 PM, Erick Erickson erickerick...@gmail.com wrote: Vincenzo: Here's perhaps more than you want to know about hard commits, soft commits and transaction logs: http://lucidworks.com/blog/understanding-transaction-logs-softcommit-and-commit-in-sorlcloud/ Best, Erick On Sun, May 24, 2015 at 12:04 AM, Vincenzo D'Amore v.dam...@gmail.com wrote: Thanks Shawn for your prompt support. Best regards, Vincenzo On Sun, May 24, 2015 at 6:45 AM, Shawn Heisey apa...@elyograg.org wrote: On 5/23/2015 9:41 PM, Vincenzo D'Amore wrote: Thanks Shawn, may be this is a silly question, but I looked around and didn't find an answer... Well, could I update solrconfig.xml for the collection while the instances are running or should I restart the cluster/reload the cores? You can upload a new config to zookeeper with the zkcli program while Solr is running, and nothing will change, at least not immediately. The new config will take effect when you reload the collection or restart all the Solr instances. Thanks, Shawn -- Vincenzo D'Amore email: v.dam...@gmail.com skype: free.dev mobile: +39 349 8513251
Re: Is it possible to do term Search for the filtered result set
Many Thanks Erick and Upayavira, Solution works for me. On Thu, May 21, 2015 at 9:10 PM, Upayavira u...@odoko.co.uk wrote: and then facet on the tags field. facet=onfacet.field=tags Upayavira On Thu, May 21, 2015, at 04:34 PM, Erick Erickson wrote: Have you tried fq=type:A Best, Erick On Thu, May 21, 2015 at 5:49 AM, Danesh Kuruppu dknkuru...@gmail.com wrote: Hi all, Is it possible to do term search for the filtered result set. we can do term search for all documents. Can we do the term search only for the specified filtered result set. Lets says we have, Doc1 -- type: A tags: T1 T2 Doc2 -- type: A tags: T1 T3 Doc3 -- type: B tags: T1 T4 T5 Can we do term search for tags only in type:A documents, So that it gives the results as T1 - 02 T2 - 01 T3 - 01 Is this possible? If so can you please share documents on this. Thanks Danesh
Re: Difference in running Solr with Jetty internally or externally
On 5/25/2015 3:28 AM, Zheng Lin Edwin Yeo wrote: I understand that Jetty comes together with the Solr installation package, and that by default, Solr uses Jetty internally to power it's HTTP stack. Would like to check, will there be any performance difference when we run the Jetty internally as compared to running an external copy of Jetty? I have heard of source saying that the performance will be better if we run on external copy. Is that true? The config for the included Jetty has had a small amount of tuning done specifically for Solr. You would lose that if you switched. Also, the Jetty included with Solr has had a bunch of the jars and their configuration stripped out. That makes it use less memory, and likely run a little bit faster. Also, as support for deploying Solr as a WAR in standalone servlet containers like Jetty is no longer supported from Solr 5.0, is it still possible to deploy Solr using an external copy of Jetty? Although we don't recommend it, for now you can find the .war file in the download and deploy it in other containers. That will be changing in a future release, but we don't have an ETA. There is a wiki page discussing the situation. It is still a work in progress: https://wiki.apache.org/solr/WhyNoWar Thanks, Shawn
Re: SolrCloud 4.8 - Transaction log size over 1GB
OK, assuming you're not doing any commits at all until the very end, then the tlog contains all the docs for the _entire_ run. The article really doesn't care whether the commits come from the solrconfig.xml or SolrJ client or curl. The tlog simply is not truncated until a hard commit happens, no matter where it comes from. So here's what I'd do: 1 set autoCommit in your solrconfig.xml with openSearcher=false for every minute. Then the problem will probably go away. or 2 periodically issue a hard commit (openSearcher=false) from the client. Of the two, I _strongly_ recommend 1 as it's more graceful when there are multiple clents. Best, Erick On Mon, May 25, 2015 at 4:45 AM, Vincenzo D'Amore v.dam...@gmail.com wrote: Hi Erick, thanks for your support. Reading the post I realised that my scenario does not apply the autoCommit configuration, now we don't have autoCommit in our solrconfig.xml. We need docs are searchable only after the indexing process, and all the documents are committed only at end of index process. Now I don't understand why tlog files are so big, given that we have an hard commit at end of every indexing. On Sun, May 24, 2015 at 5:49 PM, Erick Erickson erickerick...@gmail.com wrote: Vincenzo: Here's perhaps more than you want to know about hard commits, soft commits and transaction logs: http://lucidworks.com/blog/understanding-transaction-logs-softcommit-and-commit-in-sorlcloud/ Best, Erick On Sun, May 24, 2015 at 12:04 AM, Vincenzo D'Amore v.dam...@gmail.com wrote: Thanks Shawn for your prompt support. Best regards, Vincenzo On Sun, May 24, 2015 at 6:45 AM, Shawn Heisey apa...@elyograg.org wrote: On 5/23/2015 9:41 PM, Vincenzo D'Amore wrote: Thanks Shawn, may be this is a silly question, but I looked around and didn't find an answer... Well, could I update solrconfig.xml for the collection while the instances are running or should I restart the cluster/reload the cores? You can upload a new config to zookeeper with the zkcli program while Solr is running, and nothing will change, at least not immediately. The new config will take effect when you reload the collection or restart all the Solr instances. Thanks, Shawn -- Vincenzo D'Amore email: v.dam...@gmail.com skype: free.dev mobile: +39 349 8513251
Re: Running Solr 5.1.0 as a Service on Windows
Zookeeper is just Java, so there's no reason why it can't be started in Windows. However, the startup scripts for Zookeeper on Windows are pathetic, so you are much more on your own than you are on Linux. There may be folks here who can answer your question (e.g. with Windows specific startup scripts), or you might consider asking on the Zookeeper mailing lists directly: https://zookeeper.apache.org/lists.html Upayavira On Mon, May 25, 2015, at 10:34 AM, Zheng Lin Edwin Yeo wrote: I've managed to get the Solr started as a Windows service after re-configuring the startup script, as I've previously missed out some of the custom configurations there. However, I still couldn't get the zookeeper to start the same way too. Are we able to use NSSM to start up zookeeper as a Microsoft Windows service too? Regards, Edwin On 25 May 2015 at 12:16, Zheng Lin Edwin Yeo edwinye...@gmail.com wrote: Hi, Has anyone tried to run Solr 5.1.0 as a Microsoft Windows service? i've tried to follow the steps from this website http://www.norconex.com/how-to-run-solr5-as-a-service-on-windows/, which uses NSSM. However, when I tried to start the service from the Component Services in the Windows Control Panel Administrative tools, I get the following message: Windows could not start the Solr5 service on Local Computer. The service did not return an error. This could be an internal Windows error or an internal service error. Is this the correct way to set it up, or is there other methods? Regards, Edwin
Re: Applying gzip compression in Solr 5.1
Why do you want gzip compression? Solr generally sits close to your application server, so compression should be less necesasry there. Solr isn't intended to be serving the public over the internet directly, particularly because of its lack of security so it should be feeding some other HTTP server that does the compression for you. Am I missing something? Upayavira On Mon, May 25, 2015, at 03:26 AM, Zheng Lin Edwin Yeo wrote: Thanks for your reply. Do we still have to use back the solr.war file in Solr 5.1 in order to get the gzip working? Regards, Edwin On 25 May 2015 at 06:57, William Bell billnb...@gmail.com wrote: OK I got mine to work with 4.10.4 and it also works on 5.1... mkdir war cp ../solr-4.10.4/example/webapps/solr.war . jar xvf solr.war cd WEB-INF vi web.xmlAdd this above the filter already there:filterfilter-nameGzipFilter/filter-name filter-classorg.eclipse.jetty.servlets.GzipFilter/filter-class init-paramparam-namemimeTypes/param-name param-valueapplication/xml,application/json,text/html,text/plain,text/xml,application/xhtml+xml,text/css,application/javascript,image/svg+xml,application/x-javascript,text/css/param-value /init-param/filter filter-mapping filter-nameGzipFilter/filter-name url-pattern/*/url-pattern/filter-mapping cd .. rm solr.war jar cvf solr.war * cp solr.war ../solr-4.10.4/example/webapps/solr.war get jetty-servlets-8.1.10.v20130312.jar from http://www.eclipse.org/downloads/download.php?file=/jetty/updates/jetty-bundles-8.x/8.1.10.v20130312/Jetty-bundles-repository-8.1.10.v20130312.zipmirror_id=454 (notice the name has to be exactly that) and put into solr-4.10.4/example/lib restart On Thu, May 21, 2015 at 11:31 PM, Zheng Lin Edwin Yeo edwinye...@gmail.com wrote: Hi, I'm trying to apply gzip compression in Solr 5.1. I understand that Running Solr on Tomcat is no longer supported from Solr 5.0, so I've tried to implement it in Solr. I've downloaded jetty-servlets-9.3.0.RC0.jar and placed it in my webapp\WEB-INF folder, and have added the following in webapp\WEB-INF\web.xml filter filter-nameGzipFilter/filter-name filter-classorg.eclipse.jetty.servlets.GzipFilter/filter-class init-param param-namemethods/param-name param-valueGET,POST/param-value param-namemimeTypes/param-name param-valuetext/html;charset=UTF-8,text/plain,text/xml,text/json,text/javascript,text/css,text/plain;charset=UTF-8,application/xhtml+xml,application/javascript,image/svg+xml,application/json,application/xml; charset=UTF-8/param-value /init-param /filter filter-mapping filter-nameGzipFilter/filter-name url-pattern/*/url-pattern /filter-mapping However, when I start Solr and check the browser, there's no gzip compression. Is there anything which I configure wrongly or might have missed out? I'm also running zookeeper-3.4.6. Regards, Edwin -- Bill Bell billnb...@gmail.com cell 720-256-8076
Re: SolrCloud 4.8 - Transaction log size over 1GB
Hi Erick, I have tried indexing code I have few times, this is the behaviour I have tried out: When an indexing process starts, even if one or more tlog file exists, a new tlog file is created and all the new documents are stored there. When indexing process ends and does an hard commit, older old tlog files are removed but the new one (the latest) remains. As far as I can see, since my indexing process every time loads few millions of documents, at end of process latest tlog file persist with all these documents there. So I have such big tlog files. Now the question is, why latest tlog file persist even if the code have done a hard commit. When an hard commit is done successfully, why should we keep latest tlog file? On Mon, May 25, 2015 at 7:24 PM, Erick Erickson erickerick...@gmail.com wrote: OK, assuming you're not doing any commits at all until the very end, then the tlog contains all the docs for the _entire_ run. The article really doesn't care whether the commits come from the solrconfig.xml or SolrJ client or curl. The tlog simply is not truncated until a hard commit happens, no matter where it comes from. So here's what I'd do: 1 set autoCommit in your solrconfig.xml with openSearcher=false for every minute. Then the problem will probably go away. or 2 periodically issue a hard commit (openSearcher=false) from the client. Of the two, I _strongly_ recommend 1 as it's more graceful when there are multiple clents. Best, Erick On Mon, May 25, 2015 at 4:45 AM, Vincenzo D'Amore v.dam...@gmail.com wrote: Hi Erick, thanks for your support. Reading the post I realised that my scenario does not apply the autoCommit configuration, now we don't have autoCommit in our solrconfig.xml. We need docs are searchable only after the indexing process, and all the documents are committed only at end of index process. Now I don't understand why tlog files are so big, given that we have an hard commit at end of every indexing. On Sun, May 24, 2015 at 5:49 PM, Erick Erickson erickerick...@gmail.com wrote: Vincenzo: Here's perhaps more than you want to know about hard commits, soft commits and transaction logs: http://lucidworks.com/blog/understanding-transaction-logs-softcommit-and-commit-in-sorlcloud/ Best, Erick On Sun, May 24, 2015 at 12:04 AM, Vincenzo D'Amore v.dam...@gmail.com wrote: Thanks Shawn for your prompt support. Best regards, Vincenzo On Sun, May 24, 2015 at 6:45 AM, Shawn Heisey apa...@elyograg.org wrote: On 5/23/2015 9:41 PM, Vincenzo D'Amore wrote: Thanks Shawn, may be this is a silly question, but I looked around and didn't find an answer... Well, could I update solrconfig.xml for the collection while the instances are running or should I restart the cluster/reload the cores? You can upload a new config to zookeeper with the zkcli program while Solr is running, and nothing will change, at least not immediately. The new config will take effect when you reload the collection or restart all the Solr instances. Thanks, Shawn -- Vincenzo D'Amore email: v.dam...@gmail.com skype: free.dev mobile: +39 349 8513251 -- Vincenzo D'Amore email: v.dam...@gmail.com skype: free.dev mobile: +39 349 8513251
Re: SolrCloud 4.8 - Transaction log size over 1GB
The design is that the latest successfully flushed tlog file is kept for peer sync in SolrCloud mode. When a replica comes up, there's a chance that it's not very many docs behind. So, if possible, some of the docs are taken from the leader's tlog and replayed to the follower that's just been started. If the follower is too far out of sync, a full old-style replication is done. So there will always be a tlog file (and occasionally more than one if they're very small) kept around, even on successful commit. It doesn't matter if you have leaders and replicas or not, that's still the process that's followed. Please re-read the link I sent earlier. There's absolutely no reason your tlog files have to be so big! Really, set you autoCommit to, say, 15 seconds and 10 docs and set openSearcher=false in your solrconfig.xml file and your tlog file that's kept around will be much smaller and they'll be available for peer sync.. And if you really don't care about tlogs at all, just take this bit our of your solrconfig.xml updateLog str name=dir${solr.ulog.dir:}/str int name=${solr.ulog.numVersionBuckets:256}/int /updateLog Best, Erick On Mon, May 25, 2015 at 4:40 PM, Vincenzo D'Amore v.dam...@gmail.com wrote: Hi Erick, I have tried indexing code I have few times, this is the behaviour I have tried out: When an indexing process starts, even if one or more tlog file exists, a new tlog file is created and all the new documents are stored there. When indexing process ends and does an hard commit, older old tlog files are removed but the new one (the latest) remains. As far as I can see, since my indexing process every time loads few millions of documents, at end of process latest tlog file persist with all these documents there. So I have such big tlog files. Now the question is, why latest tlog file persist even if the code have done a hard commit. When an hard commit is done successfully, why should we keep latest tlog file? On Mon, May 25, 2015 at 7:24 PM, Erick Erickson erickerick...@gmail.com wrote: OK, assuming you're not doing any commits at all until the very end, then the tlog contains all the docs for the _entire_ run. The article really doesn't care whether the commits come from the solrconfig.xml or SolrJ client or curl. The tlog simply is not truncated until a hard commit happens, no matter where it comes from. So here's what I'd do: 1 set autoCommit in your solrconfig.xml with openSearcher=false for every minute. Then the problem will probably go away. or 2 periodically issue a hard commit (openSearcher=false) from the client. Of the two, I _strongly_ recommend 1 as it's more graceful when there are multiple clents. Best, Erick On Mon, May 25, 2015 at 4:45 AM, Vincenzo D'Amore v.dam...@gmail.com wrote: Hi Erick, thanks for your support. Reading the post I realised that my scenario does not apply the autoCommit configuration, now we don't have autoCommit in our solrconfig.xml. We need docs are searchable only after the indexing process, and all the documents are committed only at end of index process. Now I don't understand why tlog files are so big, given that we have an hard commit at end of every indexing. On Sun, May 24, 2015 at 5:49 PM, Erick Erickson erickerick...@gmail.com wrote: Vincenzo: Here's perhaps more than you want to know about hard commits, soft commits and transaction logs: http://lucidworks.com/blog/understanding-transaction-logs-softcommit-and-commit-in-sorlcloud/ Best, Erick On Sun, May 24, 2015 at 12:04 AM, Vincenzo D'Amore v.dam...@gmail.com wrote: Thanks Shawn for your prompt support. Best regards, Vincenzo On Sun, May 24, 2015 at 6:45 AM, Shawn Heisey apa...@elyograg.org wrote: On 5/23/2015 9:41 PM, Vincenzo D'Amore wrote: Thanks Shawn, may be this is a silly question, but I looked around and didn't find an answer... Well, could I update solrconfig.xml for the collection while the instances are running or should I restart the cluster/reload the cores? You can upload a new config to zookeeper with the zkcli program while Solr is running, and nothing will change, at least not immediately. The new config will take effect when you reload the collection or restart all the Solr instances. Thanks, Shawn -- Vincenzo D'Amore email: v.dam...@gmail.com skype: free.dev mobile: +39 349 8513251 -- Vincenzo D'Amore email: v.dam...@gmail.com skype: free.dev mobile: +39 349 8513251
Re: Applying gzip compression in Solr 5.1
Although we do not plan to implement Solr to be access over the internet directly, it will be accessed via a local area network (Eg within the company). As there could potentially be alot of data indexed in Solr (especially rich-text documents), and that will probably take up alot of bandwidth, we wanted gzip compression so that we can save on the bandwidth. Do you mean the compression should be done at the HTTP server and not at Solr? Regards, Edwin On 26 May 2015 at 04:13, Upayavira u...@odoko.co.uk wrote: Why do you want gzip compression? Solr generally sits close to your application server, so compression should be less necesasry there. Solr isn't intended to be serving the public over the internet directly, particularly because of its lack of security so it should be feeding some other HTTP server that does the compression for you. Am I missing something? Upayavira On Mon, May 25, 2015, at 03:26 AM, Zheng Lin Edwin Yeo wrote: Thanks for your reply. Do we still have to use back the solr.war file in Solr 5.1 in order to get the gzip working? Regards, Edwin On 25 May 2015 at 06:57, William Bell billnb...@gmail.com wrote: OK I got mine to work with 4.10.4 and it also works on 5.1... mkdir war cp ../solr-4.10.4/example/webapps/solr.war . jar xvf solr.war cd WEB-INF vi web.xmlAdd this above the filter already there:filterfilter-nameGzipFilter/filter-name filter-classorg.eclipse.jetty.servlets.GzipFilter/filter-class init-paramparam-namemimeTypes/param-name param-valueapplication/xml,application/json,text/html,text/plain,text/xml,application/xhtml+xml,text/css,application/javascript,image/svg+xml,application/x-javascript,text/css/param-value /init-param/filter filter-mapping filter-nameGzipFilter/filter-name url-pattern/*/url-pattern/filter-mapping cd .. rm solr.war jar cvf solr.war * cp solr.war ../solr-4.10.4/example/webapps/solr.war get jetty-servlets-8.1.10.v20130312.jar from http://www.eclipse.org/downloads/download.php?file=/jetty/updates/jetty-bundles-8.x/8.1.10.v20130312/Jetty-bundles-repository-8.1.10.v20130312.zipmirror_id=454 (notice the name has to be exactly that) and put into solr-4.10.4/example/lib restart On Thu, May 21, 2015 at 11:31 PM, Zheng Lin Edwin Yeo edwinye...@gmail.com wrote: Hi, I'm trying to apply gzip compression in Solr 5.1. I understand that Running Solr on Tomcat is no longer supported from Solr 5.0, so I've tried to implement it in Solr. I've downloaded jetty-servlets-9.3.0.RC0.jar and placed it in my webapp\WEB-INF folder, and have added the following in webapp\WEB-INF\web.xml filter filter-nameGzipFilter/filter-name filter-classorg.eclipse.jetty.servlets.GzipFilter/filter-class init-param param-namemethods/param-name param-valueGET,POST/param-value param-namemimeTypes/param-name param-valuetext/html;charset=UTF-8,text/plain,text/xml,text/json,text/javascript,text/css,text/plain;charset=UTF-8,application/xhtml+xml,application/javascript,image/svg+xml,application/json,application/xml; charset=UTF-8/param-value /init-param /filter filter-mapping filter-nameGzipFilter/filter-name url-pattern/*/url-pattern /filter-mapping However, when I start Solr and check the browser, there's no gzip compression. Is there anything which I configure wrongly or might have missed out? I'm also running zookeeper-3.4.6. Regards, Edwin -- Bill Bell billnb...@gmail.com cell 720-256-8076
Re: Applying gzip compression in Solr 5.1
You need to edit the web.xml as I stated above and put the solr.war file back. Instead of example it is server. Note: 4.10.4 and 5.1.0 uses the same Jetty. 8.1.10.v20130312 Here is the procedure for 5.1: mkdir war cp ../solr-5.1.0/example/webapps/solr.war . jar xvf solr.war cd WEB-INF vi web.xml Add this above the filter already there (and save it after adding this). filter filter-nameGzipFilter/filter-name filter-classorg.eclipse.jetty.servlets.GzipFilter/filter-class init-paramparam-namemimeTypes/param-name param-valueapplication/xml,application/json,text/html,text/plain,text/xml,application/xhtml+xml,text/css,application/javascript,image/svg+xml,application/x-javascript,text/css/param-value /init-param /filter filter-mapping filter-nameGzipFilter/filter-name url-pattern/*/url-pattern /filter-mapping cd .. rm solr.war jar cvf solr.war * cp solr.war ../solr-5.1.0/example/webapps/solr.war get jetty-servlets-8.1.10.v20130312.jar from http://www.eclipse.org/downloads/download.php?file=/jetty/updates/jetty-bundles-8.x/8.1.10.v20130312/Jetty-bundles-repository-8.1.10.v20130312.zipmirror_id=454 (notice the name has to be exactly that) and put into solr-5.1.0/example/lib restart On Sun, May 24, 2015 at 8:26 PM, Zheng Lin Edwin Yeo edwinye...@gmail.com wrote: Thanks for your reply. Do we still have to use back the solr.war file in Solr 5.1 in order to get the gzip working? Regards, Edwin On 25 May 2015 at 06:57, William Bell billnb...@gmail.com wrote: OK I got mine to work with 4.10.4 and it also works on 5.1... mkdir war cp ../solr-4.10.4/example/webapps/solr.war . jar xvf solr.war cd WEB-INF vi web.xmlAdd this above the filter already there:filterfilter-nameGzipFilter/filter-name filter-classorg.eclipse.jetty.servlets.GzipFilter/filter-class init-paramparam-namemimeTypes/param-name param-valueapplication/xml,application/json,text/html,text/plain,text/xml,application/xhtml+xml,text/css,application/javascript,image/svg+xml,application/x-javascript,text/css/param-value /init-param/filter filter-mapping filter-nameGzipFilter/filter-name url-pattern/*/url-pattern/filter-mapping cd .. rm solr.war jar cvf solr.war * cp solr.war ../solr-4.10.4/example/webapps/solr.war get jetty-servlets-8.1.10.v20130312.jar from http://www.eclipse.org/downloads/download.php?file=/jetty/updates/jetty-bundles-8.x/8.1.10.v20130312/Jetty-bundles-repository-8.1.10.v20130312.zipmirror_id=454 (notice the name has to be exactly that) and put into solr-4.10.4/example/lib restart On Thu, May 21, 2015 at 11:31 PM, Zheng Lin Edwin Yeo edwinye...@gmail.com wrote: Hi, I'm trying to apply gzip compression in Solr 5.1. I understand that Running Solr on Tomcat is no longer supported from Solr 5.0, so I've tried to implement it in Solr. I've downloaded jetty-servlets-9.3.0.RC0.jar and placed it in my webapp\WEB-INF folder, and have added the following in webapp\WEB-INF\web.xml filter filter-nameGzipFilter/filter-name filter-classorg.eclipse.jetty.servlets.GzipFilter/filter-class init-param param-namemethods/param-name param-valueGET,POST/param-value param-namemimeTypes/param-name param-valuetext/html;charset=UTF-8,text/plain,text/xml,text/json,text/javascript,text/css,text/plain;charset=UTF-8,application/xhtml+xml,application/javascript,image/svg+xml,application/json,application/xml; charset=UTF-8/param-value /init-param /filter filter-mapping filter-nameGzipFilter/filter-name url-pattern/*/url-pattern /filter-mapping However, when I start Solr and check the browser, there's no gzip compression. Is there anything which I configure wrongly or might have missed out? I'm also running zookeeper-3.4.6. Regards, Edwin -- Bill Bell billnb...@gmail.com cell 720-256-8076 -- Bill Bell billnb...@gmail.com cell 720-256-8076
Re: Applying gzip compression in Solr 5.1
Err.. Here is the procedure for 5.1: mkdir war cp ../solr-5.1.0/server/webapps/solr.war . jar xvf solr.war cd WEB-INF vi web.xml Add this above the filter already there (and save it after adding this). filter filter-nameGzipFilter/filter-name filter-classorg.eclipse.jetty.servlets.GzipFilter/filter-class init-param param-namemimeTypes/param-name param-valueapplication/xml,application/json,text/html,text/plain,text/xml,application/xhtml+xml,text/css,application/javascript,image/svg+xml,application/x-javascript,text/css/param-value /init-param /filter filter-mapping filter-nameGzipFilter/filter-name url-pattern/*/url-pattern /filter-mapping cd .. rm solr.war jar cvf solr.war * cp solr.war ../solr-5.1.0/server/webapps/solr.war get jetty-servlets-8.1.10.v20130312.jar from http://www.eclipse.org/downloads/download.php?file=/jetty/updates/jetty-bundles-8.x/8.1.10.v20130312/Jetty-bundles-repository-8.1.10.v20130312.zipmirror_id=454 (notice the name has to be exactly that) and put into solr-5.1.0/server/lib restart On Mon, May 25, 2015 at 8:42 PM, William Bell billnb...@gmail.com wrote: You need to edit the web.xml as I stated above and put the solr.war file back. Instead of example it is server. Note: 4.10.4 and 5.1.0 uses the same Jetty. 8.1.10.v20130312 Here is the procedure for 5.1: mkdir war cp ../solr-5.1.0/example/webapps/solr.war . jar xvf solr.war cd WEB-INF vi web.xml Add this above the filter already there (and save it after adding this). filter filter-nameGzipFilter/filter-name filter-classorg.eclipse.jetty.servlets.GzipFilter/filter-class init-paramparam-namemimeTypes/param-name param-valueapplication/xml,application/json,text/html,text/plain,text/xml,application/xhtml+xml,text/css,application/javascript,image/svg+xml,application/x-javascript,text/css/param-value /init-param /filter filter-mapping filter-nameGzipFilter/filter-name url-pattern/*/url-pattern /filter-mapping cd .. rm solr.war jar cvf solr.war * cp solr.war ../solr-5.1.0/example/webapps/solr.war get jetty-servlets-8.1.10.v20130312.jar from http://www.eclipse.org/downloads/download.php?file=/jetty/updates/jetty-bundles-8.x/8.1.10.v20130312/Jetty-bundles-repository-8.1.10.v20130312.zipmirror_id=454 (notice the name has to be exactly that) and put into solr-5.1.0/example/lib restart On Sun, May 24, 2015 at 8:26 PM, Zheng Lin Edwin Yeo edwinye...@gmail.com wrote: Thanks for your reply. Do we still have to use back the solr.war file in Solr 5.1 in order to get the gzip working? Regards, Edwin On 25 May 2015 at 06:57, William Bell billnb...@gmail.com wrote: OK I got mine to work with 4.10.4 and it also works on 5.1... mkdir war cp ../solr-4.10.4/example/webapps/solr.war . jar xvf solr.war cd WEB-INF vi web.xmlAdd this above the filter already there:filterfilter-nameGzipFilter/filter-name filter-classorg.eclipse.jetty.servlets.GzipFilter/filter-class init-paramparam-namemimeTypes/param-name param-valueapplication/xml,application/json,text/html,text/plain,text/xml,application/xhtml+xml,text/css,application/javascript,image/svg+xml,application/x-javascript,text/css/param-value /init-param/filter filter-mapping filter-nameGzipFilter/filter-name url-pattern/*/url-pattern/filter-mapping cd .. rm solr.war jar cvf solr.war * cp solr.war ../solr-4.10.4/example/webapps/solr.war get jetty-servlets-8.1.10.v20130312.jar from http://www.eclipse.org/downloads/download.php?file=/jetty/updates/jetty-bundles-8.x/8.1.10.v20130312/Jetty-bundles-repository-8.1.10.v20130312.zipmirror_id=454 (notice the name has to be exactly that) and put into solr-4.10.4/example/lib restart On Thu, May 21, 2015 at 11:31 PM, Zheng Lin Edwin Yeo edwinye...@gmail.com wrote: Hi, I'm trying to apply gzip compression in Solr 5.1. I understand that Running Solr on Tomcat is no longer supported from Solr 5.0, so I've tried to implement it in Solr. I've downloaded jetty-servlets-9.3.0.RC0.jar and placed it in my webapp\WEB-INF folder, and have added the following in webapp\WEB-INF\web.xml filter filter-nameGzipFilter/filter-name filter-classorg.eclipse.jetty.servlets.GzipFilter/filter-class init-param param-namemethods/param-name param-valueGET,POST/param-value param-namemimeTypes/param-name param-valuetext/html;charset=UTF-8,text/plain,text/xml,text/json,text/javascript,text/css,text/plain;charset=UTF-8,application/xhtml+xml,application/javascript,image/svg+xml,application/json,application/xml; charset=UTF-8/param-value /init-param /filter filter-mapping filter-nameGzipFilter/filter-name url-pattern/*/url-pattern
Re: Difference in running Solr with Jetty internally or externally
Hi Shawn, Thanks for your reply. So the recommendation is still to stick with the Jetty that's included in Solr? From what you say, seems that if we use external Jetty, we have to do more configuration to tune it to fit Solr, and it will probably use more memory and run slower too.There might also be issues with a future release of Solr. Regards, Edwin On 25 May 2015 at 21:51, Shawn Heisey apa...@elyograg.org wrote: On 5/25/2015 3:28 AM, Zheng Lin Edwin Yeo wrote: I understand that Jetty comes together with the Solr installation package, and that by default, Solr uses Jetty internally to power it's HTTP stack. Would like to check, will there be any performance difference when we run the Jetty internally as compared to running an external copy of Jetty? I have heard of source saying that the performance will be better if we run on external copy. Is that true? The config for the included Jetty has had a small amount of tuning done specifically for Solr. You would lose that if you switched. Also, the Jetty included with Solr has had a bunch of the jars and their configuration stripped out. That makes it use less memory, and likely run a little bit faster. Also, as support for deploying Solr as a WAR in standalone servlet containers like Jetty is no longer supported from Solr 5.0, is it still possible to deploy Solr using an external copy of Jetty? Although we don't recommend it, for now you can find the .war file in the download and deploy it in other containers. That will be changing in a future release, but we don't have an ETA. There is a wiki page discussing the situation. It is still a work in progress: https://wiki.apache.org/solr/WhyNoWar Thanks, Shawn
Solr relevancy score in percentage
Hi, Would like to check, does the new version of Solr allows this function of display the relevancy score in percentage? I understand from the older version that it is not able to, and the only way is to take the highest score and use that as 100%, and calculate other percentage from that number (For example if the max score is 10 and the next result has a score of 5, you would do (5 / 10) * 100 = 50%) Is there a better way to do this now? I'm using Solr 5.1 Regards, Edwin
YAJar
I am stuck in Yet Another Jarmagedon of SOLR. this is a basic question. i noticed solr 5.0 is using guava 14.0.1. My app needs guava 18.0. What is the pattern to override a jar version uploaded into jetty? I am using maven, and solr is being started the old way java -jar start.jar -Dsolr.solr.home=... -Djetty.home=... I tried to edit jetty's start.config (then run java -DSTART=/my/dir/start.config -jar start.jar) but got no where... any help would be much appreciated Peyman
Re: Applying gzip compression in Solr 5.1
I've managed to get it to work. In windows we have to put the full path name to the jar file in the jdk in Java Thank you so much for your help! Regards, Edwin On 26 May 2015 at 11:15, Zheng Lin Edwin Yeo edwinye...@gmail.com wrote: I have problems running the command 'jar xvf solr.war'. When I try to run that, it will give an error: 'jar' is not recognized as an internal or external command, operable program or batch file. Is there anything which I might have missed out? Regards, Edwin On 26 May 2015 at 10:59, William Bell billnb...@gmail.com wrote: Err.. Here is the procedure for 5.1: mkdir war cp ../solr-5.1.0/server/webapps/solr.war . jar xvf solr.war cd WEB-INF vi web.xml Add this above the filter already there (and save it after adding this). filter filter-nameGzipFilter/filter-name filter-classorg.eclipse.jetty.servlets.GzipFilter/filter-class init-param param-namemimeTypes/param-name param-valueapplication/xml,application/json,text/html,text/plain,text/xml,application/xhtml+xml,text/css,application/javascript,image/svg+xml,application/x-javascript,text/css/param-value /init-param /filter filter-mapping filter-nameGzipFilter/filter-name url-pattern/*/url-pattern /filter-mapping cd .. rm solr.war jar cvf solr.war * cp solr.war ../solr-5.1.0/server/webapps/solr.war get jetty-servlets-8.1.10.v20130312.jar from http://www.eclipse.org/downloads/download.php?file=/jetty/updates/jetty-bundles-8.x/8.1.10.v20130312/Jetty-bundles-repository-8.1.10.v20130312.zipmirror_id=454 (notice the name has to be exactly that) and put into solr-5.1.0/server/lib restart On Mon, May 25, 2015 at 8:42 PM, William Bell billnb...@gmail.com wrote: You need to edit the web.xml as I stated above and put the solr.war file back. Instead of example it is server. Note: 4.10.4 and 5.1.0 uses the same Jetty. 8.1.10.v20130312 Here is the procedure for 5.1: mkdir war cp ../solr-5.1.0/example/webapps/solr.war . jar xvf solr.war cd WEB-INF vi web.xml Add this above the filter already there (and save it after adding this). filter filter-nameGzipFilter/filter-name filter-classorg.eclipse.jetty.servlets.GzipFilter/filter-class init-paramparam-namemimeTypes/param-name param-valueapplication/xml,application/json,text/html,text/plain,text/xml,application/xhtml+xml,text/css,application/javascript,image/svg+xml,application/x-javascript,text/css/param-value /init-param /filter filter-mapping filter-nameGzipFilter/filter-name url-pattern/*/url-pattern /filter-mapping cd .. rm solr.war jar cvf solr.war * cp solr.war ../solr-5.1.0/example/webapps/solr.war get jetty-servlets-8.1.10.v20130312.jar from http://www.eclipse.org/downloads/download.php?file=/jetty/updates/jetty-bundles-8.x/8.1.10.v20130312/Jetty-bundles-repository-8.1.10.v20130312.zipmirror_id=454 (notice the name has to be exactly that) and put into solr-5.1.0/example/lib restart On Sun, May 24, 2015 at 8:26 PM, Zheng Lin Edwin Yeo edwinye...@gmail.com wrote: Thanks for your reply. Do we still have to use back the solr.war file in Solr 5.1 in order to get the gzip working? Regards, Edwin On 25 May 2015 at 06:57, William Bell billnb...@gmail.com wrote: OK I got mine to work with 4.10.4 and it also works on 5.1... mkdir war cp ../solr-4.10.4/example/webapps/solr.war . jar xvf solr.war cd WEB-INF vi web.xmlAdd this above the filter already there:filterfilter-nameGzipFilter/filter-name filter-classorg.eclipse.jetty.servlets.GzipFilter/filter-class init-paramparam-namemimeTypes/param-name param-valueapplication/xml,application/json,text/html,text/plain,text/xml,application/xhtml+xml,text/css,application/javascript,image/svg+xml,application/x-javascript,text/css/param-value /init-param/filter filter-mapping filter-nameGzipFilter/filter-name url-pattern/*/url-pattern/filter-mapping cd .. rm solr.war jar cvf solr.war * cp solr.war ../solr-4.10.4/example/webapps/solr.war get jetty-servlets-8.1.10.v20130312.jar from http://www.eclipse.org/downloads/download.php?file=/jetty/updates/jetty-bundles-8.x/8.1.10.v20130312/Jetty-bundles-repository-8.1.10.v20130312.zipmirror_id=454 (notice the name has to be exactly that) and put into solr-4.10.4/example/lib restart On Thu, May 21, 2015 at 11:31 PM, Zheng Lin Edwin Yeo edwinye...@gmail.com wrote: Hi, I'm trying to apply gzip compression in Solr 5.1. I understand that Running Solr on Tomcat is no longer supported from Solr 5.0, so I've tried to implement it in Solr. I've downloaded jetty-servlets-9.3.0.RC0.jar and placed it in my webapp\WEB-INF folder, and have added the
Re: Applying gzip compression in Solr 5.1
I have problems running the command 'jar xvf solr.war'. When I try to run that, it will give an error: 'jar' is not recognized as an internal or external command, operable program or batch file. Is there anything which I might have missed out? Regards, Edwin On 26 May 2015 at 10:59, William Bell billnb...@gmail.com wrote: Err.. Here is the procedure for 5.1: mkdir war cp ../solr-5.1.0/server/webapps/solr.war . jar xvf solr.war cd WEB-INF vi web.xml Add this above the filter already there (and save it after adding this). filter filter-nameGzipFilter/filter-name filter-classorg.eclipse.jetty.servlets.GzipFilter/filter-class init-param param-namemimeTypes/param-name param-valueapplication/xml,application/json,text/html,text/plain,text/xml,application/xhtml+xml,text/css,application/javascript,image/svg+xml,application/x-javascript,text/css/param-value /init-param /filter filter-mapping filter-nameGzipFilter/filter-name url-pattern/*/url-pattern /filter-mapping cd .. rm solr.war jar cvf solr.war * cp solr.war ../solr-5.1.0/server/webapps/solr.war get jetty-servlets-8.1.10.v20130312.jar from http://www.eclipse.org/downloads/download.php?file=/jetty/updates/jetty-bundles-8.x/8.1.10.v20130312/Jetty-bundles-repository-8.1.10.v20130312.zipmirror_id=454 (notice the name has to be exactly that) and put into solr-5.1.0/server/lib restart On Mon, May 25, 2015 at 8:42 PM, William Bell billnb...@gmail.com wrote: You need to edit the web.xml as I stated above and put the solr.war file back. Instead of example it is server. Note: 4.10.4 and 5.1.0 uses the same Jetty. 8.1.10.v20130312 Here is the procedure for 5.1: mkdir war cp ../solr-5.1.0/example/webapps/solr.war . jar xvf solr.war cd WEB-INF vi web.xml Add this above the filter already there (and save it after adding this). filter filter-nameGzipFilter/filter-name filter-classorg.eclipse.jetty.servlets.GzipFilter/filter-class init-paramparam-namemimeTypes/param-name param-valueapplication/xml,application/json,text/html,text/plain,text/xml,application/xhtml+xml,text/css,application/javascript,image/svg+xml,application/x-javascript,text/css/param-value /init-param /filter filter-mapping filter-nameGzipFilter/filter-name url-pattern/*/url-pattern /filter-mapping cd .. rm solr.war jar cvf solr.war * cp solr.war ../solr-5.1.0/example/webapps/solr.war get jetty-servlets-8.1.10.v20130312.jar from http://www.eclipse.org/downloads/download.php?file=/jetty/updates/jetty-bundles-8.x/8.1.10.v20130312/Jetty-bundles-repository-8.1.10.v20130312.zipmirror_id=454 (notice the name has to be exactly that) and put into solr-5.1.0/example/lib restart On Sun, May 24, 2015 at 8:26 PM, Zheng Lin Edwin Yeo edwinye...@gmail.com wrote: Thanks for your reply. Do we still have to use back the solr.war file in Solr 5.1 in order to get the gzip working? Regards, Edwin On 25 May 2015 at 06:57, William Bell billnb...@gmail.com wrote: OK I got mine to work with 4.10.4 and it also works on 5.1... mkdir war cp ../solr-4.10.4/example/webapps/solr.war . jar xvf solr.war cd WEB-INF vi web.xmlAdd this above the filter already there:filterfilter-nameGzipFilter/filter-name filter-classorg.eclipse.jetty.servlets.GzipFilter/filter-class init-paramparam-namemimeTypes/param-name param-valueapplication/xml,application/json,text/html,text/plain,text/xml,application/xhtml+xml,text/css,application/javascript,image/svg+xml,application/x-javascript,text/css/param-value /init-param/filter filter-mapping filter-nameGzipFilter/filter-name url-pattern/*/url-pattern/filter-mapping cd .. rm solr.war jar cvf solr.war * cp solr.war ../solr-4.10.4/example/webapps/solr.war get jetty-servlets-8.1.10.v20130312.jar from http://www.eclipse.org/downloads/download.php?file=/jetty/updates/jetty-bundles-8.x/8.1.10.v20130312/Jetty-bundles-repository-8.1.10.v20130312.zipmirror_id=454 (notice the name has to be exactly that) and put into solr-4.10.4/example/lib restart On Thu, May 21, 2015 at 11:31 PM, Zheng Lin Edwin Yeo edwinye...@gmail.com wrote: Hi, I'm trying to apply gzip compression in Solr 5.1. I understand that Running Solr on Tomcat is no longer supported from Solr 5.0, so I've tried to implement it in Solr. I've downloaded jetty-servlets-9.3.0.RC0.jar and placed it in my webapp\WEB-INF folder, and have added the following in webapp\WEB-INF\web.xml filter filter-nameGzipFilter/filter-name filter-classorg.eclipse.jetty.servlets.GzipFilter/filter-class init-param param-namemethods/param-name
SUM in result grouping
Is it possible to SUM in a Group query?I am using the Solr group function and it's retrieving the results.Now, I want to SUM the numeric field.Is it possible. My query is like this-http://localhost:8983/solr/glaas/select?%20q=abc%20fl=TotalInvoices%20wt=json%20indent=true%20debugQuery=true%20group=true%20group.field=vendorgroup.limit=100 My output is like this-{ groupvalue:abc, doclist:{numfound:3,start:0,docs:[ { TotalInvoices:100; }, { TotalInvoices:200; }, { TotalInvoices:50; }, ] }} I want to SUM the TotalInvoices in the result set. Something like { TotalInvoices:350;//Sum of all the returned set. }, Is it possible in Solr?Please help. Regards Abhijit DekaComputer ScientistAdobe SystemsBangalore Ph-+91 80884 39067
Re: Difference in running Solr with Jetty internally or externally
Actually, just use the new bin/solr start scripts and ignore whether it's running Jetty under the covers or not I think. Best, Erick On Mon, May 25, 2015 at 7:11 PM, Zheng Lin Edwin Yeo edwinye...@gmail.com wrote: Hi Shawn, Thanks for your reply. So the recommendation is still to stick with the Jetty that's included in Solr? From what you say, seems that if we use external Jetty, we have to do more configuration to tune it to fit Solr, and it will probably use more memory and run slower too.There might also be issues with a future release of Solr. Regards, Edwin On 25 May 2015 at 21:51, Shawn Heisey apa...@elyograg.org wrote: On 5/25/2015 3:28 AM, Zheng Lin Edwin Yeo wrote: I understand that Jetty comes together with the Solr installation package, and that by default, Solr uses Jetty internally to power it's HTTP stack. Would like to check, will there be any performance difference when we run the Jetty internally as compared to running an external copy of Jetty? I have heard of source saying that the performance will be better if we run on external copy. Is that true? The config for the included Jetty has had a small amount of tuning done specifically for Solr. You would lose that if you switched. Also, the Jetty included with Solr has had a bunch of the jars and their configuration stripped out. That makes it use less memory, and likely run a little bit faster. Also, as support for deploying Solr as a WAR in standalone servlet containers like Jetty is no longer supported from Solr 5.0, is it still possible to deploy Solr using an external copy of Jetty? Although we don't recommend it, for now you can find the .war file in the download and deploy it in other containers. That will be changing in a future release, but we don't have an ETA. There is a wiki page discussing the situation. It is still a work in progress: https://wiki.apache.org/solr/WhyNoWar Thanks, Shawn