Re: java.util.concurrent.RejectedExecutionException: Max requests queued per destination 3000 exceeded for HttpDestination
Hi, Sorry, actually the issue was not fixed, on that day the service restart cleared the TIMED_WAITNG threads. The problem re-occured recently, everything works fine until a disturbance happens to the cluster due to high load or a node responds slow. We have 3 servers, the following TIMED_WAITING threads are reducing Server 1: *7722* Threads are in TIMED_WATING ("lock":"java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject@151d5f2f ") Server 2: *4046* Threads are in TIMED_WATING ("lock":"java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject@1e0205c3 ") Server 3: *4210* Threads are in TIMED_WATING ("lock":"java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject@5ee792c0 ") Please help in resolving this issue. Thanks, Doss On Wed, Aug 5, 2020 at 11:30 PM Doss wrote: > Hi All, > > We have tried the system variable recommend here > > https://www.eclipse.org/jetty/documentation/current/high-load.html > > The errors stopped now, and system is running stable. > > Is there any disadvantage using the following? > > sysctl -w net.ipv4.tcp_tw_recycle=1 > > > Somewhere I read tw_reuse also will help, can we use both ? please suggest. > > Thanks, > Doss. > > On Sunday, August 2, 2020, Doss wrote: > >> Hi All, >> >> We are having SOLR (8.3.1) CLOUD (NRT) with Zookeeper Ensemble , 3 nodes >> each on Centos VMs >> >> SOLR Nodes has 66GB RAM, 15GB HEAP MEM, 4 CPUs. >> Record Count: 33L. Avg Doc Size is 350Kb. >> >> In recent times while doing a full import (Few bulk data fields computed >> on a daily basis) we are getting this error, what could be the problem? how >> to increase the Queue Size? Please help. >> >> 2020-08-02 08:10:30.847 ERROR >> (updateExecutor-5-thread-190288-processing-x:userinfoindex_6jul20_shard4_replica_n13 >> r:core_node24 null n:172.29.3.23:8983_solr c:userinfoindex_6jul20 >> s:shard4) [c:userinfoindex_6jul20 s:shard4 r:core_node24 >> x:userinfoindex_6jul20_shard4_replica_n13] >> o.a.s.u.ErrorReportingConcurrentUpdateSolrClient Error when calling >> SolrCmdDistributor$Req: cmd=add{,id=1081904963}; node=ForwardNode: >> http://172.29.3.23:8983/solr/userinfoindex_6jul20_shard3_replica_n10/ to >> http://172.29.3.23:8983/solr/userinfoindex_6jul20_shard3_replica_n10/ => >> java.io.IOException: java.util.concurrent.RejectedExecutionException: Max >> requests queued per destination 3000 exceeded for HttpDestination[ >> http://172.29 >> .3.23:8983]@5a9a216b,queue=3000,pool=MultiplexConnectionPool@545dc448 >> [c=4/4,b=4,m=0,i=0] >> java.io.IOException: java.util.concurrent.RejectedExecutionException: Max >> requests queued per destination 3000 exceeded for HttpDestination[ >> http://172.29 >> .3.23:8983]@5a9a216b,queue=3000,pool=MultiplexConnectionPool@545dc448 >> [c=4/4,b=4,m=0,i=0] >> Suppressed: java.io.IOException: >> java.util.concurrent.RejectedExecutionException: Max requests queued per >> destination 3000 exceeded for HttpDestination[http://172.29 >> .3.23:8983]@5a9a216b,queue=3000,pool=MultiplexConnectionPool@545dc448 >> [c=4/4,b=4,m=0,i=0] >> Caused by: java.util.concurrent.RejectedExecutionException: Max requests >> queued per destination 3000 exceeded for HttpDestination[http://172.29 >> .3.23:8983]@5a9a216b,queue=3000,pool=MultiplexConnectionPool@545dc448 >> [c=4/4,b=4,m=0,i=0] >> Caused by: java.util.concurrent.RejectedExecutionException: Max requests >> queued per destination 3000 exceeded for HttpDestination[http://172.29 >> .3.23:8983]@5a9a216b,queue=3000,pool=MultiplexConnectionPool@545dc448 >> [c=4/4,b=4,m=0,i=0] >> >> 2020-08-02 08:10:30.911 ERROR >> (updateExecutor-5-thread-190288-processing-x:userinfoindex_6jul20_shard4_replica_n13 >> r:core_node24 null n:172.29.3.23:8983_solr c:userinfoindex_6jul20 >> s:shard4) [c:userinfoindex_6jul20 s:shard4 r:core_node24 >> x:userinfoindex_6jul20_shard4_replica_n13] >> o.a.s.u.ErrorReportingConcurrentUpdateSolrClient Error when calling >> SolrCmdDistributor$Req: cmd=add{,id=1034918151}; node=ForwardNode: >> http://172.29.3.23:8983/solr/userinfoindex_6jul20_shard1_replica_n3/ to >> http://172.29.3.23:8983/solr/userinfoindex_6jul20_shard1_replica_n3/ => >> java.io.IOException: java.util.concurrent.RejectedExecutionException: Max >> requests queued per destination 3000 exceeded for HttpDestination[ >> http://172.29 >> .3.23:8983]@5a9a216b,queue=3000,pool=MultiplexConnectionPool@545dc448 >> [c=4/4,b=4,m=0,i=0] >> >> >> Thanks, >> Doss. >> >
Re: java.util.concurrent.RejectedExecutionException: Max requests queued per destination 3000 exceeded for HttpDestination
Hi All, We have tried the system variable recommend here https://www.eclipse.org/jetty/documentation/current/high-load.html The errors stopped now, and system is running stable. Is there any disadvantage using the following? sysctl -w net.ipv4.tcp_tw_recycle=1 Somewhere I read tw_reuse also will help, can we use both ? please suggest. Thanks, Doss. On Sunday, August 2, 2020, Doss wrote: > Hi All, > > We are having SOLR (8.3.1) CLOUD (NRT) with Zookeeper Ensemble , 3 nodes > each on Centos VMs > > SOLR Nodes has 66GB RAM, 15GB HEAP MEM, 4 CPUs. > Record Count: 33L. Avg Doc Size is 350Kb. > > In recent times while doing a full import (Few bulk data fields computed > on a daily basis) we are getting this error, what could be the problem? how > to increase the Queue Size? Please help. > > 2020-08-02 08:10:30.847 ERROR (updateExecutor-5-thread- > 190288-processing-x:userinfoindex_6jul20_shard4_replica_n13 r:core_node24 > null n:172.29.3.23:8983_solr c:userinfoindex_6jul20 s:shard4) > [c:userinfoindex_6jul20 s:shard4 r:core_node24 > x:userinfoindex_6jul20_shard4_replica_n13] o.a.s.u. > ErrorReportingConcurrentUpdateSolrClient Error when calling > SolrCmdDistributor$Req: cmd=add{,id=1081904963}; node=ForwardNode: > http://172.29.3.23:8983/solr/userinfoindex_6jul20_shard3_replica_n10/ to > http://172.29.3.23:8983/solr/userinfoindex_6jul20_shard3_replica_n10/ => > java.io.IOException: java.util.concurrent.RejectedExecutionException: Max > requests queued per destination 3000 exceeded for HttpDestination[ > http://172.29.3.23:8983]@5a9a216b,queue=3000,pool=MultiplexConnectionPool@ > 545dc448[c=4/4,b=4,m=0,i=0] > java.io.IOException: java.util.concurrent.RejectedExecutionException: Max > requests queued per destination 3000 exceeded for HttpDestination[ > http://172.29.3.23:8983]@5a9a216b,queue=3000,pool=MultiplexConnectionPool@ > 545dc448[c=4/4,b=4,m=0,i=0] > Suppressed: java.io.IOException: > java.util.concurrent.RejectedExecutionException: > Max requests queued per destination 3000 exceeded for HttpDestination[ > http://172.29.3.23:8983]@5a9a216b,queue=3000,pool=MultiplexConnectionPool@ > 545dc448[c=4/4,b=4,m=0,i=0] > Caused by: java.util.concurrent.RejectedExecutionException: Max requests > queued per destination 3000 exceeded for HttpDestination[http://172.29. > 3.23:8983]@5a9a216b,queue=3000,pool=MultiplexConnectionPool@ > 545dc448[c=4/4,b=4,m=0,i=0] > Caused by: java.util.concurrent.RejectedExecutionException: Max requests > queued per destination 3000 exceeded for HttpDestination[http://172.29. > 3.23:8983]@5a9a216b,queue=3000,pool=MultiplexConnectionPool@ > 545dc448[c=4/4,b=4,m=0,i=0] > > 2020-08-02 08:10:30.911 ERROR (updateExecutor-5-thread- > 190288-processing-x:userinfoindex_6jul20_shard4_replica_n13 r:core_node24 > null n:172.29.3.23:8983_solr c:userinfoindex_6jul20 s:shard4) > [c:userinfoindex_6jul20 s:shard4 r:core_node24 > x:userinfoindex_6jul20_shard4_replica_n13] o.a.s.u. > ErrorReportingConcurrentUpdateSolrClient Error when calling > SolrCmdDistributor$Req: cmd=add{,id=1034918151}; node=ForwardNode: > http://172.29.3.23:8983/solr/userinfoindex_6jul20_shard1_replica_n3/ to > http://172.29.3.23:8983/solr/userinfoindex_6jul20_shard1_replica_n3/ => > java.io.IOException: java.util.concurrent.RejectedExecutionException: Max > requests queued per destination 3000 exceeded for HttpDestination[ > http://172.29.3.23:8983]@5a9a216b,queue=3000,pool=MultiplexConnectionPool@ > 545dc448[c=4/4,b=4,m=0,i=0] > > > Thanks, > Doss. >
java.util.concurrent.RejectedExecutionException: Max requests queued per destination 3000 exceeded for HttpDestination
Hi All, We are having SOLR (8.3.1) CLOUD (NRT) with Zookeeper Ensemble , 3 nodes each on Centos VMs SOLR Nodes has 66GB RAM, 15GB HEAP MEM, 4 CPUs. Record Count: 33L. Avg Doc Size is 350Kb. In recent times while doing a full import (Few bulk data fields computed on a daily basis) we are getting this error, what could be the problem? how to increase the Queue Size? Please help. 2020-08-02 08:10:30.847 ERROR (updateExecutor-5-thread-190288-processing-x:userinfoindex_6jul20_shard4_replica_n13 r:core_node24 null n:172.29.3.23:8983_solr c:userinfoindex_6jul20 s:shard4) [c:userinfoindex_6jul20 s:shard4 r:core_node24 x:userinfoindex_6jul20_shard4_replica_n13] o.a.s.u.ErrorReportingConcurrentUpdateSolrClient Error when calling SolrCmdDistributor$Req: cmd=add{,id=1081904963}; node=ForwardNode: http://172.29.3.23:8983/solr/userinfoindex_6jul20_shard3_replica_n10/ to http://172.29.3.23:8983/solr/userinfoindex_6jul20_shard3_replica_n10/ => java.io.IOException: java.util.concurrent.RejectedExecutionException: Max requests queued per destination 3000 exceeded for HttpDestination[http://172.29.3.23:8983 ]@5a9a216b,queue=3000,pool=MultiplexConnectionPool@545dc448 [c=4/4,b=4,m=0,i=0] java.io.IOException: java.util.concurrent.RejectedExecutionException: Max requests queued per destination 3000 exceeded for HttpDestination[http://172.29.3.23:8983 ]@5a9a216b,queue=3000,pool=MultiplexConnectionPool@545dc448 [c=4/4,b=4,m=0,i=0] Suppressed: java.io.IOException: java.util.concurrent.RejectedExecutionException: Max requests queued per destination 3000 exceeded for HttpDestination[http://172.29.3.23:8983 ]@5a9a216b,queue=3000,pool=MultiplexConnectionPool@545dc448 [c=4/4,b=4,m=0,i=0] Caused by: java.util.concurrent.RejectedExecutionException: Max requests queued per destination 3000 exceeded for HttpDestination[http://172.29.3.23:8983 ]@5a9a216b,queue=3000,pool=MultiplexConnectionPool@545dc448 [c=4/4,b=4,m=0,i=0] Caused by: java.util.concurrent.RejectedExecutionException: Max requests queued per destination 3000 exceeded for HttpDestination[http://172.29.3.23:8983 ]@5a9a216b,queue=3000,pool=MultiplexConnectionPool@545dc448 [c=4/4,b=4,m=0,i=0] 2020-08-02 08:10:30.911 ERROR (updateExecutor-5-thread-190288-processing-x:userinfoindex_6jul20_shard4_replica_n13 r:core_node24 null n:172.29.3.23:8983_solr c:userinfoindex_6jul20 s:shard4) [c:userinfoindex_6jul20 s:shard4 r:core_node24 x:userinfoindex_6jul20_shard4_replica_n13] o.a.s.u.ErrorReportingConcurrentUpdateSolrClient Error when calling SolrCmdDistributor$Req: cmd=add{,id=1034918151}; node=ForwardNode: http://172.29.3.23:8983/solr/userinfoindex_6jul20_shard1_replica_n3/ to http://172.29.3.23:8983/solr/userinfoindex_6jul20_shard1_replica_n3/ => java.io.IOException: java.util.concurrent.RejectedExecutionException: Max requests queued per destination 3000 exceeded for HttpDestination[http://172.29.3.23:8983 ]@5a9a216b,queue=3000,pool=MultiplexConnectionPool@545dc448 [c=4/4,b=4,m=0,i=0] Thanks, Doss.