[ 
https://issues.apache.org/jira/browse/SOLR-6930?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14269695#comment-14269695
 ] 

Mike Drob commented on SOLR-6930:
---------------------------------

The tricky part here is, of course, in estimating how much memory a query will 
require to complete before actually executing it. The ES page hints that 
introspecting the query to get information about the field data and then 
computing size from there is one approach.

I wonder if we can reuse some existing parsing logic to make that process much 
easier...

Getting the total heap size and the amount currently used by the field cache 
should be fairly straightforward, but ES warns that it may be innacurate by 
stale references.

Any ideas?

> Provide "Circuit Breakers" For Expensive Solr Queries
> -----------------------------------------------------
>
>                 Key: SOLR-6930
>                 URL: https://issues.apache.org/jira/browse/SOLR-6930
>             Project: Solr
>          Issue Type: Bug
>          Components: search
>            Reporter: Mike Drob
>
> Ref: 
> http://www.elasticsearch.org/guide/en/elasticsearch/guide/current/_limiting_memory_usage.html
> ES currently allows operators to configure "circuit breakers" to preemptively 
> fail queries that are estimated too large rather than allowing an OOM 
> Exception to happen. We might be able to do the same thing.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org

Reply via email to