[ 
https://issues.apache.org/jira/browse/SPARK-36816?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17420578#comment-17420578
 ] 

Ole commented on SPARK-36816:
-----------------------------

I am running a Thrift Server {{/spark/sbin/start-thriftserver.sh}} with 
{{--conf spark.sql.thriftServer.incrementalCollect=true}} to prevent 
OutOfMemory Exceptions. Querying data results in batched result sets (as 
intended) with log messages like this:
{code:bash}
21/09/27 08:25:33 INFO SparkExecuteStatementOperation: Returning result set 
with 1000 rows from offsets [932000, 933000) with 
50f346c0-02d4-40a2-a73c-30d326d2aae{code}
I'd like to be able to configure the value of {{1000 rows }}to be able to 
adjust that value to our server capacity. Result would look like this:
{code:java}
21/09/27 08:25:33 INFO SparkExecuteStatementOperation: Returning result set 
with 10000 rows from offsets [932000, 942000) with 
50f346c0-02d4-40a2-a73c-30d326d2aae{code}

> Introduce a config variable for the incrementalCollects row batch size
> ----------------------------------------------------------------------
>
>                 Key: SPARK-36816
>                 URL: https://issues.apache.org/jira/browse/SPARK-36816
>             Project: Spark
>          Issue Type: Improvement
>          Components: SQL
>    Affects Versions: 3.1.2
>            Reporter: Ole
>            Priority: Minor
>
> After enabling *_spark.sql.thriftServer.incrementalCollects_* Thrift will 
> execute queries in batches (as intended). Unfortunately the batch size cannot 
> be configured as it seems to be hardcoded 
> [here|https://github.com/apache/spark/blob/6699f76fe2afa7f154b4ba424f3fe048fcee46df/sql/hive-thriftserver/src/main/java/org/apache/hive/service/cli/thrift/ThriftCLIServiceClient.java#L404].
>  It would be useful to configure that value to be able to adjust it to your 
> environment.
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to