[ 
https://issues.apache.org/jira/browse/HIVE-17291?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16123749#comment-16123749
 ] 

Xuefu Zhang commented on HIVE-17291:
------------------------------------

Thanks for working on this, [~pvary]. The patch looks good. However, I was a 
little confused. The description suggests that we are fixing the case when 
dynamic allocation is not enabled. However, the code seemingly will get 
executed in either case. I'm not sure if it's proper to use 
{{spark.executor.instances}} when dynamic allocation is enabled. Any thoughts?

> Set the number of executors based on config if client does not provide 
> information
> ----------------------------------------------------------------------------------
>
>                 Key: HIVE-17291
>                 URL: https://issues.apache.org/jira/browse/HIVE-17291
>             Project: Hive
>          Issue Type: Sub-task
>          Components: Spark
>    Affects Versions: 3.0.0
>            Reporter: Peter Vary
>            Assignee: Peter Vary
>         Attachments: HIVE-17291.1.patch
>
>
> When calculating the memory and cores and the client does not provide 
> information we should try to use the one provided by default. This can happen 
> on startup, when {{spark.dynamicAllocation.enabled}} is not enabled



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

Reply via email to