[ https://issues.apache.org/jira/browse/HIVE-17964?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Rui Li updated HIVE-17964: -------------------------- Attachment: HIVE-17964.2.patch Update to fix tests. There're some issue with {{spark_job_max_tasks.q}} and {{spark_stage_max_tasks.q}}: since we don't check num of tasks when the job first reaches RUNNING state, it's possible the check is bypassed if the job finishes very quickly. Therefore I use a dummy script that does nothing but sleep, so that we make sure the check is enforced. > HoS: some spark configs doesn't require re-creating a session > ------------------------------------------------------------- > > Key: HIVE-17964 > URL: https://issues.apache.org/jira/browse/HIVE-17964 > Project: Hive > Issue Type: Improvement > Components: Spark > Reporter: Rui Li > Assignee: Rui Li > Priority: Minor > Attachments: HIVE-17964.1.patch, HIVE-17964.2.patch > > > I guess the {{hive.spark.}} configs were initially intended for the RSC. > Therefore when they're changed, we'll re-create the session for them to take > effect. There're some configs not related to RSC that also start with > {{hive.spark.}}. We'd better rename them so that we don't unnecessarily > re-create sessions, which is usually time consuming. -- This message was sent by Atlassian JIRA (v6.4.14#64029)