[jira] [Commented] (HIVE-18533) Add option to use InProcessLauncher to submit spark jobs
[ https://issues.apache.org/jira/browse/HIVE-18533?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16503674#comment-16503674 ] Sahil Takiar commented on HIVE-18533: - Thanks for catching that. Pushed an addendum commit to fix the issues. > Add option to use InProcessLauncher to submit spark jobs > > > Key: HIVE-18533 > URL: https://issues.apache.org/jira/browse/HIVE-18533 > Project: Hive > Issue Type: Improvement > Components: Spark >Reporter: Sahil Takiar >Assignee: Sahil Takiar >Priority: Major > Fix For: 4.0.0 > > Attachments: HIVE-18533.1.patch, HIVE-18533.2.patch, > HIVE-18533.3.patch, HIVE-18533.4.patch, HIVE-18533.5.patch, > HIVE-18533.6.patch, HIVE-18533.7.patch, HIVE-18533.8.patch, > HIVE-18533.9.patch, HIVE-18533.91.patch, HIVE-18533.94.patch, > HIVE-18533.95.patch, HIVE-18533.96.patch, HIVE-18533.97.patch, > HIVE-18533.98.patch, HIVE-18831.93.patch > > > See discussion in HIVE-16484 for details. > I think this will help with reducing the amount of time it takes to open a > HoS session + debuggability (no need launch a separate process to run a Spark > app). -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Commented] (HIVE-18533) Add option to use InProcessLauncher to submit spark jobs
[ https://issues.apache.org/jira/browse/HIVE-18533?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16502983#comment-16502983 ] Rui Li commented on HIVE-18533: --- [~stakiar], please take a look at the Yetus report. I think some warnings are valid. > Add option to use InProcessLauncher to submit spark jobs > > > Key: HIVE-18533 > URL: https://issues.apache.org/jira/browse/HIVE-18533 > Project: Hive > Issue Type: Improvement > Components: Spark >Reporter: Sahil Takiar >Assignee: Sahil Takiar >Priority: Major > Fix For: 4.0.0 > > Attachments: HIVE-18533.1.patch, HIVE-18533.2.patch, > HIVE-18533.3.patch, HIVE-18533.4.patch, HIVE-18533.5.patch, > HIVE-18533.6.patch, HIVE-18533.7.patch, HIVE-18533.8.patch, > HIVE-18533.9.patch, HIVE-18533.91.patch, HIVE-18533.94.patch, > HIVE-18533.95.patch, HIVE-18533.96.patch, HIVE-18533.97.patch, > HIVE-18533.98.patch, HIVE-18831.93.patch > > > See discussion in HIVE-16484 for details. > I think this will help with reducing the amount of time it takes to open a > HoS session + debuggability (no need launch a separate process to run a Spark > app). -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Commented] (HIVE-18533) Add option to use InProcessLauncher to submit spark jobs
[ https://issues.apache.org/jira/browse/HIVE-18533?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16501063#comment-16501063 ] Hive QA commented on HIVE-18533: Here are the results of testing the latest attachment: https://issues.apache.org/jira/secure/attachment/12926092/HIVE-18533.98.patch {color:green}SUCCESS:{color} +1 due to 4 test(s) being added or modified. {color:green}SUCCESS:{color} +1 due to 14462 tests passed Test results: https://builds.apache.org/job/PreCommit-HIVE-Build/11508/testReport Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/11508/console Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-11508/ Messages: {noformat} Executing org.apache.hive.ptest.execution.TestCheckPhase Executing org.apache.hive.ptest.execution.PrepPhase Executing org.apache.hive.ptest.execution.YetusPhase Executing org.apache.hive.ptest.execution.ExecutionPhase Executing org.apache.hive.ptest.execution.ReportingPhase {noformat} This message is automatically generated. ATTACHMENT ID: 12926092 - PreCommit-HIVE-Build > Add option to use InProcessLauncher to submit spark jobs > > > Key: HIVE-18533 > URL: https://issues.apache.org/jira/browse/HIVE-18533 > Project: Hive > Issue Type: Improvement > Components: Spark >Reporter: Sahil Takiar >Assignee: Sahil Takiar >Priority: Major > Attachments: HIVE-18533.1.patch, HIVE-18533.2.patch, > HIVE-18533.3.patch, HIVE-18533.4.patch, HIVE-18533.5.patch, > HIVE-18533.6.patch, HIVE-18533.7.patch, HIVE-18533.8.patch, > HIVE-18533.9.patch, HIVE-18533.91.patch, HIVE-18533.94.patch, > HIVE-18533.95.patch, HIVE-18533.96.patch, HIVE-18533.97.patch, > HIVE-18533.98.patch, HIVE-18831.93.patch > > > See discussion in HIVE-16484 for details. > I think this will help with reducing the amount of time it takes to open a > HoS session + debuggability (no need launch a separate process to run a Spark > app). -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Commented] (HIVE-18533) Add option to use InProcessLauncher to submit spark jobs
[ https://issues.apache.org/jira/browse/HIVE-18533?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16500969#comment-16500969 ] Hive QA commented on HIVE-18533: | (x) *{color:red}-1 overall{color}* | \\ \\ || Vote || Subsystem || Runtime || Comment || || || || || {color:brown} Prechecks {color} || | {color:green}+1{color} | {color:green} @author {color} | {color:green} 0m 0s{color} | {color:green} The patch does not contain any @author tags. {color} | || || || || {color:brown} master Compile Tests {color} || | {color:blue}0{color} | {color:blue} mvndep {color} | {color:blue} 0m 32s{color} | {color:blue} Maven dependency ordering for branch {color} | | {color:green}+1{color} | {color:green} mvninstall {color} | {color:green} 6m 39s{color} | {color:green} master passed {color} | | {color:green}+1{color} | {color:green} compile {color} | {color:green} 1m 52s{color} | {color:green} master passed {color} | | {color:green}+1{color} | {color:green} checkstyle {color} | {color:green} 1m 9s{color} | {color:green} master passed {color} | | {color:blue}0{color} | {color:blue} findbugs {color} | {color:blue} 0m 29s{color} | {color:blue} common in master has 62 extant Findbugs warnings. {color} | | {color:blue}0{color} | {color:blue} findbugs {color} | {color:blue} 3m 27s{color} | {color:blue} ql in master has 2277 extant Findbugs warnings. {color} | | {color:blue}0{color} | {color:blue} findbugs {color} | {color:blue} 0m 20s{color} | {color:blue} spark-client in master has 15 extant Findbugs warnings. {color} | | {color:green}+1{color} | {color:green} javadoc {color} | {color:green} 1m 29s{color} | {color:green} master passed {color} | || || || || {color:brown} Patch Compile Tests {color} || | {color:blue}0{color} | {color:blue} mvndep {color} | {color:blue} 0m 8s{color} | {color:blue} Maven dependency ordering for patch {color} | | {color:green}+1{color} | {color:green} mvninstall {color} | {color:green} 2m 13s{color} | {color:green} the patch passed {color} | | {color:green}+1{color} | {color:green} compile {color} | {color:green} 1m 59s{color} | {color:green} the patch passed {color} | | {color:green}+1{color} | {color:green} javac {color} | {color:green} 1m 59s{color} | {color:green} the patch passed {color} | | {color:red}-1{color} | {color:red} checkstyle {color} | {color:red} 0m 10s{color} | {color:red} spark-client: The patch generated 11 new + 25 unchanged - 9 fixed = 36 total (was 34) {color} | | {color:green}+1{color} | {color:green} whitespace {color} | {color:green} 0m 0s{color} | {color:green} The patch has no whitespace issues. {color} | | {color:green}+1{color} | {color:green} xml {color} | {color:green} 0m 3s{color} | {color:green} The patch has no ill-formed XML file. {color} | | {color:red}-1{color} | {color:red} findbugs {color} | {color:red} 0m 32s{color} | {color:red} spark-client generated 2 new + 10 unchanged - 5 fixed = 12 total (was 15) {color} | | {color:green}+1{color} | {color:green} javadoc {color} | {color:green} 1m 32s{color} | {color:green} the patch passed {color} | || || || || {color:brown} Other Tests {color} || | {color:red}-1{color} | {color:red} asflicense {color} | {color:red} 0m 12s{color} | {color:red} The patch generated 1 ASF License warnings. {color} | | {color:black}{color} | {color:black} {color} | {color:black} 29m 25s{color} | {color:black} {color} | \\ \\ || Reason || Tests || | FindBugs | module:spark-client | | | Class org.apache.hive.spark.client.SparkLauncherSparkClient defines non-transient non-serializable instance field sparkLauncher In SparkLauncherSparkClient.java:instance field sparkLauncher In SparkLauncherSparkClient.java | | | org.apache.spark.launcher.InProcessLauncher stored into non-transient field SparkLauncherSparkClient.sparkLauncher At SparkLauncherSparkClient.java:SparkLauncherSparkClient.sparkLauncher At SparkLauncherSparkClient.java:[line 182] | \\ \\ || Subsystem || Report/Notes || | Optional Tests | asflicense javac javadoc findbugs checkstyle compile xml | | uname | Linux hiveptest-server-upstream 3.16.0-4-amd64 #1 SMP Debian 3.16.36-1+deb8u1 (2016-09-03) x86_64 GNU/Linux | | Build tool | maven | | Personality | /data/hiveptest/working/yetus_PreCommit-HIVE-Build-11508/dev-support/hive-personality.sh | | git revision | master / 5667af3 | | Default Java | 1.8.0_111 | | findbugs | v3.0.0 | | checkstyle | http://104.198.109.242/logs//PreCommit-HIVE-Build-11508/yetus/diff-checkstyle-spark-client.txt | | findbugs | http://104.198.109.242/logs//PreCommit-HIVE-Build-11508/yetus/new-findbugs-spark-client.html | | asflicense | http://104.198.109.242/logs//PreCommit-HIVE-Build-11508/yetus/patch-asflicense-problems.txt | | modules | C: common itests itests/qtest-spark ql spark-client U: . | | Console output | http://104.198.109.242/logs//PreCommit-HIVE-Build-11508/yetus.txt | | Powered by | Apa
[jira] [Commented] (HIVE-18533) Add option to use InProcessLauncher to submit spark jobs
[ https://issues.apache.org/jira/browse/HIVE-18533?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16497842#comment-16497842 ] Hive QA commented on HIVE-18533: Here are the results of testing the latest attachment: https://issues.apache.org/jira/secure/attachment/12925610/HIVE-18533.97.patch {color:green}SUCCESS:{color} +1 due to 4 test(s) being added or modified. {color:red}ERROR:{color} -1 due to 2 failed/errored test(s), 14441 tests executed *Failed tests:* {noformat} org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[input31] (batchId=63) org.apache.hive.jdbc.TestJdbcWithMiniLlapArrow.testDataTypes (batchId=242) {noformat} Test results: https://builds.apache.org/job/PreCommit-HIVE-Build/11406/testReport Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/11406/console Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-11406/ Messages: {noformat} Executing org.apache.hive.ptest.execution.TestCheckPhase Executing org.apache.hive.ptest.execution.PrepPhase Executing org.apache.hive.ptest.execution.YetusPhase Executing org.apache.hive.ptest.execution.ExecutionPhase Executing org.apache.hive.ptest.execution.ReportingPhase Tests exited with: TestsFailedException: 2 tests failed {noformat} This message is automatically generated. ATTACHMENT ID: 12925610 - PreCommit-HIVE-Build > Add option to use InProcessLauncher to submit spark jobs > > > Key: HIVE-18533 > URL: https://issues.apache.org/jira/browse/HIVE-18533 > Project: Hive > Issue Type: Improvement > Components: Spark >Reporter: Sahil Takiar >Assignee: Sahil Takiar >Priority: Major > Attachments: HIVE-18533.1.patch, HIVE-18533.2.patch, > HIVE-18533.3.patch, HIVE-18533.4.patch, HIVE-18533.5.patch, > HIVE-18533.6.patch, HIVE-18533.7.patch, HIVE-18533.8.patch, > HIVE-18533.9.patch, HIVE-18533.91.patch, HIVE-18533.94.patch, > HIVE-18533.95.patch, HIVE-18533.96.patch, HIVE-18533.97.patch, > HIVE-18831.93.patch > > > See discussion in HIVE-16484 for details. > I think this will help with reducing the amount of time it takes to open a > HoS session + debuggability (no need launch a separate process to run a Spark > app). -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Commented] (HIVE-18533) Add option to use InProcessLauncher to submit spark jobs
[ https://issues.apache.org/jira/browse/HIVE-18533?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16497818#comment-16497818 ] Hive QA commented on HIVE-18533: | (x) *{color:red}-1 overall{color}* | \\ \\ || Vote || Subsystem || Runtime || Comment || || || || || {color:brown} Prechecks {color} || | {color:green}+1{color} | {color:green} @author {color} | {color:green} 0m 0s{color} | {color:green} The patch does not contain any @author tags. {color} | || || || || {color:brown} master Compile Tests {color} || | {color:blue}0{color} | {color:blue} mvndep {color} | {color:blue} 0m 45s{color} | {color:blue} Maven dependency ordering for branch {color} | | {color:green}+1{color} | {color:green} mvninstall {color} | {color:green} 6m 36s{color} | {color:green} master passed {color} | | {color:green}+1{color} | {color:green} compile {color} | {color:green} 1m 51s{color} | {color:green} master passed {color} | | {color:green}+1{color} | {color:green} checkstyle {color} | {color:green} 1m 11s{color} | {color:green} master passed {color} | | {color:blue}0{color} | {color:blue} findbugs {color} | {color:blue} 0m 31s{color} | {color:blue} common in master has 62 extant Findbugs warnings. {color} | | {color:blue}0{color} | {color:blue} findbugs {color} | {color:blue} 3m 39s{color} | {color:blue} ql in master has 2278 extant Findbugs warnings. {color} | | {color:blue}0{color} | {color:blue} findbugs {color} | {color:blue} 0m 23s{color} | {color:blue} spark-client in master has 15 extant Findbugs warnings. {color} | | {color:green}+1{color} | {color:green} javadoc {color} | {color:green} 1m 31s{color} | {color:green} master passed {color} | || || || || {color:brown} Patch Compile Tests {color} || | {color:blue}0{color} | {color:blue} mvndep {color} | {color:blue} 0m 9s{color} | {color:blue} Maven dependency ordering for patch {color} | | {color:green}+1{color} | {color:green} mvninstall {color} | {color:green} 2m 14s{color} | {color:green} the patch passed {color} | | {color:green}+1{color} | {color:green} compile {color} | {color:green} 1m 52s{color} | {color:green} the patch passed {color} | | {color:green}+1{color} | {color:green} javac {color} | {color:green} 1m 52s{color} | {color:green} the patch passed {color} | | {color:red}-1{color} | {color:red} checkstyle {color} | {color:red} 0m 9s{color} | {color:red} spark-client: The patch generated 11 new + 25 unchanged - 9 fixed = 36 total (was 34) {color} | | {color:green}+1{color} | {color:green} whitespace {color} | {color:green} 0m 0s{color} | {color:green} The patch has no whitespace issues. {color} | | {color:green}+1{color} | {color:green} xml {color} | {color:green} 0m 2s{color} | {color:green} The patch has no ill-formed XML file. {color} | | {color:red}-1{color} | {color:red} findbugs {color} | {color:red} 0m 31s{color} | {color:red} spark-client generated 2 new + 10 unchanged - 5 fixed = 12 total (was 15) {color} | | {color:green}+1{color} | {color:green} javadoc {color} | {color:green} 1m 28s{color} | {color:green} the patch passed {color} | || || || || {color:brown} Other Tests {color} || | {color:red}-1{color} | {color:red} asflicense {color} | {color:red} 0m 11s{color} | {color:red} The patch generated 1 ASF License warnings. {color} | | {color:black}{color} | {color:black} {color} | {color:black} 29m 22s{color} | {color:black} {color} | \\ \\ || Reason || Tests || | FindBugs | module:spark-client | | | Class org.apache.hive.spark.client.SparkLauncherSparkClient defines non-transient non-serializable instance field sparkLauncher In SparkLauncherSparkClient.java:instance field sparkLauncher In SparkLauncherSparkClient.java | | | org.apache.spark.launcher.InProcessLauncher stored into non-transient field SparkLauncherSparkClient.sparkLauncher At SparkLauncherSparkClient.java:SparkLauncherSparkClient.sparkLauncher At SparkLauncherSparkClient.java:[line 182] | \\ \\ || Subsystem || Report/Notes || | Optional Tests | asflicense javac javadoc findbugs checkstyle compile xml | | uname | Linux hiveptest-server-upstream 3.16.0-4-amd64 #1 SMP Debian 3.16.36-1+deb8u1 (2016-09-03) x86_64 GNU/Linux | | Build tool | maven | | Personality | /data/hiveptest/working/yetus_PreCommit-HIVE-Build-11406/dev-support/hive-personality.sh | | git revision | master / 1c970d9 | | Default Java | 1.8.0_111 | | findbugs | v3.0.0 | | checkstyle | http://104.198.109.242/logs//PreCommit-HIVE-Build-11406/yetus/diff-checkstyle-spark-client.txt | | findbugs | http://104.198.109.242/logs//PreCommit-HIVE-Build-11406/yetus/new-findbugs-spark-client.html | | asflicense | http://104.198.109.242/logs//PreCommit-HIVE-Build-11406/yetus/patch-asflicense-problems.txt | | modules | C: common itests itests/qtest-spark ql spark-client U: . | | Console output | http://104.198.109.242/logs//PreCommit-HIVE-Build-11406/yetus.txt | | Powered by | Apa
[jira] [Commented] (HIVE-18533) Add option to use InProcessLauncher to submit spark jobs
[ https://issues.apache.org/jira/browse/HIVE-18533?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16494623#comment-16494623 ] Rui Li commented on HIVE-18533: --- [~stakiar] sorry I overlooked your previous comments. +1 pending tests bq. For your third point, Are there any code changes required for this? No, we only need to update the wiki. As for the spark-client.jar, I think it's bundled in hive-exec.jar so packaging.pom doesn't include it. > Add option to use InProcessLauncher to submit spark jobs > > > Key: HIVE-18533 > URL: https://issues.apache.org/jira/browse/HIVE-18533 > Project: Hive > Issue Type: Improvement > Components: Spark >Reporter: Sahil Takiar >Assignee: Sahil Takiar >Priority: Major > Attachments: HIVE-18533.1.patch, HIVE-18533.2.patch, > HIVE-18533.3.patch, HIVE-18533.4.patch, HIVE-18533.5.patch, > HIVE-18533.6.patch, HIVE-18533.7.patch, HIVE-18533.8.patch, > HIVE-18533.9.patch, HIVE-18533.91.patch, HIVE-18533.94.patch, > HIVE-18533.95.patch, HIVE-18533.96.patch, HIVE-18533.97.patch, > HIVE-18831.93.patch > > > See discussion in HIVE-16484 for details. > I think this will help with reducing the amount of time it takes to open a > HoS session + debuggability (no need launch a separate process to run a Spark > app). -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Commented] (HIVE-18533) Add option to use InProcessLauncher to submit spark jobs
[ https://issues.apache.org/jira/browse/HIVE-18533?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16494016#comment-16494016 ] Sahil Takiar commented on HIVE-18533: - [~lirui] any other comments? > Add option to use InProcessLauncher to submit spark jobs > > > Key: HIVE-18533 > URL: https://issues.apache.org/jira/browse/HIVE-18533 > Project: Hive > Issue Type: Improvement > Components: Spark >Reporter: Sahil Takiar >Assignee: Sahil Takiar >Priority: Major > Attachments: HIVE-18533.1.patch, HIVE-18533.2.patch, > HIVE-18533.3.patch, HIVE-18533.4.patch, HIVE-18533.5.patch, > HIVE-18533.6.patch, HIVE-18533.7.patch, HIVE-18533.8.patch, > HIVE-18533.9.patch, HIVE-18533.91.patch, HIVE-18533.94.patch, > HIVE-18533.95.patch, HIVE-18533.96.patch, HIVE-18533.97.patch, > HIVE-18831.93.patch > > > See discussion in HIVE-16484 for details. > I think this will help with reducing the amount of time it takes to open a > HoS session + debuggability (no need launch a separate process to run a Spark > app). -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Commented] (HIVE-18533) Add option to use InProcessLauncher to submit spark jobs
[ https://issues.apache.org/jira/browse/HIVE-18533?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16476305#comment-16476305 ] Hive QA commented on HIVE-18533: Here are the results of testing the latest attachment: https://issues.apache.org/jira/secure/attachment/12923334/HIVE-18533.96.patch {color:green}SUCCESS:{color} +1 due to 4 test(s) being added or modified. {color:red}ERROR:{color} -1 due to 4 failed/errored test(s), 14407 tests executed *Failed tests:* {noformat} org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[sample2] (batchId=6) org.apache.hadoop.hive.cli.TestMiniLlapLocalCliDriver.testCliDriver[explainuser_4] (batchId=164) org.apache.hadoop.hive.cli.TestMiniLlapLocalCliDriver.testCliDriver[union_stats] (batchId=159) org.apache.hadoop.hive.cli.TestMiniSparkOnYarnCliDriver.testCliDriver[spark_opt_shuffle_serde] (batchId=183) {noformat} Test results: https://builds.apache.org/job/PreCommit-HIVE-Build/10964/testReport Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/10964/console Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-10964/ Messages: {noformat} Executing org.apache.hive.ptest.execution.TestCheckPhase Executing org.apache.hive.ptest.execution.PrepPhase Executing org.apache.hive.ptest.execution.YetusPhase Executing org.apache.hive.ptest.execution.ExecutionPhase Executing org.apache.hive.ptest.execution.ReportingPhase Tests exited with: TestsFailedException: 4 tests failed {noformat} This message is automatically generated. ATTACHMENT ID: 12923334 - PreCommit-HIVE-Build > Add option to use InProcessLauncher to submit spark jobs > > > Key: HIVE-18533 > URL: https://issues.apache.org/jira/browse/HIVE-18533 > Project: Hive > Issue Type: Improvement > Components: Spark >Reporter: Sahil Takiar >Assignee: Sahil Takiar >Priority: Major > Attachments: HIVE-18533.1.patch, HIVE-18533.2.patch, > HIVE-18533.3.patch, HIVE-18533.4.patch, HIVE-18533.5.patch, > HIVE-18533.6.patch, HIVE-18533.7.patch, HIVE-18533.8.patch, > HIVE-18533.9.patch, HIVE-18533.91.patch, HIVE-18533.94.patch, > HIVE-18533.95.patch, HIVE-18533.96.patch, HIVE-18831.93.patch > > > See discussion in HIVE-16484 for details. > I think this will help with reducing the amount of time it takes to open a > HoS session + debuggability (no need launch a separate process to run a Spark > app). -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Commented] (HIVE-18533) Add option to use InProcessLauncher to submit spark jobs
[ https://issues.apache.org/jira/browse/HIVE-18533?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16476249#comment-16476249 ] Hive QA commented on HIVE-18533: | (x) *{color:red}-1 overall{color}* | \\ \\ || Vote || Subsystem || Runtime || Comment || || || || || {color:brown} Prechecks {color} || | {color:green}+1{color} | {color:green} @author {color} | {color:green} 0m 0s{color} | {color:green} The patch does not contain any @author tags. {color} | || || || || {color:brown} master Compile Tests {color} || | {color:blue}0{color} | {color:blue} mvndep {color} | {color:blue} 0m 43s{color} | {color:blue} Maven dependency ordering for branch {color} | | {color:green}+1{color} | {color:green} mvninstall {color} | {color:green} 6m 50s{color} | {color:green} master passed {color} | | {color:green}+1{color} | {color:green} compile {color} | {color:green} 1m 52s{color} | {color:green} master passed {color} | | {color:green}+1{color} | {color:green} checkstyle {color} | {color:green} 1m 14s{color} | {color:green} master passed {color} | | {color:blue}0{color} | {color:blue} findbugs {color} | {color:blue} 0m 30s{color} | {color:blue} common in master has 62 extant Findbugs warnings. {color} | | {color:blue}0{color} | {color:blue} findbugs {color} | {color:blue} 3m 55s{color} | {color:blue} ql in master has 2320 extant Findbugs warnings. {color} | | {color:blue}0{color} | {color:blue} findbugs {color} | {color:blue} 0m 22s{color} | {color:blue} spark-client in master has 15 extant Findbugs warnings. {color} | | {color:green}+1{color} | {color:green} javadoc {color} | {color:green} 1m 32s{color} | {color:green} master passed {color} | || || || || {color:brown} Patch Compile Tests {color} || | {color:blue}0{color} | {color:blue} mvndep {color} | {color:blue} 0m 8s{color} | {color:blue} Maven dependency ordering for patch {color} | | {color:green}+1{color} | {color:green} mvninstall {color} | {color:green} 2m 14s{color} | {color:green} the patch passed {color} | | {color:green}+1{color} | {color:green} compile {color} | {color:green} 1m 55s{color} | {color:green} the patch passed {color} | | {color:green}+1{color} | {color:green} javac {color} | {color:green} 1m 55s{color} | {color:green} the patch passed {color} | | {color:red}-1{color} | {color:red} checkstyle {color} | {color:red} 0m 10s{color} | {color:red} spark-client: The patch generated 11 new + 25 unchanged - 9 fixed = 36 total (was 34) {color} | | {color:green}+1{color} | {color:green} whitespace {color} | {color:green} 0m 0s{color} | {color:green} The patch has no whitespace issues. {color} | | {color:green}+1{color} | {color:green} xml {color} | {color:green} 0m 2s{color} | {color:green} The patch has no ill-formed XML file. {color} | | {color:red}-1{color} | {color:red} findbugs {color} | {color:red} 0m 32s{color} | {color:red} spark-client generated 2 new + 10 unchanged - 5 fixed = 12 total (was 15) {color} | | {color:green}+1{color} | {color:green} javadoc {color} | {color:green} 1m 34s{color} | {color:green} the patch passed {color} | || || || || {color:brown} Other Tests {color} || | {color:red}-1{color} | {color:red} asflicense {color} | {color:red} 0m 12s{color} | {color:red} The patch generated 1 ASF License warnings. {color} | | {color:black}{color} | {color:black} {color} | {color:black} 30m 28s{color} | {color:black} {color} | \\ \\ || Reason || Tests || | FindBugs | module:spark-client | | | Class org.apache.hive.spark.client.SparkLauncherSparkClient defines non-transient non-serializable instance field sparkLauncher In SparkLauncherSparkClient.java:instance field sparkLauncher In SparkLauncherSparkClient.java | | | org.apache.spark.launcher.InProcessLauncher stored into non-transient field SparkLauncherSparkClient.sparkLauncher At SparkLauncherSparkClient.java:SparkLauncherSparkClient.sparkLauncher At SparkLauncherSparkClient.java:[line 182] | \\ \\ || Subsystem || Report/Notes || | Optional Tests | asflicense javac javadoc findbugs checkstyle compile xml | | uname | Linux hiveptest-server-upstream 3.16.0-4-amd64 #1 SMP Debian 3.16.36-1+deb8u1 (2016-09-03) x86_64 GNU/Linux | | Build tool | maven | | Personality | /data/hiveptest/working/yetus_PreCommit-HIVE-Build-10964/dev-support/hive-personality.sh | | git revision | master / 6e6b0cb | | Default Java | 1.8.0_111 | | findbugs | v3.0.0 | | checkstyle | http://104.198.109.242/logs//PreCommit-HIVE-Build-10964/yetus/diff-checkstyle-spark-client.txt | | findbugs | http://104.198.109.242/logs//PreCommit-HIVE-Build-10964/yetus/new-findbugs-spark-client.html | | asflicense | http://104.198.109.242/logs//PreCommit-HIVE-Build-10964/yetus/patch-asflicense-problems.txt | | modules | C: common itests itests/qtest-spark ql spark-client U: . | | Console output | http://104.198.109.242/logs//PreCommit-HIVE-Build-10964/yetus.txt | | Powered by | Apac
[jira] [Commented] (HIVE-18533) Add option to use InProcessLauncher to submit spark jobs
[ https://issues.apache.org/jira/browse/HIVE-18533?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16474738#comment-16474738 ] Sahil Takiar commented on HIVE-18533: - [~lirui] attached an updated patch and updated the RB. Addressed your first two comments. I agree simpler names are better. I removed the enum entirely since its not really Hive standard. I went with "spark-submit" and "spark-launcher", I wanted to keep the "spark" part because "spark-submit" should be a familiar term to most Spark uses. For your third point, Are there any code changes required for this? Yes, we will have to update the wiki to point out that more jars need to be added. Do you know why the {{packaging/pom.xml}} file doesn't include the {{hive-spark-client.jar}} file? > Add option to use InProcessLauncher to submit spark jobs > > > Key: HIVE-18533 > URL: https://issues.apache.org/jira/browse/HIVE-18533 > Project: Hive > Issue Type: Improvement > Components: Spark >Reporter: Sahil Takiar >Assignee: Sahil Takiar >Priority: Major > Attachments: HIVE-18533.1.patch, HIVE-18533.2.patch, > HIVE-18533.3.patch, HIVE-18533.4.patch, HIVE-18533.5.patch, > HIVE-18533.6.patch, HIVE-18533.7.patch, HIVE-18533.8.patch, > HIVE-18533.9.patch, HIVE-18533.91.patch, HIVE-18533.94.patch, > HIVE-18533.95.patch, HIVE-18533.96.patch, HIVE-18831.93.patch > > > See discussion in HIVE-16484 for details. > I think this will help with reducing the amount of time it takes to open a > HoS session + debuggability (no need launch a separate process to run a Spark > app). -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Commented] (HIVE-18533) Add option to use InProcessLauncher to submit spark jobs
[ https://issues.apache.org/jira/browse/HIVE-18533?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16474138#comment-16474138 ] Rui Li commented on HIVE-18533: --- Seems the new qtest has unstable output. Please look into it. Some more comments: # The new configuration takes values like {{SPARK_SUBMIT_CLIENT}} and {{SPARK_LAUNCHER_CLIENT}}. How about just use "submit" and "launcher" instead, which are user friendly and more like conventional Hive configs? # {{hive.spark.client.type}} needs to be added to {{HIVE_SPARK_RSC_CONF_LIST}}, so changing it will create a new session. # More Spark jars need to be added to Hive in order to use the new client. I found 3 extra jars: {{scala-reflect}}, {{spark-launcher}} and {{spark-yarn}}. We can update the [wiki|https://cwiki.apache.org/confluence/display/Hive/Hive+on+Spark%3A+Getting+Started#HiveonSpark:GettingStarted-ConfiguringHive] once the patch is in. > Add option to use InProcessLauncher to submit spark jobs > > > Key: HIVE-18533 > URL: https://issues.apache.org/jira/browse/HIVE-18533 > Project: Hive > Issue Type: Improvement > Components: Spark >Reporter: Sahil Takiar >Assignee: Sahil Takiar >Priority: Major > Attachments: HIVE-18533.1.patch, HIVE-18533.2.patch, > HIVE-18533.3.patch, HIVE-18533.4.patch, HIVE-18533.5.patch, > HIVE-18533.6.patch, HIVE-18533.7.patch, HIVE-18533.8.patch, > HIVE-18533.9.patch, HIVE-18533.91.patch, HIVE-18533.94.patch, > HIVE-18533.95.patch, HIVE-18831.93.patch > > > See discussion in HIVE-16484 for details. > I think this will help with reducing the amount of time it takes to open a > HoS session + debuggability (no need launch a separate process to run a Spark > app). -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Commented] (HIVE-18533) Add option to use InProcessLauncher to submit spark jobs
[ https://issues.apache.org/jira/browse/HIVE-18533?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16473955#comment-16473955 ] Hive QA commented on HIVE-18533: Here are the results of testing the latest attachment: https://issues.apache.org/jira/secure/attachment/12923228/HIVE-18533.95.patch {color:green}SUCCESS:{color} +1 due to 4 test(s) being added or modified. {color:red}ERROR:{color} -1 due to 29 failed/errored test(s), 14402 tests executed *Failed tests:* {noformat} TestMinimrCliDriver - did not produce a TEST-*.xml file (likely timed out) (batchId=94) [infer_bucket_sort_num_buckets.q,infer_bucket_sort_reducers_power_two.q,parallel_orderby.q,bucket_num_reducers_acid.q,scriptfile1.q,infer_bucket_sort_map_operators.q,infer_bucket_sort_merge.q,root_dir_external_table.q,infer_bucket_sort_dyn_part.q,udf_using.q] TestNonCatCallsWithCatalog - did not produce a TEST-*.xml file (likely timed out) (batchId=217) TestTxnExIm - did not produce a TEST-*.xml file (likely timed out) (batchId=286) org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[fouter_join_ppr] (batchId=33) org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[sample2] (batchId=6) org.apache.hadoop.hive.cli.TestMiniDruidKafkaCliDriver.testCliDriver[druidkafkamini_basic] (batchId=253) org.apache.hadoop.hive.cli.TestMiniLlapLocalCliDriver.testCliDriver[bucket_map_join_tez1] (batchId=175) org.apache.hadoop.hive.cli.TestMiniLlapLocalCliDriver.testCliDriver[default_constraint] (batchId=163) org.apache.hadoop.hive.cli.TestMiniLlapLocalCliDriver.testCliDriver[mm_bhif] (batchId=157) org.apache.hadoop.hive.cli.TestMiniLlapLocalCliDriver.testCliDriver[schema_evol_orc_acidvec_part] (batchId=170) org.apache.hadoop.hive.cli.TestMiniLlapLocalCliDriver.testCliDriver[schema_evol_orc_vec_part_llap_io] (batchId=170) org.apache.hadoop.hive.cli.TestMiniLlapLocalCliDriver.testCliDriver[special_character_in_tabnames_1] (batchId=165) org.apache.hadoop.hive.cli.TestMiniLlapLocalCliDriver.testCliDriver[sysdb] (batchId=163) org.apache.hadoop.hive.cli.TestMiniLlapLocalCliDriver.testCliDriver[tez_smb_1] (batchId=172) org.apache.hadoop.hive.cli.TestMiniLlapLocalCliDriver.testCliDriver[tez_vector_dynpart_hashjoin_1] (batchId=172) org.apache.hadoop.hive.cli.TestMiniLlapLocalCliDriver.testCliDriver[union_fast_stats] (batchId=167) org.apache.hadoop.hive.cli.TestMiniSparkOnYarnCliDriver.testCliDriver[spark_in_process_launcher] (batchId=182) org.apache.hadoop.hive.cli.control.TestDanglingQOuts.checkDanglingQOut (batchId=225) org.apache.hadoop.hive.metastore.TestOldSchema.testPartitionOps (batchId=217) org.apache.hadoop.hive.metastore.TestStats.partitionedTableDeprecatedCalls (batchId=211) org.apache.hadoop.hive.ql.TestAcidOnTez.testCtasTezUnion (batchId=228) org.apache.hadoop.hive.ql.TestAcidOnTez.testNonStandardConversion01 (batchId=228) org.apache.hadoop.hive.ql.TestMTQueries.testMTQueries1 (batchId=232) org.apache.hive.beeline.TestBeeLineWithArgs.testQueryProgress (batchId=236) org.apache.hive.beeline.TestBeeLineWithArgs.testQueryProgressParallel (batchId=236) org.apache.hive.jdbc.TestJdbcWithMiniHS2.testHttpRetryOnServerIdleTimeout (batchId=243) org.apache.hive.jdbc.TestSSL.testSSLFetchHttp (batchId=239) org.apache.hive.jdbc.TestTriggersMoveWorkloadManager.testTriggerMoveBackKill (batchId=240) org.apache.hive.spark.client.rpc.TestRpc.testServerPort (batchId=304) {noformat} Test results: https://builds.apache.org/job/PreCommit-HIVE-Build/10925/testReport Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/10925/console Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-10925/ Messages: {noformat} Executing org.apache.hive.ptest.execution.TestCheckPhase Executing org.apache.hive.ptest.execution.PrepPhase Executing org.apache.hive.ptest.execution.YetusPhase Executing org.apache.hive.ptest.execution.ExecutionPhase Executing org.apache.hive.ptest.execution.ReportingPhase Tests exited with: TestsFailedException: 29 tests failed {noformat} This message is automatically generated. ATTACHMENT ID: 12923228 - PreCommit-HIVE-Build > Add option to use InProcessLauncher to submit spark jobs > > > Key: HIVE-18533 > URL: https://issues.apache.org/jira/browse/HIVE-18533 > Project: Hive > Issue Type: Improvement > Components: Spark >Reporter: Sahil Takiar >Assignee: Sahil Takiar >Priority: Major > Attachments: HIVE-18533.1.patch, HIVE-18533.2.patch, > HIVE-18533.3.patch, HIVE-18533.4.patch, HIVE-18533.5.patch, > HIVE-18533.6.patch, HIVE-18533.7.patch, HIVE-18533.8.patch, > HIVE-18533.9.patch, HIVE-18533.91.patch, HIVE-18533.94.patch, > HIVE-18533.95.patch, HIVE-18831.93.patch > > > See discussion in HIVE-16484 for details. > I think this will help with reducing the amount of time it
[jira] [Commented] (HIVE-18533) Add option to use InProcessLauncher to submit spark jobs
[ https://issues.apache.org/jira/browse/HIVE-18533?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16473926#comment-16473926 ] Hive QA commented on HIVE-18533: | (x) *{color:red}-1 overall{color}* | \\ \\ || Vote || Subsystem || Runtime || Comment || || || || || {color:brown} Prechecks {color} || | {color:green}+1{color} | {color:green} @author {color} | {color:green} 0m 0s{color} | {color:green} The patch does not contain any @author tags. {color} | || || || || {color:brown} master Compile Tests {color} || | {color:blue}0{color} | {color:blue} mvndep {color} | {color:blue} 0m 42s{color} | {color:blue} Maven dependency ordering for branch {color} | | {color:green}+1{color} | {color:green} mvninstall {color} | {color:green} 6m 37s{color} | {color:green} master passed {color} | | {color:green}+1{color} | {color:green} compile {color} | {color:green} 1m 59s{color} | {color:green} master passed {color} | | {color:green}+1{color} | {color:green} checkstyle {color} | {color:green} 1m 13s{color} | {color:green} master passed {color} | | {color:blue}0{color} | {color:blue} findbugs {color} | {color:blue} 0m 30s{color} | {color:blue} common in master has 62 extant Findbugs warnings. {color} | | {color:blue}0{color} | {color:blue} findbugs {color} | {color:blue} 3m 52s{color} | {color:blue} ql in master has 2320 extant Findbugs warnings. {color} | | {color:blue}0{color} | {color:blue} findbugs {color} | {color:blue} 0m 22s{color} | {color:blue} spark-client in master has 15 extant Findbugs warnings. {color} | | {color:green}+1{color} | {color:green} javadoc {color} | {color:green} 1m 31s{color} | {color:green} master passed {color} | || || || || {color:brown} Patch Compile Tests {color} || | {color:blue}0{color} | {color:blue} mvndep {color} | {color:blue} 0m 8s{color} | {color:blue} Maven dependency ordering for patch {color} | | {color:green}+1{color} | {color:green} mvninstall {color} | {color:green} 2m 13s{color} | {color:green} the patch passed {color} | | {color:green}+1{color} | {color:green} compile {color} | {color:green} 1m 51s{color} | {color:green} the patch passed {color} | | {color:green}+1{color} | {color:green} javac {color} | {color:green} 1m 51s{color} | {color:green} the patch passed {color} | | {color:red}-1{color} | {color:red} checkstyle {color} | {color:red} 0m 9s{color} | {color:red} spark-client: The patch generated 11 new + 27 unchanged - 7 fixed = 38 total (was 34) {color} | | {color:green}+1{color} | {color:green} whitespace {color} | {color:green} 0m 0s{color} | {color:green} The patch has no whitespace issues. {color} | | {color:green}+1{color} | {color:green} xml {color} | {color:green} 0m 2s{color} | {color:green} The patch has no ill-formed XML file. {color} | | {color:red}-1{color} | {color:red} findbugs {color} | {color:red} 0m 32s{color} | {color:red} spark-client generated 2 new + 10 unchanged - 5 fixed = 12 total (was 15) {color} | | {color:green}+1{color} | {color:green} javadoc {color} | {color:green} 1m 35s{color} | {color:green} the patch passed {color} | || || || || {color:brown} Other Tests {color} || | {color:red}-1{color} | {color:red} asflicense {color} | {color:red} 0m 13s{color} | {color:red} The patch generated 1 ASF License warnings. {color} | | {color:black}{color} | {color:black} {color} | {color:black} 30m 18s{color} | {color:black} {color} | \\ \\ || Reason || Tests || | FindBugs | module:spark-client | | | Class org.apache.hive.spark.client.SparkLauncherSparkClient defines non-transient non-serializable instance field sparkLauncher In SparkLauncherSparkClient.java:instance field sparkLauncher In SparkLauncherSparkClient.java | | | org.apache.spark.launcher.InProcessLauncher stored into non-transient field SparkLauncherSparkClient.sparkLauncher At SparkLauncherSparkClient.java:SparkLauncherSparkClient.sparkLauncher At SparkLauncherSparkClient.java:[line 182] | \\ \\ || Subsystem || Report/Notes || | Optional Tests | asflicense javac javadoc findbugs checkstyle compile xml | | uname | Linux hiveptest-server-upstream 3.16.0-4-amd64 #1 SMP Debian 3.16.36-1+deb8u1 (2016-09-03) x86_64 GNU/Linux | | Build tool | maven | | Personality | /data/hiveptest/working/yetus_PreCommit-HIVE-Build-10925/dev-support/hive-personality.sh | | git revision | master / 1542c88 | | Default Java | 1.8.0_111 | | findbugs | v3.0.0 | | checkstyle | http://104.198.109.242/logs//PreCommit-HIVE-Build-10925/yetus/diff-checkstyle-spark-client.txt | | findbugs | http://104.198.109.242/logs//PreCommit-HIVE-Build-10925/yetus/new-findbugs-spark-client.html | | asflicense | http://104.198.109.242/logs//PreCommit-HIVE-Build-10925/yetus/patch-asflicense-problems.txt | | modules | C: common itests itests/qtest-spark ql spark-client U: . | | Console output | http://104.198.109.242/logs//PreCommit-HIVE-Build-10925/yetus.txt | | Powered by | Apac
[jira] [Commented] (HIVE-18533) Add option to use InProcessLauncher to submit spark jobs
[ https://issues.apache.org/jira/browse/HIVE-18533?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16473684#comment-16473684 ] Sahil Takiar commented on HIVE-18533: - [~lirui] good catch. Updated. > Add option to use InProcessLauncher to submit spark jobs > > > Key: HIVE-18533 > URL: https://issues.apache.org/jira/browse/HIVE-18533 > Project: Hive > Issue Type: Improvement > Components: Spark >Reporter: Sahil Takiar >Assignee: Sahil Takiar >Priority: Major > Attachments: HIVE-18533.1.patch, HIVE-18533.2.patch, > HIVE-18533.3.patch, HIVE-18533.4.patch, HIVE-18533.5.patch, > HIVE-18533.6.patch, HIVE-18533.7.patch, HIVE-18533.8.patch, > HIVE-18533.9.patch, HIVE-18533.91.patch, HIVE-18533.94.patch, > HIVE-18533.95.patch, HIVE-18831.93.patch > > > See discussion in HIVE-16484 for details. > I think this will help with reducing the amount of time it takes to open a > HoS session + debuggability (no need launch a separate process to run a Spark > app). -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Commented] (HIVE-18533) Add option to use InProcessLauncher to submit spark jobs
[ https://issues.apache.org/jira/browse/HIVE-18533?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16473112#comment-16473112 ] Rui Li commented on HIVE-18533: --- Thanks for the update [~stakiar]. There's some issue with TestSparkLauncherSparkClient. I think we can't assert the Future is done after we issue the state change -- there can be a race condition and it's not against the Future contracts. We can remove those assertions, or we can assert Future::get should return in some reasonable amount of time. > Add option to use InProcessLauncher to submit spark jobs > > > Key: HIVE-18533 > URL: https://issues.apache.org/jira/browse/HIVE-18533 > Project: Hive > Issue Type: Improvement > Components: Spark >Reporter: Sahil Takiar >Assignee: Sahil Takiar >Priority: Major > Attachments: HIVE-18533.1.patch, HIVE-18533.2.patch, > HIVE-18533.3.patch, HIVE-18533.4.patch, HIVE-18533.5.patch, > HIVE-18533.6.patch, HIVE-18533.7.patch, HIVE-18533.8.patch, > HIVE-18533.9.patch, HIVE-18533.91.patch, HIVE-18533.94.patch, > HIVE-18831.93.patch > > > See discussion in HIVE-16484 for details. > I think this will help with reducing the amount of time it takes to open a > HoS session + debuggability (no need launch a separate process to run a Spark > app). -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Commented] (HIVE-18533) Add option to use InProcessLauncher to submit spark jobs
[ https://issues.apache.org/jira/browse/HIVE-18533?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16471743#comment-16471743 ] Hive QA commented on HIVE-18533: Here are the results of testing the latest attachment: https://issues.apache.org/jira/secure/attachment/12922715/HIVE-18533.94.patch {color:green}SUCCESS:{color} +1 due to 4 test(s) being added or modified. {color:red}ERROR:{color} -1 due to 44 failed/errored test(s), 13579 tests executed *Failed tests:* {noformat} TestDbNotificationListener - did not produce a TEST-*.xml file (likely timed out) (batchId=247) TestHCatHiveCompatibility - did not produce a TEST-*.xml file (likely timed out) (batchId=247) TestNegativeCliDriver - did not produce a TEST-*.xml file (likely timed out) (batchId=95) [nopart_insert.q,insert_into_with_schema.q,input41.q,having1.q,create_table_failure3.q,default_constraint_invalid_default_value.q,database_drop_not_empty_restrict.q,windowing_after_orderby.q,orderbysortby.q,subquery_select_distinct2.q,authorization_uri_alterpart_loc.q,udf_last_day_error_1.q,constraint_duplicate_name.q,create_table_failure4.q,alter_tableprops_external_with_notnull_constraint.q,semijoin5.q,udf_format_number_wrong4.q,deletejar.q,exim_11_nonpart_noncompat_sorting.q,show_tables_bad_db2.q,drop_func_nonexistent.q,alter_table_non_partitioned_table_cascade.q,check_constraint_subquery.q,load_wrong_fileformat.q,check_constraint_udtf.q,lockneg_try_db_lock_conflict.q,udf_field_wrong_args_len.q,create_table_failure2.q,create_with_fk_constraints_enforced.q,groupby2_map_skew_multi_distinct.q,mm_update.q,authorization_update_noupdatepriv.q,show_columns2.q,authorization_insert_noselectpriv.q,orc_replace_columns3_acid.q,compare_double_bigint.q,authorization_set_nonexistent_conf.q,alter_rename_partition_failure3.q,split_sample_wrong_format2.q,create_with_fk_pk_same_tab.q,compare_double_bigint_2.q,authorization_show_roles_no_admin.q,materialized_view_authorization_rebuild_no_grant.q,unionLimit.q,authorization_revoke_table_fail2.q,duplicate_insert3.q,authorization_desc_table_nosel.q,stats_noscan_non_native.q,orc_change_serde_acid.q,create_or_replace_view7.q,exim_07_nonpart_noncompat_ifof.q,create_with_unique_constraints_enforced.q,udf_concat_ws_wrong2.q,fileformat_bad_class.q,merge_negative_2.q,exim_15_part_nonpart.q,authorization_not_owner_drop_view.q,external1.q,authorization_uri_insert.q,create_with_fk_wrong_ref.q,columnstats_tbllvl_incorrect_column.q,authorization_show_parts_nosel.q,authorization_not_owner_drop_tab.q,external2.q,authorization_deletejar.q,temp_table_create_like_partitions.q,udf_greatest_error_1.q,ptf_negative_AggrFuncsWithNoGBYNoPartDef.q,alter_view_as_select_not_exist.q,touch1.q,groupby3_map_skew_multi_distinct.q,insert_into_notnull_constraint.q,exchange_partition_neg_partition_missing.q,groupby_cube_multi_gby.q,columnstats_tbllvl.q,drop_invalid_constraint2.q,alter_table_add_partition.q,update_not_acid.q,archive5.q,alter_table_constraint_invalid_pk_col.q,ivyDownload.q,udf_instr_wrong_type.q,bad_sample_clause.q,authorization_not_owner_drop_tab2.q,authorization_alter_db_owner.q,show_columns1.q,orc_type_promotion3.q,create_view_failure8.q,strict_join.q,udf_add_months_error_1.q,groupby_cube2.q,groupby_cube1.q,groupby_rollup1.q,genericFileFormat.q,invalid_cast_from_binary_4.q,drop_invalid_constraint1.q,serde_regex.q,show_partitions1.q,check_constraint_nonboolean_expr.q,invalid_cast_from_binary_6.q,create_with_multi_pk_constraint.q,udf_field_wrong_type.q,groupby_grouping_sets4.q,groupby_grouping_sets3.q,insertsel_fail.q,udf_locate_wrong_type.q,orc_type_promotion1_acid.q,set_table_property.q,create_or_replace_view2.q,groupby_grouping_sets2.q,alter_view_failure.q,distinct_windowing_failure1.q,invalid_t_alter2.q,alter_table_constraint_invalid_fk_col1.q,invalid_varchar_length_2.q,authorization_show_grant_otheruser_alltabs.q,subquery_windowing_corr.q,compact_non_acid_table.q,authorization_view_4.q,authorization_disallow_transform.q,materialized_view_authorization_rebuild_other.q,authorization_fail_4.q,dbtxnmgr_nodblock.q,set_hiveconf_internal_variable1.q,input_part0_neg.q,udf_printf_wrong3.q,load_orc_negative2.q,druid_buckets.q,archive2.q,authorization_addjar.q,invalid_sum_syntax.q,insert_into_with_schema1.q,udf_add_months_error_2.q,dyn_part_max_per_node.q,authorization_revoke_table_fail1.q,udf_printf_wrong2.q,archive_multi3.q,udf_printf_wrong1.q,subquery_subquery_chain.q,authorization_view_disable_cbo_4.q,no_matching_udf.q,create_view_failure7.q,drop_native_udf.q,truncate_column_list_bucketing.q,authorization_uri_add_partition.q,authorization_view_disable_cbo_3.q,bad_exec_hooks.q,authorization_view_disable_cbo_2.q,fetchtask_ioexception.q,char_pad_convert_fail2.q,authorization_set_role_neg1.q,serde_regex3.q,authorization_delete_nodeletepriv.q,materialized_view_delete.q,create_or_replace_view6.q,bucket_mapjoin_wrong_table_metadata_2.q,udf_sort_array_by_wrong2.q,
[jira] [Commented] (HIVE-18533) Add option to use InProcessLauncher to submit spark jobs
[ https://issues.apache.org/jira/browse/HIVE-18533?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16471709#comment-16471709 ] Hive QA commented on HIVE-18533: | (x) *{color:red}-1 overall{color}* | \\ \\ || Vote || Subsystem || Runtime || Comment || || || || || {color:brown} Prechecks {color} || | {color:green}+1{color} | {color:green} @author {color} | {color:green} 0m 0s{color} | {color:green} The patch does not contain any @author tags. {color} | || || || || {color:brown} master Compile Tests {color} || | {color:blue}0{color} | {color:blue} mvndep {color} | {color:blue} 0m 42s{color} | {color:blue} Maven dependency ordering for branch {color} | | {color:green}+1{color} | {color:green} mvninstall {color} | {color:green} 6m 53s{color} | {color:green} master passed {color} | | {color:green}+1{color} | {color:green} compile {color} | {color:green} 1m 55s{color} | {color:green} master passed {color} | | {color:green}+1{color} | {color:green} checkstyle {color} | {color:green} 1m 15s{color} | {color:green} master passed {color} | | {color:blue}0{color} | {color:blue} findbugs {color} | {color:blue} 0m 32s{color} | {color:blue} common in master has 62 extant Findbugs warnings. {color} | | {color:blue}0{color} | {color:blue} findbugs {color} | {color:blue} 4m 1s{color} | {color:blue} ql in master has 2321 extant Findbugs warnings. {color} | | {color:blue}0{color} | {color:blue} findbugs {color} | {color:blue} 0m 23s{color} | {color:blue} spark-client in master has 15 extant Findbugs warnings. {color} | | {color:green}+1{color} | {color:green} javadoc {color} | {color:green} 1m 38s{color} | {color:green} master passed {color} | || || || || {color:brown} Patch Compile Tests {color} || | {color:blue}0{color} | {color:blue} mvndep {color} | {color:blue} 0m 8s{color} | {color:blue} Maven dependency ordering for patch {color} | | {color:green}+1{color} | {color:green} mvninstall {color} | {color:green} 2m 21s{color} | {color:green} the patch passed {color} | | {color:green}+1{color} | {color:green} compile {color} | {color:green} 1m 57s{color} | {color:green} the patch passed {color} | | {color:green}+1{color} | {color:green} javac {color} | {color:green} 1m 57s{color} | {color:green} the patch passed {color} | | {color:red}-1{color} | {color:red} checkstyle {color} | {color:red} 0m 10s{color} | {color:red} spark-client: The patch generated 11 new + 27 unchanged - 7 fixed = 38 total (was 34) {color} | | {color:green}+1{color} | {color:green} whitespace {color} | {color:green} 0m 0s{color} | {color:green} The patch has no whitespace issues. {color} | | {color:green}+1{color} | {color:green} xml {color} | {color:green} 0m 2s{color} | {color:green} The patch has no ill-formed XML file. {color} | | {color:red}-1{color} | {color:red} findbugs {color} | {color:red} 0m 32s{color} | {color:red} spark-client generated 2 new + 10 unchanged - 5 fixed = 12 total (was 15) {color} | | {color:green}+1{color} | {color:green} javadoc {color} | {color:green} 1m 35s{color} | {color:green} the patch passed {color} | || || || || {color:brown} Other Tests {color} || | {color:red}-1{color} | {color:red} asflicense {color} | {color:red} 0m 13s{color} | {color:red} The patch generated 2 ASF License warnings. {color} | | {color:black}{color} | {color:black} {color} | {color:black} 31m 11s{color} | {color:black} {color} | \\ \\ || Reason || Tests || | FindBugs | module:spark-client | | | Class org.apache.hive.spark.client.SparkLauncherSparkClient defines non-transient non-serializable instance field sparkLauncher In SparkLauncherSparkClient.java:instance field sparkLauncher In SparkLauncherSparkClient.java | | | org.apache.spark.launcher.InProcessLauncher stored into non-transient field SparkLauncherSparkClient.sparkLauncher At SparkLauncherSparkClient.java:SparkLauncherSparkClient.sparkLauncher At SparkLauncherSparkClient.java:[line 182] | \\ \\ || Subsystem || Report/Notes || | Optional Tests | asflicense javac javadoc findbugs checkstyle compile xml | | uname | Linux hiveptest-server-upstream 3.16.0-4-amd64 #1 SMP Debian 3.16.36-1+deb8u1 (2016-09-03) x86_64 GNU/Linux | | Build tool | maven | | Personality | /data/hiveptest/working/yetus_PreCommit-HIVE-Build-10821/dev-support/hive-personality.sh | | git revision | master / 68b66a6 | | Default Java | 1.8.0_111 | | findbugs | v3.0.0 | | checkstyle | http://104.198.109.242/logs//PreCommit-HIVE-Build-10821/yetus/diff-checkstyle-spark-client.txt | | findbugs | http://104.198.109.242/logs//PreCommit-HIVE-Build-10821/yetus/new-findbugs-spark-client.html | | asflicense | http://104.198.109.242/logs//PreCommit-HIVE-Build-10821/yetus/patch-asflicense-problems.txt | | modules | C: common itests itests/qtest-spark ql spark-client U: . | | Console output | http://104.198.109.242/logs//PreCommit-HIVE-Build-10821/yetus.txt | | Powered by | Apac
[jira] [Commented] (HIVE-18533) Add option to use InProcessLauncher to submit spark jobs
[ https://issues.apache.org/jira/browse/HIVE-18533?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16469407#comment-16469407 ] Sahil Takiar commented on HIVE-18533: - [~lirui] attached an updated patch that uses {{FutureTask}} instead of a custom {{Future}}. Agree that this makes the code simpler. > Add option to use InProcessLauncher to submit spark jobs > > > Key: HIVE-18533 > URL: https://issues.apache.org/jira/browse/HIVE-18533 > Project: Hive > Issue Type: Improvement > Components: Spark >Reporter: Sahil Takiar >Assignee: Sahil Takiar >Priority: Major > Attachments: HIVE-18533.1.patch, HIVE-18533.2.patch, > HIVE-18533.3.patch, HIVE-18533.4.patch, HIVE-18533.5.patch, > HIVE-18533.6.patch, HIVE-18533.7.patch, HIVE-18533.8.patch, > HIVE-18533.9.patch, HIVE-18533.91.patch, HIVE-18533.94.patch, > HIVE-18831.93.patch > > > See discussion in HIVE-16484 for details. > I think this will help with reducing the amount of time it takes to open a > HoS session + debuggability (no need launch a separate process to run a Spark > app). -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Commented] (HIVE-18533) Add option to use InProcessLauncher to submit spark jobs
[ https://issues.apache.org/jira/browse/HIVE-18533?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16466816#comment-16466816 ] Rui Li commented on HIVE-18533: --- Hi [~stakiar], my concern for SparkLauncherFuture is mainly about the cancel logic. SparkLauncherFuture::cancel calls SparkAppHandle::stop, which I think is an async method. So it doesn't immediately unblock threads waiting on SparkLauncherFuture::get. And subsequent calls to isCancelled and isDone may not return true. Besides, JavaDoc mentions SparkAppHandle::stop is only best effort to ask the app to stop, so it doesn't even guarantee a state change. Another issue is SparkLauncherFuture::isCancelled considers all failed states as cancelled. So it may return true even if cancel is not called. I know this might not be an issue according to the way AbstractSparkClient works at the moment. But if we want to make changes to AbstractSparkClient in the future, it's better if the two sub-classes behave consistently and both honor the Future contracts. If we use a FutureTask, we can interrupt the thread when we cancel the Future. The thread can handle the interrupt exception and call SparkAppHandle::stop (probably need to cancel the RPC as well) -- similar to what we do in SparkSubmitSparkClient. > Add option to use InProcessLauncher to submit spark jobs > > > Key: HIVE-18533 > URL: https://issues.apache.org/jira/browse/HIVE-18533 > Project: Hive > Issue Type: Improvement > Components: Spark >Reporter: Sahil Takiar >Assignee: Sahil Takiar >Priority: Major > Attachments: HIVE-18533.1.patch, HIVE-18533.2.patch, > HIVE-18533.3.patch, HIVE-18533.4.patch, HIVE-18533.5.patch, > HIVE-18533.6.patch, HIVE-18533.7.patch, HIVE-18533.8.patch, > HIVE-18533.9.patch, HIVE-18533.91.patch, HIVE-18831.93.patch > > > See discussion in HIVE-16484 for details. > I think this will help with reducing the amount of time it takes to open a > HoS session + debuggability (no need launch a separate process to run a Spark > app). -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Commented] (HIVE-18533) Add option to use InProcessLauncher to submit spark jobs
[ https://issues.apache.org/jira/browse/HIVE-18533?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16466648#comment-16466648 ] Sahil Takiar commented on HIVE-18533: - [~lirui] yeah that might be slightly simpler. Although the specific example you mentioned shouldn't be an issue. Calling {{SparkAppHandle#stop}} should cause the Spark App to transition to a stopped state, in which case the {{SparkAppHandle.Listener}} will decrement the count down latch. I don't think it would be as simple as wrapping the countdown latch in a {{FutureTask}}, there still needs to be a way to cancel the underlying Spark app if {{#interrupt}} is called. Plus there are unit tests for {{SparkLauncherFuture}}, although I can add in some more to ensure the {{Future}} contract isn't broken. > Add option to use InProcessLauncher to submit spark jobs > > > Key: HIVE-18533 > URL: https://issues.apache.org/jira/browse/HIVE-18533 > Project: Hive > Issue Type: Improvement > Components: Spark >Reporter: Sahil Takiar >Assignee: Sahil Takiar >Priority: Major > Attachments: HIVE-18533.1.patch, HIVE-18533.2.patch, > HIVE-18533.3.patch, HIVE-18533.4.patch, HIVE-18533.5.patch, > HIVE-18533.6.patch, HIVE-18533.7.patch, HIVE-18533.8.patch, > HIVE-18533.9.patch, HIVE-18533.91.patch, HIVE-18831.93.patch > > > See discussion in HIVE-16484 for details. > I think this will help with reducing the amount of time it takes to open a > HoS session + debuggability (no need launch a separate process to run a Spark > app). -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Commented] (HIVE-18533) Add option to use InProcessLauncher to submit spark jobs
[ https://issues.apache.org/jira/browse/HIVE-18533?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16465624#comment-16465624 ] Rui Li commented on HIVE-18533: --- Hi [~stakiar], for SparkLauncherSparkClient, how about we use a thread to wait on the countdown latch and return a FutureTask like we did in SparkSubmitSparkClient? I think it makes the two clients more consistent and it's easier than implementing a custom Future. For example, when {{Future::cancel}} is called, threads waiting on {{Future::get}} should immediately be unblocked, and {{Future::isCancelled}} should return true. We don't have to worry about breaking these contracts if we use FutureTask. What do you think? > Add option to use InProcessLauncher to submit spark jobs > > > Key: HIVE-18533 > URL: https://issues.apache.org/jira/browse/HIVE-18533 > Project: Hive > Issue Type: Improvement > Components: Spark >Reporter: Sahil Takiar >Assignee: Sahil Takiar >Priority: Major > Attachments: HIVE-18533.1.patch, HIVE-18533.2.patch, > HIVE-18533.3.patch, HIVE-18533.4.patch, HIVE-18533.5.patch, > HIVE-18533.6.patch, HIVE-18533.7.patch, HIVE-18533.8.patch, > HIVE-18533.9.patch, HIVE-18533.91.patch, HIVE-18831.93.patch > > > See discussion in HIVE-16484 for details. > I think this will help with reducing the amount of time it takes to open a > HoS session + debuggability (no need launch a separate process to run a Spark > app). -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Commented] (HIVE-18533) Add option to use InProcessLauncher to submit spark jobs
[ https://issues.apache.org/jira/browse/HIVE-18533?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16465530#comment-16465530 ] Hive QA commented on HIVE-18533: Here are the results of testing the latest attachment: https://issues.apache.org/jira/secure/attachment/12922206/HIVE-18831.93.patch {color:green}SUCCESS:{color} +1 due to 4 test(s) being added or modified. {color:red}ERROR:{color} -1 due to 35 failed/errored test(s), 14321 tests executed *Failed tests:* {noformat} TestCustomAuthentication - did not produce a TEST-*.xml file (likely timed out) (batchId=236) TestDbNotificationListener - did not produce a TEST-*.xml file (likely timed out) (batchId=247) TestHCatHiveCompatibility - did not produce a TEST-*.xml file (likely timed out) (batchId=247) TestNonCatCallsWithCatalog - did not produce a TEST-*.xml file (likely timed out) (batchId=217) TestSequenceFileReadWrite - did not produce a TEST-*.xml file (likely timed out) (batchId=247) TestTxnExIm - did not produce a TEST-*.xml file (likely timed out) (batchId=286) org.apache.hadoop.hive.cli.TestMiniLlapLocalCliDriver.testCliDriver[bucket_map_join_tez1] (batchId=175) org.apache.hadoop.hive.cli.TestMiniLlapLocalCliDriver.testCliDriver[bucket_map_join_tez2] (batchId=156) org.apache.hadoop.hive.cli.TestMiniLlapLocalCliDriver.testCliDriver[sysdb] (batchId=163) org.apache.hadoop.hive.cli.TestMiniLlapLocalCliDriver.testCliDriver[tez_dynpart_hashjoin_1] (batchId=174) org.apache.hadoop.hive.cli.TestMiniLlapLocalCliDriver.testCliDriver[tez_vector_dynpart_hashjoin_1] (batchId=172) org.apache.hadoop.hive.cli.TestMiniLlapLocalCliDriver.testCliDriver[union_fast_stats] (batchId=167) org.apache.hadoop.hive.cli.TestMiniTezCliDriver.testCliDriver[explainanalyze_5] (batchId=105) org.apache.hadoop.hive.cli.TestNegativeCliDriver.testCliDriver[udf_reflect_neg] (batchId=96) org.apache.hadoop.hive.cli.TestNegativeCliDriver.testCliDriver[udf_test_error] (batchId=96) org.apache.hadoop.hive.ql.TestAcidOnTez.testCtasTezUnion (batchId=228) org.apache.hadoop.hive.ql.TestAcidOnTez.testNonStandardConversion01 (batchId=228) org.apache.hadoop.hive.ql.TestMTQueries.testMTQueries1 (batchId=232) org.apache.hadoop.hive.ql.parse.TestCopyUtils.testPrivilegedDistCpWithSameUserAsCurrentDoesNotTryToImpersonate (batchId=231) org.apache.hadoop.hive.ql.parse.TestReplicationOnHDFSEncryptedZones.targetAndSourceHaveDifferentEncryptionZoneKeys (batchId=231) org.apache.hive.beeline.TestBeeLineWithArgs.testQueryProgress (batchId=235) org.apache.hive.beeline.TestBeeLineWithArgs.testQueryProgressParallel (batchId=235) org.apache.hive.jdbc.TestSSL.testSSLFetchHttp (batchId=239) org.apache.hive.jdbc.TestTriggersMoveWorkloadManager.testTriggerMoveAndKill (batchId=241) org.apache.hive.jdbc.TestTriggersWorkloadManager.testMultipleTriggers2 (batchId=241) org.apache.hive.jdbc.TestTriggersWorkloadManager.testTriggerCustomCreatedFiles (batchId=241) org.apache.hive.jdbc.TestTriggersWorkloadManager.testTriggerCustomNonExistent (batchId=241) org.apache.hive.jdbc.TestTriggersWorkloadManager.testTriggerCustomReadOps (batchId=241) org.apache.hive.jdbc.TestTriggersWorkloadManager.testTriggerHighBytesRead (batchId=241) org.apache.hive.jdbc.TestTriggersWorkloadManager.testTriggerHighBytesWrite (batchId=241) org.apache.hive.jdbc.TestTriggersWorkloadManager.testTriggerHighShuffleBytes (batchId=241) org.apache.hive.jdbc.TestTriggersWorkloadManager.testTriggerSlowQueryElapsedTime (batchId=241) org.apache.hive.jdbc.TestTriggersWorkloadManager.testTriggerSlowQueryExecutionTime (batchId=241) org.apache.hive.jdbc.TestTriggersWorkloadManager.testTriggerVertexRawInputSplitsNoKill (batchId=241) org.apache.hive.spark.client.rpc.TestRpc.testServerPort (batchId=304) {noformat} Test results: https://builds.apache.org/job/PreCommit-HIVE-Build/10740/testReport Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/10740/console Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-10740/ Messages: {noformat} Executing org.apache.hive.ptest.execution.TestCheckPhase Executing org.apache.hive.ptest.execution.PrepPhase Executing org.apache.hive.ptest.execution.YetusPhase Executing org.apache.hive.ptest.execution.ExecutionPhase Executing org.apache.hive.ptest.execution.ReportingPhase Tests exited with: TestsFailedException: 35 tests failed {noformat} This message is automatically generated. ATTACHMENT ID: 12922206 - PreCommit-HIVE-Build > Add option to use InProcessLauncher to submit spark jobs > > > Key: HIVE-18533 > URL: https://issues.apache.org/jira/browse/HIVE-18533 > Project: Hive > Issue Type: Improvement > Components: Spark >Reporter: Sahil Takiar >Assignee: Sahil Takiar >Priority: Major > Attachments: HIVE-18533.1.patch, HIVE-18533.2.patc
[jira] [Commented] (HIVE-18533) Add option to use InProcessLauncher to submit spark jobs
[ https://issues.apache.org/jira/browse/HIVE-18533?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16465462#comment-16465462 ] Hive QA commented on HIVE-18533: | (x) *{color:red}-1 overall{color}* | \\ \\ || Vote || Subsystem || Runtime || Comment || | {color:red}-1{color} | {color:red} patch {color} | {color:red} 0m 10s{color} | {color:red} /data/hiveptest/logs/PreCommit-HIVE-Build-10740/patches/PreCommit-HIVE-Build-10740.patch does not apply to master. Rebase required? Wrong Branch? See http://cwiki.apache.org/confluence/display/Hive/HowToContribute for help. {color} | \\ \\ || Subsystem || Report/Notes || | Console output | http://104.198.109.242/logs//PreCommit-HIVE-Build-10740/yetus.txt | | Powered by | Apache Yetushttp://yetus.apache.org | This message was automatically generated. > Add option to use InProcessLauncher to submit spark jobs > > > Key: HIVE-18533 > URL: https://issues.apache.org/jira/browse/HIVE-18533 > Project: Hive > Issue Type: Improvement > Components: Spark >Reporter: Sahil Takiar >Assignee: Sahil Takiar >Priority: Major > Attachments: HIVE-18533.1.patch, HIVE-18533.2.patch, > HIVE-18533.3.patch, HIVE-18533.4.patch, HIVE-18533.5.patch, > HIVE-18533.6.patch, HIVE-18533.7.patch, HIVE-18533.8.patch, > HIVE-18533.9.patch, HIVE-18533.91.patch, HIVE-18831.93.patch > > > See discussion in HIVE-16484 for details. > I think this will help with reducing the amount of time it takes to open a > HoS session + debuggability (no need launch a separate process to run a Spark > app). -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Commented] (HIVE-18533) Add option to use InProcessLauncher to submit spark jobs
[ https://issues.apache.org/jira/browse/HIVE-18533?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16465355#comment-16465355 ] Hive QA commented on HIVE-18533: Here are the results of testing the latest attachment: https://issues.apache.org/jira/secure/attachment/12922202/HIVE-18533.92.patch {color:red}ERROR:{color} -1 due to build exiting with an error Test results: https://builds.apache.org/job/PreCommit-HIVE-Build/10737/testReport Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/10737/console Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-10737/ Messages: {noformat} Executing org.apache.hive.ptest.execution.TestCheckPhase Executing org.apache.hive.ptest.execution.PrepPhase Tests exited with: NonZeroExitCodeException Command 'bash /data/hiveptest/working/scratch/source-prep.sh' failed with exit status 1 and output '+ date '+%Y-%m-%d %T.%3N' 2018-05-07 01:52:05.718 + [[ -n /usr/lib/jvm/java-8-openjdk-amd64 ]] + export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64 + JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64 + export PATH=/usr/lib/jvm/java-8-openjdk-amd64/bin/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games + PATH=/usr/lib/jvm/java-8-openjdk-amd64/bin/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games + export 'ANT_OPTS=-Xmx1g -XX:MaxPermSize=256m ' + ANT_OPTS='-Xmx1g -XX:MaxPermSize=256m ' + export 'MAVEN_OPTS=-Xmx1g ' + MAVEN_OPTS='-Xmx1g ' + cd /data/hiveptest/working/ + tee /data/hiveptest/logs/PreCommit-HIVE-Build-10737/source-prep.txt + [[ false == \t\r\u\e ]] + mkdir -p maven ivy + [[ git = \s\v\n ]] + [[ git = \g\i\t ]] + [[ -z master ]] + [[ -d apache-github-source-source ]] + [[ ! -d apache-github-source-source/.git ]] + [[ ! -d apache-github-source-source ]] + date '+%Y-%m-%d %T.%3N' 2018-05-07 01:52:05.721 + cd apache-github-source-source + git fetch origin + git reset --hard HEAD HEAD is now at 88d224f HIVE-19344 : Change default value of msck.repair.batch.size (Vihang Karajgaonkar reviewed by Sahil Takiar) + git clean -f -d Removing ${project.basedir}/ + git checkout master Already on 'master' Your branch is up-to-date with 'origin/master'. + git reset --hard origin/master HEAD is now at 88d224f HIVE-19344 : Change default value of msck.repair.batch.size (Vihang Karajgaonkar reviewed by Sahil Takiar) + git merge --ff-only origin/master Already up-to-date. + date '+%Y-%m-%d %T.%3N' 2018-05-07 01:52:06.991 + rm -rf ../yetus_PreCommit-HIVE-Build-10737 + mkdir ../yetus_PreCommit-HIVE-Build-10737 + git gc + cp -R . ../yetus_PreCommit-HIVE-Build-10737 + mkdir /data/hiveptest/logs/PreCommit-HIVE-Build-10737/yetus + patchCommandPath=/data/hiveptest/working/scratch/smart-apply-patch.sh + patchFilePath=/data/hiveptest/working/scratch/build.patch + [[ -f /data/hiveptest/working/scratch/build.patch ]] + chmod +x /data/hiveptest/working/scratch/smart-apply-patch.sh + /data/hiveptest/working/scratch/smart-apply-patch.sh /data/hiveptest/working/scratch/build.patch fatal: unrecognized input fatal: unrecognized input fatal: unrecognized input The patch does not appear to apply with p0, p1, or p2 + exit 1 ' {noformat} This message is automatically generated. ATTACHMENT ID: 12922202 - PreCommit-HIVE-Build > Add option to use InProcessLauncher to submit spark jobs > > > Key: HIVE-18533 > URL: https://issues.apache.org/jira/browse/HIVE-18533 > Project: Hive > Issue Type: Improvement > Components: Spark >Reporter: Sahil Takiar >Assignee: Sahil Takiar >Priority: Major > Attachments: HIVE-18533.1.patch, HIVE-18533.2.patch, > HIVE-18533.3.patch, HIVE-18533.4.patch, HIVE-18533.5.patch, > HIVE-18533.6.patch, HIVE-18533.7.patch, HIVE-18533.8.patch, > HIVE-18533.9.patch, HIVE-18533.91.patch, HIVE-18533.92.patch > > > See discussion in HIVE-16484 for details. > I think this will help with reducing the amount of time it takes to open a > HoS session + debuggability (no need launch a separate process to run a Spark > app). -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Commented] (HIVE-18533) Add option to use InProcessLauncher to submit spark jobs
[ https://issues.apache.org/jira/browse/HIVE-18533?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16464849#comment-16464849 ] Hive QA commented on HIVE-18533: Here are the results of testing the latest attachment: https://issues.apache.org/jira/secure/attachment/12922084/HIVE-18533.91.patch {color:red}ERROR:{color} -1 due to build exiting with an error Test results: https://builds.apache.org/job/PreCommit-HIVE-Build/10713/testReport Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/10713/console Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-10713/ Messages: {noformat} Executing org.apache.hive.ptest.execution.TestCheckPhase Executing org.apache.hive.ptest.execution.PrepPhase Tests exited with: NonZeroExitCodeException Command 'bash /data/hiveptest/working/scratch/source-prep.sh' failed with exit status 1 and output '+ date '+%Y-%m-%d %T.%3N' 2018-05-05 17:43:01.824 + [[ -n /usr/lib/jvm/java-8-openjdk-amd64 ]] + export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64 + JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64 + export PATH=/usr/lib/jvm/java-8-openjdk-amd64/bin/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games + PATH=/usr/lib/jvm/java-8-openjdk-amd64/bin/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games + export 'ANT_OPTS=-Xmx1g -XX:MaxPermSize=256m ' + ANT_OPTS='-Xmx1g -XX:MaxPermSize=256m ' + export 'MAVEN_OPTS=-Xmx1g ' + MAVEN_OPTS='-Xmx1g ' + cd /data/hiveptest/working/ + tee /data/hiveptest/logs/PreCommit-HIVE-Build-10713/source-prep.txt + [[ false == \t\r\u\e ]] + mkdir -p maven ivy + [[ git = \s\v\n ]] + [[ git = \g\i\t ]] + [[ -z master ]] + [[ -d apache-github-source-source ]] + [[ ! -d apache-github-source-source/.git ]] + [[ ! -d apache-github-source-source ]] + date '+%Y-%m-%d %T.%3N' 2018-05-05 17:43:01.840 + cd apache-github-source-source + git fetch origin + git reset --hard HEAD HEAD is now at 0d517bf HIVE-19396 : HiveOperation is incorrectly set for analyze statement (Ashutosh Chauhan via Zoltan Haindrich) (addendum) + git clean -f -d + git checkout master Already on 'master' Your branch is up-to-date with 'origin/master'. + git reset --hard origin/master HEAD is now at 0d517bf HIVE-19396 : HiveOperation is incorrectly set for analyze statement (Ashutosh Chauhan via Zoltan Haindrich) (addendum) + git merge --ff-only origin/master Already up-to-date. + date '+%Y-%m-%d %T.%3N' 2018-05-05 17:43:11.377 + rm -rf ../yetus_PreCommit-HIVE-Build-10713 + mkdir ../yetus_PreCommit-HIVE-Build-10713 + git gc + cp -R . ../yetus_PreCommit-HIVE-Build-10713 + mkdir /data/hiveptest/logs/PreCommit-HIVE-Build-10713/yetus + patchCommandPath=/data/hiveptest/working/scratch/smart-apply-patch.sh + patchFilePath=/data/hiveptest/working/scratch/build.patch + [[ -f /data/hiveptest/working/scratch/build.patch ]] + chmod +x /data/hiveptest/working/scratch/smart-apply-patch.sh + /data/hiveptest/working/scratch/smart-apply-patch.sh /data/hiveptest/working/scratch/build.patch fatal: git apply: bad git-diff - inconsistent old filename on line 297 error: patch failed: spark-client/src/main/java/org/apache/hive/spark/client/SparkClientImpl.java:42 Falling back to three-way merge... Applied patch to 'spark-client/src/main/java/org/apache/hive/spark/client/AbstractSparkClient.java' with conflicts. error: patch failed: spark-client/src/main/java/org/apache/hive/spark/client/SparkClientFactory.java:82 Falling back to three-way merge... Applied patch to 'spark-client/src/main/java/org/apache/hive/spark/client/SparkClientFactory.java' with conflicts. Going to apply patch with: git apply -p1 /data/hiveptest/working/scratch/build.patch:105: trailing whitespace. Map 1 /data/hiveptest/working/scratch/build.patch:128: trailing whitespace. Reducer 2 /data/hiveptest/working/scratch/build.patch:143: trailing whitespace. Reducer 3 error: patch failed: spark-client/src/main/java/org/apache/hive/spark/client/SparkClientImpl.java:42 Falling back to three-way merge... Applied patch to 'spark-client/src/main/java/org/apache/hive/spark/client/AbstractSparkClient.java' with conflicts. error: patch failed: spark-client/src/main/java/org/apache/hive/spark/client/SparkClientFactory.java:82 Falling back to three-way merge... Applied patch to 'spark-client/src/main/java/org/apache/hive/spark/client/SparkClientFactory.java' with conflicts. U spark-client/src/main/java/org/apache/hive/spark/client/AbstractSparkClient.java U spark-client/src/main/java/org/apache/hive/spark/client/SparkClientFactory.java warning: 3 lines add whitespace errors. + exit 1 ' {noformat} This message is automatically generated. ATTACHMENT ID: 12922084 - PreCommit-HIVE-Build > Add option to use InProcessLauncher to submit spark jobs > > > Key: HIVE-18533 > URL: https://issues.apache.org/jira/browse/HIVE-18533 >
[jira] [Commented] (HIVE-18533) Add option to use InProcessLauncher to submit spark jobs
[ https://issues.apache.org/jira/browse/HIVE-18533?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16462702#comment-16462702 ] Sahil Takiar commented on HIVE-18533: - I think {{Future}} is a better abstraction than a {{Thread}} in this scenario. A {{Future}} provides a way to monitor an async operation, as well as get any errors or results from the async computation. {{launchDriver}} is basically an async operation, it launches the Spark app and then exits. > Add option to use InProcessLauncher to submit spark jobs > > > Key: HIVE-18533 > URL: https://issues.apache.org/jira/browse/HIVE-18533 > Project: Hive > Issue Type: Improvement > Components: Spark >Reporter: Sahil Takiar >Assignee: Sahil Takiar >Priority: Major > Attachments: HIVE-18533.1.patch, HIVE-18533.2.patch, > HIVE-18533.3.patch, HIVE-18533.4.patch, HIVE-18533.5.patch, > HIVE-18533.6.patch, HIVE-18533.7.patch, HIVE-18533.8.patch, HIVE-18533.9.patch > > > See discussion in HIVE-16484 for details. > I think this will help with reducing the amount of time it takes to open a > HoS session + debuggability (no need launch a separate process to run a Spark > app). -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Commented] (HIVE-18533) Add option to use InProcessLauncher to submit spark jobs
[ https://issues.apache.org/jira/browse/HIVE-18533?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16462612#comment-16462612 ] Rui Li commented on HIVE-18533: --- Hi [~stakiar], could you explain why we now need to monitor the driver with a future instead of a thread? > Add option to use InProcessLauncher to submit spark jobs > > > Key: HIVE-18533 > URL: https://issues.apache.org/jira/browse/HIVE-18533 > Project: Hive > Issue Type: Improvement > Components: Spark >Reporter: Sahil Takiar >Assignee: Sahil Takiar >Priority: Major > Attachments: HIVE-18533.1.patch, HIVE-18533.2.patch, > HIVE-18533.3.patch, HIVE-18533.4.patch, HIVE-18533.5.patch, > HIVE-18533.6.patch, HIVE-18533.7.patch, HIVE-18533.8.patch, HIVE-18533.9.patch > > > See discussion in HIVE-16484 for details. > I think this will help with reducing the amount of time it takes to open a > HoS session + debuggability (no need launch a separate process to run a Spark > app). -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Commented] (HIVE-18533) Add option to use InProcessLauncher to submit spark jobs
[ https://issues.apache.org/jira/browse/HIVE-18533?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16462109#comment-16462109 ] Hive QA commented on HIVE-18533: Here are the results of testing the latest attachment: https://issues.apache.org/jira/secure/attachment/12921672/HIVE-18533.9.patch {color:green}SUCCESS:{color} +1 due to 4 test(s) being added or modified. {color:red}ERROR:{color} -1 due to 50 failed/errored test(s), 14319 tests executed *Failed tests:* {noformat} TestDbNotificationListener - did not produce a TEST-*.xml file (likely timed out) (batchId=247) TestHCatHiveCompatibility - did not produce a TEST-*.xml file (likely timed out) (batchId=247) TestNonCatCallsWithCatalog - did not produce a TEST-*.xml file (likely timed out) (batchId=217) TestSequenceFileReadWrite - did not produce a TEST-*.xml file (likely timed out) (batchId=247) TestTxnExIm - did not produce a TEST-*.xml file (likely timed out) (batchId=286) org.apache.hadoop.hive.cli.TestBeeLineDriver.testCliDriver[smb_mapjoin_13] (batchId=253) org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[bucket_map_join_1] (batchId=68) org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[bucket_map_join_2] (batchId=60) org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[bucketmapjoin_negative3] (batchId=29) org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[fouter_join_ppr] (batchId=33) org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[groupby_sort_1_23] (batchId=81) org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[groupby_sort_skew_1_23] (batchId=9) org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[sample2] (batchId=6) org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[smb_mapjoin_13] (batchId=32) org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[sort_merge_join_desc_7] (batchId=27) org.apache.hadoop.hive.cli.TestMiniLlapLocalCliDriver.testCliDriver[explainuser_4] (batchId=164) org.apache.hadoop.hive.cli.TestMiniLlapLocalCliDriver.testCliDriver[sysdb] (batchId=163) org.apache.hadoop.hive.cli.TestMiniLlapLocalCliDriver.testCliDriver[tez_dynpart_hashjoin_1] (batchId=174) org.apache.hadoop.hive.cli.TestMiniLlapLocalCliDriver.testCliDriver[tez_smb_main] (batchId=160) org.apache.hadoop.hive.cli.TestMiniLlapLocalCliDriver.testCliDriver[tez_vector_dynpart_hashjoin_1] (batchId=172) org.apache.hadoop.hive.cli.TestMiniLlapLocalCliDriver.testCliDriver[union_fast_stats] (batchId=167) org.apache.hadoop.hive.cli.TestMiniSparkOnYarnCliDriver.testCliDriver[spark_in_process_launcher] (batchId=182) org.apache.hadoop.hive.cli.TestMiniTezCliDriver.testCliDriver[explainanalyze_5] (batchId=105) org.apache.hadoop.hive.cli.TestNegativeCliDriver.testCliDriver[insertsel_fail] (batchId=95) org.apache.hadoop.hive.cli.TestNegativeCliDriver.testCliDriver[load_data_parquet_empty] (batchId=96) org.apache.hadoop.hive.cli.TestNegativeCliDriver.testCliDriver[udf_reflect_neg] (batchId=96) org.apache.hadoop.hive.cli.TestNegativeCliDriver.testCliDriver[udf_test_error] (batchId=96) org.apache.hadoop.hive.cli.TestSparkCliDriver.testCliDriver[bucket_map_join_1] (batchId=137) org.apache.hadoop.hive.cli.TestSparkCliDriver.testCliDriver[bucket_map_join_2] (batchId=133) org.apache.hadoop.hive.cli.TestSparkCliDriver.testCliDriver[bucketmapjoin_negative3] (batchId=120) org.apache.hadoop.hive.cli.TestSparkCliDriver.testCliDriver[groupby_sort_1_23] (batchId=143) org.apache.hadoop.hive.cli.TestSparkCliDriver.testCliDriver[groupby_sort_skew_1_23] (batchId=111) org.apache.hadoop.hive.cli.TestSparkCliDriver.testCliDriver[smb_mapjoin_13] (batchId=122) org.apache.hadoop.hive.ql.TestAcidOnTez.testCtasTezUnion (batchId=228) org.apache.hadoop.hive.ql.TestAcidOnTez.testNonStandardConversion01 (batchId=228) org.apache.hadoop.hive.ql.TestMTQueries.testMTQueries1 (batchId=232) org.apache.hadoop.hive.ql.parse.TestCopyUtils.testPrivilegedDistCpWithSameUserAsCurrentDoesNotTryToImpersonate (batchId=231) org.apache.hadoop.hive.ql.parse.TestReplicationOnHDFSEncryptedZones.targetAndSourceHaveDifferentEncryptionZoneKeys (batchId=231) org.apache.hive.beeline.TestBeeLineWithArgs.testQueryProgress (batchId=235) org.apache.hive.jdbc.TestSSL.testSSLFetchHttp (batchId=239) org.apache.hive.jdbc.TestTriggersWorkloadManager.testMultipleTriggers2 (batchId=241) org.apache.hive.jdbc.TestTriggersWorkloadManager.testTriggerCustomCreatedFiles (batchId=241) org.apache.hive.jdbc.TestTriggersWorkloadManager.testTriggerCustomNonExistent (batchId=241) org.apache.hive.jdbc.TestTriggersWorkloadManager.testTriggerCustomReadOps (batchId=241) org.apache.hive.jdbc.TestTriggersWorkloadManager.testTriggerHighBytesRead (batchId=241) org.apache.hive.jdbc.TestTriggersWorkloadManager.testTriggerHighBytesWrite (batchId=241) org.apache.hive.jdbc.TestTriggersWorkloadManager.testTriggerHighShuffleBytes (batchId=241) org.apache.hive.jdbc.TestTriggersWorkloadManager.testTriggerSlowQueryElap
[jira] [Commented] (HIVE-18533) Add option to use InProcessLauncher to submit spark jobs
[ https://issues.apache.org/jira/browse/HIVE-18533?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16462054#comment-16462054 ] Hive QA commented on HIVE-18533: | (x) *{color:red}-1 overall{color}* | \\ \\ || Vote || Subsystem || Runtime || Comment || || || || || {color:brown} Prechecks {color} || | {color:blue}0{color} | {color:blue} findbugs {color} | {color:blue} 0m 1s{color} | {color:blue} Findbugs executables are not available. {color} | | {color:green}+1{color} | {color:green} @author {color} | {color:green} 0m 0s{color} | {color:green} The patch does not contain any @author tags. {color} | || || || || {color:brown} master Compile Tests {color} || | {color:blue}0{color} | {color:blue} mvndep {color} | {color:blue} 0m 49s{color} | {color:blue} Maven dependency ordering for branch {color} | | {color:green}+1{color} | {color:green} mvninstall {color} | {color:green} 7m 59s{color} | {color:green} master passed {color} | | {color:green}+1{color} | {color:green} compile {color} | {color:green} 2m 18s{color} | {color:green} master passed {color} | | {color:green}+1{color} | {color:green} checkstyle {color} | {color:green} 1m 24s{color} | {color:green} master passed {color} | | {color:green}+1{color} | {color:green} javadoc {color} | {color:green} 1m 53s{color} | {color:green} master passed {color} | || || || || {color:brown} Patch Compile Tests {color} || | {color:blue}0{color} | {color:blue} mvndep {color} | {color:blue} 0m 9s{color} | {color:blue} Maven dependency ordering for patch {color} | | {color:green}+1{color} | {color:green} mvninstall {color} | {color:green} 2m 43s{color} | {color:green} the patch passed {color} | | {color:green}+1{color} | {color:green} compile {color} | {color:green} 2m 17s{color} | {color:green} the patch passed {color} | | {color:green}+1{color} | {color:green} javac {color} | {color:green} 2m 17s{color} | {color:green} the patch passed {color} | | {color:red}-1{color} | {color:red} checkstyle {color} | {color:red} 0m 11s{color} | {color:red} spark-client: The patch generated 10 new + 26 unchanged - 7 fixed = 36 total (was 33) {color} | | {color:green}+1{color} | {color:green} whitespace {color} | {color:green} 0m 0s{color} | {color:green} The patch has no whitespace issues. {color} | | {color:green}+1{color} | {color:green} xml {color} | {color:green} 0m 3s{color} | {color:green} The patch has no ill-formed XML file. {color} | | {color:red}-1{color} | {color:red} javadoc {color} | {color:red} 0m 10s{color} | {color:red} spark-client generated 1 new + 0 unchanged - 0 fixed = 1 total (was 0) {color} | || || || || {color:brown} Other Tests {color} || | {color:red}-1{color} | {color:red} asflicense {color} | {color:red} 0m 18s{color} | {color:red} The patch generated 1 ASF License warnings. {color} | | {color:black}{color} | {color:black} {color} | {color:black} 23m 47s{color} | {color:black} {color} | \\ \\ || Subsystem || Report/Notes || | Optional Tests | asflicense javac javadoc findbugs checkstyle compile xml | | uname | Linux hiveptest-server-upstream 3.16.0-4-amd64 #1 SMP Debian 3.16.36-1+deb8u1 (2016-09-03) x86_64 GNU/Linux | | Build tool | maven | | Personality | /data/hiveptest/working/yetus_PreCommit-HIVE-Build-10650/dev-support/hive-personality.sh | | git revision | master / 2c7f9c2 | | Default Java | 1.8.0_111 | | checkstyle | http://104.198.109.242/logs//PreCommit-HIVE-Build-10650/yetus/diff-checkstyle-spark-client.txt | | javadoc | http://104.198.109.242/logs//PreCommit-HIVE-Build-10650/yetus/diff-javadoc-javadoc-spark-client.txt | | asflicense | http://104.198.109.242/logs//PreCommit-HIVE-Build-10650/yetus/patch-asflicense-problems.txt | | modules | C: common itests itests/qtest-spark ql spark-client U: . | | Console output | http://104.198.109.242/logs//PreCommit-HIVE-Build-10650/yetus.txt | | Powered by | Apache Yetushttp://yetus.apache.org | This message was automatically generated. > Add option to use InProcessLauncher to submit spark jobs > > > Key: HIVE-18533 > URL: https://issues.apache.org/jira/browse/HIVE-18533 > Project: Hive > Issue Type: Improvement > Components: Spark >Reporter: Sahil Takiar >Assignee: Sahil Takiar >Priority: Major > Attachments: HIVE-18533.1.patch, HIVE-18533.2.patch, > HIVE-18533.3.patch, HIVE-18533.4.patch, HIVE-18533.5.patch, > HIVE-18533.6.patch, HIVE-18533.7.patch, HIVE-18533.8.patch, HIVE-18533.9.patch > > > See discussion in HIVE-16484 for details. > I think this will help with reducing the amount of time it takes to open a > HoS session + debuggability (no need launch a separate process to run a Spark > app). -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Commented] (HIVE-18533) Add option to use InProcessLauncher to submit spark jobs
[ https://issues.apache.org/jira/browse/HIVE-18533?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16461318#comment-16461318 ] Rui Li commented on HIVE-18533: --- [~stakiar], thanks for working on this and sorry for the delay. I had a quick overview of the patch and will look into more details tomorrow. [~xuefuz], you might also want to take a look. > Add option to use InProcessLauncher to submit spark jobs > > > Key: HIVE-18533 > URL: https://issues.apache.org/jira/browse/HIVE-18533 > Project: Hive > Issue Type: Improvement > Components: Spark >Reporter: Sahil Takiar >Assignee: Sahil Takiar >Priority: Major > Attachments: HIVE-18533.1.patch, HIVE-18533.2.patch, > HIVE-18533.3.patch, HIVE-18533.4.patch, HIVE-18533.5.patch, > HIVE-18533.6.patch, HIVE-18533.7.patch, HIVE-18533.8.patch > > > See discussion in HIVE-16484 for details. > I think this will help with reducing the amount of time it takes to open a > HoS session + debuggability (no need launch a separate process to run a Spark > app). -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Commented] (HIVE-18533) Add option to use InProcessLauncher to submit spark jobs
[ https://issues.apache.org/jira/browse/HIVE-18533?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16456664#comment-16456664 ] Sahil Takiar commented on HIVE-18533: - [~lirui] could you take a look? Below is a brief description of the change. RB: https://reviews.apache.org/r/66071/ * Added the ability to launch jobs via Spark's {{InProcessLauncher}} rather than invoking {{bin/spark-submit}} * Users can pick which launcher they want to use, by default the {{spark-submit}} launcher is used * Renamed {{SparkClientImpl}} to {{AbstractSparkClient}} it contains the all the common logic between the two launchers ** {{AbstractSparkClient}} has two subclasses: {{SparkLauncherSparkClient}} which uses the {{InProcessLaucher}} and {{SparkSubmitSparkClient}} which uses {{spark-submit}} ** The changes to {{SparkClientImpl}} are mostly just re-factoring, I did my best to ensure there are no logic changes; the code is now mostly split between {{AbstractSparkClient}} and {{SparkSubmitSparkClient}} *** The biggest change in logic is that now {{SparkSubmitSparkClient#startDriver}} returns a {{Future}} object instead of a {{Thread}} object ** {{AbstractSparkClient}} has a number of {{abstract}} methods that decide how certain configuration options need to be set - e.g. how to add jars, specify the keytab / principal, etc. ** Its main method is {{launchDriver}} which specifies how to actually launcher the Spark app, it returns a {{Future}} object which is used to monitor the state of the Spark app * {{SparkLauncherSparkClient}} is essentially a wrapper around {{InProcessLauncher}} and it contains a custom {{Future}} implementation that monitors the underlying Spark app using the API's exposed by the {{InProcessLauncher}} * Added unit tests and a q-test > Add option to use InProcessLauncher to submit spark jobs > > > Key: HIVE-18533 > URL: https://issues.apache.org/jira/browse/HIVE-18533 > Project: Hive > Issue Type: Improvement > Components: Spark >Reporter: Sahil Takiar >Assignee: Sahil Takiar >Priority: Major > Attachments: HIVE-18533.1.patch, HIVE-18533.2.patch, > HIVE-18533.3.patch, HIVE-18533.4.patch, HIVE-18533.5.patch, > HIVE-18533.6.patch, HIVE-18533.7.patch, HIVE-18533.8.patch > > > See discussion in HIVE-16484 for details. > I think this will help with reducing the amount of time it takes to open a > HoS session + debuggability (no need launch a separate process to run a Spark > app). -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Commented] (HIVE-18533) Add option to use InProcessLauncher to submit spark jobs
[ https://issues.apache.org/jira/browse/HIVE-18533?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16415447#comment-16415447 ] Hive QA commented on HIVE-18533: Here are the results of testing the latest attachment: https://issues.apache.org/jira/secure/attachment/12916258/HIVE-18533.7.patch {color:green}SUCCESS:{color} +1 due to 4 test(s) being added or modified. {color:red}ERROR:{color} -1 due to 27 failed/errored test(s), 13045 tests executed *Failed tests:* {noformat} TestMinimrCliDriver - did not produce a TEST-*.xml file (likely timed out) (batchId=92) [infer_bucket_sort_num_buckets.q,infer_bucket_sort_reducers_power_two.q,parallel_orderby.q,bucket_num_reducers_acid.q,infer_bucket_sort_map_operators.q,infer_bucket_sort_merge.q,root_dir_external_table.q,infer_bucket_sort_dyn_part.q,udf_using.q,bucket_num_reducers_acid2.q] TestNegativeCliDriver - did not produce a TEST-*.xml file (likely timed out) (batchId=94) [nopart_insert.q,insert_into_with_schema.q,input41.q,having1.q,create_table_failure3.q,default_constraint_invalid_default_value.q,database_drop_not_empty_restrict.q,windowing_after_orderby.q,orderbysortby.q,subquery_select_distinct2.q,authorization_uri_alterpart_loc.q,udf_last_day_error_1.q,constraint_duplicate_name.q,create_table_failure4.q,alter_tableprops_external_with_notnull_constraint.q,semijoin5.q,udf_format_number_wrong4.q,deletejar.q,exim_11_nonpart_noncompat_sorting.q,show_tables_bad_db2.q,drop_func_nonexistent.q,nopart_load.q,alter_table_non_partitioned_table_cascade.q,check_constraint_subquery.q,load_wrong_fileformat.q,check_constraint_udtf.q,lockneg_try_db_lock_conflict.q,udf_field_wrong_args_len.q,create_table_failure2.q,create_with_fk_constraints_enforced.q,groupby2_map_skew_multi_distinct.q,authorization_update_noupdatepriv.q,show_columns2.q,authorization_insert_noselectpriv.q,orc_replace_columns3_acid.q,compare_double_bigint.q,authorization_set_nonexistent_conf.q,alter_rename_partition_failure3.q,split_sample_wrong_format2.q,create_with_fk_pk_same_tab.q,compare_double_bigint_2.q,authorization_show_roles_no_admin.q,materialized_view_authorization_rebuild_no_grant.q,unionLimit.q,authorization_revoke_table_fail2.q,authorization_insert_noinspriv.q,duplicate_insert3.q,authorization_desc_table_nosel.q,stats_noscan_non_native.q,orc_change_serde_acid.q,create_or_replace_view7.q,exim_07_nonpart_noncompat_ifof.q,create_with_unique_constraints_enforced.q,udf_concat_ws_wrong2.q,fileformat_bad_class.q,merge_negative_2.q,exim_15_part_nonpart.q,authorization_not_owner_drop_view.q,external1.q,authorization_uri_insert.q,create_with_fk_wrong_ref.q,columnstats_tbllvl_incorrect_column.q,authorization_show_parts_nosel.q,authorization_not_owner_drop_tab.q,external2.q,authorization_deletejar.q,temp_table_create_like_partitions.q,udf_greatest_error_1.q,ptf_negative_AggrFuncsWithNoGBYNoPartDef.q,alter_view_as_select_not_exist.q,touch1.q,groupby3_map_skew_multi_distinct.q,insert_into_notnull_constraint.q,exchange_partition_neg_partition_missing.q,groupby_cube_multi_gby.q,columnstats_tbllvl.q,drop_invalid_constraint2.q,alter_table_add_partition.q,update_not_acid.q,archive5.q,alter_table_constraint_invalid_pk_col.q,ivyDownload.q,udf_instr_wrong_type.q,bad_sample_clause.q,authorization_not_owner_drop_tab2.q,authorization_alter_db_owner.q,show_columns1.q,orc_type_promotion3.q,create_view_failure8.q,strict_join.q,udf_add_months_error_1.q,groupby_cube2.q,groupby_cube1.q,groupby_rollup1.q,genericFileFormat.q,invalid_cast_from_binary_4.q,drop_invalid_constraint1.q,serde_regex.q,show_partitions1.q,check_constraint_nonboolean_expr.q,invalid_cast_from_binary_6.q,create_with_multi_pk_constraint.q,udf_field_wrong_type.q,groupby_grouping_sets4.q,groupby_grouping_sets3.q,insertsel_fail.q,udf_locate_wrong_type.q,orc_type_promotion1_acid.q,set_table_property.q,create_or_replace_view2.q,groupby_grouping_sets2.q,alter_view_failure.q,distinct_windowing_failure1.q,invalid_t_alter2.q,alter_table_constraint_invalid_fk_col1.q,invalid_varchar_length_2.q,authorization_show_grant_otheruser_alltabs.q,subquery_windowing_corr.q,compact_non_acid_table.q,authorization_view_4.q,authorization_disallow_transform.q,materialized_view_authorization_rebuild_other.q,authorization_fail_4.q,dbtxnmgr_nodblock.q,set_hiveconf_internal_variable1.q,input_part0_neg.q,udf_printf_wrong3.q,load_orc_negative2.q,druid_buckets.q,archive2.q,authorization_addjar.q,invalid_sum_syntax.q,insert_into_with_schema1.q,udf_add_months_error_2.q,dyn_part_max_per_node.q,authorization_revoke_table_fail1.q,udf_printf_wrong2.q,archive_multi3.q,udf_printf_wrong1.q,subquery_subquery_chain.q,authorization_view_disable_cbo_4.q,no_matching_udf.q,create_view_failure7.q,drop_native_udf.q,truncate_column_list_bucketing.q,authorization_uri_add_partition.q,authorization_view_disable_cbo_3.q,bad_exec_hooks.q,authorization_view_disable_cbo_2.q,fetchtask_ioexception.q,char_pad_conv
[jira] [Commented] (HIVE-18533) Add option to use InProcessLauncher to submit spark jobs
[ https://issues.apache.org/jira/browse/HIVE-18533?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16415398#comment-16415398 ] Hive QA commented on HIVE-18533: | (x) *{color:red}-1 overall{color}* | \\ \\ || Vote || Subsystem || Runtime || Comment || || || || || {color:brown} Prechecks {color} || | {color:blue}0{color} | {color:blue} findbugs {color} | {color:blue} 0m 1s{color} | {color:blue} Findbugs executables are not available. {color} | | {color:green}+1{color} | {color:green} @author {color} | {color:green} 0m 0s{color} | {color:green} The patch does not contain any @author tags. {color} | || || || || {color:brown} master Compile Tests {color} || | {color:blue}0{color} | {color:blue} mvndep {color} | {color:blue} 0m 44s{color} | {color:blue} Maven dependency ordering for branch {color} | | {color:green}+1{color} | {color:green} mvninstall {color} | {color:green} 6m 40s{color} | {color:green} master passed {color} | | {color:green}+1{color} | {color:green} compile {color} | {color:green} 1m 54s{color} | {color:green} master passed {color} | | {color:green}+1{color} | {color:green} checkstyle {color} | {color:green} 1m 11s{color} | {color:green} master passed {color} | | {color:green}+1{color} | {color:green} javadoc {color} | {color:green} 1m 34s{color} | {color:green} master passed {color} | || || || || {color:brown} Patch Compile Tests {color} || | {color:blue}0{color} | {color:blue} mvndep {color} | {color:blue} 0m 8s{color} | {color:blue} Maven dependency ordering for patch {color} | | {color:green}+1{color} | {color:green} mvninstall {color} | {color:green} 2m 19s{color} | {color:green} the patch passed {color} | | {color:green}+1{color} | {color:green} compile {color} | {color:green} 1m 57s{color} | {color:green} the patch passed {color} | | {color:green}+1{color} | {color:green} javac {color} | {color:green} 1m 57s{color} | {color:green} the patch passed {color} | | {color:red}-1{color} | {color:red} checkstyle {color} | {color:red} 0m 10s{color} | {color:red} spark-client: The patch generated 16 new + 30 unchanged - 16 fixed = 46 total (was 46) {color} | | {color:green}+1{color} | {color:green} whitespace {color} | {color:green} 0m 0s{color} | {color:green} The patch has no whitespace issues. {color} | | {color:green}+1{color} | {color:green} xml {color} | {color:green} 0m 2s{color} | {color:green} The patch has no ill-formed XML file. {color} | | {color:red}-1{color} | {color:red} javadoc {color} | {color:red} 0m 11s{color} | {color:red} spark-client generated 1 new + 0 unchanged - 0 fixed = 1 total (was 0) {color} | || || || || {color:brown} Other Tests {color} || | {color:red}-1{color} | {color:red} asflicense {color} | {color:red} 0m 14s{color} | {color:red} The patch generated 50 ASF License warnings. {color} | | {color:black}{color} | {color:black} {color} | {color:black} 20m 14s{color} | {color:black} {color} | \\ \\ || Subsystem || Report/Notes || | Optional Tests | asflicense javac javadoc findbugs checkstyle compile xml | | uname | Linux hiveptest-server-upstream 3.16.0-4-amd64 #1 SMP Debian 3.16.36-1+deb8u1 (2016-09-03) x86_64 GNU/Linux | | Build tool | maven | | Personality | /data/hiveptest/working/yetus_PreCommit-HIVE-Build-9865/dev-support/hive-personality.sh | | git revision | master / 1e884cc | | Default Java | 1.8.0_111 | | checkstyle | http://104.198.109.242/logs//PreCommit-HIVE-Build-9865/yetus/diff-checkstyle-spark-client.txt | | javadoc | http://104.198.109.242/logs//PreCommit-HIVE-Build-9865/yetus/diff-javadoc-javadoc-spark-client.txt | | asflicense | http://104.198.109.242/logs//PreCommit-HIVE-Build-9865/yetus/patch-asflicense-problems.txt | | modules | C: common itests itests/qtest-spark ql spark-client U: . | | Console output | http://104.198.109.242/logs//PreCommit-HIVE-Build-9865/yetus.txt | | Powered by | Apache Yetushttp://yetus.apache.org | This message was automatically generated. > Add option to use InProcessLauncher to submit spark jobs > > > Key: HIVE-18533 > URL: https://issues.apache.org/jira/browse/HIVE-18533 > Project: Hive > Issue Type: Improvement > Components: Spark >Reporter: Sahil Takiar >Assignee: Sahil Takiar >Priority: Major > Attachments: HIVE-18533.1.patch, HIVE-18533.2.patch, > HIVE-18533.3.patch, HIVE-18533.4.patch, HIVE-18533.5.patch, > HIVE-18533.6.patch, HIVE-18533.7.patch > > > See discussion in HIVE-16484 for details. > I think this will help with reducing the amount of time it takes to open a > HoS session + debuggability (no need launch a separate process to run a Spark > app). -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Commented] (HIVE-18533) Add option to use InProcessLauncher to submit spark jobs
[ https://issues.apache.org/jira/browse/HIVE-18533?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16413364#comment-16413364 ] Hive QA commented on HIVE-18533: Here are the results of testing the latest attachment: https://issues.apache.org/jira/secure/attachment/12916117/HIVE-18533.6.patch {color:green}SUCCESS:{color} +1 due to 3 test(s) being added or modified. {color:red}ERROR:{color} -1 due to 114 failed/errored test(s), 13431 tests executed *Failed tests:* {noformat} TestMinimrCliDriver - did not produce a TEST-*.xml file (likely timed out) (batchId=92) [infer_bucket_sort_num_buckets.q,infer_bucket_sort_reducers_power_two.q,parallel_orderby.q,bucket_num_reducers_acid.q,infer_bucket_sort_map_operators.q,infer_bucket_sort_merge.q,root_dir_external_table.q,infer_bucket_sort_dyn_part.q,udf_using.q,bucket_num_reducers_acid2.q] TestNegativeCliDriver - did not produce a TEST-*.xml file (likely timed out) (batchId=94) [nopart_insert.q,insert_into_with_schema.q,input41.q,having1.q,create_table_failure3.q,default_constraint_invalid_default_value.q,database_drop_not_empty_restrict.q,windowing_after_orderby.q,orderbysortby.q,subquery_select_distinct2.q,authorization_uri_alterpart_loc.q,udf_last_day_error_1.q,constraint_duplicate_name.q,create_table_failure4.q,alter_tableprops_external_with_notnull_constraint.q,semijoin5.q,udf_format_number_wrong4.q,deletejar.q,exim_11_nonpart_noncompat_sorting.q,show_tables_bad_db2.q,drop_func_nonexistent.q,nopart_load.q,alter_table_non_partitioned_table_cascade.q,check_constraint_subquery.q,load_wrong_fileformat.q,check_constraint_udtf.q,lockneg_try_db_lock_conflict.q,udf_field_wrong_args_len.q,create_table_failure2.q,create_with_fk_constraints_enforced.q,groupby2_map_skew_multi_distinct.q,authorization_update_noupdatepriv.q,show_columns2.q,authorization_insert_noselectpriv.q,orc_replace_columns3_acid.q,compare_double_bigint.q,authorization_set_nonexistent_conf.q,alter_rename_partition_failure3.q,split_sample_wrong_format2.q,create_with_fk_pk_same_tab.q,compare_double_bigint_2.q,authorization_show_roles_no_admin.q,materialized_view_authorization_rebuild_no_grant.q,unionLimit.q,authorization_revoke_table_fail2.q,authorization_insert_noinspriv.q,duplicate_insert3.q,authorization_desc_table_nosel.q,stats_noscan_non_native.q,orc_change_serde_acid.q,create_or_replace_view7.q,exim_07_nonpart_noncompat_ifof.q,create_with_unique_constraints_enforced.q,udf_concat_ws_wrong2.q,fileformat_bad_class.q,merge_negative_2.q,exim_15_part_nonpart.q,authorization_not_owner_drop_view.q,external1.q,authorization_uri_insert.q,create_with_fk_wrong_ref.q,columnstats_tbllvl_incorrect_column.q,authorization_show_parts_nosel.q,authorization_not_owner_drop_tab.q,external2.q,authorization_deletejar.q,temp_table_create_like_partitions.q,udf_greatest_error_1.q,ptf_negative_AggrFuncsWithNoGBYNoPartDef.q,alter_view_as_select_not_exist.q,touch1.q,groupby3_map_skew_multi_distinct.q,insert_into_notnull_constraint.q,exchange_partition_neg_partition_missing.q,groupby_cube_multi_gby.q,columnstats_tbllvl.q,drop_invalid_constraint2.q,alter_table_add_partition.q,update_not_acid.q,archive5.q,alter_table_constraint_invalid_pk_col.q,ivyDownload.q,udf_instr_wrong_type.q,bad_sample_clause.q,authorization_not_owner_drop_tab2.q,authorization_alter_db_owner.q,show_columns1.q,orc_type_promotion3.q,create_view_failure8.q,strict_join.q,udf_add_months_error_1.q,groupby_cube2.q,groupby_cube1.q,groupby_rollup1.q,genericFileFormat.q,invalid_cast_from_binary_4.q,drop_invalid_constraint1.q,serde_regex.q,show_partitions1.q,check_constraint_nonboolean_expr.q,invalid_cast_from_binary_6.q,create_with_multi_pk_constraint.q,udf_field_wrong_type.q,groupby_grouping_sets4.q,groupby_grouping_sets3.q,insertsel_fail.q,udf_locate_wrong_type.q,orc_type_promotion1_acid.q,set_table_property.q,create_or_replace_view2.q,groupby_grouping_sets2.q,alter_view_failure.q,distinct_windowing_failure1.q,invalid_t_alter2.q,alter_table_constraint_invalid_fk_col1.q,invalid_varchar_length_2.q,authorization_show_grant_otheruser_alltabs.q,subquery_windowing_corr.q,compact_non_acid_table.q,authorization_view_4.q,authorization_disallow_transform.q,materialized_view_authorization_rebuild_other.q,authorization_fail_4.q,dbtxnmgr_nodblock.q,set_hiveconf_internal_variable1.q,input_part0_neg.q,udf_printf_wrong3.q,load_orc_negative2.q,druid_buckets.q,archive2.q,authorization_addjar.q,invalid_sum_syntax.q,insert_into_with_schema1.q,udf_add_months_error_2.q,dyn_part_max_per_node.q,authorization_revoke_table_fail1.q,udf_printf_wrong2.q,archive_multi3.q,udf_printf_wrong1.q,subquery_subquery_chain.q,authorization_view_disable_cbo_4.q,no_matching_udf.q,create_view_failure7.q,drop_native_udf.q,truncate_column_list_bucketing.q,authorization_uri_add_partition.q,authorization_view_disable_cbo_3.q,bad_exec_hooks.q,authorization_view_disable_cbo_2.q,fetchtask_ioexception.q,char_pad_con
[jira] [Commented] (HIVE-18533) Add option to use InProcessLauncher to submit spark jobs
[ https://issues.apache.org/jira/browse/HIVE-18533?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16413320#comment-16413320 ] Hive QA commented on HIVE-18533: | (x) *{color:red}-1 overall{color}* | \\ \\ || Vote || Subsystem || Runtime || Comment || || || || || {color:brown} Prechecks {color} || | {color:blue}0{color} | {color:blue} findbugs {color} | {color:blue} 0m 1s{color} | {color:blue} Findbugs executables are not available. {color} | | {color:green}+1{color} | {color:green} @author {color} | {color:green} 0m 0s{color} | {color:green} The patch does not contain any @author tags. {color} | || || || || {color:brown} master Compile Tests {color} || | {color:blue}0{color} | {color:blue} mvndep {color} | {color:blue} 0m 28s{color} | {color:blue} Maven dependency ordering for branch {color} | | {color:green}+1{color} | {color:green} mvninstall {color} | {color:green} 7m 33s{color} | {color:green} master passed {color} | | {color:green}+1{color} | {color:green} compile {color} | {color:green} 2m 0s{color} | {color:green} master passed {color} | | {color:green}+1{color} | {color:green} checkstyle {color} | {color:green} 1m 15s{color} | {color:green} master passed {color} | | {color:green}+1{color} | {color:green} javadoc {color} | {color:green} 1m 40s{color} | {color:green} master passed {color} | || || || || {color:brown} Patch Compile Tests {color} || | {color:blue}0{color} | {color:blue} mvndep {color} | {color:blue} 0m 9s{color} | {color:blue} Maven dependency ordering for patch {color} | | {color:green}+1{color} | {color:green} mvninstall {color} | {color:green} 2m 18s{color} | {color:green} the patch passed {color} | | {color:green}+1{color} | {color:green} compile {color} | {color:green} 1m 56s{color} | {color:green} the patch passed {color} | | {color:green}+1{color} | {color:green} javac {color} | {color:green} 1m 56s{color} | {color:green} the patch passed {color} | | {color:red}-1{color} | {color:red} checkstyle {color} | {color:red} 0m 10s{color} | {color:red} spark-client: The patch generated 16 new + 21 unchanged - 16 fixed = 37 total (was 37) {color} | | {color:green}+1{color} | {color:green} whitespace {color} | {color:green} 0m 0s{color} | {color:green} The patch has no whitespace issues. {color} | | {color:green}+1{color} | {color:green} xml {color} | {color:green} 0m 2s{color} | {color:green} The patch has no ill-formed XML file. {color} | | {color:red}-1{color} | {color:red} javadoc {color} | {color:red} 0m 9s{color} | {color:red} spark-client generated 1 new + 0 unchanged - 0 fixed = 1 total (was 0) {color} | || || || || {color:brown} Other Tests {color} || | {color:red}-1{color} | {color:red} asflicense {color} | {color:red} 0m 13s{color} | {color:red} The patch generated 50 ASF License warnings. {color} | | {color:black}{color} | {color:black} {color} | {color:black} 21m 0s{color} | {color:black} {color} | \\ \\ || Subsystem || Report/Notes || | Optional Tests | asflicense javac javadoc findbugs checkstyle compile xml | | uname | Linux hiveptest-server-upstream 3.16.0-4-amd64 #1 SMP Debian 3.16.36-1+deb8u1 (2016-09-03) x86_64 GNU/Linux | | Build tool | maven | | Personality | /data/hiveptest/working/yetus_PreCommit-HIVE-Build-9838/dev-support/hive-personality.sh | | git revision | master / 5b222db | | Default Java | 1.8.0_111 | | checkstyle | http://104.198.109.242/logs//PreCommit-HIVE-Build-9838/yetus/diff-checkstyle-spark-client.txt | | javadoc | http://104.198.109.242/logs//PreCommit-HIVE-Build-9838/yetus/diff-javadoc-javadoc-spark-client.txt | | asflicense | http://104.198.109.242/logs//PreCommit-HIVE-Build-9838/yetus/patch-asflicense-problems.txt | | modules | C: common itests itests/qtest-spark ql spark-client U: . | | Console output | http://104.198.109.242/logs//PreCommit-HIVE-Build-9838/yetus.txt | | Powered by | Apache Yetushttp://yetus.apache.org | This message was automatically generated. > Add option to use InProcessLauncher to submit spark jobs > > > Key: HIVE-18533 > URL: https://issues.apache.org/jira/browse/HIVE-18533 > Project: Hive > Issue Type: Improvement > Components: Spark >Reporter: Sahil Takiar >Assignee: Sahil Takiar >Priority: Major > Attachments: HIVE-18533.1.patch, HIVE-18533.2.patch, > HIVE-18533.3.patch, HIVE-18533.4.patch, HIVE-18533.5.patch, HIVE-18533.6.patch > > > See discussion in HIVE-16484 for details. > I think this will help with reducing the amount of time it takes to open a > HoS session + debuggability (no need launch a separate process to run a Spark > app). -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Commented] (HIVE-18533) Add option to use InProcessLauncher to submit spark jobs
[ https://issues.apache.org/jira/browse/HIVE-18533?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16413100#comment-16413100 ] Hive QA commented on HIVE-18533: Here are the results of testing the latest attachment: https://issues.apache.org/jira/secure/attachment/12915994/HIVE-18533.5.patch {color:green}SUCCESS:{color} +1 due to 3 test(s) being added or modified. {color:red}ERROR:{color} -1 due to 85 failed/errored test(s), 12301 tests executed *Failed tests:* {noformat} TestCLIAuthzSessionContext - did not produce a TEST-*.xml file (likely timed out) (batchId=240) TestJdbcWithLocalClusterSpark - did not produce a TEST-*.xml file (likely timed out) (batchId=240) TestMinimrCliDriver - did not produce a TEST-*.xml file (likely timed out) (batchId=92) [infer_bucket_sort_num_buckets.q,infer_bucket_sort_reducers_power_two.q,parallel_orderby.q,bucket_num_reducers_acid.q,infer_bucket_sort_map_operators.q,infer_bucket_sort_merge.q,root_dir_external_table.q,infer_bucket_sort_dyn_part.q,udf_using.q,bucket_num_reducers_acid2.q] TestMultiSessionsHS2WithLocalClusterSpark - did not produce a TEST-*.xml file (likely timed out) (batchId=240) TestNegativeCliDriver - did not produce a TEST-*.xml file (likely timed out) (batchId=94) [nopart_insert.q,insert_into_with_schema.q,input41.q,having1.q,create_table_failure3.q,default_constraint_invalid_default_value.q,database_drop_not_empty_restrict.q,windowing_after_orderby.q,orderbysortby.q,subquery_select_distinct2.q,authorization_uri_alterpart_loc.q,udf_last_day_error_1.q,constraint_duplicate_name.q,create_table_failure4.q,alter_tableprops_external_with_notnull_constraint.q,semijoin5.q,udf_format_number_wrong4.q,deletejar.q,exim_11_nonpart_noncompat_sorting.q,show_tables_bad_db2.q,drop_func_nonexistent.q,nopart_load.q,alter_table_non_partitioned_table_cascade.q,check_constraint_subquery.q,load_wrong_fileformat.q,check_constraint_udtf.q,lockneg_try_db_lock_conflict.q,udf_field_wrong_args_len.q,create_table_failure2.q,create_with_fk_constraints_enforced.q,groupby2_map_skew_multi_distinct.q,authorization_update_noupdatepriv.q,show_columns2.q,authorization_insert_noselectpriv.q,orc_replace_columns3_acid.q,compare_double_bigint.q,authorization_set_nonexistent_conf.q,alter_rename_partition_failure3.q,split_sample_wrong_format2.q,create_with_fk_pk_same_tab.q,compare_double_bigint_2.q,authorization_show_roles_no_admin.q,materialized_view_authorization_rebuild_no_grant.q,unionLimit.q,authorization_revoke_table_fail2.q,authorization_insert_noinspriv.q,duplicate_insert3.q,authorization_desc_table_nosel.q,stats_noscan_non_native.q,orc_change_serde_acid.q,create_or_replace_view7.q,exim_07_nonpart_noncompat_ifof.q,create_with_unique_constraints_enforced.q,udf_concat_ws_wrong2.q,fileformat_bad_class.q,merge_negative_2.q,exim_15_part_nonpart.q,authorization_not_owner_drop_view.q,external1.q,authorization_uri_insert.q,create_with_fk_wrong_ref.q,columnstats_tbllvl_incorrect_column.q,authorization_show_parts_nosel.q,authorization_not_owner_drop_tab.q,external2.q,authorization_deletejar.q,temp_table_create_like_partitions.q,udf_greatest_error_1.q,ptf_negative_AggrFuncsWithNoGBYNoPartDef.q,alter_view_as_select_not_exist.q,touch1.q,groupby3_map_skew_multi_distinct.q,insert_into_notnull_constraint.q,exchange_partition_neg_partition_missing.q,groupby_cube_multi_gby.q,columnstats_tbllvl.q,drop_invalid_constraint2.q,alter_table_add_partition.q,update_not_acid.q,archive5.q,alter_table_constraint_invalid_pk_col.q,ivyDownload.q,udf_instr_wrong_type.q,bad_sample_clause.q,authorization_not_owner_drop_tab2.q,authorization_alter_db_owner.q,show_columns1.q,orc_type_promotion3.q,create_view_failure8.q,strict_join.q,udf_add_months_error_1.q,groupby_cube2.q,groupby_cube1.q,groupby_rollup1.q,genericFileFormat.q,invalid_cast_from_binary_4.q,drop_invalid_constraint1.q,serde_regex.q,show_partitions1.q,check_constraint_nonboolean_expr.q,invalid_cast_from_binary_6.q,create_with_multi_pk_constraint.q,udf_field_wrong_type.q,groupby_grouping_sets4.q,groupby_grouping_sets3.q,insertsel_fail.q,udf_locate_wrong_type.q,orc_type_promotion1_acid.q,set_table_property.q,create_or_replace_view2.q,groupby_grouping_sets2.q,alter_view_failure.q,distinct_windowing_failure1.q,invalid_t_alter2.q,alter_table_constraint_invalid_fk_col1.q,invalid_varchar_length_2.q,authorization_show_grant_otheruser_alltabs.q,subquery_windowing_corr.q,compact_non_acid_table.q,authorization_view_4.q,authorization_disallow_transform.q,materialized_view_authorization_rebuild_other.q,authorization_fail_4.q,dbtxnmgr_nodblock.q,set_hiveconf_internal_variable1.q,input_part0_neg.q,udf_printf_wrong3.q,load_orc_negative2.q,druid_buckets.q,archive2.q,authorization_addjar.q,invalid_sum_syntax.q,insert_into_with_schema1.q,udf_add_months_error_2.q,dyn_part_max_per_node.q,authorization_revoke_table_fail1.q,udf_printf_wrong2.q,archive_multi3.q,udf_printf_wrong1.q,su
[jira] [Commented] (HIVE-18533) Add option to use InProcessLauncher to submit spark jobs
[ https://issues.apache.org/jira/browse/HIVE-18533?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16413059#comment-16413059 ] Hive QA commented on HIVE-18533: | (x) *{color:red}-1 overall{color}* | \\ \\ || Vote || Subsystem || Runtime || Comment || || || || || {color:brown} Prechecks {color} || | {color:blue}0{color} | {color:blue} findbugs {color} | {color:blue} 0m 1s{color} | {color:blue} Findbugs executables are not available. {color} | | {color:green}+1{color} | {color:green} @author {color} | {color:green} 0m 0s{color} | {color:green} The patch does not contain any @author tags. {color} | || || || || {color:brown} master Compile Tests {color} || | {color:blue}0{color} | {color:blue} mvndep {color} | {color:blue} 0m 38s{color} | {color:blue} Maven dependency ordering for branch {color} | | {color:green}+1{color} | {color:green} mvninstall {color} | {color:green} 6m 46s{color} | {color:green} master passed {color} | | {color:green}+1{color} | {color:green} compile {color} | {color:green} 1m 58s{color} | {color:green} master passed {color} | | {color:green}+1{color} | {color:green} checkstyle {color} | {color:green} 1m 13s{color} | {color:green} master passed {color} | | {color:green}+1{color} | {color:green} javadoc {color} | {color:green} 1m 37s{color} | {color:green} master passed {color} | || || || || {color:brown} Patch Compile Tests {color} || | {color:blue}0{color} | {color:blue} mvndep {color} | {color:blue} 0m 8s{color} | {color:blue} Maven dependency ordering for patch {color} | | {color:green}+1{color} | {color:green} mvninstall {color} | {color:green} 2m 21s{color} | {color:green} the patch passed {color} | | {color:green}+1{color} | {color:green} compile {color} | {color:green} 1m 52s{color} | {color:green} the patch passed {color} | | {color:green}+1{color} | {color:green} javac {color} | {color:green} 1m 52s{color} | {color:green} the patch passed {color} | | {color:red}-1{color} | {color:red} checkstyle {color} | {color:red} 0m 10s{color} | {color:red} spark-client: The patch generated 16 new + 21 unchanged - 16 fixed = 37 total (was 37) {color} | | {color:green}+1{color} | {color:green} whitespace {color} | {color:green} 0m 0s{color} | {color:green} The patch has no whitespace issues. {color} | | {color:green}+1{color} | {color:green} xml {color} | {color:green} 0m 2s{color} | {color:green} The patch has no ill-formed XML file. {color} | | {color:red}-1{color} | {color:red} javadoc {color} | {color:red} 0m 10s{color} | {color:red} spark-client generated 1 new + 0 unchanged - 0 fixed = 1 total (was 0) {color} | || || || || {color:brown} Other Tests {color} || | {color:red}-1{color} | {color:red} asflicense {color} | {color:red} 0m 13s{color} | {color:red} The patch generated 50 ASF License warnings. {color} | | {color:black}{color} | {color:black} {color} | {color:black} 20m 10s{color} | {color:black} {color} | \\ \\ || Subsystem || Report/Notes || | Optional Tests | asflicense javac javadoc findbugs checkstyle compile xml | | uname | Linux hiveptest-server-upstream 3.16.0-4-amd64 #1 SMP Debian 3.16.36-1+deb8u1 (2016-09-03) x86_64 GNU/Linux | | Build tool | maven | | Personality | /data/hiveptest/working/yetus_PreCommit-HIVE-Build-9823/dev-support/hive-personality.sh | | git revision | master / 5b222db | | Default Java | 1.8.0_111 | | checkstyle | http://104.198.109.242/logs//PreCommit-HIVE-Build-9823/yetus/diff-checkstyle-spark-client.txt | | javadoc | http://104.198.109.242/logs//PreCommit-HIVE-Build-9823/yetus/diff-javadoc-javadoc-spark-client.txt | | asflicense | http://104.198.109.242/logs//PreCommit-HIVE-Build-9823/yetus/patch-asflicense-problems.txt | | modules | C: common itests itests/qtest-spark ql spark-client U: . | | Console output | http://104.198.109.242/logs//PreCommit-HIVE-Build-9823/yetus.txt | | Powered by | Apache Yetushttp://yetus.apache.org | This message was automatically generated. > Add option to use InProcessLauncher to submit spark jobs > > > Key: HIVE-18533 > URL: https://issues.apache.org/jira/browse/HIVE-18533 > Project: Hive > Issue Type: Improvement > Components: Spark >Reporter: Sahil Takiar >Assignee: Sahil Takiar >Priority: Major > Attachments: HIVE-18533.1.patch, HIVE-18533.2.patch, > HIVE-18533.3.patch, HIVE-18533.4.patch, HIVE-18533.5.patch > > > See discussion in HIVE-16484 for details. > I think this will help with reducing the amount of time it takes to open a > HoS session + debuggability (no need launch a separate process to run a Spark > app). -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Commented] (HIVE-18533) Add option to use InProcessLauncher to submit spark jobs
[ https://issues.apache.org/jira/browse/HIVE-18533?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16411987#comment-16411987 ] Sahil Takiar commented on HIVE-18533: - Thanks filed SPARK-23785 to fix the issue. > Add option to use InProcessLauncher to submit spark jobs > > > Key: HIVE-18533 > URL: https://issues.apache.org/jira/browse/HIVE-18533 > Project: Hive > Issue Type: Improvement > Components: Spark >Reporter: Sahil Takiar >Assignee: Sahil Takiar >Priority: Major > Attachments: HIVE-18533.1.patch, HIVE-18533.2.patch, > HIVE-18533.3.patch, HIVE-18533.4.patch > > > See discussion in HIVE-16484 for details. > I think this will help with reducing the amount of time it takes to open a > HoS session + debuggability (no need launch a separate process to run a Spark > app). -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Commented] (HIVE-18533) Add option to use InProcessLauncher to submit spark jobs
[ https://issues.apache.org/jira/browse/HIVE-18533?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16410347#comment-16410347 ] Marcelo Vanzin commented on HIVE-18533: --- That looks like a bug in Spark's {{LauncherBackend}}. It shouldn't be trying to write to the connection socket if it's been disconnected. The code already tracks that state, just fails to use it in the write methods (such as {{setState}}). > Add option to use InProcessLauncher to submit spark jobs > > > Key: HIVE-18533 > URL: https://issues.apache.org/jira/browse/HIVE-18533 > Project: Hive > Issue Type: Improvement > Components: Spark >Reporter: Sahil Takiar >Assignee: Sahil Takiar >Priority: Major > Attachments: HIVE-18533.1.patch, HIVE-18533.2.patch, > HIVE-18533.3.patch, HIVE-18533.4.patch > > > See discussion in HIVE-16484 for details. > I think this will help with reducing the amount of time it takes to open a > HoS session + debuggability (no need launch a separate process to run a Spark > app). -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Commented] (HIVE-18533) Add option to use InProcessLauncher to submit spark jobs
[ https://issues.apache.org/jira/browse/HIVE-18533?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16408654#comment-16408654 ] Sahil Takiar commented on HIVE-18533: - [~vanzin] I'm seeing a race condition in one of the tests that invoke the {{SparkLauncher}} {code} 2018-03-21T14:11:26,064 ERROR [Driver-RPC-Handler-0] util.Utils: Uncaught exception in thread Driver-RPC-Handler-0 java.lang.IllegalStateException: Disconnected. at org.apache.spark.launcher.CommandBuilderUtils.checkState(CommandBuilderUtils.java:248) ~[spark-launcher_2.11-2.3.0.jar:2.3.0] at org.apache.spark.launcher.LauncherConnection.send(LauncherConnection.java:81) ~[spark-launcher_2.11-2.3.0.jar:2.3.0] at org.apache.spark.launcher.LauncherBackend.setState(LauncherBackend.scala:77) ~[spark-core_2.11-2.3.0.jar:2.3.0] at org.apache.spark.scheduler.local.LocalSchedulerBackend.org$apache$spark$scheduler$local$LocalSchedulerBackend$$stop(LocalSchedulerBackend.scala:161) ~[spark-core_2.11-2.3.0.jar:2.3.0] at org.apache.spark.scheduler.local.LocalSchedulerBackend.stop(LocalSchedulerBackend.scala:137) ~[spark-core_2.11-2.3.0.jar:2.3.0] at org.apache.spark.scheduler.TaskSchedulerImpl.stop(TaskSchedulerImpl.scala:508) ~[spark-core_2.11-2.3.0.jar:2.3.0] at org.apache.spark.scheduler.DAGScheduler.stop(DAGScheduler.scala:1752) ~[spark-core_2.11-2.3.0.jar:2.3.0] at org.apache.spark.SparkContext$$anonfun$stop$8.apply$mcV$sp(SparkContext.scala:1924) ~[spark-core_2.11-2.3.0.jar:2.3.0] at org.apache.spark.util.Utils$.tryLogNonFatalError(Utils.scala:1357) [spark-core_2.11-2.3.0.jar:2.3.0] at org.apache.spark.SparkContext.stop(SparkContext.scala:1923) [spark-core_2.11-2.3.0.jar:2.3.0] at org.apache.spark.api.java.JavaSparkContext.stop(JavaSparkContext.scala:654) [spark-core_2.11-2.3.0.jar:2.3.0] at org.apache.hive.spark.client.JobContextImpl.stop(JobContextImpl.java:81) [classes/:?] at org.apache.hive.spark.client.RemoteDriver.shutdown(RemoteDriver.java:223) [classes/:?] at org.apache.hive.spark.client.RemoteDriver.access$200(RemoteDriver.java:71) [classes/:?] at org.apache.hive.spark.client.RemoteDriver$DriverProtocol.handle(RemoteDriver.java:286) [classes/:?] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_92] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_92] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_92] at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_92] at org.apache.hive.spark.client.rpc.RpcDispatcher.handleCall(RpcDispatcher.java:121) [classes/:?] at org.apache.hive.spark.client.rpc.RpcDispatcher.channelRead0(RpcDispatcher.java:80) [classes/:?] at io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105) [netty-all-4.1.17.Final.jar:4.1.17.Final] at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) [netty-all-4.1.17.Final.jar:4.1.17.Final] at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) [netty-all-4.1.17.Final.jar:4.1.17.Final] at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340) [netty-all-4.1.17.Final.jar:4.1.17.Final] at io.netty.channel.ChannelInboundHandlerAdapter.channelRead(ChannelInboundHandlerAdapter.java:86) [netty-all-4.1.17.Final.jar:4.1.17.Final] at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) [netty-all-4.1.17.Final.jar:4.1.17.Final] at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) [netty-all-4.1.17.Final.jar:4.1.17.Final] at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340) [netty-all-4.1.17.Final.jar:4.1.17.Final] at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:310) [netty-all-4.1.17.Final.jar:4.1.17.Final] at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:284) [netty-all-4.1.17.Final.jar:4.1.17.Final] at io.netty.handler.codec.ByteToMessageCodec.channelRead(ByteToMessageCodec.java:103) [netty-all-4.1.17.Final.jar:4.1.17.Final] at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) [netty-all-4.1.17.Final.jar:4.1.17.Final] at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) [netty-all-4.1.17.Final.jar:4.1.17.Final] at io.netty.
[jira] [Commented] (HIVE-18533) Add option to use InProcessLauncher to submit spark jobs
[ https://issues.apache.org/jira/browse/HIVE-18533?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16400291#comment-16400291 ] Hive QA commented on HIVE-18533: Here are the results of testing the latest attachment: https://issues.apache.org/jira/secure/attachment/12914601/HIVE-18533.4.patch {color:green}SUCCESS:{color} +1 due to 2 test(s) being added or modified. {color:red}ERROR:{color} -1 due to 34 failed/errored test(s), 13016 tests executed *Failed tests:* {noformat} TestNegativeCliDriver - did not produce a TEST-*.xml file (likely timed out) (batchId=94) [nopart_insert.q,insert_into_with_schema.q,input41.q,having1.q,create_table_failure3.q,default_constraint_invalid_default_value.q,database_drop_not_empty_restrict.q,windowing_after_orderby.q,orderbysortby.q,subquery_select_distinct2.q,authorization_uri_alterpart_loc.q,udf_last_day_error_1.q,constraint_duplicate_name.q,create_table_failure4.q,alter_tableprops_external_with_notnull_constraint.q,semijoin5.q,udf_format_number_wrong4.q,deletejar.q,exim_11_nonpart_noncompat_sorting.q,show_tables_bad_db2.q,drop_func_nonexistent.q,nopart_load.q,alter_table_non_partitioned_table_cascade.q,load_wrong_fileformat.q,lockneg_try_db_lock_conflict.q,udf_field_wrong_args_len.q,create_table_failure2.q,create_with_fk_constraints_enforced.q,groupby2_map_skew_multi_distinct.q,udf_min.q,authorization_update_noupdatepriv.q,show_columns2.q,authorization_insert_noselectpriv.q,orc_replace_columns3_acid.q,compare_double_bigint.q,authorization_set_nonexistent_conf.q,alter_rename_partition_failure3.q,split_sample_wrong_format2.q,create_with_fk_pk_same_tab.q,compare_double_bigint_2.q,authorization_show_roles_no_admin.q,materialized_view_authorization_rebuild_no_grant.q,unionLimit.q,authorization_revoke_table_fail2.q,authorization_insert_noinspriv.q,duplicate_insert3.q,authorization_desc_table_nosel.q,stats_noscan_non_native.q,orc_change_serde_acid.q,create_or_replace_view7.q,exim_07_nonpart_noncompat_ifof.q,create_with_unique_constraints_enforced.q,udf_concat_ws_wrong2.q,fileformat_bad_class.q,merge_negative_2.q,exim_15_part_nonpart.q,authorization_not_owner_drop_view.q,external1.q,authorization_uri_insert.q,create_with_fk_wrong_ref.q,columnstats_tbllvl_incorrect_column.q,authorization_show_parts_nosel.q,authorization_not_owner_drop_tab.q,external2.q,authorization_deletejar.q,temp_table_create_like_partitions.q,udf_greatest_error_1.q,ptf_negative_AggrFuncsWithNoGBYNoPartDef.q,alter_view_as_select_not_exist.q,touch1.q,groupby3_map_skew_multi_distinct.q,insert_into_notnull_constraint.q,exchange_partition_neg_partition_missing.q,groupby_cube_multi_gby.q,columnstats_tbllvl.q,drop_invalid_constraint2.q,alter_table_add_partition.q,update_not_acid.q,archive5.q,alter_table_constraint_invalid_pk_col.q,ivyDownload.q,udf_instr_wrong_type.q,bad_sample_clause.q,authorization_not_owner_drop_tab2.q,authorization_alter_db_owner.q,show_columns1.q,orc_type_promotion3.q,create_view_failure8.q,strict_join.q,udf_add_months_error_1.q,groupby_cube2.q,groupby_cube1.q,groupby_rollup1.q,genericFileFormat.q,invalid_cast_from_binary_4.q,drop_invalid_constraint1.q,serde_regex.q,show_partitions1.q,invalid_cast_from_binary_6.q,create_with_multi_pk_constraint.q,udf_field_wrong_type.q,groupby_grouping_sets4.q,groupby_grouping_sets3.q,insertsel_fail.q,udf_locate_wrong_type.q,orc_type_promotion1_acid.q,set_table_property.q,create_or_replace_view2.q,groupby_grouping_sets2.q,alter_view_failure.q,distinct_windowing_failure1.q,invalid_t_alter2.q,alter_table_constraint_invalid_fk_col1.q,invalid_varchar_length_2.q,authorization_show_grant_otheruser_alltabs.q,subquery_windowing_corr.q,compact_non_acid_table.q,authorization_view_4.q,authorization_disallow_transform.q,materialized_view_authorization_rebuild_other.q,authorization_fail_4.q,dbtxnmgr_nodblock.q,set_hiveconf_internal_variable1.q,input_part0_neg.q,udf_printf_wrong3.q,load_orc_negative2.q,druid_buckets.q,archive2.q,authorization_addjar.q,invalid_sum_syntax.q,insert_into_with_schema1.q,udf_add_months_error_2.q,dyn_part_max_per_node.q,authorization_revoke_table_fail1.q,udf_printf_wrong2.q,archive_multi3.q,udf_printf_wrong1.q,subquery_subquery_chain.q,authorization_view_disable_cbo_4.q,no_matching_udf.q,create_view_failure7.q,drop_native_udf.q,truncate_column_list_bucketing.q,authorization_uri_add_partition.q,authorization_view_disable_cbo_3.q,bad_exec_hooks.q,authorization_view_disable_cbo_2.q,fetchtask_ioexception.q,char_pad_convert_fail2.q,authorization_set_role_neg1.q,serde_regex3.q,authorization_delete_nodeletepriv.q,materialized_view_delete.q,create_or_replace_view6.q,bucket_mapjoin_wrong_table_metadata_2.q,msck_repair_3.q,udf_sort_array_by_wrong2.q,local_mapred_error_cache.q,alter_external_acid.q,mm_concatenate.q,authorization_fail_3.q,set_hiveconf_internal_variable0.q,udf_last_day_error_2.q,alter_table_constraint_invalid_ref.q,create_table_wrong_regex.q,describe_x
[jira] [Commented] (HIVE-18533) Add option to use InProcessLauncher to submit spark jobs
[ https://issues.apache.org/jira/browse/HIVE-18533?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16400236#comment-16400236 ] Hive QA commented on HIVE-18533: | (x) *{color:red}-1 overall{color}* | \\ \\ || Vote || Subsystem || Runtime || Comment || || || || || {color:brown} Prechecks {color} || | {color:blue}0{color} | {color:blue} findbugs {color} | {color:blue} 0m 1s{color} | {color:blue} Findbugs executables are not available. {color} | | {color:green}+1{color} | {color:green} @author {color} | {color:green} 0m 0s{color} | {color:green} The patch does not contain any @author tags. {color} | || || || || {color:brown} master Compile Tests {color} || | {color:blue}0{color} | {color:blue} mvndep {color} | {color:blue} 0m 37s{color} | {color:blue} Maven dependency ordering for branch {color} | | {color:green}+1{color} | {color:green} mvninstall {color} | {color:green} 6m 40s{color} | {color:green} master passed {color} | | {color:green}+1{color} | {color:green} compile {color} | {color:green} 1m 27s{color} | {color:green} master passed {color} | | {color:green}+1{color} | {color:green} checkstyle {color} | {color:green} 0m 58s{color} | {color:green} master passed {color} | | {color:green}+1{color} | {color:green} javadoc {color} | {color:green} 1m 12s{color} | {color:green} master passed {color} | || || || || {color:brown} Patch Compile Tests {color} || | {color:blue}0{color} | {color:blue} mvndep {color} | {color:blue} 0m 8s{color} | {color:blue} Maven dependency ordering for patch {color} | | {color:green}+1{color} | {color:green} mvninstall {color} | {color:green} 1m 52s{color} | {color:green} the patch passed {color} | | {color:green}+1{color} | {color:green} compile {color} | {color:green} 1m 27s{color} | {color:green} the patch passed {color} | | {color:green}+1{color} | {color:green} javac {color} | {color:green} 1m 27s{color} | {color:green} the patch passed {color} | | {color:red}-1{color} | {color:red} checkstyle {color} | {color:red} 0m 9s{color} | {color:red} spark-client: The patch generated 16 new + 21 unchanged - 16 fixed = 37 total (was 37) {color} | | {color:green}+1{color} | {color:green} whitespace {color} | {color:green} 0m 0s{color} | {color:green} The patch has no whitespace issues. {color} | | {color:green}+1{color} | {color:green} xml {color} | {color:green} 0m 1s{color} | {color:green} The patch has no ill-formed XML file. {color} | | {color:red}-1{color} | {color:red} javadoc {color} | {color:red} 0m 10s{color} | {color:red} spark-client generated 1 new + 0 unchanged - 0 fixed = 1 total (was 0) {color} | || || || || {color:brown} Other Tests {color} || | {color:red}-1{color} | {color:red} asflicense {color} | {color:red} 0m 12s{color} | {color:red} The patch generated 49 ASF License warnings. {color} | | {color:black}{color} | {color:black} {color} | {color:black} 17m 11s{color} | {color:black} {color} | \\ \\ || Subsystem || Report/Notes || | Optional Tests | asflicense javac javadoc findbugs checkstyle compile xml | | uname | Linux hiveptest-server-upstream 3.16.0-4-amd64 #1 SMP Debian 3.16.36-1+deb8u1 (2016-09-03) x86_64 GNU/Linux | | Build tool | maven | | Personality | /data/hiveptest/working/yetus_PreCommit-HIVE-Build-9641/dev-support/hive-personality.sh | | git revision | master / d5cb7f6 | | Default Java | 1.8.0_111 | | checkstyle | http://104.198.109.242/logs//PreCommit-HIVE-Build-9641/yetus/diff-checkstyle-spark-client.txt | | javadoc | http://104.198.109.242/logs//PreCommit-HIVE-Build-9641/yetus/diff-javadoc-javadoc-spark-client.txt | | asflicense | http://104.198.109.242/logs//PreCommit-HIVE-Build-9641/yetus/patch-asflicense-problems.txt | | modules | C: common itests ql spark-client U: . | | Console output | http://104.198.109.242/logs//PreCommit-HIVE-Build-9641/yetus.txt | | Powered by | Apache Yetushttp://yetus.apache.org | This message was automatically generated. > Add option to use InProcessLauncher to submit spark jobs > > > Key: HIVE-18533 > URL: https://issues.apache.org/jira/browse/HIVE-18533 > Project: Hive > Issue Type: Improvement > Components: Spark >Reporter: Sahil Takiar >Assignee: Sahil Takiar >Priority: Major > Attachments: HIVE-18533.1.patch, HIVE-18533.2.patch, > HIVE-18533.3.patch, HIVE-18533.4.patch > > > See discussion in HIVE-16484 for details. > I think this will help with reducing the amount of time it takes to open a > HoS session + debuggability (no need launch a separate process to run a Spark > app). -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Commented] (HIVE-18533) Add option to use InProcessLauncher to submit spark jobs
[ https://issues.apache.org/jira/browse/HIVE-18533?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16396682#comment-16396682 ] Hive QA commented on HIVE-18533: Here are the results of testing the latest attachment: https://issues.apache.org/jira/secure/attachment/12914202/HIVE-18533.2.patch {color:green}SUCCESS:{color} +1 due to 1 test(s) being added or modified. {color:red}ERROR:{color} -1 due to 23 failed/errored test(s), 13409 tests executed *Failed tests:* {noformat} TestNegativeCliDriver - did not produce a TEST-*.xml file (likely timed out) (batchId=95) [udf_invalid.q,authorization_uri_export.q,default_constraint_complex_default_value.q,druid_datasource2.q,view_update.q,default_partition_name.q,authorization_public_create.q,load_wrong_fileformat_rc_seq.q,default_constraint_invalid_type.q,altern1.q,describe_xpath1.q,drop_view_failure2.q,temp_table_rename.q,invalid_select_column_with_subquery.q,udf_trunc_error1.q,insert_view_failure.q,dbtxnmgr_nodbunlock.q,authorization_show_columns.q,cte_recursion.q,merge_constraint_notnull.q,load_part_nospec.q,clusterbyorderby.q,orc_type_promotion2.q,ctas_noperm_loc.q,udf_instr_wrong_args_len.q,invalid_create_tbl2.q,part_col_complex_type.q,authorization_drop_db_empty.q,smb_mapjoin_14.q,subquery_scalar_multi_rows.q,alter_partition_coltype_2columns.q,subquery_corr_in_agg.q,insert_overwrite_notnull_constraint.q,authorization_show_grant_otheruser_wtab.q,regex_col_groupby.q,udaf_collect_set_unsupported.q,ptf_negative_DuplicateWindowAlias.q,exim_22_export_authfail.q,udf_likeany_wrong1.q,groupby_key.q,ambiguous_col.q,groupby3_multi_distinct.q,authorization_alter_drop_ptn.q,invalid_cast_from_binary_5.q,show_create_table_does_not_exist.q,invalid_select_column.q,exim_20_managed_location_over_existing.q,interval_3.q,authorization_compile.q,join35.q,merge_negative_3.q,udf_concat_ws_wrong3.q,create_or_replace_view8.q,create_external_with_notnull_constraint.q,split_sample_out_of_range.q,materialized_view_no_transactional_rewrite.q,authorization_show_grant_otherrole.q,create_with_constraints_duplicate_name.q,invalid_stddev_samp_syntax.q,authorization_view_disable_cbo_7.q,autolocal1.q,avro_non_nullable_union.q,load_orc_negative_part.q,drop_view_failure1.q,columnstats_partlvl_invalid_values_autogather.q,exim_13_nonnative_import.q,alter_table_wrong_regex.q,add_partition_with_whitelist.q,udf_next_day_error_2.q,authorization_select.q,udf_trunc_error2.q,authorization_view_7.q,udf_format_number_wrong5.q,touch2.q,exim_03_nonpart_noncompat_colschema.q,orc_type_promotion1.q,lateral_view_alias.q,show_tables_bad_db1.q,unset_table_property.q,alter_non_native.q,nvl_mismatch_type.q,load_orc_negative3.q,authorization_create_role_no_admin.q,invalid_distinct1.q,authorization_grant_server.q,orc_type_promotion3_acid.q,show_tables_bad1.q,macro_unused_parameter.q,drop_invalid_constraint3.q,drop_partition_filter_failure.q,char_pad_convert_fail3.q,exim_23_import_exist_authfail.q,drop_invalid_constraint4.q,authorization_create_macro1.q,archive1.q,subquery_multiple_cols_in_select.q,change_hive_hdfs_session_path.q,udf_trunc_error3.q,invalid_variance_syntax.q,authorization_truncate_2.q,invalid_avg_syntax.q,invalid_select_column_with_tablename.q,mm_truncate_cols.q,groupby_grouping_sets1.q,druid_location.q,groupby2_multi_distinct.q,authorization_sba_drop_table.q,dynamic_partitions_with_whitelist.q,compare_string_bigint_2.q,udf_greatest_error_2.q,authorization_view_6.q,show_tablestatus.q,duplicate_alias_in_transform_schema.q,create_with_fk_uk_same_tab.q,udtf_not_supported3.q,alter_table_constraint_invalid_fk_col2.q,udtf_not_supported1.q,dbtxnmgr_notableunlock.q,ptf_negative_InvalidValueBoundary.q,alter_table_constraint_duplicate_pk.q,udf_printf_wrong4.q,create_view_failure9.q,udf_elt_wrong_type.q,selectDistinctStarNeg_1.q,invalid_mapjoin1.q,load_stored_as_dirs.q,input1.q,udf_sort_array_wrong1.q,invalid_distinct2.q,invalid_select_fn.q,authorization_role_grant_otherrole.q,archive4.q,load_nonpart_authfail.q,recursive_view.q,authorization_view_disable_cbo_1.q,desc_failure4.q,create_not_acid.q,udf_sort_array_wrong3.q,char_pad_convert_fail0.q,udf_map_values_arg_type.q,alter_view_failure6_2.q,alter_partition_change_col_nonexist.q,update_non_acid_table.q,authorization_view_disable_cbo_5.q,ct_noperm_loc.q,interval_1.q,authorization_show_grant_otheruser_all.q,authorization_view_2.q,show_tables_bad2.q,groupby_rollup2.q,truncate_column_seqfile.q,create_view_failure5.q,authorization_create_view.q,ptf_window_boundaries.q,ctasnullcol.q,input_part0_neg_2.q,create_or_replace_view1.q,udf_max.q,exim_01_nonpart_over_loaded.q,msck_repair_1.q,orc_change_fileformat_acid.q,udf_nonexistent_resource.q,exim_19_external_over_existing.q,serde_regex2.q,msck_repair_2.q,exim_06_nonpart_noncompat_storage.q,illegal_partition_type4.q,udf_sort_array_by_wrong1.q,create_or_replace_view5.q,windowing_leadlag_in_udaf.q,avro_decimal.q,materialized_view_updat
[jira] [Commented] (HIVE-18533) Add option to use InProcessLauncher to submit spark jobs
[ https://issues.apache.org/jira/browse/HIVE-18533?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16396654#comment-16396654 ] Hive QA commented on HIVE-18533: | (x) *{color:red}-1 overall{color}* | \\ \\ || Vote || Subsystem || Runtime || Comment || || || || || {color:brown} Prechecks {color} || | {color:blue}0{color} | {color:blue} findbugs {color} | {color:blue} 0m 0s{color} | {color:blue} Findbugs executables are not available. {color} | | {color:green}+1{color} | {color:green} @author {color} | {color:green} 0m 0s{color} | {color:green} The patch does not contain any @author tags. {color} | || || || || {color:brown} master Compile Tests {color} || | {color:blue}0{color} | {color:blue} mvndep {color} | {color:blue} 0m 34s{color} | {color:blue} Maven dependency ordering for branch {color} | | {color:green}+1{color} | {color:green} mvninstall {color} | {color:green} 6m 34s{color} | {color:green} master passed {color} | | {color:green}+1{color} | {color:green} compile {color} | {color:green} 5m 6s{color} | {color:green} master passed {color} | | {color:green}+1{color} | {color:green} checkstyle {color} | {color:green} 2m 12s{color} | {color:green} master passed {color} | | {color:green}+1{color} | {color:green} javadoc {color} | {color:green} 5m 33s{color} | {color:green} master passed {color} | || || || || {color:brown} Patch Compile Tests {color} || | {color:blue}0{color} | {color:blue} mvndep {color} | {color:blue} 0m 8s{color} | {color:blue} Maven dependency ordering for patch {color} | | {color:green}+1{color} | {color:green} mvninstall {color} | {color:green} 6m 7s{color} | {color:green} the patch passed {color} | | {color:green}+1{color} | {color:green} compile {color} | {color:green} 5m 9s{color} | {color:green} the patch passed {color} | | {color:green}+1{color} | {color:green} javac {color} | {color:green} 5m 9s{color} | {color:green} the patch passed {color} | | {color:red}-1{color} | {color:red} checkstyle {color} | {color:red} 0m 13s{color} | {color:red} common: The patch generated 1 new + 426 unchanged - 0 fixed = 427 total (was 426) {color} | | {color:red}-1{color} | {color:red} checkstyle {color} | {color:red} 1m 33s{color} | {color:red} root: The patch generated 39 new + 448 unchanged - 15 fixed = 487 total (was 463) {color} | | {color:red}-1{color} | {color:red} checkstyle {color} | {color:red} 0m 9s{color} | {color:red} spark-client: The patch generated 38 new + 22 unchanged - 15 fixed = 60 total (was 37) {color} | | {color:red}-1{color} | {color:red} whitespace {color} | {color:red} 0m 0s{color} | {color:red} The patch 13 line(s) with tabs. {color} | | {color:green}+1{color} | {color:green} xml {color} | {color:green} 0m 2s{color} | {color:green} The patch has no ill-formed XML file. {color} | | {color:green}+1{color} | {color:green} javadoc {color} | {color:green} 5m 49s{color} | {color:green} the patch passed {color} | || || || || {color:brown} Other Tests {color} || | {color:red}-1{color} | {color:red} asflicense {color} | {color:red} 0m 22s{color} | {color:red} The patch generated 50 ASF License warnings. {color} | | {color:black}{color} | {color:black} {color} | {color:black} 40m 7s{color} | {color:black} {color} | \\ \\ || Subsystem || Report/Notes || | Optional Tests | asflicense javac javadoc findbugs checkstyle compile xml | | uname | Linux hiveptest-server-upstream 3.16.0-4-amd64 #1 SMP Debian 3.16.36-1+deb8u1 (2016-09-03) x86_64 GNU/Linux | | Build tool | maven | | Personality | /data/hiveptest/working/yetus_PreCommit-HIVE-Build-9616/dev-support/hive-personality.sh | | git revision | master / 12041d3 | | Default Java | 1.8.0_111 | | checkstyle | http://104.198.109.242/logs//PreCommit-HIVE-Build-9616/yetus/diff-checkstyle-common.txt | | checkstyle | http://104.198.109.242/logs//PreCommit-HIVE-Build-9616/yetus/diff-checkstyle-root.txt | | checkstyle | http://104.198.109.242/logs//PreCommit-HIVE-Build-9616/yetus/diff-checkstyle-spark-client.txt | | whitespace | http://104.198.109.242/logs//PreCommit-HIVE-Build-9616/yetus/whitespace-tabs.txt | | asflicense | http://104.198.109.242/logs//PreCommit-HIVE-Build-9616/yetus/patch-asflicense-problems.txt | | modules | C: common . itests/qtest-spark spark-client U: . | | Console output | http://104.198.109.242/logs//PreCommit-HIVE-Build-9616/yetus.txt | | Powered by | Apache Yetushttp://yetus.apache.org | This message was automatically generated. > Add option to use InProcessLauncher to submit spark jobs > > > Key: HIVE-18533 > URL: https://issues.apache.org/jira/browse/HIVE-18533 > Project: Hive > Issue Type: Improvement > Components: Spark >Reporter: Sahil Takiar >Assignee: Sahil Takiar >Priority: Major > Attachm
[jira] [Commented] (HIVE-18533) Add option to use InProcessLauncher to submit spark jobs
[ https://issues.apache.org/jira/browse/HIVE-18533?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16394826#comment-16394826 ] Hive QA commented on HIVE-18533: Here are the results of testing the latest attachment: https://issues.apache.org/jira/secure/attachment/12913962/HIVE-18533.1.patch {color:green}SUCCESS:{color} +1 due to 1 test(s) being added or modified. {color:red}ERROR:{color} -1 due to 29 failed/errored test(s), 12882 tests executed *Failed tests:* {noformat} TestNegativeCliDriver - did not produce a TEST-*.xml file (likely timed out) (batchId=94) [nopart_insert.q,insert_into_with_schema.q,input41.q,having1.q,create_table_failure3.q,default_constraint_invalid_default_value.q,database_drop_not_empty_restrict.q,windowing_after_orderby.q,orderbysortby.q,subquery_select_distinct2.q,authorization_uri_alterpart_loc.q,udf_last_day_error_1.q,constraint_duplicate_name.q,create_table_failure4.q,alter_tableprops_external_with_notnull_constraint.q,semijoin5.q,udf_format_number_wrong4.q,deletejar.q,exim_11_nonpart_noncompat_sorting.q,show_tables_bad_db2.q,drop_func_nonexistent.q,nopart_load.q,alter_table_non_partitioned_table_cascade.q,load_wrong_fileformat.q,lockneg_try_db_lock_conflict.q,udf_field_wrong_args_len.q,create_table_failure2.q,create_with_fk_constraints_enforced.q,groupby2_map_skew_multi_distinct.q,udf_min.q,authorization_update_noupdatepriv.q,show_columns2.q,authorization_insert_noselectpriv.q,orc_replace_columns3_acid.q,compare_double_bigint.q,authorization_set_nonexistent_conf.q,alter_rename_partition_failure3.q,split_sample_wrong_format2.q,create_with_fk_pk_same_tab.q,compare_double_bigint_2.q,authorization_show_roles_no_admin.q,materialized_view_authorization_rebuild_no_grant.q,unionLimit.q,authorization_revoke_table_fail2.q,authorization_insert_noinspriv.q,duplicate_insert3.q,authorization_desc_table_nosel.q,stats_noscan_non_native.q,orc_change_serde_acid.q,create_or_replace_view7.q,exim_07_nonpart_noncompat_ifof.q,create_with_unique_constraints_enforced.q,udf_concat_ws_wrong2.q,fileformat_bad_class.q,merge_negative_2.q,exim_15_part_nonpart.q,authorization_not_owner_drop_view.q,external1.q,authorization_uri_insert.q,create_with_fk_wrong_ref.q,columnstats_tbllvl_incorrect_column.q,authorization_show_parts_nosel.q,authorization_not_owner_drop_tab.q,external2.q,authorization_deletejar.q,temp_table_create_like_partitions.q,udf_greatest_error_1.q,ptf_negative_AggrFuncsWithNoGBYNoPartDef.q,alter_view_as_select_not_exist.q,touch1.q,groupby3_map_skew_multi_distinct.q,insert_into_notnull_constraint.q,exchange_partition_neg_partition_missing.q,groupby_cube_multi_gby.q,columnstats_tbllvl.q,drop_invalid_constraint2.q,alter_table_add_partition.q,update_not_acid.q,archive5.q,alter_table_constraint_invalid_pk_col.q,ivyDownload.q,udf_instr_wrong_type.q,bad_sample_clause.q,authorization_not_owner_drop_tab2.q,authorization_alter_db_owner.q,show_columns1.q,orc_type_promotion3.q,create_view_failure8.q,strict_join.q,udf_add_months_error_1.q,groupby_cube2.q,groupby_cube1.q,groupby_rollup1.q,genericFileFormat.q,invalid_cast_from_binary_4.q,drop_invalid_constraint1.q,serde_regex.q,show_partitions1.q,invalid_cast_from_binary_6.q,create_with_multi_pk_constraint.q,udf_field_wrong_type.q,groupby_grouping_sets4.q,groupby_grouping_sets3.q,insertsel_fail.q,udf_locate_wrong_type.q,orc_type_promotion1_acid.q,set_table_property.q,create_or_replace_view2.q,groupby_grouping_sets2.q,alter_view_failure.q,distinct_windowing_failure1.q,invalid_t_alter2.q,alter_table_constraint_invalid_fk_col1.q,invalid_varchar_length_2.q,authorization_show_grant_otheruser_alltabs.q,subquery_windowing_corr.q,compact_non_acid_table.q,authorization_view_4.q,authorization_disallow_transform.q,materialized_view_authorization_rebuild_other.q,authorization_fail_4.q,dbtxnmgr_nodblock.q,set_hiveconf_internal_variable1.q,input_part0_neg.q,udf_printf_wrong3.q,load_orc_negative2.q,druid_buckets.q,archive2.q,authorization_addjar.q,invalid_sum_syntax.q,insert_into_with_schema1.q,udf_add_months_error_2.q,dyn_part_max_per_node.q,authorization_revoke_table_fail1.q,udf_printf_wrong2.q,archive_multi3.q,udf_printf_wrong1.q,subquery_subquery_chain.q,authorization_view_disable_cbo_4.q,no_matching_udf.q,create_view_failure7.q,drop_native_udf.q,truncate_column_list_bucketing.q,authorization_uri_add_partition.q,authorization_view_disable_cbo_3.q,bad_exec_hooks.q,authorization_view_disable_cbo_2.q,fetchtask_ioexception.q,char_pad_convert_fail2.q,authorization_set_role_neg1.q,serde_regex3.q,authorization_delete_nodeletepriv.q,materialized_view_delete.q,create_or_replace_view6.q,bucket_mapjoin_wrong_table_metadata_2.q,msck_repair_3.q,udf_sort_array_by_wrong2.q,local_mapred_error_cache.q,alter_external_acid.q,mm_concatenate.q,authorization_fail_3.q,set_hiveconf_internal_variable0.q,udf_last_day_error_2.q,alter_table_constraint_invalid_ref.q,create_table_wrong_regex.q,describe_x
[jira] [Commented] (HIVE-18533) Add option to use InProcessLauncher to submit spark jobs
[ https://issues.apache.org/jira/browse/HIVE-18533?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16394813#comment-16394813 ] Hive QA commented on HIVE-18533: | (x) *{color:red}-1 overall{color}* | \\ \\ || Vote || Subsystem || Runtime || Comment || || || || || {color:brown} Prechecks {color} || | {color:blue}0{color} | {color:blue} findbugs {color} | {color:blue} 0m 0s{color} | {color:blue} Findbugs executables are not available. {color} | | {color:green}+1{color} | {color:green} @author {color} | {color:green} 0m 0s{color} | {color:green} The patch does not contain any @author tags. {color} | || || || || {color:brown} master Compile Tests {color} || | {color:blue}0{color} | {color:blue} mvndep {color} | {color:blue} 0m 21s{color} | {color:blue} Maven dependency ordering for branch {color} | | {color:green}+1{color} | {color:green} mvninstall {color} | {color:green} 7m 0s{color} | {color:green} master passed {color} | | {color:green}+1{color} | {color:green} compile {color} | {color:green} 5m 4s{color} | {color:green} master passed {color} | | {color:green}+1{color} | {color:green} checkstyle {color} | {color:green} 2m 11s{color} | {color:green} master passed {color} | | {color:green}+1{color} | {color:green} javadoc {color} | {color:green} 5m 41s{color} | {color:green} master passed {color} | || || || || {color:brown} Patch Compile Tests {color} || | {color:blue}0{color} | {color:blue} mvndep {color} | {color:blue} 0m 6s{color} | {color:blue} Maven dependency ordering for patch {color} | | {color:green}+1{color} | {color:green} mvninstall {color} | {color:green} 6m 18s{color} | {color:green} the patch passed {color} | | {color:green}+1{color} | {color:green} compile {color} | {color:green} 5m 8s{color} | {color:green} the patch passed {color} | | {color:green}+1{color} | {color:green} javac {color} | {color:green} 5m 8s{color} | {color:green} the patch passed {color} | | {color:red}-1{color} | {color:red} checkstyle {color} | {color:red} 0m 15s{color} | {color:red} common: The patch generated 1 new + 426 unchanged - 0 fixed = 427 total (was 426) {color} | | {color:red}-1{color} | {color:red} checkstyle {color} | {color:red} 1m 37s{color} | {color:red} root: The patch generated 39 new + 448 unchanged - 15 fixed = 487 total (was 463) {color} | | {color:red}-1{color} | {color:red} checkstyle {color} | {color:red} 0m 9s{color} | {color:red} spark-client: The patch generated 38 new + 22 unchanged - 15 fixed = 60 total (was 37) {color} | | {color:red}-1{color} | {color:red} whitespace {color} | {color:red} 0m 0s{color} | {color:red} The patch 13 line(s) with tabs. {color} | | {color:green}+1{color} | {color:green} xml {color} | {color:green} 0m 2s{color} | {color:green} The patch has no ill-formed XML file. {color} | | {color:green}+1{color} | {color:green} javadoc {color} | {color:green} 5m 46s{color} | {color:green} the patch passed {color} | || || || || {color:brown} Other Tests {color} || | {color:red}-1{color} | {color:red} asflicense {color} | {color:red} 0m 18s{color} | {color:red} The patch generated 50 ASF License warnings. {color} | | {color:black}{color} | {color:black} {color} | {color:black} 40m 31s{color} | {color:black} {color} | \\ \\ || Subsystem || Report/Notes || | Optional Tests | asflicense javac javadoc findbugs checkstyle compile xml | | uname | Linux hiveptest-server-upstream 3.16.0-4-amd64 #1 SMP Debian 3.16.36-1+deb8u1 (2016-09-03) x86_64 GNU/Linux | | Build tool | maven | | Personality | /data/hiveptest/working/yetus_PreCommit-HIVE-Build-9597/dev-support/hive-personality.sh | | git revision | master / e213c4c | | Default Java | 1.8.0_111 | | checkstyle | http://104.198.109.242/logs//PreCommit-HIVE-Build-9597/yetus/diff-checkstyle-common.txt | | checkstyle | http://104.198.109.242/logs//PreCommit-HIVE-Build-9597/yetus/diff-checkstyle-root.txt | | checkstyle | http://104.198.109.242/logs//PreCommit-HIVE-Build-9597/yetus/diff-checkstyle-spark-client.txt | | whitespace | http://104.198.109.242/logs//PreCommit-HIVE-Build-9597/yetus/whitespace-tabs.txt | | asflicense | http://104.198.109.242/logs//PreCommit-HIVE-Build-9597/yetus/patch-asflicense-problems.txt | | modules | C: common . itests/qtest-spark spark-client U: . | | Console output | http://104.198.109.242/logs//PreCommit-HIVE-Build-9597/yetus.txt | | Powered by | Apache Yetushttp://yetus.apache.org | This message was automatically generated. > Add option to use InProcessLauncher to submit spark jobs > > > Key: HIVE-18533 > URL: https://issues.apache.org/jira/browse/HIVE-18533 > Project: Hive > Issue Type: Improvement > Components: Spark >Reporter: Sahil Takiar >Assignee: Sahil Takiar >Priority: Major > Attachm
[jira] [Commented] (HIVE-18533) Add option to use InProcessLauncher to submit spark jobs
[ https://issues.apache.org/jira/browse/HIVE-18533?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16394793#comment-16394793 ] Sahil Takiar commented on HIVE-18533: - Attaching WIP patch with the {{SparkLauncher}} option enabled by default, but just for testing purposes. > Add option to use InProcessLauncher to submit spark jobs > > > Key: HIVE-18533 > URL: https://issues.apache.org/jira/browse/HIVE-18533 > Project: Hive > Issue Type: Improvement > Components: Spark >Reporter: Sahil Takiar >Assignee: Sahil Takiar >Priority: Major > Attachments: HIVE-18533.1.patch > > > See discussion in HIVE-16484 for details. > I think this will help with reducing the amount of time it takes to open a > HoS session + debuggability (no need launch a separate process to run a Spark > app). -- This message was sent by Atlassian JIRA (v7.6.3#76005)