[jira] [Commented] (HIVE-17929) Use sessionId for HoS Remote Driver Client id
[ https://issues.apache.org/jira/browse/HIVE-17929?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16309247#comment-16309247 ] Rui Li commented on HIVE-17929: --- +1 > Use sessionId for HoS Remote Driver Client id > - > > Key: HIVE-17929 > URL: https://issues.apache.org/jira/browse/HIVE-17929 > Project: Hive > Issue Type: Sub-task > Components: Spark >Reporter: Sahil Takiar >Assignee: Sahil Takiar > Attachments: HIVE-17929.1.patch, HIVE-17929.2.patch, > HIVE-17929.3.patch > > > Each {{SparkClientImpl}} creates a client connection using a client id. The > client id is created via {{UUID.randomUUID()}}. > Since each HoS session has a single client connection we should just use the > sessionId instead (which is also a UUID). This should help simplify the code > and some of the client logging. -- This message was sent by Atlassian JIRA (v6.4.14#64029)
[jira] [Commented] (HIVE-17929) Use sessionId for HoS Remote Driver Client id
[ https://issues.apache.org/jira/browse/HIVE-17929?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16308301#comment-16308301 ] Sahil Takiar commented on HIVE-17929: - [~lirui] can you take a look? > Use sessionId for HoS Remote Driver Client id > - > > Key: HIVE-17929 > URL: https://issues.apache.org/jira/browse/HIVE-17929 > Project: Hive > Issue Type: Sub-task > Components: Spark >Reporter: Sahil Takiar >Assignee: Sahil Takiar > Attachments: HIVE-17929.1.patch, HIVE-17929.2.patch, > HIVE-17929.3.patch > > > Each {{SparkClientImpl}} creates a client connection using a client id. The > client id is created via {{UUID.randomUUID()}}. > Since each HoS session has a single client connection we should just use the > sessionId instead (which is also a UUID). This should help simplify the code > and some of the client logging. -- This message was sent by Atlassian JIRA (v6.4.14#64029)
[jira] [Commented] (HIVE-17929) Use sessionId for HoS Remote Driver Client id
[ https://issues.apache.org/jira/browse/HIVE-17929?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16225029#comment-16225029 ] Hive QA commented on HIVE-17929: Here are the results of testing the latest attachment: https://issues.apache.org/jira/secure/attachment/12894738/HIVE-17929.3.patch {color:green}SUCCESS:{color} +1 due to 1 test(s) being added or modified. {color:red}ERROR:{color} -1 due to 5 failed/errored test(s), 11340 tests executed *Failed tests:* {noformat} org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[insert_values_orig_table_use_metadata] (batchId=62) org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver[udaf_histogram_numeric] (batchId=13) org.apache.hadoop.hive.cli.TestNegativeMinimrCliDriver.testCliDriver[ct_noperm_loc] (batchId=93) org.apache.hadoop.hive.cli.control.TestDanglingQOuts.checkDanglingQOut (batchId=205) org.apache.hadoop.hive.ql.parse.TestReplicationScenarios.testConstraints (batchId=222) {noformat} Test results: https://builds.apache.org/job/PreCommit-HIVE-Build/7559/testReport Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/7559/console Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-7559/ Messages: {noformat} Executing org.apache.hive.ptest.execution.TestCheckPhase Executing org.apache.hive.ptest.execution.PrepPhase Executing org.apache.hive.ptest.execution.ExecutionPhase Executing org.apache.hive.ptest.execution.ReportingPhase Tests exited with: TestsFailedException: 5 tests failed {noformat} This message is automatically generated. ATTACHMENT ID: 12894738 - PreCommit-HIVE-Build > Use sessionId for HoS Remote Driver Client id > - > > Key: HIVE-17929 > URL: https://issues.apache.org/jira/browse/HIVE-17929 > Project: Hive > Issue Type: Sub-task > Components: Spark >Reporter: Sahil Takiar >Assignee: Sahil Takiar > Attachments: HIVE-17929.1.patch, HIVE-17929.2.patch, > HIVE-17929.3.patch > > > Each {{SparkClientImpl}} creates a client connection using a client id. The > client id is created via {{UUID.randomUUID()}}. > Since each HoS session has a single client connection we should just use the > sessionId instead (which is also a UUID). This should help simplify the code > and some of the client logging. -- This message was sent by Atlassian JIRA (v6.4.14#64029)
[jira] [Commented] (HIVE-17929) Use sessionId for HoS Remote Driver Client id
[ https://issues.apache.org/jira/browse/HIVE-17929?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16223810#comment-16223810 ] Hive QA commented on HIVE-17929: Here are the results of testing the latest attachment: https://issues.apache.org/jira/secure/attachment/12894606/HIVE-17929.2.patch {color:red}ERROR:{color} -1 due to build exiting with an error Test results: https://builds.apache.org/job/PreCommit-HIVE-Build/7546/testReport Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/7546/console Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-7546/ Messages: {noformat} Executing org.apache.hive.ptest.execution.TestCheckPhase Executing org.apache.hive.ptest.execution.PrepPhase Tests exited with: NonZeroExitCodeException Command 'bash /data/hiveptest/working/scratch/source-prep.sh' failed with exit status 1 and output '+ date '+%Y-%m-%d %T.%3N' 2017-10-29 01:39:07.376 + [[ -n /usr/lib/jvm/java-8-openjdk-amd64 ]] + export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64 + JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64 + export PATH=/usr/lib/jvm/java-8-openjdk-amd64/bin/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games + PATH=/usr/lib/jvm/java-8-openjdk-amd64/bin/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games + export 'ANT_OPTS=-Xmx1g -XX:MaxPermSize=256m ' + ANT_OPTS='-Xmx1g -XX:MaxPermSize=256m ' + export 'MAVEN_OPTS=-Xmx1g ' + MAVEN_OPTS='-Xmx1g ' + cd /data/hiveptest/working/ + tee /data/hiveptest/logs/PreCommit-HIVE-Build-7546/source-prep.txt + [[ false == \t\r\u\e ]] + mkdir -p maven ivy + [[ git = \s\v\n ]] + [[ git = \g\i\t ]] + [[ -z master ]] + [[ -d apache-github-source-source ]] + [[ ! -d apache-github-source-source/.git ]] + [[ ! -d apache-github-source-source ]] + date '+%Y-%m-%d %T.%3N' 2017-10-29 01:39:07.378 + cd apache-github-source-source + git fetch origin + git reset --hard HEAD HEAD is now at a9f25c0 HIVE-17778: Add support for custom counters in trigger expression (Prasanth Jayachandran reviewed by Sergey Shelukhin) + git clean -f -d Removing ${project.basedir}/ Removing hbase-handler/src/java/org/apache/hadoop/hive/hbase/HBaseMetaHook.java Removing standalone-metastore/src/gen/org/ + git checkout master Already on 'master' Your branch is up-to-date with 'origin/master'. + git reset --hard origin/master HEAD is now at a9f25c0 HIVE-17778: Add support for custom counters in trigger expression (Prasanth Jayachandran reviewed by Sergey Shelukhin) + git merge --ff-only origin/master Already up-to-date. + date '+%Y-%m-%d %T.%3N' 2017-10-29 01:39:11.615 + patchCommandPath=/data/hiveptest/working/scratch/smart-apply-patch.sh + patchFilePath=/data/hiveptest/working/scratch/build.patch + [[ -f /data/hiveptest/working/scratch/build.patch ]] + chmod +x /data/hiveptest/working/scratch/smart-apply-patch.sh + /data/hiveptest/working/scratch/smart-apply-patch.sh /data/hiveptest/working/scratch/build.patch Going to apply patch with: patch -p1 patching file ql/src/java/org/apache/hadoop/hive/ql/exec/spark/HiveSparkClientFactory.java patching file ql/src/java/org/apache/hadoop/hive/ql/exec/spark/RemoteHiveSparkClient.java patching file spark-client/src/main/java/org/apache/hive/spark/client/SparkClientFactory.java patching file spark-client/src/main/java/org/apache/hive/spark/client/SparkClientImpl.java + [[ maven == \m\a\v\e\n ]] + rm -rf /data/hiveptest/working/maven/org/apache/hive + mvn -B clean install -DskipTests -T 4 -q -Dmaven.repo.local=/data/hiveptest/working/maven protoc-jar: protoc version: 250, detected platform: linux/amd64 protoc-jar: executing: [/tmp/protoc7476871961870750455.exe, -I/data/hiveptest/working/apache-github-source-source/standalone-metastore/src/main/protobuf/org/apache/hadoop/hive/metastore, --java_out=/data/hiveptest/working/apache-github-source-source/standalone-metastore/target/generated-sources, /data/hiveptest/working/apache-github-source-source/standalone-metastore/src/main/protobuf/org/apache/hadoop/hive/metastore/metastore.proto] ANTLR Parser Generator Version 3.5.2 Output file /data/hiveptest/working/apache-github-source-source/standalone-metastore/target/generated-sources/antlr3/org/apache/hadoop/hive/metastore/parser/FilterParser.java does not exist: must build /data/hiveptest/working/apache-github-source-source/standalone-metastore/src/main/java/org/apache/hadoop/hive/metastore/parser/Filter.g org/apache/hadoop/hive/metastore/parser/Filter.g [ERROR] COMPILATION ERROR : [ERROR] /data/hiveptest/working/apache-github-source-source/spark-client/src/test/java/org/apache/hive/spark/client/TestSparkClient.java:[316,34] method createClient in class org.apache.hive.spark.client.SparkClientFactory cannot be applied to given types; required: java.util.Map,org.apache.hadoop.hive.conf.HiveConf,java.lang.String found: java.util.Map,org.apache.hadoop.hive.conf.HiveConf reason: actual and formal argument lists differ in length
[jira] [Commented] (HIVE-17929) Use sessionId for HoS Remote Driver Client id
[ https://issues.apache.org/jira/browse/HIVE-17929?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16223773#comment-16223773 ] Hive QA commented on HIVE-17929: Here are the results of testing the latest attachment: https://issues.apache.org/jira/secure/attachment/12894603/HIVE-17929.1.patch {color:red}ERROR:{color} -1 due to build exiting with an error Test results: https://builds.apache.org/job/PreCommit-HIVE-Build/7544/testReport Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/7544/console Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-7544/ Messages: {noformat} Executing org.apache.hive.ptest.execution.TestCheckPhase Executing org.apache.hive.ptest.execution.PrepPhase Tests exited with: NonZeroExitCodeException Command 'bash /data/hiveptest/working/scratch/source-prep.sh' failed with exit status 1 and output '+ date '+%Y-%m-%d %T.%3N' 2017-10-28 23:13:51.529 + [[ -n /usr/lib/jvm/java-8-openjdk-amd64 ]] + export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64 + JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64 + export PATH=/usr/lib/jvm/java-8-openjdk-amd64/bin/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games + PATH=/usr/lib/jvm/java-8-openjdk-amd64/bin/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games + export 'ANT_OPTS=-Xmx1g -XX:MaxPermSize=256m ' + ANT_OPTS='-Xmx1g -XX:MaxPermSize=256m ' + export 'MAVEN_OPTS=-Xmx1g ' + MAVEN_OPTS='-Xmx1g ' + cd /data/hiveptest/working/ + tee /data/hiveptest/logs/PreCommit-HIVE-Build-7544/source-prep.txt + [[ false == \t\r\u\e ]] + mkdir -p maven ivy + [[ git = \s\v\n ]] + [[ git = \g\i\t ]] + [[ -z master ]] + [[ -d apache-github-source-source ]] + [[ ! -d apache-github-source-source/.git ]] + [[ ! -d apache-github-source-source ]] + date '+%Y-%m-%d %T.%3N' 2017-10-28 23:13:51.531 + cd apache-github-source-source + git fetch origin + git reset --hard HEAD HEAD is now at a9f25c0 HIVE-17778: Add support for custom counters in trigger expression (Prasanth Jayachandran reviewed by Sergey Shelukhin) + git clean -f -d Removing itests/src/test/resources/testconfiguration.properties.orig Removing ql/src/java/org/apache/hadoop/hive/ql/parse/SemanticAnalyzer.java.orig Removing ql/src/test/queries/clientpositive/semijoin6.q Removing ql/src/test/results/clientpositive/llap/semijoin6.q.out Removing standalone-metastore/src/gen/org/ + git checkout master Already on 'master' Your branch is up-to-date with 'origin/master'. + git reset --hard origin/master HEAD is now at a9f25c0 HIVE-17778: Add support for custom counters in trigger expression (Prasanth Jayachandran reviewed by Sergey Shelukhin) + git merge --ff-only origin/master Already up-to-date. + date '+%Y-%m-%d %T.%3N' 2017-10-28 23:13:56.426 + patchCommandPath=/data/hiveptest/working/scratch/smart-apply-patch.sh + patchFilePath=/data/hiveptest/working/scratch/build.patch + [[ -f /data/hiveptest/working/scratch/build.patch ]] + chmod +x /data/hiveptest/working/scratch/smart-apply-patch.sh + /data/hiveptest/working/scratch/smart-apply-patch.sh /data/hiveptest/working/scratch/build.patch Going to apply patch with: patch -p1 patching file ql/src/java/org/apache/hadoop/hive/ql/exec/spark/HiveSparkClientFactory.java patching file ql/src/java/org/apache/hadoop/hive/ql/exec/spark/RemoteHiveSparkClient.java patching file spark-client/src/main/java/org/apache/hive/spark/client/SparkClientFactory.java patching file spark-client/src/main/java/org/apache/hive/spark/client/SparkClientImpl.java + [[ maven == \m\a\v\e\n ]] + rm -rf /data/hiveptest/working/maven/org/apache/hive + mvn -B clean install -DskipTests -T 4 -q -Dmaven.repo.local=/data/hiveptest/working/maven protoc-jar: protoc version: 250, detected platform: linux/amd64 protoc-jar: executing: [/tmp/protoc2857838909825807169.exe, -I/data/hiveptest/working/apache-github-source-source/standalone-metastore/src/main/protobuf/org/apache/hadoop/hive/metastore, --java_out=/data/hiveptest/working/apache-github-source-source/standalone-metastore/target/generated-sources, /data/hiveptest/working/apache-github-source-source/standalone-metastore/src/main/protobuf/org/apache/hadoop/hive/metastore/metastore.proto] ANTLR Parser Generator Version 3.5.2 Output file /data/hiveptest/working/apache-github-source-source/standalone-metastore/target/generated-sources/antlr3/org/apache/hadoop/hive/metastore/parser/FilterParser.java does not exist: must build /data/hiveptest/working/apache-github-source-source/standalone-metastore/src/main/java/org/apache/hadoop/hive/metastore/parser/Filter.g org/apache/hadoop/hive/metastore/parser/Filter.g [ERROR] COMPILATION ERROR : [ERROR] /data/hiveptest/working/apache-github-source-source/spark-client/src/test/java/org/apache/hive/spark/client/TestSparkClient.java:[316,34] method createClient in class org.apache.hive.spark.client.SparkClientFactory cannot be applied to given types; required: java.util.Map,org.apache.ha