[ 
https://issues.apache.org/jira/browse/HIVE-12568?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15036693#comment-15036693
 ] 

Hive QA commented on HIVE-12568:
--------------------------------



Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12775354/HIVE-12568.0-spark.patch

{color:red}ERROR:{color} -1 due to no test(s) being added or modified.

{color:red}ERROR:{color} -1 due to 5 failed/errored test(s), 9866 tests executed
*Failed tests:*
{noformat}
TestHWISessionManager - did not produce a TEST-*.xml file
org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_authorization_uri_import
org.apache.hadoop.hive.metastore.TestHiveMetaStorePartitionSpecs.testGetPartitionSpecs_WithAndWithoutPartitionGrouping
org.apache.hive.jdbc.TestSSL.testSSLVersion
org.apache.hive.jdbc.miniHS2.TestHs2Metrics.testMetrics
{noformat}

Test results: 
http://ec2-174-129-184-35.compute-1.amazonaws.com/jenkins/job/PreCommit-HIVE-SPARK-Build/1018/testReport
Console output: 
http://ec2-174-129-184-35.compute-1.amazonaws.com/jenkins/job/PreCommit-HIVE-SPARK-Build/1018/console
Test logs: 
http://ec2-50-18-27-0.us-west-1.compute.amazonaws.com/logs/PreCommit-HIVE-SPARK-Build-1018/

Messages:
{noformat}
Executing org.apache.hive.ptest.execution.TestCheckPhase
Executing org.apache.hive.ptest.execution.PrepPhase
Executing org.apache.hive.ptest.execution.ExecutionPhase
Executing org.apache.hive.ptest.execution.ReportingPhase
Tests exited with: TestsFailedException: 5 tests failed
{noformat}

This message is automatically generated.

ATTACHMENT ID: 12775354 - PreCommit-HIVE-SPARK-Build

> Use the same logic finding HS2 host name in Spark client [Spark Branch]
> -----------------------------------------------------------------------
>
>                 Key: HIVE-12568
>                 URL: https://issues.apache.org/jira/browse/HIVE-12568
>             Project: Hive
>          Issue Type: Bug
>          Components: Spark
>    Affects Versions: 1.1.0
>            Reporter: Xuefu Zhang
>            Assignee: Xuefu Zhang
>         Attachments: HIVE-12568.0-spark.patch
>
>
> Spark client sends a pair of host name and port number to the remote driver 
> so that the driver can connects back to HS2 where the user session is. Spark 
> client has its own way determining the host name, and pick one network 
> interface if the host happens to have multiple network interfaces. This can 
> be problematic. For that, there is parameter, 
> hive.spark.client.server.address, which user can pick an interface. 
> Unfortunately, this interface isn't exposed.
> Instead of exposing this parameter, we can use the same logic as Hive in 
> determining the host name. Therefore, the remote driver connecting to HS2 
> using the same network interface as a HS2 client would do.
> There might be a case where user may want the remote driver to use a 
> different network. This is rare if at all. Thus, for now it should be 
> sufficient to use the same network interface.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to