[ 
https://issues.apache.org/jira/browse/SPARK-21377?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Saisai Shao updated SPARK-21377:
--------------------------------
    Description: 
STR:
* Set below config in spark-default.conf
{code}
spark.yarn.security.credentials.hbase.enabled true
spark.hbase.connector.security.credentials.enabled false{code}
* Set below config in hdfs-site.xml
{code}
'dfs.namenode.delegation.token.max-lifetime':'43200000'
'dfs.namenode.delegation.token.renew-interval':'28800000' {code}
* Set below config in hbase-site.xml
{code}
'hbase.auth.token.max.lifetime': '28800000' {code}
* Run an application with SHC package
{code}
spark-submit  --class 
org.apache.spark.sql.execution.datasources.hbase.examples.LRJobForDataSources 
--master yarn-client --packages xxxx --num-executors 4 --driver-memory 512m 
--executor-memory 512m --executor-cores 1  --keytab /xxx/user.headless.keytab 
--principal x...@xx.com spark-*jar hiveTableInClient 1800000  {code}

After 8 hours, application fails with below error. 
{code}
17/06/28 06:33:43 INFO ClientCnxn: Opening socket connection to server 
xxx/xxx:2181. Will not attempt to authenticate using SASL (unknown error)
17/06/28 06:33:43 INFO ClientCnxn: Socket connection established to 
xxx/xxx:2181, initiating session
17/06/28 06:33:43 INFO ClientCnxn: Session establishment complete on server 
xxx/xxx:2181, sessionid = 0x25ced1d3ac20022, negotiated timeout = 90000
17/06/28 06:33:43 WARN AbstractRpcClient: Exception encountered while 
connecting to the server : 
org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.token.SecretManager$InvalidToken):
 Token has expired
17/06/28 06:33:45 WARN AbstractRpcClient: Exception encountered while 
connecting to the server : 
org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.token.SecretManager$InvalidToken):
 Token has expired
17/06/28 06:33:48 WARN AbstractRpcClient: Exception encountered while 
connecting to the server : 
org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.token.SecretManager$InvalidToken):
 Token has expired
17/06/28 06:33:52 WARN AbstractRpcClient: Exception encountered while 
connecting to the server : 
org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.token.SecretManager$InvalidToken):
 Token has expired
17/06/28 06:34:02 WARN AbstractRpcClient: Exception encountered while 
connecting to the server : 
org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.token.SecretManager$InvalidToken):
 Token has expired
17/06/28 06:34:12 WARN AbstractRpcClient: Exception encountered while 
connecting to the server : 
org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.token.SecretManager$InvalidToken):
 Token has expired{code}

Here, Jars pulled from "--packages" are not added into AM class path and that's 
the reason why AM cannot get HBase tokens and failed after token expired. 

So here we should figure out a solution  either to put these dependencies to AM 
classpath or to extend AM classpath with jars we wanted.

  was:
STR:
* Set below config in spark-default.conf
{code}
spark.yarn.security.credentials.hbase.enabled true
spark.hbase.connector.security.credentials.enabled false{code}
* Set below config in hdfs-site.xml
{code}
'dfs.namenode.delegation.token.max-lifetime':'43200000'
'dfs.namenode.delegation.token.renew-interval':'28800000' {code}
* Set below config in hbase-site.xml
{code}
'hbase.auth.token.max.lifetime': '28800000' {code}
* Run an application with SHC package
{code}
spark-submit  --class 
org.apache.spark.sql.execution.datasources.hbase.examples.LRJobForDataSources 
--master yarn-client --packages xxxx --num-executors 4 --driver-memory 512m 
--executor-memory 512m --executor-cores 1  --keytab /xxx/user.headless.keytab 
--principal x...@xx.com spark-*jar hiveTableInClient 1800000  {code}

After 8 hours, application fails with below error. 
{code}
17/06/28 06:33:43 INFO ClientCnxn: Opening socket connection to server 
xxx/xxx:2181. Will not attempt to authenticate using SASL (unknown error)
17/06/28 06:33:43 INFO ClientCnxn: Socket connection established to 
xxx/xxx:2181, initiating session
17/06/28 06:33:43 INFO ClientCnxn: Session establishment complete on server 
xxx/xxx:2181, sessionid = 0x25ced1d3ac20022, negotiated timeout = 90000
17/06/28 06:33:43 WARN AbstractRpcClient: Exception encountered while 
connecting to the server : 
org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.token.SecretManager$InvalidToken):
 Token has expired
17/06/28 06:33:45 WARN AbstractRpcClient: Exception encountered while 
connecting to the server : 
org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.token.SecretManager$InvalidToken):
 Token has expired
17/06/28 06:33:48 WARN AbstractRpcClient: Exception encountered while 
connecting to the server : 
org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.token.SecretManager$InvalidToken):
 Token has expired
17/06/28 06:33:52 WARN AbstractRpcClient: Exception encountered while 
connecting to the server : 
org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.token.SecretManager$InvalidToken):
 Token has expired
17/06/28 06:34:02 WARN AbstractRpcClient: Exception encountered while 
connecting to the server : 
org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.token.SecretManager$InvalidToken):
 Token has expired
17/06/28 06:34:12 WARN AbstractRpcClient: Exception encountered while 
connecting to the server : 
org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.token.SecretManager$InvalidToken):
 Token has expired{code}

Here, Jars pulled from "--packages" are not added into AM class path and that's 
the reason why AM cannot get HBase tokens and failed after token expired. 


> Jars pulled from "--packages" are not added into AM classpath
> -------------------------------------------------------------
>
>                 Key: SPARK-21377
>                 URL: https://issues.apache.org/jira/browse/SPARK-21377
>             Project: Spark
>          Issue Type: Bug
>          Components: YARN
>    Affects Versions: 2.2.0
>            Reporter: Yesha Vora
>            Priority: Minor
>
> STR:
> * Set below config in spark-default.conf
> {code}
> spark.yarn.security.credentials.hbase.enabled true
> spark.hbase.connector.security.credentials.enabled false{code}
> * Set below config in hdfs-site.xml
> {code}
> 'dfs.namenode.delegation.token.max-lifetime':'43200000'
> 'dfs.namenode.delegation.token.renew-interval':'28800000' {code}
> * Set below config in hbase-site.xml
> {code}
> 'hbase.auth.token.max.lifetime': '28800000' {code}
> * Run an application with SHC package
> {code}
> spark-submit  --class 
> org.apache.spark.sql.execution.datasources.hbase.examples.LRJobForDataSources 
> --master yarn-client --packages xxxx --num-executors 4 --driver-memory 512m 
> --executor-memory 512m --executor-cores 1  --keytab /xxx/user.headless.keytab 
> --principal x...@xx.com spark-*jar hiveTableInClient 1800000  {code}
> After 8 hours, application fails with below error. 
> {code}
> 17/06/28 06:33:43 INFO ClientCnxn: Opening socket connection to server 
> xxx/xxx:2181. Will not attempt to authenticate using SASL (unknown error)
> 17/06/28 06:33:43 INFO ClientCnxn: Socket connection established to 
> xxx/xxx:2181, initiating session
> 17/06/28 06:33:43 INFO ClientCnxn: Session establishment complete on server 
> xxx/xxx:2181, sessionid = 0x25ced1d3ac20022, negotiated timeout = 90000
> 17/06/28 06:33:43 WARN AbstractRpcClient: Exception encountered while 
> connecting to the server : 
> org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.token.SecretManager$InvalidToken):
>  Token has expired
> 17/06/28 06:33:45 WARN AbstractRpcClient: Exception encountered while 
> connecting to the server : 
> org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.token.SecretManager$InvalidToken):
>  Token has expired
> 17/06/28 06:33:48 WARN AbstractRpcClient: Exception encountered while 
> connecting to the server : 
> org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.token.SecretManager$InvalidToken):
>  Token has expired
> 17/06/28 06:33:52 WARN AbstractRpcClient: Exception encountered while 
> connecting to the server : 
> org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.token.SecretManager$InvalidToken):
>  Token has expired
> 17/06/28 06:34:02 WARN AbstractRpcClient: Exception encountered while 
> connecting to the server : 
> org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.token.SecretManager$InvalidToken):
>  Token has expired
> 17/06/28 06:34:12 WARN AbstractRpcClient: Exception encountered while 
> connecting to the server : 
> org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.token.SecretManager$InvalidToken):
>  Token has expired{code}
> Here, Jars pulled from "--packages" are not added into AM class path and 
> that's the reason why AM cannot get HBase tokens and failed after token 
> expired. 
> So here we should figure out a solution  either to put these dependencies to 
> AM classpath or to extend AM classpath with jars we wanted.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to