[ https://issues.apache.org/jira/browse/SPARK-21888?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Thomas Graves updated SPARK-21888: ---------------------------------- Description: While running Spark on Yarn in cluster mode, currently there is no way to add any config files to Client classpath. An example for this is that suppose you want to run an application that uses hbase. Then, unless and until we do not copy the necessary config files required by hbase to Spark Config folder, we cannot specify or set their exact locations in classpath on Client end which we could do so earlier by setting the environment variable "SPARK_CLASSPATH". (was: While running Spark on Yarn in cluster mode, currently there is no way to add any config files, jars etc. to Client classpath. An example for this is that suppose you want to run an application that uses hbase. Then, unless and until we do not copy the necessary config files required by hbase to Spark Config folder, we cannot specify or set their exact locations in classpath on Client end which we could do so earlier by setting the environment variable "SPARK_CLASSPATH".) > Cannot add stuff to Client Classpath for Yarn Cluster Mode > ---------------------------------------------------------- > > Key: SPARK-21888 > URL: https://issues.apache.org/jira/browse/SPARK-21888 > Project: Spark > Issue Type: Improvement > Components: Spark Core > Affects Versions: 2.2.0 > Reporter: Parth Gandhi > Priority: Minor > > While running Spark on Yarn in cluster mode, currently there is no way to add > any config files to Client classpath. An example for this is that suppose you > want to run an application that uses hbase. Then, unless and until we do not > copy the necessary config files required by hbase to Spark Config folder, we > cannot specify or set their exact locations in classpath on Client end which > we could do so earlier by setting the environment variable "SPARK_CLASSPATH". -- This message was sent by Atlassian JIRA (v6.4.14#64029) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org