GitHub user yaooqinn opened a pull request:

    https://github.com/apache/spark/pull/19663

    [SPARK-21888][Hive]add hadoop/hive/hdfs configuration files in 
SPARK_CONF_DIR to distribute archive

    ## What changes were proposed in this pull request?
    When I ran self contained sql apps, such as
    ```scala
    import org.apache.spark.sql.SparkSession
    
    object ShowHiveTables {
      def main(args: Array[String]): Unit = {
        val spark = SparkSession
          .builder()
          .appName("Show Hive Tables")
          .enableHiveSupport()
          .getOrCreate()
        spark.sql("show tables").show()
        spark.stop()
      }
    }
    ```
    with **yarn cluster** mode and `hive-site.xml` correctly within 
`$SPARH_HOME/conf`,they failed to connect the right hive metestore for not 
seeing hive-site.xml in AM/Driver's classpath.
    
    Although submitting them with `--files/--jars local/path/to/hive-site.xml` 
or puting it to `$HADOOP_CONF_DIR/YARN_CONF_DIR` can make these apps works well 
in cluster mode as client mode, according to the official doc, see @ 
http://spark.apache.org/docs/latest/sql-programming-guide.html#hive-tables
    > Configuration of Hive is done by placing your hive-site.xml, 
core-site.xml (for security configuration), and hdfs-site.xml (for HDFS 
configuration) file in conf/.
    
    We may respect these configuration files too or modify the doc for 
hive-tables in cluster mode.
    ## How was this patch tested?
    
    cc @cloud-fan @gatorsmile 


You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/yaooqinn/spark SPARK-21888

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/19663.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #19663
    
----
commit 696dbd6e5b5af89752c3869264f70ceddb868baf
Author: Kent Yao <yaooq...@hotmail.com>
Date:   2017-11-06T04:56:04Z

    add hadoop/hive/hdfs configuration files in SPARK_CONF_DIR to distribute 
archive

----


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to