Even the hive configurations like the following would work with this?

    sqlContext.setConf("hive.default.fileformat", "Orc")
    sqlContext.setConf("hive.exec.orc.memory.pool", "1.0")
    sqlContext.setConf("hive.optimize.sort.dynamic.partition", "true")
    sqlContext.setConf("hive.exec.reducers.max", "2000")

On Mon, Feb 27, 2017 at 9:26 AM, neil90 <neilp1...@icloud.com> wrote:

> All you need to do is -
>
> spark.conf.set("spark.sql.shuffle.partitions", 2000)
> spark.conf.set("spark.sql.orc.filterPushdown", True)
> ...etc
>
>
>
> --
> View this message in context: http://apache-spark-user-list.
> 1001560.n3.nabble.com/How-to-set-hive-configs-in-Spark-2-1-
> tp28429p28431.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
>
>

Reply via email to