Try sparksession.conf().set

> On 28. Jul 2017, at 12:19, Chetan Khatri <chetan.opensou...@gmail.com> wrote:
> 
> Hey Dev/ USer,
> 
> I am working with Spark 2.0.1 and with dynamic partitioning with Hive facing 
> below issue:
> 
> org.apache.hadoop.hive.ql.metadata.HiveException:
> Number of dynamic partitions created is 1344, which is more than 1000.
> To solve this try to set hive.exec.max.dynamic.partitions to at least 1344.
> 
> I tried below options, but failed:
> 
> val spark = sparkSession.builder().enableHiveSupport().getOrCreate()
> 
> spark.sqlContext.setConf("hive.exec.max.dynamic.partitions", "2000")
> 
> Please help with alternate workaround !
> 
> Thanks

Reply via email to