Hi,

I am using spark 0.14. I try to insert data into a hive table (in orc format) from DF.

partitionedTestDF.write.format("org.apache.spark.sql.hive.orc.DefaultSource")
.mode(org.apache.spark.sql.SaveMode.Append).partitionBy("zone","z","year","month").saveAsTable("testorc")

When this job is submitted by spark-submit I get >>
Exception in thread "main" java.lang.RuntimeException: Tables created with SQLContext must be TEMPORARY. Use a HiveContext instead

But the job works fine on spark-shell. What can be wrong?

BR,
Patcharee

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to