As the error message says, were you using a |SQLContext| instead of a
|HiveContext| to create the DataFrame?
In Spark shell, although the variable name is |sqlContext|, the type of
that variable is actually |org.apache.spark.sql.hive.HiveContext|, which
has the ability to communicate with Hive metastore.
Cheng
On 6/13/15 12:36 PM, pth001 wrote:
Hi,
I am using spark 0.14. I try to insert data into a hive table (in orc
format) from DF.
partitionedTestDF.write.format("org.apache.spark.sql.hive.orc.DefaultSource")
.mode(org.apache.spark.sql.SaveMode.Append).partitionBy("zone","z","year","month").saveAsTable("testorc")
When this job is submitted by spark-submit I get >>
Exception in thread "main" java.lang.RuntimeException: Tables created
with SQLContext must be TEMPORARY. Use a HiveContext instead
But the job works fine on spark-shell. What can be wrong?
BR,
Patcharee
---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org
.