Re: Dataframe Write : Tables created with SQLContext must be TEMPORARY. Use a HiveContext instead.

2015-06-13 Thread pth001

I got it. Thanks!
Patcharee

On 13/06/15 23:00, Will Briggs wrote:

The context that is created by spark-shell is actually an instance of 
HiveContext. If you want to use it programmatically in your driver, you need to 
make sure that your context is a HiveContext, and not a SQLContext.

https://spark.apache.org/docs/latest/sql-programming-guide.html#hive-tables

Hope this helps,
Will

On June 13, 2015, at 3:36 PM, pth001 patcharee.thong...@uni.no wrote:

Hi,

I am using spark 0.14. I try to insert data into a hive table (in orc
format) from DF.

partitionedTestDF.write.format(org.apache.spark.sql.hive.orc.DefaultSource)
.mode(org.apache.spark.sql.SaveMode.Append).partitionBy(zone,z,year,month).saveAsTable(testorc)

When this job is submitted by spark-submit I get 
Exception in thread main java.lang.RuntimeException: Tables created
with SQLContext must be TEMPORARY. Use a HiveContext instead

But the job works fine on spark-shell. What can be wrong?

BR,
Patcharee

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org




-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: Dataframe Write : Tables created with SQLContext must be TEMPORARY. Use a HiveContext instead.

2015-06-13 Thread Cheng Lian
As the error message says, were you using a |SQLContext| instead of a 
|HiveContext| to create the DataFrame?


In Spark shell, although the variable name is |sqlContext|, the type of 
that variable is actually |org.apache.spark.sql.hive.HiveContext|, which 
has the ability to communicate with Hive metastore.


Cheng

On 6/13/15 12:36 PM, pth001 wrote:


Hi,

I am using spark 0.14. I try to insert data into a hive table (in orc 
format) from DF.


partitionedTestDF.write.format(org.apache.spark.sql.hive.orc.DefaultSource) 

.mode(org.apache.spark.sql.SaveMode.Append).partitionBy(zone,z,year,month).saveAsTable(testorc) 



When this job is submitted by spark-submit I get 
Exception in thread main java.lang.RuntimeException: Tables created 
with SQLContext must be TEMPORARY. Use a HiveContext instead


But the job works fine on spark-shell. What can be wrong?

BR,
Patcharee

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

.


​


Dataframe Write : Tables created with SQLContext must be TEMPORARY. Use a HiveContext instead.

2015-06-13 Thread pth001

Hi,

I am using spark 0.14. I try to insert data into a hive table (in orc 
format) from DF.


partitionedTestDF.write.format(org.apache.spark.sql.hive.orc.DefaultSource)
.mode(org.apache.spark.sql.SaveMode.Append).partitionBy(zone,z,year,month).saveAsTable(testorc)

When this job is submitted by spark-submit I get 
Exception in thread main java.lang.RuntimeException: Tables created 
with SQLContext must be TEMPORARY. Use a HiveContext instead


But the job works fine on spark-shell. What can be wrong?

BR,
Patcharee

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org