You need to compile with -Phive and use a HiveContext for that function to
work.  You can use saveAsParquetFile with a normal SQLContext however.

On Mon, Nov 3, 2014 at 3:05 AM, Addanki, Santosh Kumar <
santosh.kumar.adda...@sap.com> wrote:

>  Hi,
>
>
>
> I have a schemaRDD created  like below :
>
>
>
> schemaTransactions = sqlContext.applySchema(transactions,schema);
>
>
>
> When I try to save the schemaRDD as a table using :
>
>
>
> schemaTransactions.saveAsTable("transactions") I get the error below
>
>
>
>
>
> Py4JJavaError: An error occurred while calling o70.saveAsTable.
>
> : java.lang.AssertionError: assertion failed: No plan for
> InsertIntoCreatedTable None, transactions
>
> SparkLogicalPlan (ExistingRdd
> [GUID#21,INSTANCE#22,TRANSID#23,CONTEXT_ID#24,DIALOG_STEP#25,REPORT#26,ACCOUNT#27,MANDT#28,ACTION#29,TASKTYPE#30,TCODE#31,F12#32,F13#33,STARTDATE#34,STARTTIME#35,F16#36,RESPTIME#37,F18#38,F19#39,F20#40,F21#41],
> MapPartitionsRDD[11] at mapPartitions at SQLContext.scala:522)
>
>
>
>
>
> Also I have copied my hive-site.xml to the spark conf folder and started
> the thrift server .So where does the saveAsTable store the table in ..hive
> ??
>
>
>
> Best Regards
>
> Santosh
>
>
>

Reply via email to