currently saveAsTable will create Hive Internal table by default see here
<http://spark.apache.org/docs/latest/sql-programming-guide.html#saving-to-persistent-tables>


If you want to save it as external table, use saveAsParquetFile and create
an external hive table on that parquet file.

On Mon, Dec 7, 2015 at 3:13 PM, Fengdong Yu <fengdo...@everstring.com>
wrote:

> If your RDD is JSON format, that’s easy.
>
> val df = sqlContext.read.json(rdd)
> df.saveAsTable(“your_table_name")
>
>
>
> > On Dec 7, 2015, at 5:28 PM, Divya Gehlot <divya.htco...@gmail.com>
> wrote:
> >
> > Hi,
> > I am new bee to Spark.
> > Could somebody guide me how can I persist my spark RDD results in Hive
> using SaveAsTable API.
> > Would  appreciate if you could  provide the example for hive external
> table.
> >
> > Thanks in advance.
> >
> >
>
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>

Reply via email to