Thanks guys. prashanth's idea worked for me. Appreciate very much!

On Sun, Feb 11, 2018 at 10:20 AM, prashanth t <tprashanth...@gmail.com>
wrote:

> Hi Lian,
>
> Please add below command before creating table.
> "Use (database_name)"
> By default saveAsTable uses default database of hive. You might not have
> access to it that's causing problems.
>
>
>
> Thanks
> Prashanth Thipparthi
>
>
>
>
>
> On 11 Feb 2018 10:45 pm, "Lian Jiang" <jiangok2...@gmail.com> wrote:
>
> I started spark-shell with below command:
>
> spark-shell --master yarn --conf spark.sql.warehouse.dir="/user/spark"
>
> In spark-shell, below statement can create a managed table using
> /user/spark HDFS folder:
>
> spark.sql("CREATE TABLE t5 (i int) USING PARQUET")
>
> However, below statements still use spark-warehouse in local folder like
> {currentfolder}/spark-warehouse.
>
> case class SymbolInfo(name: String, sector: String)
>
> val siDS = Seq(
>   SymbolInfo("AAPL", "IT"),
>   SymbolInfo("GOOG", "IT")
> ).toDS()
>
> siDS.write.saveAsTable("siDS")
>
> How can I make saveAsTable respect spark.sql.warehouse.dir when creating
> a managed table? Appreciate any help!
>
>
>

Reply via email to