See this thread http://search-hadoop.com/m/q3RTt0NFls1XATV02

Cheers

On Tue, Jul 7, 2015 at 11:07 AM, Arun Luthra <arun.lut...@gmail.com> wrote:

>
> https://spark.apache.org/docs/latest/api/scala/index.html#org.apache.spark.sql.hive.HiveContext
>
> I'm getting org.apache.spark.sql.catalyst.analysis.NoSuchTableException
> from:
>
> val dataframe = hiveContext.table("other_db.mytable")
>
> Do I have to change current database to access it? Is it possible to do
> this? I'm guessing that the "database.table" syntax that I used in
> hiveContext.table is not recognized.
>
> I have no problems accessing tables in the database called "default".
>
> I can list tables in "other_db" with hiveContext.tableNames("other_db")
>
> Using Spark 1.4.0.
>
>
>

Reply via email to