https://spark.apache.org/docs/latest/api/scala/index.html#org.apache.spark.sql.hive.HiveContext

I'm getting org.apache.spark.sql.catalyst.analysis.NoSuchTableException
from:

val dataframe = hiveContext.table("other_db.mytable")

Do I have to change current database to access it? Is it possible to do
this? I'm guessing that the "database.table" syntax that I used in
hiveContext.table is not recognized.

I have no problems accessing tables in the database called "default".

I can list tables in "other_db" with hiveContext.tableNames("other_db")

Using Spark 1.4.0.

Reply via email to