https://spark.apache.org/docs/latest/api/scala/index.html#org.apache.spark.sql.hive.HiveContext
I'm getting org.apache.spark.sql.catalyst.analysis.NoSuchTableException
from:
val dataframe = hiveContext.table(other_db.mytable)
Do I have to change current database to access it? Is it possible to
See this thread http://search-hadoop.com/m/q3RTt0NFls1XATV02
Cheers
On Tue, Jul 7, 2015 at 11:07 AM, Arun Luthra arun.lut...@gmail.com wrote:
https://spark.apache.org/docs/latest/api/scala/index.html#org.apache.spark.sql.hive.HiveContext
I'm getting