Hi,

I am working on application to move the data from multiple hive tables into
single external hive table using joins.

Spark sql is able to insert data into the table but it is losing connection
to metastore after inserting data. I still have 3 more queries to be
executed and insert data into other hive external tables. Did anyone come
across this issue?

*Logs:*

16/07/31 15:53:58 WARN ExecutorAllocationManager: Unable to reach the
cluster manager to request 2 total executors!
16/07/31 15:53:59 WARN SparkContext: Requesting executors is only supported
in coarse-grained mode
*16/07/31 15:54:00 WARN RetryingMetaStoreClient: MetaStoreClient lost
connection. Attempting to reconnect.*
*org.apache.thrift.TApplicationException: Invalid method name:
'alter_table_with_cascade'*
*        at
org.apache.thrift.TApplicationException.read(TApplicationException.java:111)*

Above log is displayed after inserting data into one table and the job is
abborted.

Query:

hiveContext.sql("insert into daastest.analytics from select query from
other tables using joins");


Thanks,
Asmath.

Reply via email to