Hi,

I'm using Spark on top of Hive.
As I want to keep old tables I store the DataFrame into tmp table in hive
and when finished successfully I rename the table.

In last few days I've upgrade to use Spark 1.4.1, and as I'm using aws emr
I got Hive 1.0.
Now when I try to rename the table I get the following error:

Caused by: InvalidOperationException(message:Unable to access old location
hdfs://ip-10-140-189-94.ec2.internal:8020/user/hive/warehouse/<table_name>_29092015_111704_tmp
for table default.<table_name>_29092015_111704_tmp)
    at
org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$alter_table_with_environment_context_result$alter_table_with_environment_context_resultStandardScheme.read(ThriftHiveMetastore.java:34066)
    at
org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$alter_table_with_environment_context_result$alter_table_with_environment_context_resultStandardScheme.read(ThriftHiveMetastore.java:34052)
    at
org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$alter_table_with_environment_context_result.read(ThriftHiveMetastore.java:33994)
    at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:78)
    at
org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_alter_table_with_environment_context(ThriftHiveMetastore.java:1163)
    at
org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.alter_table_with_environment_context(ThriftHiveMetastore.java:1147)


I suspect that this is the bug:
https://issues.apache.org/jira/browse/HIVE-10719 but it strange cause it
works from Hive CLI.

Did anyone encounter that?
Do we have any workaround?

Thanks,
Ophir

Reply via email to