Vandana Yadav created CARBONDATA-2210:
-----------------------------------------

             Summary: Not able to rename a partitioned table on cluster
                 Key: CARBONDATA-2210
                 URL: https://issues.apache.org/jira/browse/CARBONDATA-2210
             Project: CarbonData
          Issue Type: Bug
          Components: data-query
    Affects Versions: 1.4.0
         Environment: spark 2.2.1
            Reporter: Vandana Yadav


Not able to rename a partitioned table on cluster

Steps to reproduce:

1) Create a hive table:

  CREATE TABLE uniqdata_hive (CUST_ID int,CUST_NAME String,ACTIVE_EMUI_VERSION 
string, DOB timestamp, DOJ timestamp, BIGINT_COLUMN1 bigint,BIGINT_COLUMN2 
bigint,DECIMAL_COLUMN1 decimal(30,10), DECIMAL_COLUMN2 
decimal(36,10),Double_COLUMN1 double, Double_COLUMN2 double, INTEGER_COLUMN1 
int)ROW FORMAT DELIMITED FIELDS TERMINATED BY ','

LOAD DATA LOCAL INPATH 
'/opt/Carbon/CarbonData/TestData/Data/uniqdata/2000_UniqData.csv' into table 
UNIQDATA_HIVE

2) Create a carbon table:

CREATE TABLE uniqdata_int (CUST_NAME String,ACTIVE_EMUI_VERSION string, DOB 
timestamp, DOJ timestamp, BIGINT_COLUMN1 bigint,BIGINT_COLUMN2 
bigint,DECIMAL_COLUMN1 decimal(30,10), DECIMAL_COLUMN2 
decimal(36,10),Double_COLUMN1 double, Double_COLUMN2 double, INTEGER_COLUMN1 
int) Partitioned by (cust_id int) STORED BY 'org.apache.carbondata.format' 
TBLPROPERTIES ("TABLE_BLOCKSIZE"= "256 MB")

insert into uniqdata_int partition(cust_id)select * from uniqdata_hive limit 2

3) Execute Query:

alter table uniqdata_int rename to uniqdata_bigint1

Expected Result: table should be renamed successfully.

Actual Result:

operation failed for default.uniqdata_int: Alter table rename table operation 
failed: Folder rename failed for table default.uniqdata_int

 

logs 
[exec] 18/02/27 16:38:16 INFO SelectQuery: Executing Query: alter table 
uniqdata_int rename to uniqdata_bigint1
     [exec] 18/02/27 16:38:16 INFO CarbonSparkSqlParser: Parsing command: alter 
table uniqdata_int rename to uniqdata_bigint1
     [exec] 18/02/27 16:38:16 INFO CarbonLateDecodeRule: main skip 
CarbonOptimizer
     [exec] 18/02/27 16:38:16 INFO CarbonLateDecodeRule: main Skip 
CarbonOptimizer
     [exec] 18/02/27 16:38:16 INFO HiveMetaStore: 0: get_table : db=default 
tbl=uniqdata_int
     [exec] 18/02/27 16:38:16 INFO audit: ugi=root      ip=unknown-ip-addr      
cmd=get_table : db=default tbl=uniqdata_int     
     [exec] 18/02/27 16:38:16 INFO CatalystSqlParser: Parsing command: int
     [exec] 18/02/27 16:38:16 INFO CatalystSqlParser: Parsing command: 
array<string>
     [exec] 18/02/27 16:38:16 INFO HiveMetaStore: 0: get_table : db=default 
tbl=uniqdata_bigint1
     [exec] 18/02/27 16:38:16 INFO audit: ugi=root      ip=unknown-ip-addr      
cmd=get_table : db=default tbl=uniqdata_bigint1 
     [exec] 18/02/27 16:38:16 AUDIT CarbonAlterTableRenameCommand: 
[hadoop-master][root][Thread-1]Rename table request has been received for 
default.uniqdata_int
     [exec] 18/02/27 16:38:16 INFO CarbonAlterTableRenameCommand: main Rename 
table request has been received for default.uniqdata_int
     [exec] 18/02/27 16:38:16 INFO HiveMetaStore: 0: get_table : db=default 
tbl=uniqdata_int
     [exec] 18/02/27 16:38:16 INFO audit: ugi=root      ip=unknown-ip-addr      
cmd=get_table : db=default tbl=uniqdata_int     
     [exec] 18/02/27 16:38:16 INFO CatalystSqlParser: Parsing command: int
     [exec] 18/02/27 16:38:16 INFO CatalystSqlParser: Parsing command: 
array<string>
     [exec] 18/02/27 16:38:16 INFO HiveMetaStore: 0: get_table : db=default 
tbl=uniqdata_int
     [exec] 18/02/27 16:38:16 INFO audit: ugi=root      ip=unknown-ip-addr      
cmd=get_table : db=default tbl=uniqdata_int     
     [exec] 18/02/27 16:38:16 INFO CatalystSqlParser: Parsing command: int
     [exec] 18/02/27 16:38:16 INFO CatalystSqlParser: Parsing command: 
array<string>
     [exec] 18/02/27 16:38:16 INFO HdfsFileLock: main HDFS lock 
path:hdfs://hadoop-master:54311/opt/CarbonStore/default/uniqdata_int/meta.lock
     [exec] 18/02/27 16:38:16 INFO CarbonLockUtil: main Trying to acquire lock: 
meta.lockfor table: default_uniqdata_int
     [exec] 18/02/27 16:38:16 INFO CarbonLockUtil: main Successfully acquired 
the lock meta.lockfor table: default_uniqdata_int
     [exec] 18/02/27 16:38:16 INFO HdfsFileLock: main HDFS lock 
path:hdfs://hadoop-master:54311/opt/CarbonStore/default/uniqdata_int/compaction.lock
     [exec] 18/02/27 16:38:16 INFO CarbonLockUtil: main Trying to acquire lock: 
compaction.lockfor table: default_uniqdata_int
     [exec] 18/02/27 16:38:16 INFO CarbonLockUtil: main Successfully acquired 
the lock compaction.lockfor table: default_uniqdata_int
     [exec] 18/02/27 16:38:16 INFO HdfsFileLock: main HDFS lock 
path:hdfs://hadoop-master:54311/opt/CarbonStore/default/uniqdata_int/delete_segment.lock
     [exec] 18/02/27 16:38:16 INFO CarbonLockUtil: main Trying to acquire lock: 
delete_segment.lockfor table: default_uniqdata_int
     [exec] 18/02/27 16:38:16 INFO CarbonLockUtil: main Successfully acquired 
the lock delete_segment.lockfor table: default_uniqdata_int
     [exec] 18/02/27 16:38:16 INFO HdfsFileLock: main HDFS lock 
path:hdfs://hadoop-master:54311/opt/CarbonStore/default/uniqdata_int/clean_files.lock
     [exec] 18/02/27 16:38:16 INFO CarbonLockUtil: main Trying to acquire lock: 
clean_files.lockfor table: default_uniqdata_int
     [exec] 18/02/27 16:38:16 INFO CarbonLockUtil: main Successfully acquired 
the lock clean_files.lockfor table: default_uniqdata_int
     [exec] 18/02/27 16:38:16 INFO HdfsFileLock: main HDFS lock 
path:hdfs://hadoop-master:54311/opt/CarbonStore/default/uniqdata_int/droptable.lock
     [exec] 18/02/27 16:38:16 INFO CarbonLockUtil: main Trying to acquire lock: 
droptable.lockfor table: default_uniqdata_int
     [exec] 18/02/27 16:38:16 INFO CarbonLockUtil: main Successfully acquired 
the lock droptable.lockfor table: default_uniqdata_int
     [exec] 18/02/27 16:38:16 INFO HiveMetaStore: 0: get_table : db=default 
tbl=uniqdata_int
     [exec] 18/02/27 16:38:16 INFO audit: ugi=root      ip=unknown-ip-addr      
cmd=get_table : db=default tbl=uniqdata_int     
     [exec] 18/02/27 16:38:16 INFO CatalystSqlParser: Parsing command: int
     [exec] 18/02/27 16:38:16 INFO CatalystSqlParser: Parsing command: 
array<string>
     [exec] 18/02/27 16:38:16 INFO CarbonLRUCache: main Removed entry from 
InMemory lru cache :: 
default/uniqdata_int_/0/100000100001_batchno0-0-1519729675075.carbonindex
     [exec] 18/02/27 16:38:16 INFO CarbonLRUCache: main Removed entry from 
InMemory lru cache :: 
default/uniqdata_int_/0/100000100002_batchno0-0-1519729675075.carbonindex
     [exec] 18/02/27 16:38:16 INFO CarbonSparkSqlParser: Parsing command: 
`default`.`uniqdata_int`
     [exec] 18/02/27 16:38:16 INFO HiveMetaStore: 0: get_database: default
     [exec] 18/02/27 16:38:16 INFO audit: ugi=root      ip=unknown-ip-addr      
cmd=get_database: default       
     [exec] 18/02/27 16:38:16 INFO HiveMetaStore: 0: get_table : db=default 
tbl=uniqdata_int
     [exec] 18/02/27 16:38:16 INFO audit: ugi=root      ip=unknown-ip-addr      
cmd=get_table : db=default tbl=uniqdata_int     
     [exec] 18/02/27 16:38:16 INFO HiveMetaStore: 0: get_table : db=default 
tbl=uniqdata_int
     [exec] 18/02/27 16:38:16 INFO audit: ugi=root      ip=unknown-ip-addr      
cmd=get_table : db=default tbl=uniqdata_int     
     [exec] 18/02/27 16:38:16 INFO CatalystSqlParser: Parsing command: int
     [exec] 18/02/27 16:38:16 INFO CatalystSqlParser: Parsing command: 
array<string>
     [exec] 18/02/27 16:38:16 INFO HiveMetaStore: 0: get_database: default
     [exec] 18/02/27 16:38:16 INFO audit: ugi=root      ip=unknown-ip-addr      
cmd=get_database: default       
     [exec] 18/02/27 16:38:16 INFO HiveMetaStore: 0: get_table : db=default 
tbl=uniqdata_int
     [exec] 18/02/27 16:38:16 INFO audit: ugi=root      ip=unknown-ip-addr      
cmd=get_table : db=default tbl=uniqdata_int     
     [exec] 18/02/27 16:38:16 INFO HiveMetaStore: 0: get_table : db=default 
tbl=uniqdata_int
     [exec] 18/02/27 16:38:16 INFO audit: ugi=root      ip=unknown-ip-addr      
cmd=get_table : db=default tbl=uniqdata_int     
     [exec] 18/02/27 16:38:16 INFO CatalystSqlParser: Parsing command: int
     [exec] 18/02/27 16:38:16 INFO CatalystSqlParser: Parsing command: 
array<string>
     [exec] 18/02/27 16:38:16 INFO HiveMetaStore: 0: get_database: default
     [exec] 18/02/27 16:38:16 INFO audit: ugi=root      ip=unknown-ip-addr      
cmd=get_database: default       
     [exec] 18/02/27 16:38:16 INFO HiveMetaStore: 0: get_database: default
     [exec] 18/02/27 16:38:16 INFO audit: ugi=root      ip=unknown-ip-addr      
cmd=get_database: default       
     [exec] 18/02/27 16:38:16 INFO HiveMetaStore: 0: get_tables: db=default 
pat=*
     [exec] 18/02/27 16:38:16 INFO audit: ugi=root      ip=unknown-ip-addr      
cmd=get_tables: db=default pat=*        
     [exec] 18/02/27 16:38:16 INFO PerfLogger: <PERFLOG method=Driver.run 
from=org.apache.hadoop.hive.ql.Driver>
     [exec] 18/02/27 16:38:16 INFO PerfLogger: <PERFLOG method=TimeToSubmit 
from=org.apache.hadoop.hive.ql.Driver>
     [exec] 18/02/27 16:38:16 INFO PerfLogger: <PERFLOG method=compile 
from=org.apache.hadoop.hive.ql.Driver>
     [exec] 18/02/27 16:38:16 INFO PerfLogger: <PERFLOG method=parse 
from=org.apache.hadoop.hive.ql.Driver>
     [exec] 18/02/27 16:38:16 INFO ParseDriver: Parsing command: ALTER TABLE 
default.uniqdata_int RENAME TO default.uniqdata_bigint1
     [exec] 18/02/27 16:38:16 INFO ParseDriver: Parse Completed
     [exec] 18/02/27 16:38:16 INFO PerfLogger: </PERFLOG method=parse 
start=1519729696538 end=1519729696539 duration=1 
from=org.apache.hadoop.hive.ql.Driver>
     [exec] 18/02/27 16:38:16 INFO PerfLogger: <PERFLOG method=semanticAnalyze 
from=org.apache.hadoop.hive.ql.Driver>
     [exec] 18/02/27 16:38:16 INFO HiveMetaStore: 0: get_table : db=default 
tbl=uniqdata_int
     [exec] 18/02/27 16:38:16 INFO audit: ugi=root      ip=unknown-ip-addr      
cmd=get_table : db=default tbl=uniqdata_int     
     [exec] 18/02/27 16:38:16 INFO Driver: Semantic Analysis Completed
     [exec] 18/02/27 16:38:16 INFO PerfLogger: </PERFLOG method=semanticAnalyze 
start=1519729696539 end=1519729696542 duration=3 
from=org.apache.hadoop.hive.ql.Driver>
     [exec] 18/02/27 16:38:16 INFO Driver: Returning Hive schema: 
Schema(fieldSchemas:null, properties:null)
     [exec] 18/02/27 16:38:16 INFO PerfLogger: </PERFLOG method=compile 
start=1519729696538 end=1519729696542 duration=4 
from=org.apache.hadoop.hive.ql.Driver>
     [exec] 18/02/27 16:38:16 INFO Driver: Concurrency mode is disabled, not 
creating a lock manager
     [exec] 18/02/27 16:38:16 INFO PerfLogger: <PERFLOG method=Driver.execute 
from=org.apache.hadoop.hive.ql.Driver>
     [exec] 18/02/27 16:38:16 INFO Driver: Starting 
command(queryId=root_20180227163816_c6a748b9-74de-4876-8686-13c92269add0): 
ALTER TABLE default.uniqdata_int RENAME TO default.uniqdata_bigint1
     [exec] 18/02/27 16:38:16 INFO PerfLogger: </PERFLOG method=TimeToSubmit 
start=1519729696538 end=1519729696542 duration=4 
from=org.apache.hadoop.hive.ql.Driver>
     [exec] 18/02/27 16:38:16 INFO PerfLogger: <PERFLOG method=runTasks 
from=org.apache.hadoop.hive.ql.Driver>
     [exec] 18/02/27 16:38:16 INFO PerfLogger: <PERFLOG method=task.DDL.Stage-0 
from=org.apache.hadoop.hive.ql.Driver>
     [exec] 18/02/27 16:38:16 INFO Driver: Starting task [Stage-0:DDL] in 
serial mode
     [exec] 18/02/27 16:38:16 INFO HiveMetaStore: 0: get_table : db=default 
tbl=uniqdata_int
     [exec] 18/02/27 16:38:16 INFO audit: ugi=root      ip=unknown-ip-addr      
cmd=get_table : db=default tbl=uniqdata_int     
     [exec] 18/02/27 16:38:16 INFO HiveMetaStore: 0: alter_table: db=default 
tbl=uniqdata_int newtbl=uniqdata_bigint1
     [exec] 18/02/27 16:38:16 INFO audit: ugi=root      ip=unknown-ip-addr      
cmd=alter_table: db=default tbl=uniqdata_int newtbl=uniqdata_bigint1    
     [exec] 18/02/27 16:38:16 INFO PerfLogger: </PERFLOG method=runTasks 
start=1519729696542 end=1519729696568 duration=26 
from=org.apache.hadoop.hive.ql.Driver>
     [exec] 18/02/27 16:38:16 INFO PerfLogger: </PERFLOG method=Driver.execute 
start=1519729696542 end=1519729696568 duration=26 
from=org.apache.hadoop.hive.ql.Driver>
     [exec] 18/02/27 16:38:16 INFO Driver: OK
     [exec] 18/02/27 16:38:16 INFO PerfLogger: <PERFLOG method=releaseLocks 
from=org.apache.hadoop.hive.ql.Driver>
     [exec] 18/02/27 16:38:16 INFO PerfLogger: </PERFLOG method=releaseLocks 
start=1519729696569 end=1519729696569 duration=0 
from=org.apache.hadoop.hive.ql.Driver>
     [exec] 18/02/27 16:38:16 INFO PerfLogger: </PERFLOG method=Driver.run 
start=1519729696538 end=1519729696569 duration=31 
from=org.apache.hadoop.hive.ql.Driver>
     [exec] 18/02/27 16:38:16 INFO PerfLogger: <PERFLOG method=releaseLocks 
from=org.apache.hadoop.hive.ql.Driver>
     [exec] 18/02/27 16:38:16 INFO PerfLogger: </PERFLOG method=releaseLocks 
start=1519729696569 end=1519729696569 duration=0 
from=org.apache.hadoop.hive.ql.Driver>
     [exec] 18/02/27 16:38:16 INFO PerfLogger: <PERFLOG method=Driver.run 
from=org.apache.hadoop.hive.ql.Driver>
     [exec] 18/02/27 16:38:16 INFO PerfLogger: <PERFLOG method=TimeToSubmit 
from=org.apache.hadoop.hive.ql.Driver>
     [exec] 18/02/27 16:38:16 INFO PerfLogger: <PERFLOG method=compile 
from=org.apache.hadoop.hive.ql.Driver>
     [exec] 18/02/27 16:38:16 INFO PerfLogger: <PERFLOG method=parse 
from=org.apache.hadoop.hive.ql.Driver>
     [exec] 18/02/27 16:38:16 INFO ParseDriver: Parsing command: ALTER TABLE 
default.uniqdata_bigint1 SET SERDEPROPERTIES('tableName'='uniqdata_bigint1', 
'dbName'='default', 
'tablePath'='hdfs://hadoop-master:54311/opt/CarbonStore/default/uniqdata_bigint1')
     [exec] 18/02/27 16:38:16 INFO ParseDriver: Parse Completed
     [exec] 18/02/27 16:38:16 INFO PerfLogger: </PERFLOG method=parse 
start=1519729696569 end=1519729696570 duration=1 
from=org.apache.hadoop.hive.ql.Driver>
     [exec] 18/02/27 16:38:16 INFO PerfLogger: <PERFLOG method=semanticAnalyze 
from=org.apache.hadoop.hive.ql.Driver>
     [exec] 18/02/27 16:38:16 INFO HiveMetaStore: 0: get_table : db=default 
tbl=uniqdata_bigint1
     [exec] 18/02/27 16:38:16 INFO audit: ugi=root      ip=unknown-ip-addr      
cmd=get_table : db=default tbl=uniqdata_bigint1 
     [exec] 18/02/27 16:38:16 INFO Driver: Semantic Analysis Completed
     [exec] 18/02/27 16:38:16 INFO PerfLogger: </PERFLOG method=semanticAnalyze 
start=1519729696570 end=1519729696573 duration=3 
from=org.apache.hadoop.hive.ql.Driver>
     [exec] 18/02/27 16:38:16 INFO Driver: Returning Hive schema: 
Schema(fieldSchemas:null, properties:null)
     [exec] 18/02/27 16:38:16 INFO PerfLogger: </PERFLOG method=compile 
start=1519729696569 end=1519729696573 duration=4 
from=org.apache.hadoop.hive.ql.Driver>
     [exec] 18/02/27 16:38:16 INFO Driver: Concurrency mode is disabled, not 
creating a lock manager
     [exec] 18/02/27 16:38:16 INFO PerfLogger: <PERFLOG method=Driver.execute 
from=org.apache.hadoop.hive.ql.Driver>
     [exec] 18/02/27 16:38:16 INFO Driver: Starting 
command(queryId=root_20180227163816_4175ddb2-e8a0-4538-bab4-5688fe7b2a10): 
ALTER TABLE default.uniqdata_bigint1 SET 
SERDEPROPERTIES('tableName'='uniqdata_bigint1', 'dbName'='default', 
'tablePath'='hdfs://hadoop-master:54311/opt/CarbonStore/default/uniqdata_bigint1')
     [exec] 18/02/27 16:38:16 INFO PerfLogger: </PERFLOG method=TimeToSubmit 
start=1519729696569 end=1519729696573 duration=4 
from=org.apache.hadoop.hive.ql.Driver>
     [exec] 18/02/27 16:38:16 INFO PerfLogger: <PERFLOG method=runTasks 
from=org.apache.hadoop.hive.ql.Driver>
     [exec] 18/02/27 16:38:16 INFO PerfLogger: <PERFLOG method=task.DDL.Stage-0 
from=org.apache.hadoop.hive.ql.Driver>
     [exec] 18/02/27 16:38:16 INFO Driver: Starting task [Stage-0:DDL] in 
serial mode
     [exec] 18/02/27 16:38:16 INFO HiveMetaStore: 0: get_table : db=default 
tbl=uniqdata_bigint1
     [exec] 18/02/27 16:38:16 INFO audit: ugi=root      ip=unknown-ip-addr      
cmd=get_table : db=default tbl=uniqdata_bigint1 
     [exec] 18/02/27 16:38:16 INFO HiveMetaStore: 0: alter_table: db=default 
tbl=uniqdata_bigint1 newtbl=uniqdata_bigint1
     [exec] 18/02/27 16:38:16 INFO audit: ugi=root      ip=unknown-ip-addr      
cmd=alter_table: db=default tbl=uniqdata_bigint1 newtbl=uniqdata_bigint1        
     [exec] 18/02/27 16:38:16 INFO PerfLogger: </PERFLOG method=runTasks 
start=1519729696573 end=1519729696593 duration=20 
from=org.apache.hadoop.hive.ql.Driver>
     [exec] 18/02/27 16:38:16 INFO PerfLogger: </PERFLOG method=Driver.execute 
start=1519729696573 end=1519729696593 duration=20 
from=org.apache.hadoop.hive.ql.Driver>
     [exec] 18/02/27 16:38:16 INFO Driver: OK
     [exec] 18/02/27 16:38:16 INFO PerfLogger: <PERFLOG method=releaseLocks 
from=org.apache.hadoop.hive.ql.Driver>
     [exec] 18/02/27 16:38:16 INFO PerfLogger: </PERFLOG method=releaseLocks 
start=1519729696594 end=1519729696594 duration=0 
from=org.apache.hadoop.hive.ql.Driver>
     [exec] 18/02/27 16:38:16 INFO PerfLogger: </PERFLOG method=Driver.run 
start=1519729696569 end=1519729696594 duration=25 
from=org.apache.hadoop.hive.ql.Driver>
     [exec] 18/02/27 16:38:16 INFO PerfLogger: <PERFLOG method=releaseLocks 
from=org.apache.hadoop.hive.ql.Driver>
     [exec] 18/02/27 16:38:16 INFO PerfLogger: </PERFLOG method=releaseLocks 
start=1519729696594 end=1519729696594 duration=0 
from=org.apache.hadoop.hive.ql.Driver>
     [exec] 18/02/27 16:38:16 ERROR HDFSCarbonFile: main Exception occured: 
rename destination directory is not empty: 
/opt/CarbonStore/default/uniqdata_bigint1
     [exec]     at 
org.apache.hadoop.hdfs.server.namenode.FSDirRenameOp.validateOverwrite(FSDirRenameOp.java:529)
     [exec]     at 
org.apache.hadoop.hdfs.server.namenode.FSDirRenameOp.unprotectedRenameTo(FSDirRenameOp.java:364)
     [exec]     at 
org.apache.hadoop.hdfs.server.namenode.FSDirRenameOp.renameTo(FSDirRenameOp.java:282)
     [exec]     at 
org.apache.hadoop.hdfs.server.namenode.FSDirRenameOp.renameToInt(FSDirRenameOp.java:247)
     [exec]     at 
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.renameTo(FSNamesystem.java:3675)
     [exec]     at 
org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.rename2(NameNodeRpcServer.java:913)
     [exec]     at 
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.rename2(ClientNamenodeProtocolServerSideTranslatorPB.java:587)
     [exec]     at 
org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
     [exec]     at 
org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:616)
     [exec]     at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:969)
     [exec]     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2049)
     [exec]     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2045)
     [exec]     at java.security.AccessController.doPrivileged(Native Method)
     [exec]     at javax.security.auth.Subject.doAs(Subject.java:422)
     [exec]     at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
     [exec]     at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2043)
     [exec] 
     [exec] 18/02/27 16:38:16 ERROR CarbonAlterTableRenameCommand: main Rename 
table failed: Folder rename failed for table default.uniqdata_int
     [exec] java.lang.RuntimeException: Folder rename failed for table 
default.uniqdata_int
     [exec]     at scala.sys.package$.error(package.scala:27)
     [exec]     at 
org.apache.spark.sql.execution.command.schema.CarbonAlterTableRenameCommand.processMetadata(CarbonAlterTableRenameCommand.scala:138)
     [exec]     at 
org.apache.spark.sql.execution.command.MetadataCommand.run(package.scala:68)
     [exec]     at 
org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:58)
     [exec]     at 
org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:56)
     [exec]     at 
org.apache.spark.sql.execution.command.ExecutedCommandExec.executeCollect(commands.scala:67)
     [exec]     at org.apache.spark.sql.Dataset.<init>(Dataset.scala:183)
     [exec]     at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:68)
     [exec]     at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:632)
     [exec]     at com.huawei.spark.SessionManager.sql(SessionManager.java:42)
     [exec]     at 
com.huawei.querymanagement.QueryManagement.sql(QueryManagement.java:62)
     [exec]     at 
com.huawei.querymanagement.SelectQuery.testQuery(SelectQuery.java:70)
     [exec]     at sun.reflect.GeneratedMethodAccessor88.invoke(Unknown Source)
     [exec]     at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
     [exec]     at java.lang.reflect.Method.invoke(Method.java:498)
     [exec]     at 
org.junit.internal.runners.TestMethod.invoke(TestMethod.java:59)
     [exec]     at 
org.junit.internal.runners.MethodRoadie.runTestMethod(MethodRoadie.java:98)
     [exec]     at 
org.junit.internal.runners.MethodRoadie$2.run(MethodRoadie.java:79)
     [exec]     at 
org.junit.internal.runners.MethodRoadie.runBeforesThenTestThenAfters(MethodRoadie.java:87)
     [exec]     at 
org.junit.internal.runners.MethodRoadie.runTest(MethodRoadie.java:77)
     [exec]     at 
org.junit.internal.runners.MethodRoadie.run(MethodRoadie.java:42)
     [exec]     at 
org.junit.internal.runners.JUnit4ClassRunner.invokeTestMethod(JUnit4ClassRunner.java:88)
     [exec]     at 
org.junit.internal.runners.JUnit4ClassRunner.runMethods(JUnit4ClassRunner.java:51)
     [exec]     at 
org.junit.runners.Parameterized$TestClassRunnerForParameters.run(Parameterized.java:98)
     [exec]     at 
org.junit.internal.runners.CompositeRunner.runChildren(CompositeRunner.java:33)
     [exec]     at 
org.junit.runners.Parameterized.access$000(Parameterized.java:55)
     [exec]     at org.junit.runners.Parameterized$1.run(Parameterized.java:131)
     [exec]     at 
org.junit.internal.runners.ClassRoadie.runUnprotected(ClassRoadie.java:27)
     [exec]     at 
org.junit.internal.runners.ClassRoadie.runProtected(ClassRoadie.java:37)
     [exec]     at org.junit.runners.Parameterized.run(Parameterized.java:129)
     [exec]     at 
org.junit.internal.runners.CompositeRunner.runChildren(CompositeRunner.java:33)
     [exec]     at 
org.junit.internal.runners.CompositeRunner.run(CompositeRunner.java:28)
     [exec]     at org.junit.runner.JUnitCore.run(JUnitCore.java:130)
     [exec]     at org.junit.runner.JUnitCore.run(JUnitCore.java:109)
     [exec]     at org.junit.runner.JUnitCore.run(JUnitCore.java:100)
     [exec]     at org.junit.runner.JUnitCore.runClasses(JUnitCore.java:60)
     [exec]     at 
com.huawei.querymanagement.SelectQuerySuite.main(SelectQuerySuite.java:18)
     [exec]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
     [exec]     at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
     [exec]     at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
     [exec]     at java.lang.reflect.Method.invoke(Method.java:498)
     [exec]     at 
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:775)
     [exec]     at 
org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
     [exec]     at 
org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
     [exec]     at 
org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:119)
     [exec]     at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
     [exec] 18/02/27 16:38:16 INFO AlterTableUtil$: main Alter table lock 
released successfully: meta.lock
     [exec] 18/02/27 16:38:16 INFO AlterTableUtil$: main Alter table lock 
released successfully: compaction.lock
     [exec] 18/02/27 16:38:16 INFO AlterTableUtil$: main Alter table lock 
released successfully: delete_segment.lock
     [exec] 18/02/27 16:38:16 INFO AlterTableUtil$: main Alter table lock 
released successfully: clean_files.lock
     [exec] 18/02/27 16:38:16 INFO AlterTableUtil$: main Alter table lock 
released successfully: droptable.lock
     [exec] 18/02/27 16:38:16 ERROR SelectQuery: An exception has occurred: 
     [exec] org.apache.carbondata.spark.exception.ProcessMetaDataException: 
operation failed for default.uniqdata_int: Alter table rename table operation 
failed: Folder rename failed for table default.uniqdata_int
     [exec]     at 
org.apache.spark.sql.execution.command.MetadataProcessOpeation$class.throwMetadataException(package.scala:52)
     [exec]     at 
org.apache.spark.sql.execution.command.MetadataCommand.throwMetadataException(package.scala:66)
     [exec]     at 
org.apache.spark.sql.execution.command.schema.CarbonAlterTableRenameCommand.processMetadata(CarbonAlterTableRenameCommand.scala:174)
     [exec]     at 
org.apache.spark.sql.execution.command.MetadataCommand.run(package.scala:68)
     [exec]     at 
org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:58)
     [exec]     at 
org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:56)
     [exec]     at 
org.apache.spark.sql.execution.command.ExecutedCommandExec.executeCollect(commands.scala:67)
     [exec]     at org.apache.spark.sql.Dataset.<init>(Dataset.scala:183)
     [exec]     at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:68)
     [exec]     at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:632)
     [exec]     at com.huawei.spark.SessionManager.sql(SessionManager.java:42)
     [exec]     at 
com.huawei.querymanagement.QueryManagement.sql(QueryManagement.java:62)
     [exec]     at 
com.huawei.querymanagement.SelectQuery.testQuery(SelectQuery.java:70)
     [exec]     at sun.reflect.GeneratedMethodAccessor88.invoke(Unknown Source)
     [exec]     at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
     [exec]     at java.lang.reflect.Method.invoke(Method.java:498)
     [exec]     at 
org.junit.internal.runners.TestMethod.invoke(TestMethod.java:59)
     [exec]     at 
org.junit.internal.runners.MethodRoadie.runTestMethod(MethodRoadie.java:98)
     [exec]     at 
org.junit.internal.runners.MethodRoadie$2.run(MethodRoadie.java:79)
     [exec]     at 
org.junit.internal.runners.MethodRoadie.runBeforesThenTestThenAfters(MethodRoadie.java:87)
     [exec]     at 
org.junit.internal.runners.MethodRoadie.runTest(MethodRoadie.java:77)
     [exec]     at 
org.junit.internal.runners.MethodRoadie.run(MethodRoadie.java:42)
     [exec]     at 
org.junit.internal.runners.JUnit4ClassRunner.invokeTestMethod(JUnit4ClassRunner.java:88)
     [exec]     at 
org.junit.internal.runners.JUnit4ClassRunner.runMethods(JUnit4ClassRunner.java:51)
     [exec]     at 
org.junit.runners.Parameterized$TestClassRunnerForParameters.run(Parameterized.java:98)
     [exec]     at 
org.junit.internal.runners.CompositeRunner.runChildren(CompositeRunner.java:33)
     [exec]     at 
org.junit.runners.Parameterized.access$000(Parameterized.java:55)
     [exec]     at org.junit.runners.Parameterized$1.run(Parameterized.java:131)
     [exec]     at 
org.junit.internal.runners.ClassRoadie.runUnprotected(ClassRoadie.java:27)
     [exec]     at 
org.junit.internal.runners.ClassRoadie.runProtected(ClassRoadie.java:37)
     [exec]     at org.junit.runners.Parameterized.run(Parameterized.java:129)
     [exec]     at 
org.junit.internal.runners.CompositeRunner.runChildren(CompositeRunner.java:33)
     [exec]     at 
org.junit.internal.runners.CompositeRunner.run(CompositeRunner.java:28)
     [exec]     at org.junit.runner.JUnitCore.run(JUnitCore.java:130)
     [exec]     at org.junit.runner.JUnitCore.run(JUnitCore.java:109)
     [exec]     at org.junit.runner.JUnitCore.run(JUnitCore.java:100)
     [exec]     at org.junit.runner.JUnitCore.runClasses(JUnitCore.java:60)
     [exec]     at 
com.huawei.querymanagement.SelectQuerySuite.main(SelectQuerySuite.java:18)
     [exec]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
     [exec]     at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
     [exec]     at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
     [exec]     at java.lang.reflect.Method.invoke(Method.java:498)
     [exec]     at 
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:775)
     [exec]     at 
org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
     [exec]     at 
org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
     [exec]     at 
org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:119)
     [exec]     at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

Reply via email to