Eugene Koifman created HIVE-17900:
-------------------------------------

             Summary: analyze stats on columns triggered by Compactor generates 
malformed SQL with > 1 partition column
                 Key: HIVE-17900
                 URL: https://issues.apache.org/jira/browse/HIVE-17900
             Project: Hive
          Issue Type: Bug
          Components: Transactions
            Reporter: Eugene Koifman


{noformat}
2017-10-16 09:01:51,255 ERROR [haddl0007.mycenterpointenergy.com-51]: ql.Driver 
(SessionState.java:printError(993)) - FAILED: ParseException line 1:70 
mismatched input 'dates' expecting ) near ''201608'' in analyze statement
org.apache.hadoop.hive.ql.parse.ParseException: line 1:70 mismatched input 
'dates' expecting ) near ''201608'' in analyze statement
        at 
org.apache.hadoop.hive.ql.parse.ParseDriver.parse(ParseDriver.java:205)
        at 
org.apache.hadoop.hive.ql.parse.ParseDriver.parse(ParseDriver.java:166)
        at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:438)
        at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:321)
        at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1221)
        at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1262)
        at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1158)
        at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1148)
        at 
org.apache.hadoop.hive.ql.txn.compactor.Worker$StatsUpdater.gatherStats(Worker.java:294)
        at 
org.apache.hadoop.hive.ql.txn.compactor.CompactorMR.run(CompactorMR.java:265)
        at org.apache.hadoop.hive.ql.txn.compactor.Worker.run(Worker.java:168)

2017-10-16 09:01:51,255 INFO  [haddl0007.mycenterpointenergy.com-51]: 
log.PerfLogger (PerfLogger.java:PerfLogEnd(177)) - </PERFLOG method=compile 
start=1508162511253 end=1508162511255 duration=2 
from=org.apache.hadoop.hive.ql.Driver>
2017-10-16 09:01:51,255 INFO  [haddl0007.mycenterpointenergy.com-51]: ql.Driver 
(Driver.java:compile(559)) - We are resetting the hadoop caller context to
2017-10-16 09:01:51,255 INFO  [haddl0007.mycenterpointenergy.com-51]: 
log.PerfLogger (PerfLogger.java:PerfLogBegin(149)) - <PERFLOG 
method=releaseLocks from=org.apache.hadoop.hive.ql.Driver>
2017-10-16 09:01:51,255 INFO  [haddl0007.mycenterpointenergy.com-51]: 
log.PerfLogger (PerfLogger.java:PerfLogEnd(177)) - </PERFLOG 
method=releaseLocks start=1508162511255 end=1508162511255 duration=0 
from=org.apache.hadoop.hive.ql.Driver>
2017-10-16 09:01:51,256 INFO  [haddl0007.mycenterpointenergy.com-51]: 
tez.TezSessionPoolManager (TezSessionPoolManager.java:close(183)) - Closing tez 
session default? false
2017-10-16 09:01:51,256 INFO  [haddl0007.mycenterpointenergy.com-51]: 
tez.TezSessionState (TezSessionState.java:close(294)) - Closing Tez Session
2017-10-16 09:01:51,256 INFO  [haddl0007.mycenterpointenergy.com-51]: 
client.TezClient (TezClient.java:stop(518)) - Shutting down Tez Session, 
sessionName=HIVE-ae652f03-72c7-4ca8-a2d8-05dcc7392f4f, 
applicationId=application_1507779664083_0159
2017-10-16 09:01:51,279 ERROR [haddl0007.mycenterpointenergy.com-51]: 
compactor.Worker (Worker.java:run(191)) - Caught exception while trying to 
compact 
id:3723,dbname:mobiusad,tableName:zces_img_data_small_pt,partName:month=201608/dates=9,state:^@,type:MAJOR,properties:null,runAs:null,tooManyAborts:false,highestTxnId:0.
  Marking failed to avoid repeated failures, java.io.IOException: Could not 
update stats for table mobiusad.zces_img_data_small_pt/month=201608/dates=9 due 
to: (40000,FAILED: ParseException line 1:70 mismatched input 'dates' expecting 
) near ''201608'' in analyze statement,42000line 1:70 mismatched input 'dates' 
expecting ) near ''201608'' in analyze statement)
        at 
org.apache.hadoop.hive.ql.txn.compactor.Worker$StatsUpdater.gatherStats(Worker.java:296)
        at 
org.apache.hadoop.hive.ql.txn.compactor.CompactorMR.run(CompactorMR.java:265)
        at org.apache.hadoop.hive.ql.txn.compactor.Worker.run(Worker.java:168)
{noformat}




--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

Reply via email to