[ 
https://issues.apache.org/jira/browse/HIVE-25109?focusedWorklogId=599190&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-599190
 ]

ASF GitHub Bot logged work on HIVE-25109:
-----------------------------------------

                Author: ASF GitHub Bot
            Created on: 19/May/21 12:41
            Start Date: 19/May/21 12:41
    Worklog Time Spent: 10m 
      Work Description: kasakrisz commented on a change in pull request #2268:
URL: https://github.com/apache/hive/pull/2268#discussion_r635199267



##########
File path: ql/src/java/org/apache/hadoop/hive/ql/parse/CalcitePlanner.java
##########
@@ -5031,7 +5038,7 @@ private RelNode genLogicalPlan(QB qb, boolean outerMostQB,
 
       // Build Rel for Constraint checks
       Pair<RelNode, RowResolver> constraintPair =
-          genConstraintFilterLogicalPlan(qb, srcRel, outerNameToPosMap, 
outerRR);
+          genConstraintFilterLogicalPlan(qb, selPair, outerNameToPosMap, 
outerRR);

Review comment:
       Went through the code where `selectRel` gets its value and I found that 
it can not be null:
   If is coming from `internalGenSelectLogicalPlan` which can create it with 
the following way
   ```
   outputRel = genUDTFPlan(genericUDTF, genericUDTFName, udtfTableAlias, 
udtfColAliases, qb,
   ...
           RelNode udtf = HiveTableFunctionScan.create(cluster, traitSet, list, 
rexNode, null, retType,
             null);
   outputRel = genSelectRelNode(columnList, outputRR, srcRel);
   ...
         HiveRelNode selRel = HiveProject.create(
   
   outputRel = new HiveAggregate(cluster, 
cluster.traitSetOf(HiveRelNode.CONVENTION),
   ```




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
-------------------

    Worklog Id:     (was: 599190)
    Time Spent: 40m  (was: 0.5h)

> CBO fails when updating table has constraints defined
> -----------------------------------------------------
>
>                 Key: HIVE-25109
>                 URL: https://issues.apache.org/jira/browse/HIVE-25109
>             Project: Hive
>          Issue Type: Bug
>          Components: CBO, Logical Optimizer
>            Reporter: Krisztian Kasa
>            Assignee: Krisztian Kasa
>            Priority: Major
>              Labels: pull-request-available
>          Time Spent: 40m
>  Remaining Estimate: 0h
>
> {code}
> create table acid_uami_n0(i int,
>                  de decimal(5,2) constraint nn1 not null enforced,
>                  vc varchar(128) constraint ch2 CHECK (de >= cast(i as 
> decimal(5,2))) enforced)
>                  clustered by (i) into 2 buckets stored as orc TBLPROPERTIES 
> ('transactional'='true');
> -- update
> explain cbo
> update acid_uami_n0 set de = 893.14 where de = 103.00;
> {code}
> hive.log
> {code}
> 2021-05-13T06:08:05,547 ERROR [061f4d3b-9cbd-464f-80db-f0cd443dc3d7 main] 
> parse.UpdateDeleteSemanticAnalyzer: CBO failed, skipping CBO. 
> org.apache.hadoop.hive.ql.optimizer.calcite.CalciteSemanticException: Result 
> Schema didn't match Optimized Op Tree Schema
>         at 
> org.apache.hadoop.hive.ql.optimizer.calcite.translator.PlanModifierForASTConv.renameTopLevelSelectInResultSchema(PlanModifierForASTConv.java:217)
>  ~[hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
>         at 
> org.apache.hadoop.hive.ql.optimizer.calcite.translator.PlanModifierForASTConv.convertOpTree(PlanModifierForASTConv.java:105)
>  ~[hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
>         at 
> org.apache.hadoop.hive.ql.optimizer.calcite.translator.ASTConverter.convert(ASTConverter.java:119)
>  ~[hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
>         at 
> org.apache.hadoop.hive.ql.parse.CalcitePlanner.getOptimizedAST(CalcitePlanner.java:1410)
>  ~[hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
>         at 
> org.apache.hadoop.hive.ql.parse.CalcitePlanner.genOPTree(CalcitePlanner.java:572)
>  [hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
>         at 
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:12488)
>  [hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
>         at 
> org.apache.hadoop.hive.ql.parse.CalcitePlanner.analyzeInternal(CalcitePlanner.java:449)
>  [hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
>         at 
> org.apache.hadoop.hive.ql.parse.RewriteSemanticAnalyzer.analyzeInternal(RewriteSemanticAnalyzer.java:67)
>  [hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
>         at 
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:316)
>  [hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
>         at 
> org.apache.hadoop.hive.ql.parse.UpdateDeleteSemanticAnalyzer.reparseAndSuperAnalyze(UpdateDeleteSemanticAnalyzer.java:208)
>  [hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
>         at 
> org.apache.hadoop.hive.ql.parse.UpdateDeleteSemanticAnalyzer.analyzeUpdate(UpdateDeleteSemanticAnalyzer.java:63)
>  [hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
>         at 
> org.apache.hadoop.hive.ql.parse.UpdateDeleteSemanticAnalyzer.analyze(UpdateDeleteSemanticAnalyzer.java:53)
>  [hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
>         at 
> org.apache.hadoop.hive.ql.parse.RewriteSemanticAnalyzer.analyzeInternal(RewriteSemanticAnalyzer.java:72)
>  [hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
>         at 
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:316)
>  [hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
>         at 
> org.apache.hadoop.hive.ql.parse.ExplainSemanticAnalyzer.analyzeInternal(ExplainSemanticAnalyzer.java:171)
>  [hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
>         at 
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:316)
>  [hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
>         at org.apache.hadoop.hive.ql.Compiler.analyze(Compiler.java:223) 
> [hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
>         at org.apache.hadoop.hive.ql.Compiler.compile(Compiler.java:104) 
> [hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
>         at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:492) 
> [hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
>         at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:445) 
> [hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
>         at 
> org.apache.hadoop.hive.ql.Driver.compileAndRespond(Driver.java:409) 
> [hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
>         at 
> org.apache.hadoop.hive.ql.Driver.compileAndRespond(Driver.java:403) 
> [hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
>         at 
> org.apache.hadoop.hive.ql.reexec.ReExecDriver.compileAndRespond(ReExecDriver.java:125)
>  [hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
>         at 
> org.apache.hadoop.hive.ql.reexec.ReExecDriver.run(ReExecDriver.java:229) 
> [hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
>         at 
> org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:258) 
> [hive-cli-4.0.0-SNAPSHOT.jar:?]
>         at 
> org.apache.hadoop.hive.cli.CliDriver.processCmd1(CliDriver.java:203) 
> [hive-cli-4.0.0-SNAPSHOT.jar:?]
>         at 
> org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:129) 
> [hive-cli-4.0.0-SNAPSHOT.jar:?]
>         at 
> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:424) 
> [hive-cli-4.0.0-SNAPSHOT.jar:?]
>         at 
> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:355) 
> [hive-cli-4.0.0-SNAPSHOT.jar:?]
>         at 
> org.apache.hadoop.hive.ql.QTestUtil.executeClientInternal(QTestUtil.java:744) 
> [hive-it-util-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
>         at 
> org.apache.hadoop.hive.ql.QTestUtil.executeClient(QTestUtil.java:714) 
> [hive-it-util-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
>         at 
> org.apache.hadoop.hive.cli.control.CoreCliDriver.runTest(CoreCliDriver.java:170)
>  [hive-it-util-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
>         at 
> org.apache.hadoop.hive.cli.control.CliAdapter.runTest(CliAdapter.java:157) 
> [hive-it-util-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
>         at 
> org.apache.hadoop.hive.cli.TestMiniLlapLocalCliDriver.testCliDriver(TestMiniLlapLocalCliDriver.java:62)
>  [test-classes/:?]
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
> ~[?:1.8.0_112]
>         at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
> ~[?:1.8.0_112]
>         at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>  ~[?:1.8.0_112]
>         at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_112]
>         at 
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
>  [junit-4.13.jar:4.13]
>         at 
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
>  [junit-4.13.jar:4.13]
>         at 
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
>  [junit-4.13.jar:4.13]
>         at 
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
>  [junit-4.13.jar:4.13]
>         at 
> org.apache.hadoop.hive.cli.control.CliAdapter$2$1.evaluate(CliAdapter.java:135)
>  [hive-it-util-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
>         at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306) 
> [junit-4.13.jar:4.13]
>         at 
> org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
>  [junit-4.13.jar:4.13]
>         at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366) 
> [junit-4.13.jar:4.13]
>         at 
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
>  [junit-4.13.jar:4.13]
>         at 
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
>  [junit-4.13.jar:4.13]
>         at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331) 
> [junit-4.13.jar:4.13]
>         at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79) 
> [junit-4.13.jar:4.13]
>         at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329) 
> [junit-4.13.jar:4.13]
>         at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66) 
> [junit-4.13.jar:4.13]
> {code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

Reply via email to