[ 
https://issues.apache.org/jira/browse/HIVE-22173?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17642144#comment-17642144
 ] 

Stamatis Zampetakis commented on HIVE-22173:
--------------------------------------------

I will simplify a bit the query to demonstrate the problem. Consider the 
following setting:
{code:sql}
CREATE TABLE customer(orders array<string>);

EXPLAIN SELECT *
FROM customer
lateral view explode(orders) v as c1
lateral view explode(orders) v as c2
lateral view explode(orders) v as c3
lateral view explode(orders) v as c4
; [^op_plan_4_lateral_views.pdf] 
{code}

The structure of the operator plan [before logical 
optimizations|https://github.com/apache/hive/blob/b053c61e6d9ffe36f3197a3efe88732bef726ecf/ql/src/java/org/apache/hadoop/hive/ql/parse/SemanticAnalyzer.java#L12904]
 is shown in  [^op_plan_4_lateral_views.pdf] .

The plan is comprised from 4 sub-graphs (one for each lateral view appearing in 
the query) connected together in a linear fashion. We can observe that from 
each LVF node we have exactly two paths that merge onto an LVJ node. 

Recursive algorithms follow a DFS traversal pattern some allowing a node to be 
visited again (e.g., when trying to enumerate all possible paths in a DAG) and 
some not. Such kind of algorithms appear in various places in Hive and two that 
cause problems in this case are shown below:
* PreOrderOnceWalker uses DFS and allows a node to be visited again
* The JSON plan serialization (ExplainTask#getJSONPlan) uses DFS and allows a 
node to be visited again

By allowing a node to be visited again we are more or less enumerating all 
paths from the root to every other node in the DAG. Observe that for this 
particular plan with the lateral views we have 2^N paths from the TS (root) to 
the FS (leaf) where N is the number of lateral views in the query. When the N 
is small (e.g., in this case 4) the query finishes normally but as soon as it 
grows a bit (15 to 20) the query will get stuck (or die with OOM) in every 
place where there is recursion (DFS traversal) with repetition.

Next steps/potential solutions:
* Check if it is possible to change the shape of the operator tree;
* Check if the DFS algorithms that allow nodes to be visited again can be 
changed/fixed;
* Detect potentially expensive traversals and selectively disable optimizations.

> HiveServer2: Query with multiple lateral view hung forever during compile 
> stage
> -------------------------------------------------------------------------------
>
>                 Key: HIVE-22173
>                 URL: https://issues.apache.org/jira/browse/HIVE-22173
>             Project: Hive
>          Issue Type: Bug
>          Components: HiveServer2
>    Affects Versions: 3.1.1, 4.0.0-alpha-1
>         Environment: Hive-3.1.1, Java-8
>            Reporter: Rajkumar Singh
>            Priority: Critical
>         Attachments: op_plan_4_lateral_views.pdf, thread-progress.log
>
>
> Steps To Repro:
> {code:java}
> -- create table 
> CREATE EXTERNAL TABLE `jsontable`( 
> `json_string` string) 
> ROW FORMAT SERDE 
> 'org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe' 
> STORED AS INPUTFORMAT 
> 'org.apache.hadoop.mapred.TextInputFormat' 
> OUTPUTFORMAT 
> 'org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat' ;
> -- Run explain of the query
> explain SELECT
> *
> FROM jsontable
> lateral view 
> explode(split(regexp_replace(get_json_object(jsontable.json_string, 
> '$.jsonfield.addr.city'), "\\[|\\]|\"", ""),',')) t1 as c1
> lateral view 
> explode(split(regexp_replace(get_json_object(jsontable.json_string, 
> '$.jsonfield.addr.country'), "\\[|\\]|\"", ""),',')) t2 as c2
> lateral view 
> explode(split(regexp_replace(get_json_object(jsontable.json_string, 
> '$.jsonfield.addr'), "\\[|\\]|\"", ""),',')) t3 as c3
> lateral view 
> explode(split(regexp_replace(get_json_object(jsontable.json_string, 
> '$.jsonfield.addr.postalCode'), "\\[|\\]|\"", ""),',')) t4 as c4
> lateral view 
> explode(split(regexp_replace(get_json_object(jsontable.json_string, 
> '$.jsonfield.addr.state'), "\\[|\\]|\"", ""),',')) t5 as c5
> lateral view 
> explode(split(regexp_replace(get_json_object(jsontable.json_string, 
> '$.jsonfield.addr.streetAddressLine'), "\\[|\\]|\"", ""),',')) t6 as c6
> lateral view 
> explode(split(regexp_replace(get_json_object(jsontable.json_string, 
> '$.jsonfield.dummyfield'), "\\[|\\]|\"", ""),',')) t7 as c7
> lateral view 
> explode(split(regexp_replace(get_json_object(jsontable.json_string, 
> '$.jsonfield.dummyfield'), "\\[|\\]|\"", ""),',')) t8 as c8
> lateral view 
> explode(split(regexp_replace(get_json_object(jsontable.json_string, 
> '$.jsonfield.dummyfield.name.suffix'), "\\[|\\]|\"", ""),',')) t9 as c9
> lateral view 
> explode(split(regexp_replace(get_json_object(jsontable.json_string, 
> '$.jsonfield.id.extension'), "\\[|\\]|\"", ""),',')) t10 as c10
> lateral view 
> explode(split(regexp_replace(get_json_object(jsontable.json_string, 
> '$.jsonfield.id'), "\\[|\\]|\"", ""),',')) t11 as c11
> lateral view 
> explode(split(regexp_replace(get_json_object(jsontable.json_string, 
> '$.jsonfield.id.root'), "\\[|\\]|\"", ""),',')) t12 as c12
> lateral view 
> explode(split(regexp_replace(get_json_object(jsontable.json_string, 
> '$.jsonfield.telecom.'), "\\[|\\]|\"", ""),',')) t13 as c13
> lateral view 
> explode(split(regexp_replace(get_json_object(jsontable.json_string, 
> '$.jsonfield.dummyfield1.use'), "\\[|\\]|\"", ""),',')) t14 as c14
> lateral view 
> explode(split(regexp_replace(get_json_object(jsontable.json_string, 
> '$.jsonfield.dummyfield1.value'), "\\[|\\]|\"", ""),',')) t15 as c15
> lateral view 
> explode(split(regexp_replace(get_json_object(jsontable.json_string, 
> '$.jsonfield1.dummyfield1.code'), "\\[|\\]|\"", ""),',')) t16 as c16
> lateral view 
> explode(split(regexp_replace(get_json_object(jsontable.json_string, 
> '$.jsonfield1.dummyfield1.value'), "\\[|\\]|\"", ""),',')) t17 as c17
> lateral view 
> explode(split(regexp_replace(get_json_object(jsontable.json_string, 
> '$.jsonfield2.city'), "\\[|\\]|\"", ""),',')) t18 as c18
> lateral view 
> explode(split(regexp_replace(get_json_object(jsontable.json_string, 
> '$.jsonfield2.city'), "\\[|\\]|\"", ""),',')) t19 as c19
> lateral view 
> explode(split(regexp_replace(get_json_object(jsontable.json_string, 
> '$.jsonfield2.country'), "\\[|\\]|\"", ""),',')) t20 as c20
> lateral view 
> explode(split(regexp_replace(get_json_object(jsontable.json_string, 
> '$.jsonfield2.country'), "\\[|\\]|\"", ""),',')) t21 as c21
> lateral view 
> explode(split(regexp_replace(get_json_object(jsontable.json_string, 
> '$.jsonfield'), "\\[|\\]|\"", ""),',')) t22 as c22
> lateral view 
> explode(split(regexp_replace(get_json_object(jsontable.json_string, 
> '$.jsonfield2.postalCode'), "\\[|\\]|\"", ""),',')) t23 as c23
> lateral view 
> explode(split(regexp_replace(get_json_object(jsontable.json_string, 
> '$.jsonfield2.postalCode'), "\\[|\\]|\"", ""),',')) t24 as c24
> lateral view 
> explode(split(regexp_replace(get_json_object(jsontable.json_string, 
> '$.jsonfield2.state'), "\\[|\\]|\"", ""),',')) t25 as c25
> lateral view 
> explode(split(regexp_replace(get_json_object(jsontable.json_string, 
> '$.jsonfield2.state'), "\\[|\\]|\"", ""),',')) t26 as c26
> lateral view 
> explode(split(regexp_replace(get_json_object(jsontable.json_string, 
> '$.jsonfield2'), "\\[|\\]|\"", ""),',')) t27 as c27
> lateral view 
> explode(split(regexp_replace(get_json_object(jsontable.json_string, 
> '$.jsonfield2.streetAddressLine'), "\\[|\\]|\"", ""),',')) t28 as c28
> lateral view 
> explode(split(regexp_replace(get_json_object(jsontable.json_string, 
> '$.jsonfield2.use'), "\\[|\\]|\"", ""),',')) t29 as c29
> lateral view 
> explode(split(regexp_replace(get_json_object(jsontable.json_string, 
> '$.jsonfield3'), "\\[|\\]|\"", ""),',')) t30 as c30
> lateral view 
> explode(split(regexp_replace(get_json_object(jsontable.json_string, 
> '$.jsonfield3'), "\\[|\\]|\"", ""),',')) t31 as c31
> lateral view 
> explode(split(regexp_replace(get_json_object(jsontable.json_string, 
> '$.jsonfield4'), "\\[|\\]|\"", ""),',')) t32 as c32
> ;
> -- it will hung forever
> {code}
> – HS2 jstacks
> {code:java}
> // 
> "8ed37c3a-be03-4f74-9afd-419d05609b9c HiveServer2-Handler-Pool: Thread-85" 
> #85 prio=5 os_prio=0 tid=0x00007f3bd873f800 nid=0x90b94 runnable 
> [0x00007f3baa6e2000]"8ed37c3a-be03-4f74-9afd-419d05609b9c 
> HiveServer2-Handler-Pool: Thread-85" #85 prio=5 os_prio=0 
> tid=0x00007f3bd873f800 nid=0x90b94 runnable [0x00007f3baa6e2000]   
> java.lang.Thread.State: RUNNABLE at 
> java.util.regex.Pattern$Curly.match0(Pattern.java:4272) at 
> java.util.regex.Pattern$Curly.match(Pattern.java:4234) at 
> java.util.regex.Pattern$Slice.match(Pattern.java:3972) at 
> java.util.regex.Pattern$GroupHead.match(Pattern.java:4658) at 
> java.util.regex.Matcher.match(Matcher.java:1270) at 
> java.util.regex.Matcher.matches(Matcher.java:604) at 
> org.apache.hadoop.hive.ql.lib.RuleRegExp.costPatternWithWildCardChar(RuleRegExp.java:236)
>  at org.apache.hadoop.hive.ql.lib.RuleRegExp.cost(RuleRegExp.java:279) at 
> org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispatcher.java:72)
>  at 
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatchAndReturn(DefaultGraphWalker.java:105)
>  at 
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.java:89)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:43)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.PreOrderOnceWalker.walk(PreOrderOnceWalker.java:54)
>  at 
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWalker.java:120)
>  at 
> org.apache.hadoop.hive.ql.ppd.SyntheticJoinPredicate.transform(SyntheticJoinPredicate.java:106)
>  at 
> org.apache.hadoop.hive.ql.optimizer.Optimizer.optimize(Optimizer.java:250) at 
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:12423)
>  at 
> org.apache.hadoop.hive.ql.parse.CalcitePlanner.analyzeInternal(CalcitePlanner.java:360)
>  at 
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:289)
>  at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:664) at 
> org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1869) at 
> org.apache.hadoop.hive.ql.Driver.compileAndRespond(Driver.java:1816) at 
> org.apache.hadoop.hive.ql.Driver.compileAndRespond(Driver.java:1811) at 
> org.apache.hadoop.hive.ql.reexec.ReExecDriver.compileAndRespond(ReExecDriver.java:126)
>  at 
> org.apache.hive.service.cli.operation.SQLOperation.prepare(SQLOperation.java:197)
>  at 
> org.apache.hive.service.cli.operation.SQLOperation.runInternal(SQLOperation.java:262)
>  at org.apache.hive.service.cli.operation.Operation.run(Operation.java:247) 
> at 
> org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementInternal(HiveSessionImpl.java:575)
>  at 
> org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementAsync(HiveSessionImpl.java:561)
>  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
> at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>  at java.lang.reflect.Method.invoke(Method.java:498) at 
> org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:78)
>  at 
> org.apache.hive.service.cli.session.HiveSessionProxy.access$000(HiveSessionProxy.java:36)
>  at 
> org.apache.hive.service.cli.session.HiveSessionProxy$1.run(HiveSessionProxy.java:63)
>  at java.security.AccessController.doPrivileged(Native Method) at 
> javax.security.auth.Subject.doAs(Subject.java:422) at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730)
>  at 
> org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:59)
>  at com.sun.proxy.$Proxy45.executeStatementAsync(Unknown Source) at 
> org.apache.hive.service.cli.CLIService.executeStatementAsync(CLIService.java:315)
>  at 
> org.apache.hive.service.cli.thrift.ThriftCLIService.ExecuteStatement(ThriftCLIService.java:566)
>  at 
> org.apache.hive.service.rpc.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1557)
>  at 
> org.apache.hive.service.rpc.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1542)
>  at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39) at 
> org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39) at 
> org.apache.hive.service.auth.TSetIpAddressProcessor.process(TSetIpAddressProcessor.java:56)
>  at 
> org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:286)
>  at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
>  at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
>  at java.lang.Thread.run(Thread.java:745)
> {code}
>  
> attached 10 jstacks taken at the interval of 30 seconds.
>  



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

Reply via email to