[ 
https://issues.apache.org/jira/browse/HIVE-12551?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15047815#comment-15047815
 ] 

Feng Yuan commented on HIVE-12551:
----------------------------------

hi [~prasanth_j],i try to get down the newest src from branch-1 in github and 
try my hql,but still meet the issue:
which: no hbase in 
(/opt/java/:/opt/hadoop/hadoop-2.6.0/bin:/opt/Python-2.7/bin:/opt/java/bin:/usr/local/bin:/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/sbin:/opt/Ice-3.3/bin:/opt/dell/srvadmin/bin:/opt/hadoop/hadoop-2.6.0/bin:/opt/hadoop/hadoop-2.6.0/sbin:/opt/hadoop/bin)

Logging initialized using configuration in 
jar:file:/opt/hadoop/yuanfeng/apache-hive-1.3.0-SNAPSHOT-bin/lib/hive-common-1.3.0-SNAPSHOT.jar!/hive-log4j.properties
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in 
[jar:file:/opt/hadoop/hadoop-2.6.0/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in 
[jar:file:/opt/hadoop/hadoop-2.6.0/lib/tez-0.5.2/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
Query ID = hadoop_20151208181528_d346dc26-18fb-4267-93e4-34c79f6759bd
Total jobs = 8
Stage-6 is selected by condition resolver.
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in 
[jar:file:/opt/hadoop/hadoop-2.6.0/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in 
[jar:file:/opt/hadoop/hadoop-2.6.0/lib/tez-0.5.2/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
Execution log at: 
/tmp/hadoop/hadoop_20151208181528_d346dc26-18fb-4267-93e4-34c79f6759bd.log
2015-12-08 18:15:44     Starting to launch local task to process map join;      
maximum memory = 1908932608
2015-12-08 18:15:46     Dump the side-table for tag: 1 with group count: 1 into 
file: 
file:/tmp/hadoop/71586bb3-ea53-4edf-94d0-cc17094c60e9/hive_2015-12-08_18-15-28_261_2150907413884186817-1/-local-10015/HashTable-Stage-2/MapJoin-mapfile21--.hashtable
2015-12-08 18:15:46     Uploaded 1 File to: 
file:/tmp/hadoop/71586bb3-ea53-4edf-94d0-cc17094c60e9/hive_2015-12-08_18-15-28_261_2150907413884186817-1/-local-10015/HashTable-Stage-2/MapJoin-mapfile21--.hashtable
 (293 bytes)
2015-12-08 18:15:46     End of local task; Time Taken: 1.632 sec.
Execution completed successfully
MapredLocal task succeeded
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in 
[jar:file:/opt/hadoop/hadoop-2.6.0/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in 
[jar:file:/opt/hadoop/hadoop-2.6.0/lib/tez-0.5.2/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
Execution log at: 
/tmp/hadoop/hadoop_20151208181528_d346dc26-18fb-4267-93e4-34c79f6759bd.log
2015-12-08 18:15:52     Starting to launch local task to process map join;      
maximum memory = 1908932608
2015-12-08 18:15:53     Dump the side-table for tag: 1 with group count: 1 into 
file: 
file:/tmp/hadoop/71586bb3-ea53-4edf-94d0-cc17094c60e9/hive_2015-12-08_18-15-28_261_2150907413884186817-1/-local-10019/HashTable-Stage-12/MapJoin-mapfile61--.hashtable
2015-12-08 18:15:53     Uploaded 1 File to: 
file:/tmp/hadoop/71586bb3-ea53-4edf-94d0-cc17094c60e9/hive_2015-12-08_18-15-28_261_2150907413884186817-1/-local-10019/HashTable-Stage-12/MapJoin-mapfile61--.hashtable
 (293 bytes)
2015-12-08 18:15:53     End of local task; Time Taken: 1.333 sec.
Execution completed successfully
MapredLocal task succeeded
Launching Job 1 out of 8
Number of reduce tasks not specified. Estimated from input data size: 24
In order to change the average load for a reducer (in bytes):
  set hive.exec.reducers.bytes.per.reducer=<number>
In order to limit the maximum number of reducers:
  set hive.exec.reducers.max=<number>
In order to set a constant number of reducers:
  set mapreduce.job.reduces=<number>
Starting Job = job_1449545133527_0033, Tracking URL = 
http://bjhlg-24p2-113-hadoop03:8088/proxy/application_1449545133527_0033/
Kill Command = /opt/hadoop/hadoop-2.6.0/bin/hadoop job  -kill 
job_1449545133527_0033
Hadoop job information for Stage-6: number of mappers: 8; number of reducers: 24
2015-12-08 18:16:14,666 Stage-6 map = 0%,  reduce = 0%
2015-12-08 18:16:28,249 Stage-6 map = 13%,  reduce = 0%, Cumulative CPU 19.93 
sec
2015-12-08 18:16:32,476 Stage-6 map = 15%,  reduce = 0%, Cumulative CPU 33.7 sec
2015-12-08 18:16:34,591 Stage-6 map = 19%,  reduce = 0%, Cumulative CPU 33.7 sec
2015-12-08 18:16:36,726 Stage-6 map = 40%,  reduce = 0%, Cumulative CPU 108.43 
sec
2015-12-08 18:16:37,789 Stage-6 map = 44%,  reduce = 0%, Cumulative CPU 109.72 
sec
2015-12-08 18:16:38,833 Stage-6 map = 50%,  reduce = 0%, Cumulative CPU 112.06 
sec
2015-12-08 18:16:42,003 Stage-6 map = 53%,  reduce = 0%, Cumulative CPU 120.92 
sec
2015-12-08 18:16:43,070 Stage-6 map = 57%,  reduce = 0%, Cumulative CPU 134.87 
sec
2015-12-08 18:16:45,196 Stage-6 map = 100%,  reduce = 100%, Cumulative CPU 
112.06 sec
MapReduce Total cumulative CPU time: 1 minutes 52 seconds 60 msec
Ended Job = job_1449545133527_0033 with errors
Error during job, obtaining debugging information...
Examining task ID: task_1449545133527_0033_m_000000 (and more) from job 
job_1449545133527_0033
Examining task ID: task_1449545133527_0033_m_000004 (and more) from job 
job_1449545133527_0033

Task with the most failures(4): 
-----
Task ID:
  task_1449545133527_0033_m_000006

URL:
  
http://0.0.0.0:8088/taskdetails.jsp?jobid=job_1449545133527_0033&tipid=task_1449545133527_0033_m_000006
-----
Diagnostic Messages for this Task:
Error: java.lang.RuntimeException: 
org.apache.hive.com.esotericsoftware.kryo.KryoException: Encountered 
unregistered class ID: 18
Serialization trace:
tableDesc (org.apache.hadoop.hive.ql.plan.PartitionDesc)
aliasToPartnInfo (org.apache.hadoop.hive.ql.plan.MapWork)
        at 
org.apache.hadoop.hive.ql.exec.Utilities.getBaseWork(Utilities.java:423)
        at 
org.apache.hadoop.hive.ql.exec.Utilities.getMapWork(Utilities.java:286)
        at 
org.apache.hadoop.hive.ql.io.HiveInputFormat.init(HiveInputFormat.java:263)
        at 
org.apache.hadoop.hive.ql.io.HiveInputFormat.pushProjectionsAndFilters(HiveInputFormat.java:478)
        at 
org.apache.hadoop.hive.ql.io.HiveInputFormat.pushProjectionsAndFilters(HiveInputFormat.java:471)
        at 
org.apache.hadoop.hive.ql.io.CombineHiveInputFormat.getRecordReader(CombineHiveInputFormat.java:648)
        at 
org.apache.hadoop.mapred.MapTask$TrackedRecordReader.<init>(MapTask.java:169)
        at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:429)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
        at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:415)
        at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
        at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
Caused by: org.apache.hive.com.esotericsoftware.kryo.KryoException: Encountered 
unregistered class ID: 18
Serialization trace:
tableDesc (org.apache.hadoop.hive.ql.plan.PartitionDesc)
aliasToPartnInfo (org.apache.hadoop.hive.ql.plan.MapWork)
        at 
org.apache.hive.com.esotericsoftware.kryo.util.DefaultClassResolver.readClass(DefaultClassResolver.java:119)
        at 
org.apache.hive.com.esotericsoftware.kryo.Kryo.readClass(Kryo.java:656)
        at 
org.apache.hive.com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:99)
        at 
org.apache.hive.com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:507)
        at 
org.apache.hive.com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:776)
        at 
org.apache.hive.com.esotericsoftware.kryo.serializers.MapSerializer.read(MapSerializer.java:139)
        at 
org.apache.hive.com.esotericsoftware.kryo.serializers.MapSerializer.read(MapSerializer.java:17)
        at 
org.apache.hive.com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:694)
        at 
org.apache.hive.com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:106)
        at 
org.apache.hive.com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:507)
        at 
org.apache.hive.com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:672)
        at 
org.apache.hadoop.hive.ql.exec.Utilities.deserializeObjectByKryo(Utilities.java:1025)
        at 
org.apache.hadoop.hive.ql.exec.Utilities.deserializePlan(Utilities.java:933)
        at 
org.apache.hadoop.hive.ql.exec.Utilities.deserializePlan(Utilities.java:947)
        at 
org.apache.hadoop.hive.ql.exec.Utilities.getBaseWork(Utilities.java:390)
        ... 13 more


FAILED: Execution Error, return code 2 from 
org.apache.hadoop.hive.ql.exec.mr.MapRedTask
MapReduce Jobs Launched: 
Stage-Stage-6: Map: 8  Reduce: 24   Cumulative CPU: 112.06 sec   HDFS Read: 
204015117 HDFS Write: 0 FAIL

this is my hql:
select  A.state_date, 
           A.customer, 
           A.channel_2,
           A.id,
           A.pid,
           A.type,
           A.pv,
           A.uv,
           A.visits,
           if(C.stay_visits is null,0,C.stay_visits) as stay_visits,
           A.stay_time,
           if(B.bounce is null,0,B.bounce) as bounce
 from
     (select a.state_date, 
            a.customer, 
            b.url as channel_2,
            b.id,
            b.pid,
            b.type,
            count(1) as pv,
            count(distinct a.gid) uv,
            count(distinct a.session_id) as visits,
            sum(a.stay_time) as stay_time
       from       
               ( select state_date, 
                           customer, 
                           gid,
                           session_id,
                           ep,
                           stay_time
                    from bdi_fact.mid_pageview_dt0
                    where l_date ='$v_date'
                  )a
                  join
                  (select l_date as state_date ,
                          url,
                          id,
                          pid,
                          type,
                          cid
                   from bdi_fact.frequency_channel
                   where l_date ='$v_date'
                   and type ='2'
                   and dr='0'
                  )b
                   on  a.customer=b.cid  
                   where a.ep  rlike b.url
                   group by a.state_date, a.customer, b.url,b.id,b.pid,b.type
       )A
      
        left outer join
       (   select 
                   c.state_date ,
                   c.customer ,
                   d.url as channel_2,
                   d.id,
                   sum(pagedepth) as bounce
            from
                  ( select 
                              t1.state_date ,
                              t1.customer ,
                              t1.session_id,
                              t1.ep,
                              t2.pagedepth
                    from           
                         ( select 
                                     state_date ,
                                     customer ,
                                     session_id,
                                     exit_url as ep
                          from ods.mid_session_enter_exit_dt0
                          where l_date ='$v_date'
                          )t1
                         join
                          ( select 
                                    state_date ,
                                    customer ,
                                    session_id,
                                    pagedepth
                            from ods.mid_session_action_dt0
                            where l_date ='$v_date'
                            and  pagedepth='1'
                          )t2
                         on t1.customer=t2.customer
                         and t1.session_id=t2.session_id
                   )c
                   join
                   (select *
                   from bdi_fact.frequency_channel
                   where l_date ='$v_date'
                   and type ='2'
                   and dr='0'
                   )d
                   on c.customer=d.cid
                   where c.ep  rlike d.url
                   group by  c.state_date,c.customer,d.url,d.id
             )B
             on 
             A.customer=B.customer
             and A.channel_2=B.channel_2 
             and A.id=B.id
      left outer join
     ( 
             select e.state_date, 
            e.customer, 
            f.url as channel_2,
            f.id,
            f.pid,
            f.type,
            count(distinct e.session_id) as stay_visits
       from       
               ( select state_date, 
                           customer, 
                           gid,
                           session_id,
                           ep,
                           stay_time
                    from bdi_fact.mid_pageview_dt0
                    where l_date ='$v_date'
                  )e
                  join
                  (select l_date as state_date,
                          url,
                          id,
                          pid,
                          type,
                          cid
                   from bdi_fact.frequency_channel
                   where l_date ='$v_date'
                   and type ='2'
                   and dr='0'
                  )f
                   on  e.customer=f.cid  
                   where e.ep  rlike f.url
                   and e.stay_time is not null
                   and e.stay_time <>'0'
                   group by e.state_date, e.customer, f.url,f.id,f.pid,f.type
           )C
        on
        A.customer=C.customer
        and   A.channel_2=C.channel_2
        and   A.id=C.id
        and   A.pid=C.pid
        and   A.type=C.type 
 where A.customer='Cdianyingwang' and 
A.channel_2='http://www.1905.com/film/filmnews/jk/' and A.id='127';"

do you know why?thanks!

> Fix several kryo exceptions in branch-1
> ---------------------------------------
>
>                 Key: HIVE-12551
>                 URL: https://issues.apache.org/jira/browse/HIVE-12551
>             Project: Hive
>          Issue Type: Bug
>    Affects Versions: 1.3.0
>            Reporter: Prasanth Jayachandran
>            Assignee: Prasanth Jayachandran
>              Labels: serialization
>             Fix For: 1.3.0
>
>         Attachments: HIVE-12551.1.patch
>
>
> HIVE-11519, HIVE-12174 and the following exception are all caused by 
> unregistered classes or serializers. HIVE-12175 should have fixed these 
> issues for master branch.
> {code}
> Caused by: org.apache.hive.com.esotericsoftware.kryo.KryoException: 
> java.lang.NullPointerException
> Serialization trace:
> chidren (org.apache.hadoop.hive.ql.plan.ExprNodeGenericFuncDesc)
> expr (org.apache.hadoop.hive.ql.exec.vector.udf.VectorUDFAdaptor)
> childExpressions 
> (org.apache.hadoop.hive.ql.exec.vector.expressions.gen.FilterStringColumnBetween)
> conditionEvaluator 
> (org.apache.hadoop.hive.ql.exec.vector.VectorFilterOperator)
> childOperators (org.apache.hadoop.hive.ql.exec.TableScanOperator)
> aliasToWork (org.apache.hadoop.hive.ql.plan.MapWork)
>       at 
> org.apache.hive.com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:125)
>       at 
> org.apache.hive.com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:507)
>       at 
> org.apache.hive.com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:694)
>       at 
> org.apache.hive.com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:106)
>       at 
> org.apache.hive.com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:507)
>       at 
> org.apache.hive.com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:694)
>       at 
> org.apache.hive.com.esotericsoftware.kryo.serializers.DefaultArraySerializers$ObjectArraySerializer.read(DefaultArraySerializers.java:367)
>       at 
> org.apache.hive.com.esotericsoftware.kryo.serializers.DefaultArraySerializers$ObjectArraySerializer.read(DefaultArraySerializers.java:276)
>       at 
> org.apache.hive.com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:694)
>       at 
> org.apache.hive.com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:106)
>       at 
> org.apache.hive.com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:507)
>       at 
> org.apache.hive.com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:694)
>       at 
> org.apache.hive.com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:106)
>       at 
> org.apache.hive.com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:507)
>       at 
> org.apache.hive.com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:776)
>       at 
> org.apache.hive.com.esotericsoftware.kryo.serializers.CollectionSerializer.read(CollectionSerializer.java:112)
>       at 
> org.apache.hive.com.esotericsoftware.kryo.serializers.CollectionSerializer.read(CollectionSerializer.java:18)
>       at 
> org.apache.hive.com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:694)
>       at 
> org.apache.hive.com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:106)
>       at 
> org.apache.hive.com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:507)
>       at 
> org.apache.hive.com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:776)
>       at 
> org.apache.hive.com.esotericsoftware.kryo.serializers.MapSerializer.read(MapSerializer.java:139)
>       at 
> org.apache.hive.com.esotericsoftware.kryo.serializers.MapSerializer.read(MapSerializer.java:17)
>       at 
> org.apache.hive.com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:694)
>       at 
> org.apache.hive.com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:106)
>       at 
> org.apache.hive.com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:507)
>       at 
> org.apache.hive.com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:672)
>       at 
> org.apache.hadoop.hive.ql.exec.Utilities.deserializeObjectByKryo(Utilities.java:1087)
>       at 
> org.apache.hadoop.hive.ql.exec.Utilities.deserializePlan(Utilities.java:976)
>       at 
> org.apache.hadoop.hive.ql.exec.Utilities.deserializePlan(Utilities.java:990)
>       at 
> org.apache.hadoop.hive.ql.exec.Utilities.getBaseWork(Utilities.java:426)
>       ... 27 more
> Caused by: java.lang.NullPointerException
>       at java.util.Arrays$ArrayList.size(Arrays.java:3818)
>       at java.util.AbstractList.add(AbstractList.java:108)
>       at 
> org.apache.hive.com.esotericsoftware.kryo.serializers.CollectionSerializer.read(CollectionSerializer.java:112)
>       at 
> org.apache.hive.com.esotericsoftware.kryo.serializers.CollectionSerializer.read(CollectionSerializer.java:18)
>       at 
> org.apache.hive.com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:694)
>       at 
> org.apache.hive.com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:106)
>       ... 57 more
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to