Please find attached error log for the same. On Tue, May 5, 2015 at 11:36 PM, Jason Dere <jd...@hortonworks.com> wrote:
> Looks like you are running into > https://issues.apache.org/jira/browse/HIVE-8321, fixed in Hive-0.14. > You might be stuck having to use Kryo, what are the issues you are having > with Kryo? > > > Thanks, > Jason > > On May 5, 2015, at 4:28 AM, Bhagwan S. Soni <bhgwnsson...@gmail.com> > wrote: > > Bottom on the log: > > at java.beans.Encoder.writeObject(Encoder.java:74) > > at java.beans.XMLEncoder.writeObject(XMLEncoder.java:327) > > at java.beans.Encoder.writeExpression(Encoder.java:330) > > at java.beans.XMLEncoder.writeExpression(XMLEncoder.java:454) > > at > java.beans.DefaultPersistenceDelegate.doProperty(DefaultPersistenceDelegate.java:194) > > at > java.beans.DefaultPersistenceDelegate.initBean(DefaultPersistenceDelegate.java:256) > > ... 98 more > > Caused by: java.lang.NullPointerException > > at java.lang.StringBuilder.<init>(StringBuilder.java:109) > > at > org.apache.hadoop.hive.serde2.typeinfo.BaseCharTypeInfo.getQualifiedName(BaseCharTypeInfo.java:49) > > at > org.apache.hadoop.hive.serde2.typeinfo.BaseCharTypeInfo.getQualifiedName(BaseCharTypeInfo.java:45) > > at > org.apache.hadoop.hive.serde2.typeinfo.VarcharTypeInfo.getTypeName(VarcharTypeInfo.java:37) > > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > > at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) > > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > > at java.lang.reflect.Method.invoke(Method.java:606) > > at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:75) > > at sun.reflect.GeneratedMethodAccessor8.invoke(Unknown Source) > > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > > at java.lang.reflect.Method.invoke(Method.java:606) > > at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:279) > > at java.beans.Statement.invokeInternal(Statement.java:292) > > at java.beans.Statement.access$000(Statement.java:58) > > at java.beans.Statement$2.run(Statement.java:185) > > at java.security.AccessController.doPrivileged(Native Method) > > at java.beans.Statement.invoke(Statement.java:182) > > at java.beans.Expression.getValue(Expression.java:153) > > at > java.beans.DefaultPersistenceDelegate.doProperty(DefaultPersistenceDelegate.java:193) > > at > java.beans.DefaultPersistenceDelegate.initBean(DefaultPersistenceDelegate.java:256) > > ... 111 more > > Job Submission failed with exception > 'java.lang.RuntimeException(java.lang.RuntimeException: Cannot serialize > object)' > > FAILED: Execution Error, return code 1 from > org.apache.hadoop.hive.ql.exec.mr.MapRedTask > > On Tue, May 5, 2015 at 3:10 PM, Jason Dere <jd...@hortonworks.com> wrote: > >> kryo/javaXML are the only available options. What are the errors you see >> with each setting? >> >> >> On May 1, 2015, at 9:41 AM, Bhagwan S. Soni <bhgwnsson...@gmail.com> >> wrote: >> >> Hi Hive Users, >> >> I'm using cloudera's hive 0.13 version which by default provide Kryo >> plan serialization format. >> <property> >> <name>hive.plan.serialization.format</name> >> <value>*kryo*</value> >> </property> >> >> As i'm facing issues with Kryo, can anyone help me identify the other >> open options in place of Kryo for hive plan serialization format. >> >> I know one option javaXML, but in my case it is not working. >> >> >> >> >> >> > >
2015-04-21 07:30:36,717 WARN [main] conf.HiveConf (HiveConf.java:initialize(1491)) - DEPRECATED: Configuration property hive.metas\ tore.local no longer has any effect. Make sure to provide a valid value for hive.metastore.uris if you are connecting to a remote m\ etastore. Logging initialized using configuration in jar:file:/opt/cloudera/parcels/CDH-5.2.1-1.cdh5.2.1.p0.12/jars/hive-common-0.13.1-cdh5.2\ .1.jar!/hive-log4j.properties HJOBNAME=mdhdy001 LASTLOADDATE=17000101 hiveconf:LASTLOADDATE=17000101 RUNNING_MODE= Total jobs = 5 Launching Job 1 out of 5 Launching Job 2 out of 5 Number of reduce tasks not specified. Defaulting to jobconf value of: 10 In order to change the average load for a reducer (in bytes): Number of reduce tasks not specified. Defaulting to jobconf value of: 10 set hive.exec.reducers.bytes.per.reducer=<number> In order to change the average load for a reducer (in bytes): In order to limit the maximum number of reducers: set hive.exec.reducers.bytes.per.reducer=<number> set hive.exec.reducers.max=<number> In order to limit the maximum number of reducers: In order to set a constant number of reducers: set hive.exec.reducers.max=<number> set mapreduce.job.reduces=<number> In order to set a constant number of reducers: set mapreduce.job.reduces=<number> Starting Job = job_1429549389861_0673, Tracking URL = http://dkhc2603.dcsg.com:8088/proxy/application_1429549389861_0673/ Kill Command = /opt/cloudera/parcels/CDH-5.2.1-1.cdh5.2.1.p0.12/lib/hadoop/bin/hadoop job -kill job_1429549389861_0673 Starting Job = job_1429549389861_0674, Tracking URL = http://dkhc2603.dcsg.com:8088/proxy/application_1429549389861_0674/ Kill Command = /opt/cloudera/parcels/CDH-5.2.1-1.cdh5.2.1.p0.12/lib/hadoop/bin/hadoop job -kill job_1429549389861_0674 Hadoop job information for Stage-1: number of mappers: 5; number of reducers: 10 2015-04-21 07:31:00,430 Stage-1 map = 0%, reduce = 0% Hadoop job information for Stage-13: number of mappers: 5; number of reducers: 10 2015-04-21 07:31:02,687 Stage-13 map = 0%, reduce = 0% 2015-04-21 07:31:09,462 Stage-1 map = 20%, reduce = 0%, Cumulative CPU 2.23 sec 2015-04-21 07:31:10,539 Stage-1 map = 60%, reduce = 0%, Cumulative CPU 8.38 sec 2015-04-21 07:31:11,420 Stage-13 map = 20%, reduce = 0%, Cumulative CPU 2.54 sec 2015-04-21 07:31:11,614 Stage-1 map = 100%, reduce = 0%, Cumulative CPU 14.87 sec 2015-04-21 07:31:13,528 Stage-13 map = 60%, reduce = 0%, Cumulative CPU 10.72 sec 2015-04-21 07:31:14,594 Stage-13 map = 80%, reduce = 0%, Cumulative CPU 15.47 sec 2015-04-21 07:31:15,664 Stage-13 map = 100%, reduce = 0%, Cumulative CPU 17.26 sec 2015-04-21 07:31:22,274 Stage-1 map = 100%, reduce = 20%, Cumulative CPU 18.18 sec 2015-04-21 07:31:23,347 Stage-1 map = 100%, reduce = 40%, Cumulative CPU 27.4 sec 2015-04-21 07:31:24,609 Stage-1 map = 100%, reduce = 80%, Cumulative CPU 36.08 sec 2015-04-21 07:31:25,268 Stage-13 map = 100%, reduce = 10%, Cumulative CPU 17.26 sec 2015-04-21 07:31:25,659 Stage-1 map = 100%, reduce = 100%, Cumulative CPU 45.74 sec 2015-04-21 07:31:26,351 Stage-13 map = 100%, reduce = 20%, Cumulative CPU 27.81 sec MapReduce Total cumulative CPU time: 45 seconds 740 msec Ended Job = job_1429549389861_0673 2015-04-21 07:31:27,424 Stage-13 map = 100%, reduce = 30%, Cumulative CPU 34.15 sec 2015-04-21 07:31:28,483 Stage-13 map = 100%, reduce = 90%, Cumulative CPU 71.15 sec 2015-04-21 07:31:29,555 Stage-13 map = 100%, reduce = 100%, Cumulative CPU 77.81 sec MapReduce Total cumulative CPU time: 1 minutes 17 seconds 810 msec Ended Job = job_1429549389861_0674 2015-04-21 07:31:36,035 WARN [main] conf.Configuration (Configuration.java:loadProperty(2510)) - file:/tmp/srv-hdp-mkt-d/hive_2015\ -04-21_07-30-38_195_8833192956835410748-1/-local-10014/jobconf.xml:an attempt to override final parameter: hadoop.ssl.require.clien\ t.cert; Ignoring. 2015-04-21 07:31:36,042 WARN [main] conf.Configuration (Configuration.java:loadProperty(2510)) - file:/tmp/srv-hdp-mkt-d/hive_2015\ -04-21_07-30-38_195_8833192956835410748-1/-local-10014/jobconf.xml:an attempt to override final parameter: mapreduce.job.end-notifi\ cation.max.retry.interval; Ignoring. 2015-04-21 07:31:36,044 WARN [main] conf.Configuration (Configuration.java:loadProperty(2510)) - file:/tmp/srv-hdp-mkt-d/hive_2015\ -04-21_07-30-38_195_8833192956835410748-1/-local-10014/jobconf.xml:an attempt to override final parameter: hadoop.ssl.client.conf; \ Ignoring. 2015-04-21 07:31:36,046 WARN [main] conf.Configuration (Configuration.java:loadProperty(2510)) - file:/tmp/srv-hdp-mkt-d/hive_2015\ -04-21_07-30-38_195_8833192956835410748-1/-local-10014/jobconf.xml:an attempt to override final parameter: hadoop.ssl.keystores.fac\ tory.class; Ignoring. 2015-04-21 07:31:36,051 WARN [main] conf.Configuration (Configuration.java:loadProperty(2510)) - file:/tmp/srv-hdp-mkt-d/hive_2015\ -04-21_07-30-38_195_8833192956835410748-1/-local-10014/jobconf.xml:an attempt to override final parameter: hadoop.ssl.server.conf; \ Ignoring. 2015-04-21 07:31:36,076 WARN [main] conf.Configuration (Configuration.java:loadProperty(2510)) - file:/tmp/srv-hdp-mkt-d/hive_2015\ -04-21_07-30-38_195_8833192956835410748-1/-local-10014/jobconf.xml:an attempt to override final parameter: mapreduce.job.end-notifi\ cation.max.attempts; Ignoring. 2015-04-21 07:31:36,401 WARN [main] conf.HiveConf (HiveConf.java:initialize(1491)) - DEPRECATED: Configuration property hive.metas\ tore.local no longer has any effect. Make sure to provide a valid value for hive.metastore.uris if you are connecting to a remote m\ etastore. Execution log at: /tmp/srv-hdp-mkt-d/srv-hdp-mkt-d_20150421073030_226accd4-9af1-4a4b-8260-bcac790c67c0.log 2015-04-21 07:31:36 Starting to launch local task to process map join; maximum memory = 257949696 2015-04-21 07:31:38 Dump the side-table into file: file:/tmp/srv-hdp-mkt-d/hive_2015-04-21_07-30-38_195_8833192956835410748-1/-loca\ l-10007/HashTable-Stage-4/MapJoin-mapfile01--.hashtable 2015-04-21 07:31:39 Uploaded 1 File to: file:/tmp/srv-hdp-mkt-d/hive_2015-04-21_07-30-38_195_8833192956835410748-1/-local-10007/Has\ hTable-Stage-4/MapJoin-mapfile01--.hashtable (281677 bytes) 2015-04-21 07:31:40 Dump the side-table into file: file:/tmp/srv-hdp-mkt-d/hive_2015-04-21_07-30-38_195_8833192956835410748-1/-loca\ l-10007/HashTable-Stage-4/MapJoin-mapfile10--.hashtable 2015-04-21 07:31:40 Uploaded 1 File to: file:/tmp/srv-hdp-mkt-d/hive_2015-04-21_07-30-38_195_8833192956835410748-1/-local-10007/Has\ hTable-Stage-4/MapJoin-mapfile10--.hashtable (1039767 bytes) 2015-04-21 07:31:40 End of local task; Time Taken: 3.988 sec. Execution completed successfully MapredLocal task succeeded Launching Job 3 out of 5 Number of reduce tasks is set to 0 since there's no reduce operator Starting Job = job_1429549389861_0679, Tracking URL = http://dkhc2603.dcsg.com:8088/proxy/application_1429549389861_0679/ Kill Command = /opt/cloudera/parcels/CDH-5.2.1-1.cdh5.2.1.p0.12/lib/hadoop/bin/hadoop job -kill job_1429549389861_0679 Hadoop job information for Stage-4: number of mappers: 15; number of reducers: 0 2015-04-21 07:32:00,788 Stage-4 map = 0%, reduce = 0% 2015-04-21 07:32:29,011 Stage-4 map = 100%, reduce = 0% Ended Job = job_1429549389861_0679 with errors Error during job, obtaining debugging information... Examining task ID: task_1429549389861_0679_m_000006 (and more) from job job_1429549389861_0679 Examining task ID: task_1429549389861_0679_m_000011 (and more) from job job_1429549389861_0679 Examining task ID: task_1429549389861_0679_m_000002 (and more) from job job_1429549389861_0679 Examining task ID: task_1429549389861_0679_m_000008 (and more) from job job_1429549389861_0679 Examining task ID: task_1429549389861_0679_m_000002 (and more) from job job_1429549389861_0679 Task with the most failures(4): ----- Task ID: task_1429549389861_0679_m_000008 URL: http://0.0.0.0:8088/taskdetails.jsp?jobid=job_1429549389861_0679&tipid=task_1429549389861_0679_m_000008 ----- Diagnostic Messages for this Task: Error: java.lang.RuntimeException: org.apache.hive.com.esotericsoftware.kryo.KryoException: Encountered unregistered class ID: 107 Serialization trace: rowSchema (org.apache.hadoop.hive.ql.exec.MapJoinOperator) parentOperators (org.apache.hadoop.hive.ql.exec.SelectOperator) parentOperators (org.apache.hadoop.hive.ql.exec.MapJoinOperator) parentOperators (org.apache.hadoop.hive.ql.exec.FilterOperator) parentOperators (org.apache.hadoop.hive.ql.exec.SelectOperator) parentOperators (org.apache.hadoop.hive.ql.exec.UnionOperator) childOperators (org.apache.hadoop.hive.ql.exec.TableScanOperator) aliasToWork (org.apache.hadoop.hive.ql.plan.MapWork) at org.apache.hadoop.hive.ql.exec.Utilities.getBaseWork(Utilities.java:364) at org.apache.hadoop.hive.ql.exec.Utilities.getMapWork(Utilities.java:275) at org.apache.hadoop.hive.ql.io.HiveInputFormat.init(HiveInputFormat.java:254) at org.apache.hadoop.hive.ql.io.HiveInputFormat.pushProjectionsAndFilters(HiveInputFormat.java:440) at org.apache.hadoop.hive.ql.io.HiveInputFormat.pushProjectionsAndFilters(HiveInputFormat.java:433) at org.apache.hadoop.hive.ql.io.CombineHiveInputFormat.getRecordReader(CombineHiveInputFormat.java:587) at org.apache.hadoop.mapred.MapTask$TrackedRecordReader.<init>(MapTask.java:169) at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:429) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163) Caused by: org.apache.hive.com.esotericsoftware.kryo.KryoException: Encountered unregistered class ID: 107 Serialization trace: rowSchema (org.apache.hadoop.hive.ql.exec.MapJoinOperator) parentOperators (org.apache.hadoop.hive.ql.exec.SelectOperator) parentOperators (org.apache.hadoop.hive.ql.exec.MapJoinOperator) parentOperators (org.apache.hadoop.hive.ql.exec.FilterOperator) parentOperators (org.apache.hadoop.hive.ql.exec.SelectOperator) parentOperators (org.apache.hadoop.hive.ql.exec.UnionOperator) childOperators (org.apache.hadoop.hive.ql.exec.TableScanOperator) aliasToWork (org.apache.hadoop.hive.ql.plan.MapWork) at org.apache.hive.com.esotericsoftware.kryo.util.DefaultClassResolver.readClass(DefaultClassResolver.java:119) at org.apache.hive.com.esotericsoftware.kryo.Kryo.readClass(Kryo.java:656) at org.apache.hive.com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:99) at org.apache.hive.com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:507) at org.apache.hive.com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:776) at org.apache.hive.com.esotericsoftware.kryo.serializers.CollectionSerializer.read(CollectionSerializer.java:112) at org.apache.hive.com.esotericsoftware.kryo.serializers.CollectionSerializer.read(CollectionSerializer.java:18) at org.apache.hive.com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:694) at org.apache.hive.com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:106) at org.apache.hive.com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:507) at org.apache.hive.com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:776) at org.apache.hive.com.esotericsoftware.kryo.serializers.CollectionSerializer.read(CollectionSerializer.java:112) at org.apache.hive.com.esotericsoftware.kryo.serializers.CollectionSerializer.read(CollectionSerializer.java:18) at org.apache.hive.com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:694) at org.apache.hive.com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:106) at org.apache.hive.com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:507) at org.apache.hive.com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:776) at org.apache.hive.com.esotericsoftware.kryo.serializers.CollectionSerializer.read(CollectionSerializer.java:112) at org.apache.hive.com.esotericsoftware.kryo.serializers.CollectionSerializer.read(CollectionSerializer.java:18) at org.apache.hive.com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:694) at org.apache.hive.com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:106) at org.apache.hive.com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:507) at org.apache.hive.com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:776) at org.apache.hive.com.esotericsoftware.kryo.serializers.CollectionSerializer.read(CollectionSerializer.java:112) at org.apache.hive.com.esotericsoftware.kryo.serializers.CollectionSerializer.read(CollectionSerializer.java:18) at org.apache.hive.com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:694) at org.apache.hive.com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:106) at org.apache.hive.com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:507) at org.apache.hive.com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:776) at org.apache.hive.com.esotericsoftware.kryo.serializers.CollectionSerializer.read(CollectionSerializer.java:112) at org.apache.hive.com.esotericsoftware.kryo.serializers.CollectionSerializer.read(CollectionSerializer.java:18) at org.apache.hive.com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:694) at org.apache.hive.com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:106) at org.apache.hive.com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:507) at org.apache.hive.com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:776) at org.apache.hive.com.esotericsoftware.kryo.serializers.CollectionSerializer.read(CollectionSerializer.java:112) at org.apache.hive.com.esotericsoftware.kryo.serializers.CollectionSerializer.read(CollectionSerializer.java:18) at org.apache.hive.com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:694) at org.apache.hive.com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:106) at org.apache.hive.com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:507) at org.apache.hive.com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:776) at org.apache.hive.com.esotericsoftware.kryo.serializers.MapSerializer.read(MapSerializer.java:139) at org.apache.hive.com.esotericsoftware.kryo.serializers.MapSerializer.read(MapSerializer.java:17) at org.apache.hive.com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:694) at org.apache.hive.com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:106) at org.apache.hive.com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:507) at org.apache.hive.com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:672) at org.apache.hadoop.hive.ql.exec.Utilities.deserializeObjectByKryo(Utilities.java:918) at org.apache.hadoop.hive.ql.exec.Utilities.deserializePlan(Utilities.java:826) at org.apache.hadoop.hive.ql.exec.Utilities.deserializePlan(Utilities.java:840) at org.apache.hadoop.hive.ql.exec.Utilities.getBaseWork(Utilities.java:333) ... 13 more FAILED: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask MapReduce Jobs Launched: Stage-Stage-1: Map: 5 Reduce: 10 Cumulative CPU: 45.74 sec HDFS Read: 1414406 HDFS Write: 1532 SUCCESS Stage-Stage-13: Map: 5 Reduce: 10 Cumulative CPU: 77.81 sec HDFS Read: 1414406 HDFS Write: 1175433 SUCCESS Stage-Stage-4: Map: 15 HDFS Read: 0 HDFS Write: 0 FAIL Total MapReduce CPU Time Spent: 2 minutes 3 seconds 550 msec Error (2). Execution Failed. 2015-04-21 07:32:32 ERROR (2) in run_hive