水,could you give more details on the CDH version and correct Jackson version? Others can benefit from your work.
On Mon, May 23, 2016 at 9:29 AM, 水。。。海 <549066...@qq.com> wrote: > I solve the exception, I replace the jackson* jars in the HBase > > > > > ------------------ Original ------------------ > From: "yapujia";<yapu...@microsoft.com>; > Date: Fri, May 20, 2016 04:56 PM > To: "dev@kylin.apache.org"<dev@kylin.apache.org>; > > Subject: RE: RE: RE: kylin question > > > > Which step you get this exception? Can you get more details information? > And, I recommend use Ambari HDP to deploy your environment. > > Thanks. > > -----Original Message----- > From: 水。。。海 [mailto:549066...@qq.com] > Sent: Friday, May 20, 2016 4:48 PM > To: dev <dev@kylin.apache.org> > Subject: Re: RE: RE: kylin question > > jackson version in hadoop is 1.8.8, I try 1.8.8 and 1.9.13 ,all have > problems > > > > > ------------------ Original ------------------ > From: "Yapu Jia";<yapu...@microsoft.com>; > Date: Fri, May 20, 2016 04:41 PM > To: "dev@kylin.apache.org"<dev@kylin.apache.org>; > > Subject: RE: RE: kylin question > > > > Hi, > > You can check your jackson version. I think that if your jackson version > not match your Hadoop verison, it would cause this exception. > > Thanks. > -----Original Message----- > From: 水。。。海 [mailto:549066...@qq.com] > Sent: Friday, May 20, 2016 4:29 PM > To: dev <dev@kylin.apache.org> > Subject: Re: RE: kylin question > > I put the Hadoop-yarn* jars to $KYLIN_HOME/lib ,resolved the exception,but > I have other exception like this > java.lang.NoSuchMethodError: > org.codehaus.jackson.map.ObjectMapper.setSerializationInclusion(Lorg/codehaus/jackson/map/annotate/JsonSerialize$Inclusion;)Lorg/codehaus/jackson/map/ObjectMapper; > at > org.apache.hadoop.yarn.webapp.YarnJacksonJaxbJsonProvider.configObjectMapper(YarnJacksonJaxbJsonProvider.java:59) > at > org.apache.hadoop.yarn.util.timeline.TimelineUtils.<clinit>(TimelineUtils.java:47) > at > org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.serviceInit(YarnClientImpl.java:151) > at > org.apache.hadoop.service.AbstractService.init(AbstractService.java:163) > at > org.apache.hadoop.mapred.ResourceMgrDelegate.serviceInit(ResourceMgrDelegate.java:94) > at > org.apache.hadoop.service.AbstractService.init(AbstractService.java:163) > at > org.apache.hadoop.mapred.ResourceMgrDelegate.<init>(ResourceMgrDelegate.java:88) > at org.apache.hadoop.mapred.YARNRunner.<init>(YARNRunner.java:111) > at > org.apache.hadoop.mapred.YarnClientProtocolProvider.create(YarnClientProtocolProvider.java:34) > at org.apache.hadoop.mapreduce.Cluster.initialize(Cluster.java:95) > at org.apache.hadoop.mapreduce.Cluster.<init>(Cluster.java:82) > at org.apache.hadoop.mapreduce.Cluster.<init>(Cluster.java:75) > at org.apache.hadoop.mapreduce.Job$9.run(Job.java:1255) > at org.apache.hadoop.mapreduce.Job$9.run(Job.java:1251) > at java.security.AccessController.doPrivileged(Native Method) > at javax.security.auth.Subject.doAs(Subject.java:415) > at > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614) > at org.apache.hadoop.mapreduce.Job.connect(Job.java:1250) > at org.apache.hadoop.mapreduce.Job.submit(Job.java:1279) > at > org.apache.kylin.engine.mr.common.AbstractHadoopJob.waitForCompletion(AbstractHadoopJob.java:147) > at > org.apache.kylin.engine.mr.steps.FactDistinctColumnsJob.run(FactDistinctColumnsJob.java:96) > at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70) > at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84) > at > org.apache.kylin.engine.mr.common.MapReduceExecutable.doWork(MapReduceExecutable.java:118) > at > org.apache.kylin.job.execution.AbstractExecutable.execute(AbstractExecutable.java:114) > at > org.apache.kylin.job.execution.DefaultChainedExecutable.doWork(DefaultChainedExecutable.java:50) > at > org.apache.kylin.job.execution.AbstractExecutable.execute(AbstractExecutable.java:114) > at > org.apache.kylin.job.impl.threadpool.DefaultScheduler$JobRunner.run(DefaultScheduler.java:124) > at > java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) > at > java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) > at java.lang.Thread.run(Thread.java:745) > > > > and I put jackson-* 1.9.13 jars to $KYLIN_HOME/bin,still have > exception > > > > > Thanks! > > > ------------------ Original ------------------ > From: "Yapu Jia";<yapu...@microsoft.com>; > Date: Fri, May 20, 2016 02:56 PM > To: "dev@kylin.apache.org"<dev@kylin.apache.org>; > > Subject: RE: kylin question > > > > I think this issue maybe caused by that your mapred classpath not have > Hadoop-yarn* jars. > > Thanks. > > -----Original Message----- > From: 水。。。海 [mailto:549066...@qq.com] > Sent: Friday, May 20, 2016 2:28 PM > To: dev <dev@kylin.apache.org> > Subject: kylin question > > Hi: > When I Quick play with a sample cube get some exception like this > > > org.apache.kylin.job.exception.ExecuteException: > org.apache.kylin.job.exception.ExecuteException: > java.lang.NoClassDefFoundError: Could not initialize class > org.apache.hadoop.yarn.util.timeline.TimelineUtils > at > org.apache.kylin.job.execution.AbstractExecutable.execute(AbstractExecutable.java:124) > at > org.apache.kylin.job.impl.threadpool.DefaultScheduler$JobRunner.run(DefaultScheduler.java:124) > at > java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) > at > java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) > at java.lang.Thread.run(Thread.java:745) > Caused by: org.apache.kylin.job.exception.ExecuteException: > java.lang.NoClassDefFoundError: Could not initialize class > org.apache.hadoop.yarn.util.timeline.TimelineUtils > at > org.apache.kylin.job.execution.AbstractExecutable.execute(AbstractExecutable.java:124) > at > org.apache.kylin.job.execution.DefaultChainedExecutable.doWork(DefaultChainedExecutable.java:50) > at > org.apache.kylin.job.execution.AbstractExecutable.execute(AbstractExecutable.java:114) > ... 4 more > Caused by: java.lang.NoClassDefFoundError: Could not initialize class > org.apache.hadoop.yarn.util.timeline.TimelineUtils > at > org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.serviceInit(YarnClientImpl.java:151) > at > org.apache.hadoop.service.AbstractService.init(AbstractService.java:163) > at > org.apache.hadoop.mapred.ResourceMgrDelegate.serviceInit(ResourceMgrDelegate.java:94) > at > org.apache.hadoop.service.AbstractService.init(AbstractService.java:163) > at > org.apache.hadoop.mapred.ResourceMgrDelegate.<init>(ResourceMgrDelegate.java:88) > at org.apache.hadoop.mapred.YARNRunner.<init>(YARNRunner.java:111) > at > org.apache.hadoop.mapred.YarnClientProtocolProvider.create(YarnClientProtocolProvider.java:34) > at org.apache.hadoop.mapreduce.Cluster.initialize(Cluster.java:95) > at org.apache.hadoop.mapreduce.Cluster.<init>(Cluster.java:82) > at org.apache.hadoop.mapreduce.Cluster.<init>(Cluster.java:75) > at org.apache.hadoop.mapreduce.Job$9.run(Job.java:1255) > at org.apache.hadoop.mapreduce.Job$9.run(Job.java:1251) > at java.security.AccessController.doPrivileged(Native Method) > at javax.security.auth.Subject.doAs(Subject.java:415) > at > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614) > at org.apache.hadoop.mapreduce.Job.connect(Job.java:1250) > at org.apache.hadoop.mapreduce.Job.submit(Job.java:1279) > at > org.apache.kylin.engine.mr.common.AbstractHadoopJob.waitForCompletion(AbstractHadoopJob.java:147) > at > org.apache.kylin.engine.mr.steps.FactDistinctColumnsJob.run(FactDistinctColumnsJob.java:96) > at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70) > at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84) > at > org.apache.kylin.engine.mr.common.MapReduceExecutable.doWork(MapReduceExecutable.java:118) > at > org.apache.kylin.job.execution.AbstractExecutable.execute(AbstractExecutable.java:114) > ... 6 more > > > > software version : > apache-kylin-1.5.1-bin > zookeeper-3.4.5-cdh5.3.9 > hadoop-2.5.0-cdh5.3.9 > hbase-1.0.3 > hive-0.13.1-cdh5.3.9 > jdk1.7.0_67 > > > > As a user I would like to have a clear instruction manuals. > > Thanks! >