System: Kylin-2.0 spark-cluster-1.5.2 HDP2.3:hadoop-2.7.1.2 lib spark-assembly-1.5.2.2.3.4.7.4-hadoop2.7.1.2.3.4.7.jar
Error occured when I build the cube with kylin-engine while the job has added to the kylin-history webUI 17/05/23 17:26:04 INFO CubeManager: Reloaded cube DFAD being CUBE[name=DFAD] having 0 segments 17/05/23 17:26:04 INFO CubeManager: Reloaded cube kylin_sales_cube being CUBE[name=kylin_sales_cube] having 1 segments 17/05/23 17:26:04 INFO CubeManager: Loaded 2 cubes, fail on 0 cubes 17/05/23 17:26:04 INFO MemoryStore: ensureFreeSpace(99864) called with curMem=0, maxMem=556038881 17/05/23 17:26:04 INFO MemoryStore: Block broadcast_0 stored as values in memory (estimated size 97.5 KB, free 530.2 MB) Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/sql/execution/columnar/CachedBatch at org.apache.kylin.engine.spark.KylinKryoRegistrator.registerClasses(KylinKryoRegistrator.java:70) at org.apache.spark.serializer.KryoSerializer$$anonfun$newKryo$4.apply(KryoSerializer.scala:120) at org.apache.spark.serializer.KryoSerializer$$anonfun$newKryo$4.apply(KryoSerializer.scala:120) at scala.Option.foreach(Option.scala:236) at org.apache.spark.serializer.KryoSerializer.newKryo(KryoSerializer.scala:120) at org.apache.spark.serializer.KryoSerializerInstance.borrowKryo(KryoSerializer.scala:237) at org.apache.spark.serializer.KryoSerializerInstance.<init>(KryoSerializer.scala:222) at org.apache.spark.serializer.KryoSerializer.newInstance(KryoSerializer.scala:138) at org.apache.spark.broadcast.TorrentBroadcast$.blockifyObject(TorrentBroadcast.scala:201) at org.apache.spark.broadcast.TorrentBroadcast.writeBlocks(TorrentBroadcast.scala:102) at org.apache.spark.broadcast.TorrentBroadcast.<init>(TorrentBroadcast.scala:85) at org.apache.spark.broadcast.TorrentBroadcastFactory.newBroadcast(TorrentBroadcastFactory.scala:34) at org.apache.spark.broadcast.BroadcastManager.newBroadcast(BroadcastManager.scala:63) at org.apache.spark.SparkContext.broadcast(SparkContext.scala:1327) at org.apache.spark.api.java.JavaSparkContext.broadcast(JavaSparkContext.scala:648) at org.apache.kylin.engine.spark.SparkCubingByLayer.execute(SparkCubingByLayer.java:166) at org.apache.kylin.common.util.AbstractApplication.execute(AbstractApplication.java:37) at org.apache.kylin.common.util.SparkEntry.main(SparkEntry.java:44) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:685) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) Caused by: java.lang.ClassNotFoundException: org.apache.spark.sql.execution.columnar.CachedBatch at java.net.URLClassLoader$1.run(URLClassLoader.java:366) at java.net.URLClassLoader$1.run(URLClassLoader.java:355) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:354) at java.lang.ClassLoader.loadClass(ClassLoader.java:425) at java.lang.ClassLoader.loadClass(ClassLoader.java:358) ... 27 more 17/05/23 17:26:04 INFO ConnectionManager$HConnectionImplementation: Closing master protocol: MasterService 17/05/23 17:26:04 INFO SparkContext: Invoking stop() from shutdown hook 17/05/23 17:26:04 INFO ConnectionManager$HConnectionImplementation: Closing zookeeper sessionid=0x15c204d83a0000b Could you help me? Thand you! -- View this message in context: http://apache-kylin.74782.x6.nabble.com/It-does-not-work-use-Kylin-2-0-spark-1-5-2-hadoop-2-7-tp8073.html Sent from the Apache Kylin mailing list archive at Nabble.com.