[ https://issues.apache.org/jira/browse/KYLIN-3871?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16791552#comment-16791552 ]
Shaofeng SHI commented on KYLIN-3871: ------------------------------------- To run the integration test, Hortonworks sandbox 2.4 is the only choice I think. We didn't do that on CDH sandbox. > Kylin inside Cloudera CDH Quickstart Sandbox > -------------------------------------------- > > Key: KYLIN-3871 > URL: https://issues.apache.org/jira/browse/KYLIN-3871 > Project: Kylin > Issue Type: Test > Components: Job Engine, Real-time Streaming > Affects Versions: v2.6.0 > Environment: Cloudera Quickstart Docker image: > - OS: centos6 > - memory: 13GB > - disk: 20GB > - java: 1.8 > - maven: 3.5.3 > Reporter: Yanwen Lin > Priority: Blocker > > When doing integration test, I met the following error. I know this is > related to Java version error and the reason is that Kylin use java1.8 while > Cloudera use java1.7. So I manually installed java 1.8 and set JAVA_HOME to > point to Spark2.x. (I also type spark-submit --version to check this). But > the bug did not go away. I guess during the process of Spark job, some > command may change the Java version back to java1.7 (not sure). Is there > anyway to force it not change back to Java1.7 or any workaround? > I have successfully finished maven installing and unit tests. > *Branch:* > realtime-streaming > *Executed command with problem:* > mvn verify -fae -Dhdp.version=2.4.0.0-169 -P sandbox > *Error stack:* > Exception in thread "main" java.lang.UnsupportedClassVersionError: > org/apache/spark/network/util/ByteUnit : Unsupported major.minor version 52.0 > at java.lang.ClassLoader.defineClass1(Native Method) > at java.lang.ClassLoader.defineClass(ClassLoader.java:800) > at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142) > at java.net.URLClassLoader.defineClass(URLClassLoader.java:449) > at java.net.URLClassLoader.access$100(URLClassLoader.java:71) > at java.net.URLClassLoader$1.run(URLClassLoader.java:361) > at java.net.URLClassLoader$1.run(URLClassLoader.java:355) > at java.security.AccessController.doPrivileged(Native Method) > at java.net.URLClassLoader.findClass(URLClassLoader.java:354) > at java.lang.ClassLoader.loadClass(ClassLoader.java:425) > at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308) > at java.lang.ClassLoader.loadClass(ClassLoader.java:358) > at org.apache.spark.deploy.history.config$.<init>(config.scala:44) > at org.apache.spark.deploy.history.config$.<clinit>(config.scala) > at org.apache.spark.SparkConf$.<init>(SparkConf.scala:635) > at org.apache.spark.SparkConf$.<clinit>(SparkConf.scala) > at org.apache.spark.SparkConf.set(SparkConf.scala:94) > at > org.apache.spark.SparkConf$$anonfun$loadFromSystemProperties$3.apply(SparkConf.scala:76) > at > org.apache.spark.SparkConf$$anonfun$loadFromSystemProperties$3.apply(SparkConf.scala:75) > at > scala.collection.TraversableLike$WithFilter$$anonfun$foreach$1.apply(TraversableLike.scala:733) > at scala.collection.immutable.HashMap$HashMap1.foreach(HashMap.scala:221) > at scala.collection.immutable.HashMap$HashTrieMap.foreach(HashMap.scala:428) > at scala.collection.immutable.HashMap$HashTrieMap.foreach(HashMap.scala:428) > at scala.collection.immutable.HashMap$HashTrieMap.foreach(HashMap.scala:428) > at > scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:732) > at org.apache.spark.SparkConf.loadFromSystemProperties(SparkConf.scala:75) > at org.apache.spark.SparkConf.<init>(SparkConf.scala:70) > at org.apache.spark.SparkConf.<init>(SparkConf.scala:57) > at > org.apache.spark.deploy.yarn.ApplicationMaster.<init>(ApplicationMaster.scala:62) > at > org.apache.spark.deploy.yarn.ApplicationMaster$.main(ApplicationMaster.scala:838) > at > org.apache.spark.deploy.yarn.ExecutorLauncher$.main(ApplicationMaster.scala:869) > at > org.apache.spark.deploy.yarn.ExecutorLauncher.main(ApplicationMaster.scala) -- This message was sent by Atlassian JIRA (v7.6.3#76005)