Hi Terry

I think the issue you mentioned will be resolved by following PR.
https://github.com/apache/spark/pull/3072

- Kousuke

(2014/11/03 10:42), Terry Siu wrote:
I just built the 1.2 snapshot current as of commit 76386e1a23c using:

$ ./make-distribution.sh —tgz —name my-spark —skip-java-test -DskipTests -Phadoop-2.4 -Phive -Phive-0.13.1 -Pyarn

I drop in my Hive configuration files into the conf directory, launch spark-shell, and then create my HiveContext, hc. I then issue a “use <db>” command:

scala> hc.hql(“use <db>”)

and receive the following class-not-found error:

java.lang.NoClassDefFoundError: com/esotericsoftware/shaded/org/objenesis/strategy/InstantiatorStrategy

at org.apache.hadoop.hive.ql.exec.Utilities.<clinit>(Utilities.java:925)

    at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1224)

    at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1088)

    at org.apache.hadoop.hive.ql.Driver.run(Driver.java:911)

    at org.apache.hadoop.hive.ql.Driver.run(Driver.java:901)

at org.apache.spark.sql.hive.HiveContext.runHive(HiveContext.scala:315)

at org.apache.spark.sql.hive.HiveContext.runSqlHive(HiveContext.scala:286)

at org.apache.spark.sql.hive.execution.NativeCommand.sideEffectResult$lzycompute(NativeCommand.scala:35)

at org.apache.spark.sql.hive.execution.NativeCommand.sideEffectResult(NativeCommand.scala:35)

at org.apache.spark.sql.execution.Command$class.execute(commands.scala:46)

at org.apache.spark.sql.hive.execution.NativeCommand.execute(NativeCommand.scala:30)

at org.apache.spark.sql.SQLContext$QueryExecution.toRdd$lzycompute(SQLContext.scala:424)

at org.apache.spark.sql.SQLContext$QueryExecution.toRdd(SQLContext.scala:424)

at org.apache.spark.sql.SchemaRDDLike$class.$init$(SchemaRDDLike.scala:58)

    at org.apache.spark.sql.SchemaRDD.<init>(SchemaRDD.scala:103)

    at org.apache.spark.sql.hive.HiveContext.hiveql(HiveContext.scala:111)

    at org.apache.spark.sql.hive.HiveContext.hql(HiveContext.scala:115)

at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:31)

    at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:36)

    at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:38)

    at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:40)

    at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:42)

    at $iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:44)

    at $iwC$$iwC$$iwC$$iwC.<init>(<console>:46)

    at $iwC$$iwC$$iwC.<init>(<console>:48)

    at $iwC$$iwC.<init>(<console>:50)

    at $iwC.<init>(<console>:52)

    at <init>(<console>:54)

    at .<init>(<console>:58)

    at .<clinit>(<console>)

    at .<init>(<console>:7)

    at .<clinit>(<console>)

    at $print(<console>)

    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java

at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorIva:43)

    at java.lang.reflect.Method.invoke(Method.java:606)

at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:852)

at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1125

at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:674)

    at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:705)

    at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:669)

at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:828)

at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:8

    at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:785)

at org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:628)

    at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:636)

    at org.apache.spark.repl.SparkILoop.loop(SparkILoop.scala:641)

at org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply$mcZ$sp(SparkILola:968)

at org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scal

at org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scal

at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoadla:135)

    at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:916)

    at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1011)

    at org.apache.spark.repl.Main$.main(Main.scala:31)

    at org.apache.spark.repl.Main.main(Main.scala)

    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java

at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorIva:43)

    at java.lang.reflect.Method.invoke(Method.java:606)

    at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:353)

    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)

    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

Caused by: java.lang.ClassNotFoundException: com.esotericsoftware.shaded.org.objenesategy.InstantiatorStrategy

    at java.net.URLClassLoader$1.run(URLClassLoader.java:366)

    at java.net.URLClassLoader$1.run(URLClassLoader.java:355)

    at java.security.AccessController.doPrivileged(Native Method)

    at java.net.URLClassLoader.findClass(URLClassLoader.java:354)

    at java.lang.ClassLoader.loadClass(ClassLoader.java:425)

    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)

    at java.lang.ClassLoader.loadClass(ClassLoader.java:358)

        ... 63 more


This missing class does exist, but it resides in the package org.objenesis.strategy in the assembly jar. I was hoping to see if the -Phive-0.13.1 profile would result in better compatibility as we are using CDH5.2 with Hive 0.13.

I generated another build with only -Phive this time and when I issued the same command above, it completed without any errors.

I’m wondering what sort of benefit there is to include the -Phive-0.13.1 profile into the build as it looks like there’s some shaded jar action going on.

Thanks,
-Terry

Reply via email to