Hi Alex,
 
thanks for responding, I hope I can give you all the needed information. 

I downloaded the 0.6.2 binary package and used the standard configuration. I only start the zeppelin-deamon and zeppelin spins up the embedded Spark environment. I only added the postgresql package (org.postgresql:postgresql:9.4.1211), everything else is like in the default configuration. 
I also tested a few things to make the problems disappear, but nothing worked. I tried to use "spark.serializer" -> "org.apache.spark.serializer.KryoSerializer" and "spark.io.compression.codec" -> "org.apache.spark.io.SnappyCompressionCodec", but nothing changed.
 
In my zeppelin-env.sh, nothing is activated (it is completely commented). I can't find anything special in the Zeppelin server .log / out, but here an excerpt from the interpreter log:
INFO [2016-11-16 13:04:04,841] ({pool-2-thread-27} SchedulerFactory.java[jobStarted]:131) - Job remoteInterpretJob_1479301444840 started by scheduler org.apache.zeppelin.spark.SparkInterpreter1982426211
 INFO [2016-11-16 13:04:04,841] ({pool-2-thread-27} Logging.scala[logInfo]:54) - Parsing command: SELECT soi.date AS date
        ,SUM(soi.price) AS price
FROM sales_example AS soi 
WHERE soi.date >= DATE_SUB((CAST(CURRENT_TIMESTAMP() AS DATE)), 7)
GROUP BY soi.date
ORDER BY soi.date ASC
 INFO [2016-11-16 13:04:04,885] ({pool-2-thread-27} Logging.scala[logInfo]:54) - Predicate isnotnull(order_date#198) generates partition filter: ((order_date.count#11222 - order_date.nullCount#11221) > 0)
 INFO [2016-11-16 13:04:04,886] ({pool-2-thread-27} Logging.scala[logInfo]:54) - Predicate (order_date#198 >= 17114) generates partition filter: (17114 <= order_date.upperBound#11219)
ERROR [2016-11-16 13:04:04,948] ({pool-2-thread-27} Logging.scala[logError]:91) - failed to compile: org.codehaus.janino.JaninoRuntimeException: Class 'org.apache.spark.sql.catalyst.expressions.codegen.GeneratedClass' was loaded through a different loader
/* 001 */ public SpecificOrdering generate(Object[] references) {
/* 002 */   return new SpecificOrdering(references);
/* 003 */ }
/* 004 */
/* 005 */ class SpecificOrdering extends org.apache.spark.sql.catalyst.expressions.codegen.BaseOrdering {
/* 006 */
/* 007 */   private Object[] references;
/* 008 */
/* 009 */
/* 010 */
/* 011 */   public SpecificOrdering(Object[] references) {
/* 012 */     this.references = references;
/* 013 */
/* 014 */   }
/* 015 */
/* 016 */   public int compare(InternalRow a, InternalRow b) {
/* 017 */     InternalRow i = null;  // Holds current row being evaluated.
/* 018 */
/* 019 */     i = a;
/* 020 */     boolean isNullA;
/* 021 */     int primitiveA;
/* 022 */     {
/* 023 */
/* 024 */       boolean isNull = i.isNullAt(0);
/* 025 */       int value = isNull ? -1 : (i.getInt(0));
/* 026 */       isNullA = isNull;
/* 027 */       primitiveA = value;
/* 028 */     }
/* 029 */     i = b;
/* 030 */     boolean isNullB;
/* 031 */     int primitiveB;
/* 032 */     {
/* 033 */
/* 034 */       boolean isNull = i.isNullAt(0);
/* 035 */       int value = isNull ? -1 : (i.getInt(0));
/* 036 */       isNullB = isNull;
/* 037 */       primitiveB = value;
/* 038 */     }
/* 039 */     if (isNullA && isNullB) {
/* 040 */       // Nothing
/* 041 */     } else if (isNullA) {
/* 042 */       return -1;
/* 043 */     } else if (isNullB) {
/* 044 */       return 1;
/* 045 */     } else {
/* 046 */       int comp = (primitiveA > primitiveB ? 1 : primitiveA < primitiveB ? -1 : 0);
/* 047 */       if (comp != 0) {
/* 048 */         return comp;
/* 049 */       }
/* 050 */     }
/* 051 */
/* 052 */
/* 053 */     i = a;
/* 054 */     boolean isNullA1;
/* 055 */     UTF8String primitiveA1;
/* 056 */     {
/* 057 */
/* 058 */       UTF8String value1 = i.getUTF8String(1);
/* 059 */       isNullA1 = false;
/* 060 */       primitiveA1 = value1;
/* 061 */     }
/* 062 */     i = b;
/* 063 */     boolean isNullB1;
/* 064 */     UTF8String primitiveB1;
/* 065 */     {
/* 066 */
/* 067 */       UTF8String value1 = i.getUTF8String(1);
/* 068 */       isNullB1 = false;
/* 069 */       primitiveB1 = value1;
/* 070 */     }
/* 071 */     if (isNullA1 && isNullB1) {
/* 072 */       // Nothing
/* 073 */     } else if (isNullA1) {
/* 074 */       return 1;
/* 075 */     } else if (isNullB1) {
/* 076 */       return -1;
/* 077 */     } else {
/* 078 */       int comp = primitiveA1.compare(primitiveB1);
/* 079 */       if (comp != 0) {
/* 080 */         return -comp;
/* 081 */       }
/* 082 */     }
/* 083 */
/* 084 */     return 0;
/* 085 */   }
/* 086 */ }
org.codehaus.janino.JaninoRuntimeException: Class 'org.apache.spark.sql.catalyst.expressions.codegen.GeneratedClass' was loaded through a different loader
    at org.codehaus.janino.SimpleCompiler$1.getDelegate(SimpleCompiler.java:337)
    at org.codehaus.janino.SimpleCompiler$1.accept(SimpleCompiler.java:291)
    at org.codehaus.janino.UnitCompiler.getType(UnitCompiler.java:5159)
    at org.codehaus.janino.UnitCompiler.access$16700(UnitCompiler.java:185)
    at org.codehaus.janino.UnitCompiler$29.getSuperclass2(UnitCompiler.java:8154)
    at org.codehaus.janino.IClass.getSuperclass(IClass.java:406)
    at org.codehaus.janino.IClass.findMemberType(IClass.java:766)
    at org.codehaus.janino.IClass.findMemberType(IClass.java:733)
    at org.codehaus.janino.UnitCompiler.findMemberType(UnitCompiler.java:10116)
    at org.codehaus.janino.UnitCompiler.getReferenceType(UnitCompiler.java:5300)
    at org.codehaus.janino.UnitCompiler.getReferenceType(UnitCompiler.java:5207)
    at org.codehaus.janino.UnitCompiler.getType2(UnitCompiler.java:5188)
    at org.codehaus.janino.UnitCompiler.access$12600(UnitCompiler.java:185)
    at org.codehaus.janino.UnitCompiler$16.visitReferenceType(UnitCompiler.java:5119)
    at org.codehaus.janino.Java$ReferenceType.accept(Java.java:2880)
    at org.codehaus.janino.UnitCompiler.getType(UnitCompiler.java:5159)
    at org.codehaus.janino.UnitCompiler.getType2(UnitCompiler.java:5414)
    at org.codehaus.janino.UnitCompiler.access$12400(UnitCompiler.java:185)
    at org.codehaus.janino.UnitCompiler$16.visitArrayType(UnitCompiler.java:5117)
    at org.codehaus.janino.Java$ArrayType.accept(Java.java:2954)
    at org.codehaus.janino.UnitCompiler.getType(UnitCompiler.java:5159)
    at org.codehaus.janino.UnitCompiler.access$16700(UnitCompiler.java:185)
    at org.codehaus.janino.UnitCompiler$31.getParameterTypes2(UnitCompiler.java:8533)
    at org.codehaus.janino.IClass$IInvocable.getParameterTypes(IClass.java:835)
    at org.codehaus.janino.IClass$IMethod.getDescriptor2(IClass.java:1063)
    at org.codehaus.janino.IClass$IInvocable.getDescriptor(IClass.java:849)
    at org.codehaus.janino.IClass.getIMethods(IClass.java:211)
    at org.codehaus.janino.IClass.getIMethods(IClass.java:199)
    at org.codehaus.janino.UnitCompiler.compile2(UnitCompiler.java:409)
    at org.codehaus.janino.UnitCompiler.compile2(UnitCompiler.java:393)
    at org.codehaus.janino.UnitCompiler.access$400(UnitCompiler.java:185)
    at org.codehaus.janino.UnitCompiler$2.visitPackageMemberClassDeclaration(UnitCompiler.java:347)
    at org.codehaus.janino.Java$PackageMemberClassDeclaration.accept(Java.java:1139)
    at org.codehaus.janino.UnitCompiler.compile(UnitCompiler.java:354)
    at org.codehaus.janino.UnitCompiler.compileUnit(UnitCompiler.java:322)
    at org.codehaus.janino.SimpleCompiler.compileToClassLoader(SimpleCompiler.java:383)
    at org.codehaus.janino.ClassBodyEvaluator.compileToClass(ClassBodyEvaluator.java:315)
    at org.codehaus.janino.ClassBodyEvaluator.cook(ClassBodyEvaluator.java:233)
    at org.codehaus.janino.SimpleCompiler.cook(SimpleCompiler.java:192)
    at org.codehaus.commons.compiler.Cookable.cook(Cookable.java:84)
    at org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$.org$apache$spark$sql$catalyst$expressions$codegen$CodeGenerator$$doCompile(CodeGenerator.scala:883)
    at org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$$anon$1.load(CodeGenerator.scala:941)
    at org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$$anon$1.load(CodeGenerator.scala:938)
    at org.spark_project.guava.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3599)
    at org.spark_project.guava.cache.LocalCache$Segment.loadSync(LocalCache.java:2379)
    at org.spark_project.guava.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2342)
    at org.spark_project.guava.cache.LocalCache$Segment.get(LocalCache.java:2257)
    at org.spark_project.guava.cache.LocalCache.get(LocalCache.java:4000)
    at org.spark_project.guava.cache.LocalCache.getOrLoad(LocalCache.java:4004)
    at org.spark_project.guava.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4874)
    at org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$.compile(CodeGenerator.scala:837)
    at org.apache.spark.sql.catalyst.expressions.codegen.GenerateOrdering$.create(GenerateOrdering.scala:146)
    at org.apache.spark.sql.catalyst.expressions.codegen.GenerateOrdering$.create(GenerateOrdering.scala:43)
    at org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator.generate(CodeGenerator.scala:821)
    at org.apache.spark.sql.catalyst.expressions.codegen.LazilyGeneratedOrdering.<init>(GenerateOrdering.scala:160)
    at org.apache.spark.sql.catalyst.expressions.codegen.LazilyGeneratedOrdering.<init>(GenerateOrdering.scala:157)
    at org.apache.spark.sql.execution.TakeOrderedAndProjectExec.executeCollect(limit.scala:127)
    at org.apache.spark.sql.Dataset$$anonfun$org$apache$spark$sql$Dataset$$execute$1$1.apply(Dataset.scala:2183)
    at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:57)
    at org.apache.spark.sql.Dataset.withNewExecutionId(Dataset.scala:2532)
    at org.apache.spark.sql.Dataset.org$apache$spark$sql$Dataset$$execute$1(Dataset.scala:2182)
    at org.apache.spark.sql.Dataset.org$apache$spark$sql$Dataset$$collect(Dataset.scala:2189)
    at org.apache.spark.sql.Dataset$$anonfun$head$1.apply(Dataset.scala:1925)
    at org.apache.spark.sql.Dataset$$anonfun$head$1.apply(Dataset.scala:1924)
    at org.apache.spark.sql.Dataset.withTypedCallback(Dataset.scala:2562)
    at org.apache.spark.sql.Dataset.head(Dataset.scala:1924)
    at org.apache.spark.sql.Dataset.take(Dataset.scala:2139)
    at sun.reflect.GeneratedMethodAccessor113.invoke(Unknown Source)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.zeppelin.spark.ZeppelinContext.showDF(ZeppelinContext.java:216)
    at org.apache.zeppelin.spark.SparkSqlInterpreter.interpret(SparkSqlInterpreter.java:129)
    at org.apache.zeppelin.interpreter.LazyOpenInterpreter.interpret(LazyOpenInterpreter.java:94)
    at org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:341)
    at org.apache.zeppelin.scheduler.Job.run(Job.java:176)
    at org.apache.zeppelin.scheduler.FIFOScheduler$1.run(FIFOScheduler.java:139)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
    at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:745)
ERROR [2016-11-16 13:04:04,955] ({pool-2-thread-27} Job.java[run]:189) - Job failed
org.apache.zeppelin.interpreter.InterpreterException: java.lang.reflect.InvocationTargetException
    at org.apache.zeppelin.spark.ZeppelinContext.showDF(ZeppelinContext.java:220)
    at org.apache.zeppelin.spark.SparkSqlInterpreter.interpret(SparkSqlInterpreter.java:129)
    at org.apache.zeppelin.interpreter.LazyOpenInterpreter.interpret(LazyOpenInterpreter.java:94)
    at org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:341)
    at org.apache.zeppelin.scheduler.Job.run(Job.java:176)
    at org.apache.zeppelin.scheduler.FIFOScheduler$1.run(FIFOScheduler.java:139)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
    at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.reflect.InvocationTargetException
    at sun.reflect.GeneratedMethodAccessor113.invoke(Unknown Source)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.zeppelin.spark.ZeppelinContext.showDF(ZeppelinContext.java:216)
    ... 12 more
Caused by: java.util.concurrent.ExecutionException: java.lang.Exception: failed to compile: org.codehaus.janino.JaninoRuntimeException: Class 'org.apache.spark.sql.catalyst.expressions.codegen.GeneratedClass' was loaded through a different loader
/* 001 */ public SpecificOrdering generate(Object[] references) {
/* 002 */   return new SpecificOrdering(references);
/* 003 */ }
/* 004 */
/* 005 */ class SpecificOrdering extends org.apache.spark.sql.catalyst.expressions.codegen.BaseOrdering {
/* 006 */
/* 007 */   private Object[] references;
/* 008 */
/* 009 */
/* 010 */
/* 011 */   public SpecificOrdering(Object[] references) {
/* 012 */     this.references = references;
/* 013 */
/* 014 */   }
/* 015 */
/* 016 */   public int compare(InternalRow a, InternalRow b) {
/* 017 */     InternalRow i = null;  // Holds current row being evaluated.
/* 018 */
/* 019 */     i = a;
/* 020 */     boolean isNullA;
/* 021 */     int primitiveA;
/* 022 */     {
/* 023 */
/* 024 */       boolean isNull = i.isNullAt(0);
/* 025 */       int value = isNull ? -1 : (i.getInt(0));
/* 026 */       isNullA = isNull;
/* 027 */       primitiveA = value;
/* 028 */     }
/* 029 */     i = b;
/* 030 */     boolean isNullB;
/* 031 */     int primitiveB;
/* 032 */     {
/* 033 */
/* 034 */       boolean isNull = i.isNullAt(0);
/* 035 */       int value = isNull ? -1 : (i.getInt(0));
/* 036 */       isNullB = isNull;
/* 037 */       primitiveB = value;
/* 038 */     }
/* 039 */     if (isNullA && isNullB) {
/* 040 */       // Nothing
/* 041 */     } else if (isNullA) {
/* 042 */       return -1;
/* 043 */     } else if (isNullB) {
/* 044 */       return 1;
/* 045 */     } else {
/* 046 */       int comp = (primitiveA > primitiveB ? 1 : primitiveA < primitiveB ? -1 : 0);
/* 047 */       if (comp != 0) {
/* 048 */         return comp;
/* 049 */       }
/* 050 */     }
/* 051 */
/* 052 */
/* 053 */     i = a;
/* 054 */     boolean isNullA1;
/* 055 */     UTF8String primitiveA1;
/* 056 */     {
/* 057 */
/* 058 */       UTF8String value1 = i.getUTF8String(1);
/* 059 */       isNullA1 = false;
/* 060 */       primitiveA1 = value1;
/* 061 */     }
/* 062 */     i = b;
/* 063 */     boolean isNullB1;
/* 064 */     UTF8String primitiveB1;
/* 065 */     {
/* 066 */
/* 067 */       UTF8String value1 = i.getUTF8String(1);
/* 068 */       isNullB1 = false;
/* 069 */       primitiveB1 = value1;
/* 070 */     }
/* 071 */     if (isNullA1 && isNullB1) {
/* 072 */       // Nothing
/* 073 */     } else if (isNullA1) {
/* 074 */       return 1;
/* 075 */     } else if (isNullB1) {
/* 076 */       return -1;
/* 077 */     } else {
/* 078 */       int comp = primitiveA1.compare(primitiveB1);
/* 079 */       if (comp != 0) {
/* 080 */         return -comp;
/* 081 */       }
/* 082 */     }
/* 083 */
/* 084 */     return 0;
/* 085 */   }
/* 086 */ }
    at org.spark_project.guava.util.concurrent.AbstractFuture$Sync.getValue(AbstractFuture.java:306)
    at org.spark_project.guava.util.concurrent.AbstractFuture$Sync.get(AbstractFuture.java:293)
    at org.spark_project.guava.util.concurrent.AbstractFuture.get(AbstractFuture.java:116)
    at org.spark_project.guava.util.concurrent.Uninterruptibles.getUninterruptibly(Uninterruptibles.java:135)
    at org.spark_project.guava.cache.LocalCache$Segment.getAndRecordStats(LocalCache.java:2410)
    at org.spark_project.guava.cache.LocalCache$Segment.loadSync(LocalCache.java:2380)
    at org.spark_project.guava.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2342)
    at org.spark_project.guava.cache.LocalCache$Segment.get(LocalCache.java:2257)
    at org.spark_project.guava.cache.LocalCache.get(LocalCache.java:4000)
    at org.spark_project.guava.cache.LocalCache.getOrLoad(LocalCache.java:4004)
    at org.spark_project.guava.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4874)
    at org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$.compile(CodeGenerator.scala:837)
    at org.apache.spark.sql.catalyst.expressions.codegen.GenerateOrdering$.create(GenerateOrdering.scala:146)
    at org.apache.spark.sql.catalyst.expressions.codegen.GenerateOrdering$.create(GenerateOrdering.scala:43)
    at org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator.generate(CodeGenerator.scala:821)
    at org.apache.spark.sql.catalyst.expressions.codegen.LazilyGeneratedOrdering.<init>(GenerateOrdering.scala:160)
    at org.apache.spark.sql.catalyst.expressions.codegen.LazilyGeneratedOrdering.<init>(GenerateOrdering.scala:157)
    at org.apache.spark.sql.execution.TakeOrderedAndProjectExec.executeCollect(limit.scala:127)
    at org.apache.spark.sql.Dataset$$anonfun$org$apache$spark$sql$Dataset$$execute$1$1.apply(Dataset.scala:2183)
    at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:57)
    at org.apache.spark.sql.Dataset.withNewExecutionId(Dataset.scala:2532)
    at org.apache.spark.sql.Dataset.org$apache$spark$sql$Dataset$$execute$1(Dataset.scala:2182)
    at org.apache.spark.sql.Dataset.org$apache$spark$sql$Dataset$$collect(Dataset.scala:2189)
    at org.apache.spark.sql.Dataset$$anonfun$head$1.apply(Dataset.scala:1925)
    at org.apache.spark.sql.Dataset$$anonfun$head$1.apply(Dataset.scala:1924)
    at org.apache.spark.sql.Dataset.withTypedCallback(Dataset.scala:2562)
    at org.apache.spark.sql.Dataset.head(Dataset.scala:1924)
    at org.apache.spark.sql.Dataset.take(Dataset.scala:2139)
    ... 16 more
Caused by: java.lang.Exception: failed to compile: org.codehaus.janino.JaninoRuntimeException: Class 'org.apache.spark.sql.catalyst.expressions.codegen.GeneratedClass' was loaded through a different loader
/* 001 */ public SpecificOrdering generate(Object[] references) {
/* 002 */   return new SpecificOrdering(references);
/* 003 */ }
/* 004 */
/* 005 */ class SpecificOrdering extends org.apache.spark.sql.catalyst.expressions.codegen.BaseOrdering {
/* 006 */
/* 007 */   private Object[] references;
/* 008 */
/* 009 */
/* 010 */
/* 011 */   public SpecificOrdering(Object[] references) {
/* 012 */     this.references = references;
/* 013 */
/* 014 */   }
/* 015 */
/* 016 */   public int compare(InternalRow a, InternalRow b) {
/* 017 */     InternalRow i = null;  // Holds current row being evaluated.
/* 018 */
/* 019 */     i = a;
/* 020 */     boolean isNullA;
/* 021 */     int primitiveA;
/* 022 */     {
/* 023 */
/* 024 */       boolean isNull = i.isNullAt(0);
/* 025 */       int value = isNull ? -1 : (i.getInt(0));
/* 026 */       isNullA = isNull;
/* 027 */       primitiveA = value;
/* 028 */     }
/* 029 */     i = b;
/* 030 */     boolean isNullB;
/* 031 */     int primitiveB;
/* 032 */     {
/* 033 */
/* 034 */       boolean isNull = i.isNullAt(0);
/* 035 */       int value = isNull ? -1 : (i.getInt(0));
/* 036 */       isNullB = isNull;
/* 037 */       primitiveB = value;
/* 038 */     }
/* 039 */     if (isNullA && isNullB) {
/* 040 */       // Nothing
/* 041 */     } else if (isNullA) {
/* 042 */       return -1;
/* 043 */     } else if (isNullB) {
/* 044 */       return 1;
/* 045 */     } else {
/* 046 */       int comp = (primitiveA > primitiveB ? 1 : primitiveA < primitiveB ? -1 : 0);
/* 047 */       if (comp != 0) {
/* 048 */         return comp;
/* 049 */       }
/* 050 */     }
/* 051 */
/* 052 */
/* 053 */     i = a;
/* 054 */     boolean isNullA1;
/* 055 */     UTF8String primitiveA1;
/* 056 */     {
/* 057 */
/* 058 */       UTF8String value1 = i.getUTF8String(1);
/* 059 */       isNullA1 = false;
/* 060 */       primitiveA1 = value1;
/* 061 */     }
/* 062 */     i = b;
/* 063 */     boolean isNullB1;
/* 064 */     UTF8String primitiveB1;
/* 065 */     {
/* 066 */
/* 067 */       UTF8String value1 = i.getUTF8String(1);
/* 068 */       isNullB1 = false;
/* 069 */       primitiveB1 = value1;
/* 070 */     }
/* 071 */     if (isNullA1 && isNullB1) {
/* 072 */       // Nothing
/* 073 */     } else if (isNullA1) {
/* 074 */       return 1;
/* 075 */     } else if (isNullB1) {
/* 076 */       return -1;
/* 077 */     } else {
/* 078 */       int comp = primitiveA1.compare(primitiveB1);
/* 079 */       if (comp != 0) {
/* 080 */         return -comp;
/* 081 */       }
/* 082 */     }
/* 083 */
/* 084 */     return 0;
/* 085 */   }
/* 086 */ }
    at org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$.org$apache$spark$sql$catalyst$expressions$codegen$CodeGenerator$$doCompile(CodeGenerator.scala:889)
    at org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$$anon$1.load(CodeGenerator.scala:941)
    at org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$$anon$1.load(CodeGenerator.scala:938)
    at org.spark_project.guava.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3599)
    at org.spark_project.guava.cache.LocalCache$Segment.loadSync(LocalCache.java:2379)
    ... 38 more
Caused by: org.codehaus.janino.JaninoRuntimeException: Class 'org.apache.spark.sql.catalyst.expressions.codegen.GeneratedClass' was loaded through a different loader
    at org.codehaus.janino.SimpleCompiler$1.getDelegate(SimpleCompiler.java:337)
    at org.codehaus.janino.SimpleCompiler$1.accept(SimpleCompiler.java:291)
    at org.codehaus.janino.UnitCompiler.getType(UnitCompiler.java:5159)
    at org.codehaus.janino.UnitCompiler.access$16700(UnitCompiler.java:185)
    at org.codehaus.janino.UnitCompiler$29.getSuperclass2(UnitCompiler.java:8154)
    at org.codehaus.janino.IClass.getSuperclass(IClass.java:406)
    at org.codehaus.janino.IClass.findMemberType(IClass.java:766)
    at org.codehaus.janino.IClass.findMemberType(IClass.java:733)
    at org.codehaus.janino.UnitCompiler.findMemberType(UnitCompiler.java:10116)
    at org.codehaus.janino.UnitCompiler.getReferenceType(UnitCompiler.java:5300)
    at org.codehaus.janino.UnitCompiler.getReferenceType(UnitCompiler.java:5207)
    at org.codehaus.janino.UnitCompiler.getType2(UnitCompiler.java:5188)
    at org.codehaus.janino.UnitCompiler.access$12600(UnitCompiler.java:185)
    at org.codehaus.janino.UnitCompiler$16.visitReferenceType(UnitCompiler.java:5119)
    at org.codehaus.janino.Java$ReferenceType.accept(Java.java:2880)
    at org.codehaus.janino.UnitCompiler.getType(UnitCompiler.java:5159)
    at org.codehaus.janino.UnitCompiler.getType2(UnitCompiler.java:5414)
    at org.codehaus.janino.UnitCompiler.access$12400(UnitCompiler.java:185)
    at org.codehaus.janino.UnitCompiler$16.visitArrayType(UnitCompiler.java:5117)
    at org.codehaus.janino.Java$ArrayType.accept(Java.java:2954)
    at org.codehaus.janino.UnitCompiler.getType(UnitCompiler.java:5159)
    at org.codehaus.janino.UnitCompiler.access$16700(UnitCompiler.java:185)
    at org.codehaus.janino.UnitCompiler$31.getParameterTypes2(UnitCompiler.java:8533)
    at org.codehaus.janino.IClass$IInvocable.getParameterTypes(IClass.java:835)
    at org.codehaus.janino.IClass$IMethod.getDescriptor2(IClass.java:1063)
    at org.codehaus.janino.IClass$IInvocable.getDescriptor(IClass.java:849)
    at org.codehaus.janino.IClass.getIMethods(IClass.java:211)
    at org.codehaus.janino.IClass.getIMethods(IClass.java:199)
    at org.codehaus.janino.UnitCompiler.compile2(UnitCompiler.java:409)
    at org.codehaus.janino.UnitCompiler.compile2(UnitCompiler.java:393)
    at org.codehaus.janino.UnitCompiler.access$400(UnitCompiler.java:185)
    at org.codehaus.janino.UnitCompiler$2.visitPackageMemberClassDeclaration(UnitCompiler.java:347)
    at org.codehaus.janino.Java$PackageMemberClassDeclaration.accept(Java.java:1139)
    at org.codehaus.janino.UnitCompiler.compile(UnitCompiler.java:354)
    at org.codehaus.janino.UnitCompiler.compileUnit(UnitCompiler.java:322)
    at org.codehaus.janino.SimpleCompiler.compileToClassLoader(SimpleCompiler.java:383)
    at org.codehaus.janino.ClassBodyEvaluator.compileToClass(ClassBodyEvaluator.java:315)
    at org.codehaus.janino.ClassBodyEvaluator.cook(ClassBodyEvaluator.java:233)
    at org.codehaus.janino.SimpleCompiler.cook(SimpleCompiler.java:192)
    at org.codehaus.commons.compiler.Cookable.cook(Cookable.java:84)
    at org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$.org$apache$spark$sql$catalyst$expressions$codegen$CodeGenerator$$doCompile(CodeGenerator.scala:883)
    ... 42 more
 INFO [2016-11-16 13:04:04,975] ({pool-2-thread-27} SchedulerFactory.java[jobFinished]:137) - Job remoteInterpretJob_1479301444840 finished by scheduler org.apache.zeppelin.spark.SparkInterpreter1982426211
 INFO [2016-11-16 13:04:04,989] ({pool-2-thread-51} SchedulerFactory.java[jobStarted]:131) - Job remoteInterpretJob_1479301444965 started by scheduler org.apache.zeppelin.spark.SparkInterpreter1982426211
 
 
Do you need anything else?
 

Best regards
Florian
 
Gesendet: Mittwoch, 16. November 2016 um 12:40 Uhr
Von: "Alexander Bezzubov" <b...@apache.org>
An: users@zeppelin.apache.org
Betreff: Re: Two different errors while executing Spark SQL queries against cached temp tables
Hi Florian,

sorry for slow response, I guess the main reason for not much feedback here is that its hard to reproduce the error you describe, as it does not happen reliably even on your local environment.

java.lang.NoSuchMethodException: org.apache.spark.io.LZ4CompressionCodec

This can be a sign of Hadoop FS codec miss-configuration. 

Could you share a bit more details on Zeppelin/Spark/Hadoop configuration that you use?

What is your SPARK_HOME ? zeppelin-env.sh ? Do you use external Spark cluster? Is it Spark standalone or yarn-client cluster configuration? You have shared Spark interpreter logs, but just in case, is there anything strange in Zeppelin server .log/.out ?

Details like this would enable more people to chime in and help.

--

Alex

 
On Wed, Nov 16, 2016, 12:31 Florian Schulz <flo_schul...@web.de> wrote:
Hi,
 
can anyone help me with this? It is very anoying, because I get this error very often (on my local maschine and also on a second vm). I use Zeppelin 0.6.2 with Spark 2.0 and Scala 2.11.
 
 
Best regards
Florian
 
Gesendet: Montag, 14. November 2016 um 20:45 Uhr
Von: "Florian Schulz" <flo_schul...@web.de>
An: users@zeppelin.apache.org
Betreff: Two different errors while executing Spark SQL queries against cached temp tables
Hi everyone,
 
I have some trouble while executing some Spark SQL queries against some cached temp tables. I query different temp tables and while doing aggregates etc., I often get these errors back:
 
java.lang.NoSuchMethodException: org.apache.spark.io.LZ4CompressionCodec.<init>(org.apache.spark.SparkConf)
    at java.lang.Class.getConstructor0(Class.java:3082)
    at java.lang.Class.getConstructor(Class.java:1825)
    at org.apache.spark.io.CompressionCodec$.createCodec(CompressionCodec.scala:72)
    at org.apache.spark.io.CompressionCodec$.createCodec(CompressionCodec.scala:66)
    at org.apache.spark.sql.execution.SparkPlan.org$apache$spark$sql$execution$SparkPlan$$decodeUnsafeRows(SparkPlan.scala:265)
    at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeTake$1.apply(SparkPlan.scala:351)
    at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeTake$1.apply(SparkPlan.scala:350)
    at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
    at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:186)
    at org.apache.spark.sql.execution.SparkPlan.executeTake(SparkPlan.scala:350)
    at org.apache.spark.sql.execution.CollectLimitExec.executeCollect(limit.scala:39)
    at org.apache.spark.sql.Dataset$$anonfun$org$apache$spark$sql$Dataset$$execute$1$1.apply(Dataset.scala:2183)
    at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:57)
    at org.apache.spark.sql.Dataset.withNewExecutionId(Dataset.scala:2532)
    at org.apache.spark.sql.Dataset.org$apache$spark$sql$Dataset$$execute$1(Dataset.scala:2182)
    at org.apache.spark.sql.Dataset.org$apache$spark$sql$Dataset$$collect(Dataset.scala:2189)
    at org.apache.spark.sql.Dataset$$anonfun$head$1.apply(Dataset.scala:1925)
    at org.apache.spark.sql.Dataset$$anonfun$head$1.apply(Dataset.scala:1924)
    at org.apache.spark.sql.Dataset.withTypedCallback(Dataset.scala:2562)
    at org.apache.spark.sql.Dataset.head(Dataset.scala:1924)
    at org.apache.spark.sql.Dataset.take(Dataset.scala:2139)
    at sun.reflect.GeneratedMethodAccessor322.invoke(Unknown Source)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.zeppelin.spark.ZeppelinContext.showDF(ZeppelinContext.java:216)
    at org.apache.zeppelin.spark.SparkSqlInterpreter.interpret(SparkSqlInterpreter.java:129)
    at org.apache.zeppelin.interpreter.LazyOpenInterpreter.interpret(LazyOpenInterpreter.java:94)
    at org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:341)
    at org.apache.zeppelin.scheduler.Job.run(Job.java:176)
    at org.apache.zeppelin.scheduler.FIFOScheduler$1.run(FIFOScheduler.java:139)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
    at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:745)
  org.codehaus.janino.JaninoRuntimeException: Class 'org.apache.spark.sql.catalyst.expressions.codegen.GeneratedClass' was loaded through a different loader
    at org.codehaus.janino.SimpleCompiler$1.getDelegate(SimpleCompiler.java:337)
    at org.codehaus.janino.SimpleCompiler$1.accept(SimpleCompiler.java:291)
    at org.codehaus.janino.UnitCompiler.getType(UnitCompiler.java:5159)
    at org.codehaus.janino.UnitCompiler.access$16700(UnitCompiler.java:185)
    at org.codehaus.janino.UnitCompiler$29.getSuperclass2(UnitCompiler.java:8154)
    at org.codehaus.janino.IClass.getSuperclass(IClass.java:406)
    at org.codehaus.janino.IClass.findMemberType(IClass.java:766)
    at org.codehaus.janino.IClass.findMemberType(IClass.java:733)
    at org.codehaus.janino.UnitCompiler.findMemberType(UnitCompiler.java:10116)
    at org.codehaus.janino.UnitCompiler.getReferenceType(UnitCompiler.java:5300)
    at org.codehaus.janino.UnitCompiler.getReferenceType(UnitCompiler.java:5207)
    at org.codehaus.janino.UnitCompiler.getType2(UnitCompiler.java:5188)
    at org.codehaus.janino.UnitCompiler.access$12600(UnitCompiler.java:185)
    at org.codehaus.janino.UnitCompiler$16.visitReferenceType(UnitCompiler.java:5119)
    at org.codehaus.janino.Java$ReferenceType.accept(Java.java:2880)
    at org.codehaus.janino.UnitCompiler.getType(UnitCompiler.java:5159)
    at org.codehaus.janino.UnitCompiler.getType2(UnitCompiler.java:5414)
    at org.codehaus.janino.UnitCompiler.access$12400(UnitCompiler.java:185)
    at org.codehaus.janino.UnitCompiler$16.visitArrayType(UnitCompiler.java:5117)
    at org.codehaus.janino.Java$ArrayType.accept(Java.java:2954)
    at org.codehaus.janino.UnitCompiler.getType(UnitCompiler.java:5159)
    at org.codehaus.janino.UnitCompiler.access$16700(UnitCompiler.java:185)
    at org.codehaus.janino.UnitCompiler$31.getParameterTypes2(UnitCompiler.java:8533)
    at org.codehaus.janino.IClass$IInvocable.getParameterTypes(IClass.java:835)
    at org.codehaus.janino.IClass$IMethod.getDescriptor2(IClass.java:1063)
    at org.codehaus.janino.IClass$IInvocable.getDescriptor(IClass.java:849)
    at org.codehaus.janino.IClass.getIMethods(IClass.java:211)
    at org.codehaus.janino.IClass.getIMethods(IClass.java:199)
    at org.codehaus.janino.UnitCompiler.compile2(UnitCompiler.java:409)
    at org.codehaus.janino.UnitCompiler.compile2(UnitCompiler.java:393)
    at org.codehaus.janino.UnitCompiler.access$400(UnitCompiler.java:185)
    at org.codehaus.janino.UnitCompiler$2.visitPackageMemberClassDeclaration(UnitCompiler.java:347)
    at org.codehaus.janino.Java$PackageMemberClassDeclaration.accept(Java.java:1139)
    at org.codehaus.janino.UnitCompiler.compile(UnitCompiler.java:354)
    at org.codehaus.janino.UnitCompiler.compileUnit(UnitCompiler.java:322)
    at org.codehaus.janino.SimpleCompiler.compileToClassLoader(SimpleCompiler.java:383)
    at org.codehaus.janino.ClassBodyEvaluator.compileToClass(ClassBodyEvaluator.java:315)
    at org.codehaus.janino.ClassBodyEvaluator.cook(ClassBodyEvaluator.java:233)
    at org.codehaus.janino.SimpleCompiler.cook(SimpleCompiler.java:192)
    at org.codehaus.commons.compiler.Cookable.cook(Cookable.java:84)
    at org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$.org$apache$spark$sql$catalyst$expressions$codegen$CodeGenerator$$doCompile(CodeGenerator.scala:883)
    at org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$$anon$1.load(CodeGenerator.scala:941)
    at org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$$anon$1.load(CodeGenerator.scala:938)
    at org.spark_project.guava.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3599)
    at org.spark_project.guava.cache.LocalCache$Segment.loadSync(LocalCache.java:2379)
    at org.spark_project.guava.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2342)
    at org.spark_project.guava.cache.LocalCache$Segment.get(LocalCache.java:2257)
    at org.spark_project.guava.cache.LocalCache.get(LocalCache.java:4000)
    at org.spark_project.guava.cache.LocalCache.getOrLoad(LocalCache.java:4004)
    at org.spark_project.guava.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4874)
    at org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$.compile(CodeGenerator.scala:837)
    at org.apache.spark.sql.catalyst.expressions.codegen.GenerateOrdering$.create(GenerateOrdering.scala:146)
    at org.apache.spark.sql.catalyst.expressions.codegen.GenerateOrdering$.create(GenerateOrdering.scala:43)
    at org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator.generate(CodeGenerator.scala:821)
    at org.apache.spark.sql.catalyst.expressions.codegen.LazilyGeneratedOrdering.<init>(GenerateOrdering.scala:160)
    at org.apache.spark.sql.catalyst.expressions.codegen.LazilyGeneratedOrdering.<init>(GenerateOrdering.scala:157)
    at org.apache.spark.sql.execution.TakeOrderedAndProjectExec.executeCollect(limit.scala:127)
    at org.apache.spark.sql.Dataset$$anonfun$org$apache$spark$sql$Dataset$$execute$1$1.apply(Dataset.scala:2183)
    at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:57)
    at org.apache.spark.sql.Dataset.withNewExecutionId(Dataset.scala:2532)
    at org.apache.spark.sql.Dataset.org$apache$spark$sql$Dataset$$execute$1(Dataset.scala:2182)
    at org.apache.spark.sql.Dataset.org$apache$spark$sql$Dataset$$collect(Dataset.scala:2189)
    at org.apache.spark.sql.Dataset$$anonfun$head$1.apply(Dataset.scala:1925)
    at org.apache.spark.sql.Dataset$$anonfun$head$1.apply(Dataset.scala:1924)
    at org.apache.spark.sql.Dataset.withTypedCallback(Dataset.scala:2562)
    at org.apache.spark.sql.Dataset.head(Dataset.scala:1924)
    at org.apache.spark.sql.Dataset.take(Dataset.scala:2139)
    at sun.reflect.GeneratedMethodAccessor322.invoke(Unknown Source)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.zeppelin.spark.ZeppelinContext.showDF(ZeppelinContext.java:216)
    at org.apache.zeppelin.spark.SparkSqlInterpreter.interpret(SparkSqlInterpreter.java:129)
    at org.apache.zeppelin.interpreter.LazyOpenInterpreter.interpret(LazyOpenInterpreter.java:94)
    at org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:341)
    at org.apache.zeppelin.scheduler.Job.run(Job.java:176)
    at org.apache.zeppelin.scheduler.FIFOScheduler$1.run(FIFOScheduler.java:139)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
    at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:745)
 
 
I use Apache Zeppelin 0.6.2 with Apache Spark 2.0. It is totally random when I get these errors, because when I fire the exact same Spark SQL statement a second time (and only a few seconds after the first try), the statement works like a charm. Around every fourth statement isn't working correctly. Do you have any idea about this?
I only query something like:
 
SELECT columnA, COUNT(*) FROM temp_table GROUP BY columnA
 
 
Best regards
Florian

Reply via email to