Tested with Cassandra interpreter and it works fine.

Tested with Spark-Cassandra connector using SparkSQL and it works fine but
sometimes I get the exception:


java.lang.NoSuchMethodException:
org.apache.spark.io.LZ4CompressionCodec.<init>(org.apache.spark.SparkConf)
at java.lang.Class.getConstructor0(Class.java:3082)
at java.lang.Class.getConstructor(Class.java:1825)
at
org.apache.spark.io.CompressionCodec$.createCodec(CompressionCodec.scala:72)
at
org.apache.spark.io.CompressionCodec$.createCodec(CompressionCodec.scala:66)
at org.apache.spark.sql.execution.SparkPlan.org
$apache$spark$sql$execution$SparkPlan$$decodeUnsafeRows(SparkPlan.scala:265)
at
org.apache.spark.sql.execution.SparkPlan$$anonfun$executeCollect$1.apply(SparkPlan.scala:291)
at
org.apache.spark.sql.execution.SparkPlan$$anonfun$executeCollect$1.apply(SparkPlan.scala:290)
at
scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:186)
at
org.apache.spark.sql.execution.SparkPlan.executeCollect(SparkPlan.scala:290)
at
org.apache.spark.sql.execution.exchange.BroadcastExchangeExec$$anonfun$relationFuture$1$$anonfun$apply$1.apply(BroadcastExchangeExec.scala:75)
at
org.apache.spark.sql.execution.exchange.BroadcastExchangeExec$$anonfun$relationFuture$1$$anonfun$apply$1.apply(BroadcastExchangeExec.scala:72)
at
org.apache.spark.sql.execution.SQLExecution$.withExecutionId(SQLExecution.scala:94)
at
org.apache.spark.sql.execution.exchange.BroadcastExchangeExec$$anonfun$relationFuture$1.apply(BroadcastExchangeExec.scala:72)
at
org.apache.spark.sql.execution.exchange.BroadcastExchangeExec$$anonfun$relationFuture$1.apply(BroadcastExchangeExec.scala:72)
at
scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24)
at
scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)

The error is pretty random, the exact same SparkSQL query sometimes works,
sometimes throws the exception ...

Not sure it's a Zeppelin issue, I really think Spark 2.0.0 is quite buggy



On Sat, Oct 15, 2016 at 9:31 AM, Luciano Resende <luckbr1...@gmail.com>
wrote:

> +1 (non-binding)
>
> On Wed, Oct 12, 2016 at 3:28 AM, Mina Lee <mina...@apache.org> wrote:
>
> > Hi dev,
> >
> > I propose the following RC to be released for the Apache Zeppelin 0.6.2
> > release.
> >
> > The commit id is 091086de9400dd1c02ca02acf4180b1bf1e9ede7 which is
> > corresponds
> > to the tag v0.6.2-rc2:
> > *https://git-wip-us.apache.org/repos/asf?p=zeppelin.git;a=commit;h=
> > 091086de9400dd1c02ca02acf4180b1bf1e9ede7
> > <https://git-wip-us.apache.org/repos/asf?p=zeppelin.git;a=commit;h=
> > 091086de9400dd1c02ca02acf4180b1bf1e9ede7>*
> >
> > The release archives (tgz), signature, and checksums are here
> > https://dist.apache.org/repos/dist/dev/zeppelin/zeppelin-0.6.2-rc2/
> >
> > The release candidate consists of the following source distribution
> archive
> > zeppelin-0.6.2.tgz
> >
> > In addition, the following supplementary binary distributions are
> provided
> > for user convenience at the same location
> > zeppelin-0.6.2-bin-all.tgz
> > zeppelin-0.6.2-netinst-all.tgz
> >
> > The maven artifacts are here
> > https://repository.apache
> > .org/content/repositories/orgapachezeppelin-1020/org/apache/zeppelin/
> >
> > You can find the KEYS file here:
> > https://dist.apache.org/repos/dist/release/zeppelin/KEYS
> >
> > Release notes available at
> > *https://issues.apache.org/jira/secure/ReleaseNote.jspa?
> > version=12336543&styleName=Html&projectId=12316221
> > <https://issues.apache.org/jira/secure/ReleaseNote.jspa?
> > version=12336543&styleName=Html&projectId=12316221>*
> >
> > Vote will be open for next 72 hours (close at 4am 15/Oct PDT).
> >
> > [ ] +1 approve
> > [ ] 0 no opinion
> > [ ] -1 disapprove (and reason why)
> >
>
>
>
> --
> Luciano Resende
> http://twitter.com/lresende1975
> http://lresende.blogspot.com/
>

Reply via email to