I had tried that earlier. I tried it again and I get a different set of
error messages.

Here is my command:

spark-submit \
--class com.dtex.analysis.transform.GenUserSummaryView
--jars
/Applications/spark-1.2.0-bin-hadoop2.4/lib/datanucleus-core-3.2.10.jar,/Applications/spark-1.2.0-bin-hadoop2.4/lib/datanucleus-rdbms-3.2.9.jar,/Applications/spark-1.2.0-bin-hadoop2.4/lib/datanucleus-api-jdo-3.2.6.jar
\
analysis/target/scala-2.11/dtex-analysis_2.11-0.1.jar \
<arguments>

I get the following error messages. I put the datanucleus jars on the path
by using ---jars but it does not help either way.

Thanks,

arun


Spark assembly has been built with Hive, including Datanucleus jars on
classpath

Exception in thread "main" java.lang.NoSuchMethodError:
scala.runtime.VolatileObjectRef.zero()Lscala/runtime/VolatileObjectRef;

at
com.dtex.analysis.transform.GenUserSummaryView$.main(GenUserSummaryView.scala)

at
com.dtex.analysis.transform.GenUserSummaryView.main(GenUserSummaryView.scala)

at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)

at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

at java.lang.reflect.Method.invoke(Method.java:483)

at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:358)

at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)

 at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

On Tue, Jan 13, 2015 at 10:11 AM, Sean Owen <so...@cloudera.com> wrote:

> What about running spark-submit? really that's the only official way
> to run a Spark app, not running the program directly.
>
> On Tue, Jan 13, 2015 at 6:08 PM, Arun Lists <lists.a...@gmail.com> wrote:
> > Yes, I am running with Scala 2.11. Here is what I see when I do "scala
> > -version"
> >
> >> scala -version
> >
> > Scala code runner version 2.11.4 -- Copyright 2002-2013, LAMP/EPFL
> >
> >
> > On Tue, Jan 13, 2015 at 2:30 AM, Sean Owen <so...@cloudera.com> wrote:
> >>
> >> It sounds like possibly a Scala version mismatch? are you sure you're
> >> running with Scala 2.11 too?
> >>
> >> On Tue, Jan 13, 2015 at 6:58 AM, Arun Lists <lists.a...@gmail.com>
> wrote:
> >> > I have a Spark application that was assembled using sbt 0.13.7, Scala
> >> > 2.11,
> >> > and Spark 1.2.0. In build.sbt, I am running on Mac OSX Yosemite.
> >> >
> >> > I use "provided" for the Spark dependencies. I can run the application
> >> > fine
> >> > within sbt.
> >> >
> >> > I run into problems when I try to run it from the command line. Here
> is
> >> > the
> >> > command I use:
> >> >
> >> > ADD_JARS=analysis/target/scala-2.11/dtex-analysis_2.11-0.1.jar scala
> -cp
> >> >
> >> >
> /Applications/spark-1.2.0-bin-hadoop2.4/lib/spark-assembly-1.2.0-hadoop2.4.0.jar:analysis/target/scala-2.11/dtex-analysis_2.11-0.1.jar
> >> > com.dtex.analysis.transform.GenUserSummaryView ...
> >> >
> >> > I get the following error messages below. Please advise what I can do
> to
> >> > resolve this issue. Thanks!
> >> >
> >> > arun
> >> >
> >> > 15/01/12 22:47:18 WARN NativeCodeLoader: Unable to load native-hadoop
> >> > library for your platform... using builtin-java classes where
> applicable
> >> >
> >> > 15/01/12 22:47:18 WARN BlockManager: Putting block broadcast_0 failed
> >> >
> >> > java.lang.NoSuchMethodError:
> >> > scala.collection.immutable.$colon$colon.hd$1()Ljava/lang/Object;
> >> >
> >> > at
> >> >
> >> >
> org.apache.spark.util.collection.SizeTracker$class.takeSample(SizeTracker.scala:84)
> >> >
> >> > at
> >> >
> >> >
> org.apache.spark.util.collection.SizeTracker$class.resetSamples(SizeTracker.scala:61)
> >> >
> >> > at
> >> >
> >> >
> org.apache.spark.util.collection.SizeTrackingVector.resetSamples(SizeTrackingVector.scala:25)
> >> >
> >> > at
> >> >
> >> >
> org.apache.spark.util.collection.SizeTracker$class.$init$(SizeTracker.scala:51)
> >> >
> >> > at
> >> >
> >> >
> org.apache.spark.util.collection.SizeTrackingVector.<init>(SizeTrackingVector.scala:25)
> >> >
> >> > at
> >> >
> org.apache.spark.storage.MemoryStore.unrollSafely(MemoryStore.scala:236)
> >> >
> >> > at
> >> >
> org.apache.spark.storage.MemoryStore.putIterator(MemoryStore.scala:136)
> >> >
> >> > at
> >> >
> org.apache.spark.storage.MemoryStore.putIterator(MemoryStore.scala:114)
> >> >
> >> > at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:787)
> >> >
> >> > at
> >> >
> org.apache.spark.storage.BlockManager.putIterator(BlockManager.scala:638)
> >> >
> >> > at
> >> >
> org.apache.spark.storage.BlockManager.putSingle(BlockManager.scala:992)
> >> >
> >> > at
> >> >
> >> >
> org.apache.spark.broadcast.TorrentBroadcast.writeBlocks(TorrentBroadcast.scala:98)
> >> >
> >> > at
> >> >
> >> >
> org.apache.spark.broadcast.TorrentBroadcast.<init>(TorrentBroadcast.scala:84)
> >> >
> >> > at
> >> >
> >> >
> org.apache.spark.broadcast.TorrentBroadcastFactory.newBroadcast(TorrentBroadcastFactory.scala:34)
> >> >
> >> > at
> >> >
> >> >
> org.apache.spark.broadcast.TorrentBroadcastFactory.newBroadcast(TorrentBroadcastFactory.scala:29)
> >> >
> >> > at
> >> >
> >> >
> org.apache.spark.broadcast.BroadcastManager.newBroadcast(BroadcastManager.scala:62)
> >> >
> >> > at org.apache.spark.SparkContext.broadcast(SparkContext.scala:945)
> >> >
> >> > at org.apache.spark.SparkContext.hadoopFile(SparkContext.scala:695)
> >> >
> >> > at org.apache.spark.SparkContext.textFile(SparkContext.scala:540)
> >> >
> >> > at
> >> >
> >> >
> com.dtex.analysis.transform.TransformUtils$anonfun$2.apply(TransformUtils.scala:97)
> >> >
> >> > at
> >> >
> >> >
> com.dtex.analysis.transform.TransformUtils$anonfun$2.apply(TransformUtils.scala:97)
> >> >
> >> > at
> >> >
> >> >
> scala.collection.TraversableLike$anonfun$map$1.apply(TraversableLike.scala:245)
> >> >
> >> > at
> >> >
> >> >
> scala.collection.TraversableLike$anonfun$map$1.apply(TraversableLike.scala:245)
> >> >
> >> > at
> >> >
> >> >
> scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
> >> >
> >> > at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:186)
> >> >
> >> > at
> scala.collection.TraversableLike$class.map(TraversableLike.scala:245)
> >> >
> >> > at scala.collection.mutable.ArrayOps$ofRef.map(ArrayOps.scala:186)
> >> >
> >> > at
> >> >
> >> >
> com.dtex.analysis.transform.TransformUtils$.generateUserSummaryData(TransformUtils.scala:97)
> >> >
> >> > at
> >> >
> >> >
> com.dtex.analysis.transform.GenUserSummaryView$.main(GenUserSummaryView.scala:77)
> >> >
> >> > at
> >> >
> >> >
> com.dtex.analysis.transform.GenUserSummaryView.main(GenUserSummaryView.scala)
> >> >
> >> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >> >
> >> > at
> >> >
> >> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> >> >
> >> > at
> >> >
> >> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> >> >
> >> > at java.lang.reflect.Method.invoke(Method.java:483)
> >> >
> >> > at
> >> >
> >> >
> scala.reflect.internal.util.ScalaClassLoader$anonfun$run$1.apply(ScalaClassLoader.scala:70)
> >> >
> >> > at
> >> >
> >> >
> scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)
> >> >
> >> > at
> >> >
> >> >
> scala.reflect.internal.util.ScalaClassLoader$URLClassLoader.asContext(ScalaClassLoader.scala:101)
> >> >
> >> > at
> >> >
> >> >
> scala.reflect.internal.util.ScalaClassLoader$class.run(ScalaClassLoader.scala:70)
> >> >
> >> > at
> >> >
> >> >
> scala.reflect.internal.util.ScalaClassLoader$URLClassLoader.run(ScalaClassLoader.scala:101)
> >> >
> >> > at scala.tools.nsc.CommonRunner$class.run(ObjectRunner.scala:22)
> >> >
> >> > at scala.tools.nsc.ObjectRunner$.run(ObjectRunner.scala:39)
> >> >
> >> > at
> scala.tools.nsc.CommonRunner$class.runAndCatch(ObjectRunner.scala:29)
> >> >
> >> > at scala.tools.nsc.ObjectRunner$.runAndCatch(ObjectRunner.scala:39)
> >> >
> >> > at
> >> >
> scala.tools.nsc.MainGenericRunner.runTarget$1(MainGenericRunner.scala:65)
> >> >
> >> > at scala.tools.nsc.MainGenericRunner.run$1(MainGenericRunner.scala:87)
> >> >
> >> > at
> scala.tools.nsc.MainGenericRunner.process(MainGenericRunner.scala:98)
> >> >
> >> > at
> scala.tools.nsc.MainGenericRunner$.main(MainGenericRunner.scala:103)
> >> >
> >> > at scala.tools.nsc.MainGenericRunner.main(MainGenericRunner.scala)
> >> >
> >
> >
>

Reply via email to