On Sun, Jan 5, 2014 at 6:01 AM, Aaron Davidson wrote:
> That sounds like a different issue. What is the type of myrdd (i.e., if
> you just type myrdd into the shell)? It's possible it's defined as an
> RDD[Nothing] and thus all operations try to typecast to Nothing, which
> always fails. Perhaps
That sounds like a different issue. What is the type of myrdd (i.e., if you
just type myrdd into the shell)? It's possible it's defined as an
RDD[Nothing] and thus all operations try to typecast to Nothing, which
always fails. Perhaps declaring it initially with respect to your class
would help, so
While myrdd.count() works, a lot of other actions and transformations do
not still work in spark-shell. Eg myrdd.first() gives this error:
java.lang.ClassCastException: mypackage.MyClass cannot be cast to
scala.runtime.Nothing$
Also, myrdd.map(r => r) returns:
org.apache.spark.rdd.RDD[*Nothing*]
Sorry, I had a typo. I can conform that using ADD_JARS together with
SPARK_CLASSPATH works as expected in spark-shell.
It'd make sense to have the two combined as one option.
On Sun, Jan 5, 2014 at 3:51 AM, Aaron Davidson wrote:
> Cool. To confirm, you said you can access the class and constru
Cool. To confirm, you said you can access the class and construct new
objects -- did you do this in the shell itself (i.e., on the driver), or on
the executors?
Specifically, one of the following two should fail in the shell:
> new mypackage.MyClass()
> sc.parallelize(0 until 10, 2).foreach(_ => n
On Sun, Jan 5, 2014 at 2:28 AM, Aaron Davidson wrote:
> Additionally, which version of Spark are you running?
>
0.8.1.
Unfortunately, this doesn't work either:
MASTER=local[2] ADD_JARS=/path/to/my/jar
SPARK_CLASSPATH=/path/to/my/jar./spark-shell
>
>
> On Sat, Jan 4, 2014 at 6:27 PM, Aaron Da
actually, I think adding it to SPARK_CLASSPATH is exactly right. The
exception is not on the executors, but in the driver -- its happening when
the driver tries to read results that the executor is sending back to it.
So the executors know about mypackage.MyClass, they happily run and send
their
Additionally, which version of Spark are you running?
On Sat, Jan 4, 2014 at 6:27 PM, Aaron Davidson wrote:
> I am not an expert on these classpath issues, but if you're using local
> mode, you might also try to set SPARK_CLASSPATH to include the path to the
> jar file as well. This should not
I am not an expert on these classpath issues, but if you're using local
mode, you might also try to set SPARK_CLASSPATH to include the path to the
jar file as well. This should not really help, since "adding jars" is the
right way to get the jars to your executors (which is where the exception
appe
I should add that I can see in the log that the jar being shipped to the
workers:
14/01/04 15:34:52 INFO Executor: Fetching
http://192.168.1.111:51031/jars/my.jar.jar with timestamp 131979092
14/01/04 15:34:52 INFO Utils: Fetching
http://192.168.1.111:51031/jars/my.jar.jar to
/var/folders/3g/j
Hi,
I'm trying to access my stand alone spark app from spark-shell. I tried
starting the shell by:
MASTER=local[2] ADD_JARS=/path/to/my/jar ./spark-shell
The log shows that the jar file was loaded. Also, I can access and create a
new instance of mypackage.MyClass.
The problem is that myRDD.coll
11 matches
Mail list logo