Hello,

On Mon, Nov 24, 2014 at 12:07 PM, aecc <alessandroa...@gmail.com> wrote:
> This is the stacktrace:
>
> org.apache.spark.SparkException: Job aborted due to stage failure: Task not
> serializable: java.io.NotSerializableException: $iwC$$iwC$$iwC$$iwC$AAA
>         - field (class "$iwC$$iwC$$iwC$$iwC", name: "aaa", type: "class
> $iwC$$iwC$$iwC$$iwC$AAA")

Ah. Looks to me that you're trying to run this in spark-shell, right?

I'm not 100% sure of how it works internally, but I think the Scala
repl works a little differently than regular Scala code in this
regard. When you declare a "val" in the shell it will behave
differently than a "val" inside a method in a compiled Scala class -
the former will behave like an instance variable, the latter like a
local variable. So, this is probably why you're running into this.

Try compiling your code and running it outside the shell to see how it
goes. I'm not sure whether there's a workaround for this when trying
things out in the shell - maybe declare an `object` to hold your
constants? Never really tried, so YMMV.

-- 
Marcelo

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to