Yes, I'm running this in the Shell. In my compiled Jar it works perfectly,
the issue is I need to do this on the shell.

Any available workarounds?

I checked sqlContext, they use it in the same way I would like to use my
class, they make the class Serializable with transient. Does this affects
somehow the whole pipeline of data moving? I mean, will I get performance
issues when doing this because now the class will be Serialized for some
reason that I still don't understand?


2014-11-24 22:33 GMT+01:00 Marcelo Vanzin [via Apache Spark User List] <
ml-node+s1001560n19687...@n3.nabble.com>:

> Hello,
>
> On Mon, Nov 24, 2014 at 12:07 PM, aecc <[hidden email]
> <http://user/SendEmail.jtp?type=node&node=19687&i=0>> wrote:
> > This is the stacktrace:
> >
> > org.apache.spark.SparkException: Job aborted due to stage failure: Task
> not
> > serializable: java.io.NotSerializableException: $iwC$$iwC$$iwC$$iwC$AAA
> >         - field (class "$iwC$$iwC$$iwC$$iwC", name: "aaa", type: "class
> > $iwC$$iwC$$iwC$$iwC$AAA")
>
> Ah. Looks to me that you're trying to run this in spark-shell, right?
>
> I'm not 100% sure of how it works internally, but I think the Scala
> repl works a little differently than regular Scala code in this
> regard. When you declare a "val" in the shell it will behave
> differently than a "val" inside a method in a compiled Scala class -
> the former will behave like an instance variable, the latter like a
> local variable. So, this is probably why you're running into this.
>
> Try compiling your code and running it outside the shell to see how it
> goes. I'm not sure whether there's a workaround for this when trying
> things out in the shell - maybe declare an `object` to hold your
> constants? Never really tried, so YMMV.
>
> --
> Marcelo
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: [hidden email]
> <http://user/SendEmail.jtp?type=node&node=19687&i=1>
> For additional commands, e-mail: [hidden email]
> <http://user/SendEmail.jtp?type=node&node=19687&i=2>
>
>
>
> ------------------------------
>  If you reply to this email, your message will be added to the discussion
> below:
>
> http://apache-spark-user-list.1001560.n3.nabble.com/Using-Spark-Context-as-an-attribute-of-a-class-cannot-be-used-tp19668p19687.html
>  To unsubscribe from Using Spark Context as an attribute of a class cannot
> be used, click here
> <http://apache-spark-user-list.1001560.n3.nabble.com/template/NamlServlet.jtp?macro=unsubscribe_by_code&node=19668&code=YWxlc3NhbmRyb2FlY2NAZ21haWwuY29tfDE5NjY4fDE2MzQ0ODgyMDU=>
> .
> NAML
> <http://apache-spark-user-list.1001560.n3.nabble.com/template/NamlServlet.jtp?macro=macro_viewer&id=instant_html%21nabble%3Aemail.naml&base=nabble.naml.namespaces.BasicNamespace-nabble.view.web.template.NabbleNamespace-nabble.view.web.template.NodeNamespace&breadcrumbs=notify_subscribers%21nabble%3Aemail.naml-instant_emails%21nabble%3Aemail.naml-send_instant_email%21nabble%3Aemail.naml>
>



-- 
Alessandro Chacón
Aecc_ORG




--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Using-Spark-Context-as-an-attribute-of-a-class-cannot-be-used-tp19668p19690.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Reply via email to