Marcelo Vanzin wrote
> Do you expect to be able to use the spark context on the remote task?

Not At all, what I want to create is a wrapper of the SparkContext, to be
used only on the driver node.
I would like to have in this "AAA" wrapper several attributes, such as the
SparkContext and other configurations for my project.

I tested using -Dsun.io.serialization.extendedDebugInfo=true

This is the stacktrace:

org.apache.spark.SparkException: Job aborted due to stage failure: Task not
serializable: java.io.NotSerializableException: $iwC$$iwC$$iwC$$iwC$AAA
        - field (class "$iwC$$iwC$$iwC$$iwC", name: "aaa", type: "class
$iwC$$iwC$$iwC$$iwC$AAA")
        - object (class "$iwC$$iwC$$iwC$$iwC", $iwC$$iwC$$iwC$$iwC@24e57dcb)
        - field (class "$iwC$$iwC$$iwC", name: "$iw", type: "class
$iwC$$iwC$$iwC$$iwC")
        - object (class "$iwC$$iwC$$iwC", $iwC$$iwC$$iwC@178cc62b)
        - field (class "$iwC$$iwC", name: "$iw", type: "class $iwC$$iwC$$iwC")
        - object (class "$iwC$$iwC", $iwC$$iwC@1e9f5eeb)
        - field (class "$iwC", name: "$iw", type: "class $iwC$$iwC")
        - object (class "$iwC", $iwC@37d8e87e)
        - field (class "$line18.$read", name: "$iw", type: "class $iwC")
        - object (class "$line18.$read", $line18.$read@124551f)
        - field (class "$iwC$$iwC$$iwC", name: "$VAL15", type: "class
$line18.$read")
        - object (class "$iwC$$iwC$$iwC", $iwC$$iwC$$iwC@2e846e6b)
        - field (class "$iwC$$iwC$$iwC$$iwC", name: "$outer", type: "class
$iwC$$iwC$$iwC")
        - object (class "$iwC$$iwC$$iwC$$iwC", $iwC$$iwC$$iwC$$iwC@4b31ba1b)
        - field (class "$iwC$$iwC$$iwC$$iwC$$anonfun$1", name: "$outer", type:
"class $iwC$$iwC$$iwC$$iwC")
        - object (class "$iwC$$iwC$$iwC$$iwC$$anonfun$1", <function1>)
        - field (class "org.apache.spark.rdd.FilteredRDD", name: "f", type:
"interface scala.Function1")
        - root object (class "org.apache.spark.rdd.FilteredRDD", FilteredRDD[3] 
at
filter at <console>:20)
        at
org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1033)

I actually don't understand much about this stack trace. If you can help me,
I would appreciate it.

Transient didn't work either

Thanks a lot



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Using-Spark-Context-as-an-attribute-of-a-class-cannot-be-used-tp19668p19679.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to