You could be referring/sending the StandardSessionFacade inside your map
function. You could bring the class StandardSessionFacade locally and
Serialize it to get it fixed quickly.

Thanks
Best Regards

On Sat, Nov 22, 2014 at 10:02 PM, vdiwakar.malladi <
vdiwakar.mall...@gmail.com> wrote:

> Hi
>
> I'm trying to load the parquet file for querying purpose from my web
> application. I could able to load it as JavaSchemaRDD. But at the time of
> using map function on the JavaSchemaRDD, I'm getting the following
> exception.
>
> The class in which I'm using this code implements Serializable class. Could
> anyone let me know the cause.
>
>
> org.apache.spark.SparkException: Task not serializable
>
> Caused by: org.apache.spark.SparkException: Task not serializable
>         at
>
> org.apache.spark.util.ClosureCleaner$.ensureSerializable(ClosureCleaner.scala:166)
>         at
> org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:158)
>         at org.apache.spark.SparkContext.clean(SparkContext.scala:1242)
>         at org.apache.spark.rdd.RDD.map(RDD.scala:270)
>         at
> org.apache.spark.api.java.JavaRDDLike$class.map(JavaRDDLike.scala:75)
>         at
> org.apache.spark.sql.api.java.JavaSchemaRDD.map(JavaSchemaRDD.scala:42)
> ... 35 more
>
> Caused by: java.io.NotSerializableException:
> org.apache.catalina.session.StandardSessionFacade
>         at java.io.ObjectOutputStream.writeObject0(Unknown Source)
>         at java.io.ObjectOutputStream.defaultWriteFields(Unknown Source)
>         at java.io.ObjectOutputStream.writeSerialData(Unknown Source)
>         at java.io.ObjectOutputStream.writeOrdinaryObject(Unknown Source)
>         at java.io.ObjectOutputStream.writeObject0(Unknown Source)
>
>
> Thanks in advance.
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Getting-exception-on-JavaSchemaRDD-org-apache-spark-SparkException-Task-not-serializable-tp19558.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>

Reply via email to