Re: Serialization issue with Spark

2016-03-25 Thread manasdebashiskar
You have not mentioned what task is not serializable.
The stack trace is usually a good idea while asking this question.

Usually spark will tell you what class it is not able to serialize. 
If it is one of your own class then try making it serializable or make it
transient so that it only gets created on the executor.

...Manas



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Serialization-issue-with-Spark-tp26565p26595.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: Serialization issue with Spark

2016-03-23 Thread Dirceu Semighini Filho
Hello Hafsa,
TaskNotSerialized exception usually means that you are trying to use an
object, defined in the driver, in code that runs on workers.
Can you post the code that ir generating this error here, so we can better
advise you?

Cheers.

2016-03-23 14:14 GMT-03:00 Hafsa Asif <hafsa.a...@matchinguu.com>:

> Can anyone please help me in this issue?
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Serialization-issue-with-Spark-tp26565p26579.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> -
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>


Re: Serialization issue with Spark

2016-03-23 Thread Hafsa Asif
Can anyone please help me in this issue?



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Serialization-issue-with-Spark-tp26565p26579.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: Serialization issue with Spark

2016-03-22 Thread Ted Yu
Can you show code snippet and the exception for 'Task is not serializable' ?

Please see related JIRA:
  SPARK-10251
whose pull request contains code for registering classes with Kryo.

Cheers

On Tue, Mar 22, 2016 at 7:00 AM, Hafsa Asif <hafsa.a...@matchinguu.com>
wrote:

> Hello,
> I am facing Spark serialization issue in Spark (1.4.1 - Java Client) with
> Spring Framework. It is known that Spark needs serialization and it
> requires
> every class need to be implemented with java.io.Serializable. But, in the
> documentation link: http://spark.apache.org/docs/latest/tuning.html, it is
> mentioned that it is not a good approach and better to use Kryo.
> I am using Kryo in Spark configuration like this:
>   public @Bean DeepSparkContext sparkContext(){
> DeepSparkConfig conf = new DeepSparkConfig();
> conf.setAppName(this.environment.getProperty("APP_NAME"))
> .setMaster(master)
> .set("spark.executor.memory",
> this.environment.getProperty("SPARK_EXECUTOR_MEMORY"))
> .set("spark.cores.max",
> this.environment.getProperty("SPARK_CORES_MAX"))
> .set("spark.default.parallelism",
> this.environment.getProperty("SPARK_DEFAULT_PARALLELISM"));
> conf.set("spark.serializer",
> "org.apache.spark.serializer.KryoSerializer");
> return new DeepSparkContext(conf);
> }
>
> but still getting exception in Spark that 'Task is not serializable'. I
> also
> donot want to make spark contect 'static'.
>
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Serialization-issue-with-Spark-tp26565.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> -
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>


Serialization issue with Spark

2016-03-22 Thread Hafsa Asif
Hello,
I am facing Spark serialization issue in Spark (1.4.1 - Java Client) with
Spring Framework. It is known that Spark needs serialization and it requires
every class need to be implemented with java.io.Serializable. But, in the
documentation link: http://spark.apache.org/docs/latest/tuning.html, it is
mentioned that it is not a good approach and better to use Kryo.
I am using Kryo in Spark configuration like this:
  public @Bean DeepSparkContext sparkContext(){
DeepSparkConfig conf = new DeepSparkConfig();
conf.setAppName(this.environment.getProperty("APP_NAME"))
.setMaster(master)
.set("spark.executor.memory",
this.environment.getProperty("SPARK_EXECUTOR_MEMORY"))
.set("spark.cores.max",
this.environment.getProperty("SPARK_CORES_MAX"))
.set("spark.default.parallelism",
this.environment.getProperty("SPARK_DEFAULT_PARALLELISM"));
conf.set("spark.serializer",
"org.apache.spark.serializer.KryoSerializer");
return new DeepSparkContext(conf);
}

but still getting exception in Spark that 'Task is not serializable'. I also
donot want to make spark contect 'static'.




--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Serialization-issue-with-Spark-tp26565.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org