Ankur Srivastava; user@spark.apache.org
Subject: Re: org.apache.spark.SparkException: Task not serializable
For scala,
make your class Serializable, like this
```
class YourClass extends Serializable {
}
```
On Sat, Mar 11, 2017 at 3:51 PM, 萝卜丝炒饭
<1427357...@qq.com<mailto:1427357...@qq.com>&
s idea.
>
> thanks
> Robin
>
> ---Original---
> *From:* "Mina Aslani"<aslanim...@gmail.com>
> *Date:* 2017/3/7 05:32:10
> *To:* "Ankur Srivastava"<ankur.srivast...@gmail.com>;
> *Cc:* "user@spark.apache.org"<user@spark.apache.org>
Cc: "user@spark.apache.org"<user@spark.apache.org>;
Subject: Re: org.apache.spark.SparkException: Task not serializable
Thank you Ankur for the quick response, really appreciate it! Making the class
serializable resolved the exception!
Best regards,Mina
On Mon, Mar 6, 2017 at 4:20 PM, Anku
am trying to start with spark and get number of lines of a text file in my
>> mac, however I get
>>
>> org.apache.spark.SparkException: Task not serializable error on
>>
>> JavaRDD logData = javaCtx.textFile(file);
>>
>> Please see below for the s
gt; I am trying to start with spark and get number of lines of a text file in my
> mac, however I get
>
> org.apache.spark.SparkException: Task not serializable error on
>
> JavaRDD logData = javaCtx.textFile(file);
>
> Please see below for the sample of code and the sta
Hi,
I am trying to start with spark and get number of lines of a text file
in my mac, however I get
org.apache.spark.SparkException: Task not serializable error on
JavaRDD logData = javaCtx.textFile(file);
Please see below for the sample of code and the stackTrace.
Any idea why this error
,p_ORA_bc))
//org.apache.spark.SparkException: Task not serializable
2.The other one is ok:
import org.apache.spark._
import SparkContext._
import org.apache.spark.SparkConf
import org.apache.spark.streaming.{Seconds, StreamingContext}
import org.apache.spark.broadcast._
val ssc = new
}
}
}
res.iterator
}
val cdd = lines.filter(testFun).mapPartitions(x =
A(x,p_date_bc,p_ORA_bc)) //org.apache.spark.SparkException: Task not
serializable
2.The other one is ok:
import org.apache.spark._
import SparkContext._
import org.apache.spark.SparkConf
import
. Could
anyone let me know the cause.
org.apache.spark.SparkException: Task not serializable
Caused by: org.apache.spark.SparkException: Task not serializable
at
org.apache.spark.util.ClosureCleaner$.ensureSerializable(ClosureCleaner.scala:166)
at org.apache.spark.util.ClosureCleaner
anyone let me know the cause.
org.apache.spark.SparkException: Task not serializable
Caused by: org.apache.spark.SparkException: Task not serializable
at
org.apache.spark.util.ClosureCleaner$.ensureSerializable(ClosureCleaner.scala:166
Thanks for your prompt response.
I'm not using any thing in my map function. please see the below code. For
sample purpose, I would like to using 'select * from
'.
This code worked for me in standalone mode. But when I integrated with my
web application, it is throwing the specified exception.
You are declaring an anonymous inner class here. It has a reference to the
containing class even if you don't use it. If the closure cleaner can't
determine it isn't used, this reference will cause everything in the outer
class to serialize. Try rewriting this as a named static inner class .
On
Thanks.
After writing it as static inner class, that exception not coming. But
getting snappy related exception. I could see the corresponding dependency
is in the spark assembly jar. Still getting the exception. Any quick
suggestion on this?
Here is the stack trace.
,
i get org.apache.spark.SparkException: Task not serializable expetion.
here is my filter function:
object OBJ {
def f1(): Boolean = {
var i = 1;
for (j-1 to 10) i = i +1;
true;
}
}
rdd.filter(row = OBJ.f1())
And when I run, I get the following exception
Hi,
Probably I am missing very simple principle , but something is wrong with
my filter,
i get org.apache.spark.SparkException: Task not serializable expetion.
here is my filter function:
object OBJ {
def f1(): Boolean = {
var i = 1;
for (j-1 to 10) i = i +1;
true
...@gmail.com wrote:
Hi,
Probably I am missing very simple principle , but something is wrong with
my filter,
i get org.apache.spark.SparkException: Task not serializable expetion.
here is my filter function:
object OBJ {
def f1(): Boolean = {
var i = 1;
for (j-1 to 10) i = i
16 matches
Mail list logo