task not serializable on simple operations

2017-10-16 Thread Imran Rajjad
Is there a way around to implement a separate Java class that implements serializable interface for even small petty arithmetic operations? below is code from simple decision tree example Double testMSE = predictionAndLabel.map(new Function, Double>() { @Override

RE: Spark sql with Zeppelin, Task not serializable error when I try to cache the spark sql table

2017-05-31 Thread Mahesh Sawaiker
not serializable error when I try to cache the spark sql table Hello all, I am using Zeppelin 0.7.1 with Spark 2.1.0 I am getting org.apache.spark.SparkException: Task not serializable error when I try to cache the spark sql table. I am using a UDF on a column of table and want to cache the resultant table

Spark sql with Zeppelin, Task not serializable error when I try to cache the spark sql table

2017-05-31 Thread shyla deshpande
Hello all, I am using Zeppelin 0.7.1 with Spark 2.1.0 I am getting org.apache.spark.SparkException: Task not serializable error when I try to cache the spark sql table. I am using a UDF on a column of table and want to cache the resultant table . I can execute the paragraph successfully when

Re: org.apache.spark.SparkException: Task not serializable

2017-03-13 Thread Yong Zhang
Ankur Srivastava; user@spark.apache.org Subject: Re: org.apache.spark.SparkException: Task not serializable For scala, make your class Serializable, like this ``` class YourClass extends Serializable { } ``` On Sat, Mar 11, 2017 at 3:51 PM, 萝卜丝炒饭 <1427357...@qq.com<mailto:1427357...@qq.com>&

Re: org.apache.spark.SparkException: Task not serializable

2017-03-11 Thread Yan Facai
s idea. > > thanks > Robin > > ---Original--- > *From:* "Mina Aslani"<aslanim...@gmail.com> > *Date:* 2017/3/7 05:32:10 > *To:* "Ankur Srivastava"<ankur.srivast...@gmail.com>; > *Cc:* "user@spark.apache.org"<user@spark.apache.org>

Re: org.apache.spark.SparkException: Task not serializable

2017-03-10 Thread ??????????
Cc: "user@spark.apache.org"<user@spark.apache.org>; Subject: Re: org.apache.spark.SparkException: Task not serializable Thank you Ankur for the quick response, really appreciate it! Making the class serializable resolved the exception! Best regards,Mina On Mon, Mar 6, 2017 at 4:20 PM, Anku

Re: org.apache.spark.SparkException: Task not serializable

2017-03-06 Thread Mina Aslani
am trying to start with spark and get number of lines of a text file in my >> mac, however I get >> >> org.apache.spark.SparkException: Task not serializable error on >> >> JavaRDD logData = javaCtx.textFile(file); >> >> Please see below for the s

Re: org.apache.spark.SparkException: Task not serializable

2017-03-06 Thread Ankur Srivastava
gt; I am trying to start with spark and get number of lines of a text file in my > mac, however I get > > org.apache.spark.SparkException: Task not serializable error on > > JavaRDD logData = javaCtx.textFile(file); > > Please see below for the sample of code and the sta

org.apache.spark.SparkException: Task not serializable

2017-03-06 Thread Mina Aslani
Hi, I am trying to start with spark and get number of lines of a text file in my mac, however I get org.apache.spark.SparkException: Task not serializable error on JavaRDD logData = javaCtx.textFile(file); Please see below for the sample of code and the stackTrace. Any idea why this error

task not serializable in case of groupByKey() + mapGroups + map?

2016-10-31 Thread Yang
map(xx=>{ val simpley = yyy.value 1 }) I'm seeing error: org.apache.spark.SparkException: Task not serializable at org.apache.spark.util.ClosureCleaner$.ensureSerializable(ClosureCleaner.scala:298) at org.apache.spark.util.ClosureCleaner$.org$apache$spark$util$ClosureClean

Re: Spark 2.0 Structured Streaming: sc.parallelize in foreach sink cause Task not serializable error

2016-09-26 Thread Michael Armbrust
uot;playground","sstest") > println(v_str(0),v_str(1),v_str(2),v_str(3))} > override def close(errorOrNull: Throwable) = () > } > > val query = > line_count.writeStream.outputMode("complete").foreach(writer).start() > > query.

Spark 2.0 Structured Streaming: sc.parallelize in foreach sink cause Task not serializable error

2016-09-25 Thread Jianshi
problem? Or is there another way to save the result using foreach sink? Thanks very much. Best, Jianshi -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Spark-2-0-Structured-Streaming-sc-parallelize-in-foreach-sink-cause-Task-not-serializable-error-tp27791.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Re: Task not serializable: java.io.NotSerializableException: org.json4s.Serialization$$anon$1

2016-07-19 Thread RK Aduri
Did you check this: case class Example(name : String, age ; Int) there is a semicolon. should have been (age : Int) -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Task-not-serializable-java-io-NotSerializableException-org-json4s-Serialization-anon-1

Re: Task not serializable: java.io.NotSerializableException: org.json4s.Serialization$$anon$1

2016-07-19 Thread joshuata
r-list.1001560.n3.nabble.com/Task-not-serializable-java-io-NotSerializableException-org-json4s-Serialization-anon-1-tp8233p27359.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe e-mail

Spark Task not serializable with lag Window function

2016-05-18 Thread luca_guerra
I've noticed that after I use a Window function over a DataFrame if I call a map() with a function, Spark returns a "Task not serializable" Exception This is my code: val hc = new org.apache.spark.sql.hive.HiveContext(sc) import hc.implicits._ import org.apache.spark.sql.expressions.Win

Re: Renaming sc variable in sparkcontext throws task not serializable

2016-03-02 Thread Prashant Sharma
issue when using the spark-shell and >>> zeppelin. >>> If we assign the sparkcontext variable (sc) to a new variable and >>> reference >>> another variable in an RDD lambda expression we get a task not >>> serializable exception. >>> >>>

Re: Renaming sc variable in sparkcontext throws task not serializable

2016-03-02 Thread Jeff Zhang
n the sparkcontext variable (sc) to a new variable and reference > another variable in an RDD lambda expression we get a task not > serializable exception. > > The following three lines of code illustrate this : > > val temp = 10 > val newSC = sc > val new RDD = newSC.parallel

Renaming sc variable in sparkcontext throws task not serializable

2016-03-02 Thread Rahul Palamuttam
Hi All, We recently came across this issue when using the spark-shell and zeppelin. If we assign the sparkcontext variable (sc) to a new variable and reference another variable in an RDD lambda expression we get a task not serializable exception. The following three lines of code illustrate

Re: Troubleshooting "Task not serializable" in Spark/Scala environments

2015-09-22 Thread Huy Banh
ile) >>> // header >>> val header = logData.first >>> // filter out header >>> val sample = logData.filter(!_.contains(header)).map { >>> line => line.replaceAll("['\"]","").substring(0,line.length()-1) >>>

Troubleshooting "Task not serializable" in Spark/Scala environments

2015-09-21 Thread Balaji Vijayan
val sample = logData.filter(!_.contains(header)).map { line => line.replaceAll("['\"]","").substring(0,line.length()-1) }.takeSample(false,100,12L) Stack Trace: org.apache.spark.SparkException: Task not serializable org.apache.spark.util.ClosureCleaner$.ensur

Re: Troubleshooting "Task not serializable" in Spark/Scala environments

2015-09-21 Thread Alexis Gillain
ogData.filter(!_.contains(header)).map { >> line => line.replaceAll("['\"]","").substring(0,line.length()-1) >> }.takeSample(false,100,12L) >> >> Stack Trace: >> >> org.apache.spark.SparkException: Task not seri

Re: Troubleshooting "Task not serializable" in Spark/Scala environments

2015-09-21 Thread Ted Yu
le = logData.filter(!_.contains(header)).map { > line => line.replaceAll("['\"]","").substring(0,line.length()-1) > }.takeSample(false,100,12L) > > Stack Trace: > > org.apache.spark.SparkException: Task not serializable > > org.apache.spark.util.C

Re: Troubleshooting "Task not serializable" in Spark/Scala environments

2015-09-21 Thread Igor Berman
"['\"]","").substring(0,line.length()-1) > }.takeSample(false,100,12L) > > Stack Trace: > > org.apache.spark.SparkException: Task not serializable > > org.apache.spark.util.ClosureCleaner$.ensureSerializable(ClosureClea

Re: Job aborted due to stage failure: Task not serializable:

2015-07-16 Thread Akhil Das
am using the below code and using kryo serializer .when i run this code i got this error : Task not serializable in commented line 2) how broadcast variables are treated in exceotu.are they local variables or can be used in any function defined as global variables. object StreamingLogInput

Job aborted due to stage failure: Task not serializable:

2015-07-15 Thread Naveen Dabas
I am using the below code and using kryo serializer .when i run this code i got this error : Task not serializable in commented line2) how broadcast variables are treated in exceotu.are they local variables or can be used in any function defined as global variables. object

Spark stream test throw org.apache.spark.SparkException: Task not serializable when execute in spark shell

2015-06-24 Thread yuemeng (A)
hi ,all there two examples one is throw Task not serializable when execute in spark shell,the other one is ok,i am very puzzled,can anyone give what's different about this two code and why the other is ok 1.The one which throw Task not serializable : import org.apache.spark._ import

Re: Spark stream test throw org.apache.spark.SparkException: Task not serializable when execute in spark shell

2015-06-24 Thread Yana Kadiyska
I can't tell immediately, but you might be able to get more info with the hint provided here: http://stackoverflow.com/questions/27980781/spark-task-not-serializable-with-simple-accumulator (short version, set -Dsun.io.serialization.extendedDebugInfo=true) Also, unless you're simplifying your

Re: Wired Problem: Task not serializable[Spark Streaming]

2015-06-08 Thread bit1...@163.com
Could someone help explain what happens that leads to the Task not serializable issue? Thanks. bit1...@163.com From: bit1...@163.com Date: 2015-06-08 19:08 To: user Subject: Wired Problem: Task not serializable[Spark Streaming] Hi, With the following simple code, I got an exception

Re: Wired Problem: Task not serializable[Spark Streaming]

2015-06-08 Thread Michael Albert
, there is no where to which the return can jump.Hence it is not serializable. Good luck.-Mike From: bit1...@163.com bit1...@163.com To: user user@spark.apache.org Sent: Monday, June 8, 2015 10:01 PM Subject: Re: Wired Problem: Task not serializable[Spark Streaming] #yiv1823860044

[SQL][1.3.1][JAVA] UDF in java cause Task not serializable

2015-04-27 Thread Shuai Zheng
Hi All, Basically I try to define a simple UDF and use it in the query, but it gives me Task not serializable public void test() { RiskGroupModelDefinition model = registeredRiskGroupMap.get(this.modelId); RiskGroupModelDefinition edm = this.createEdm

Re: Task not Serializable: Graph is unexpectedly null when DStream is being serialized

2015-04-22 Thread Jean-Pascal Billaud
status: FAILED, exitCode: 15, (reason: User class threw exception: Task not serializable) Exception in thread Driver org.apache.spark.SparkException: Task not serializable at org.apache.spark.util.ClosureCleaner$.ensureSerializable( ClosureCleaner.scala:166

Re: Task not Serializable: Graph is unexpectedly null when DStream is being serialized

2015-04-22 Thread Tathagata Das
this serialization exception and I am not too sure what Graph is unexpectedly null when DStream is being serialized means? 15/04/20 06:12:38 INFO yarn.ApplicationMaster: Final app status: FAILED, exitCode: 15, (reason: User class threw exception: Task not serializable) Exception in thread Driver

Re: Task not Serializable: Graph is unexpectedly null when DStream is being serialized

2015-04-21 Thread Jean-Pascal Billaud
:12:38 INFO yarn.ApplicationMaster: Final app status: FAILED, exitCode: 15, (reason: User class threw exception: Task not serializable) Exception in thread Driver org.apache.spark.SparkException: Task not serializable at org.apache.spark.util.ClosureCleaner$.ensureSerializable

Re: Task not Serializable: Graph is unexpectedly null when DStream is being serialized

2015-04-21 Thread Jean-Pascal Billaud
not serializable) Exception in thread Driver org.apache.spark.SparkException: Task not serializable at org.apache.spark.util.ClosureCleaner$.ensureSerializable( ClosureCleaner.scala:166) at org.apache.spark.util.ClosureCleaner$.clean( ClosureCleaner.scala:158

Re: Task not Serializable: Graph is unexpectedly null when DStream is being serialized

2015-04-21 Thread Tathagata Das
sure what Graph is unexpectedly null when DStream is being serialized means? 15/04/20 06:12:38 INFO yarn.ApplicationMaster: Final app status: FAILED, exitCode: 15, (reason: User class threw exception: Task not serializable) Exception in thread Driver org.apache.spark.SparkException: Task

Re: Task not Serializable: Graph is unexpectedly null when DStream is being serialized

2015-04-21 Thread Tathagata Das
class threw exception: Task not serializable) Exception in thread Driver org.apache.spark.SparkException: Task not serializable at org.apache.spark.util.ClosureCleaner$.ensureSerializable( ClosureCleaner.scala:166) at org.apache.spark.util.ClosureCleaner$.clean

Re: Task not Serializable: Graph is unexpectedly null when DStream is being serialized

2015-04-20 Thread Jean-Pascal Billaud
is unexpectedly null when DStream is being serialized means? 15/04/20 06:12:38 INFO yarn.ApplicationMaster: Final app status: FAILED, exitCode: 15, (reason: User class threw exception: Task not serializable) Exception in thread Driver org.apache.spark.SparkException: Task not serializable

Task not Serializable: Graph is unexpectedly null when DStream is being serialized

2015-04-20 Thread Jean-Pascal Billaud
Hi, I am getting this serialization exception and I am not too sure what Graph is unexpectedly null when DStream is being serialized means? 15/04/20 06:12:38 INFO yarn.ApplicationMaster: Final app status: FAILED, exitCode: 15, (reason: User class threw exception: Task not serializable) Exception

Task not serializable exception

2015-02-24 Thread Kartheek.R
Hi, I run into Task not Serializable excption with following code below. When I remove the threads and run, it works, but with threads I run into Task not serializable exception. object SparkKart extends Serializable{ def parseVector(line: String): Vector[Double] = { DenseVector(line.split

Re: Task not serializable exception

2015-02-23 Thread Kartheek.R
} } -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Task-not-serializable-exception-tp21776p21778.html Sent from the Apache Spark User List mailing list archive at Nabble.com

Task not serializable exception

2015-02-23 Thread Kartheek.R
() { val dist1 =data.map(x = squaredDistance(x,kPoints(0))) } }) thread1.start I am facing Task not serializable exception: Exception in thread Thread-32 org.apache.spark.SparkException: Task not serializable at org.apache.spark.util.ClosureCleaner

Re: SparkException: Task not serializable - Jackson Json

2015-02-14 Thread mickdelaney
but only per partition and not for every row like above. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/SparkException-Task-not-serializable-Jackson-Json-tp21347p21655.html Sent from the Apache Spark User List mailing list archive at Nabble.com

Re: SparkException: Task not serializable - Jackson Json

2015-02-13 Thread jamckelvey
I'm having the same problem with the same sample code. Any progress on this? -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/SparkException-Task-not-serializable-Jackson-Json-tp21347p21651.html Sent from the Apache Spark User List mailing list archive

Task not serializable problem in the multi-thread SQL query

2015-02-12 Thread lihu
} } }) } this will throw a Task serializable Exception, if I do not use the multi-thread, it works well. Since there is no object is not serializable? so what is the problem? java.lang.Error: org.apache.spark.SparkException: Task not serializable

Re: Task not serializable problem in the multi-thread SQL query

2015-02-12 Thread Michael Armbrust
(){ override def run(){ if( some condition){ sqlContext.sql(SELECT * from ...).collect().foreach(println) } else{ //some other query } } }) } this will throw a Task serializable Exception, if I

Re: Task not serializable problem in the multi-thread SQL query

2015-02-12 Thread lihu
().foreach(println) } else{ //some other query } } }) } this will throw a Task serializable Exception, if I do not use the multi-thread, it works well. Since there is no object is not serializable? so what is the problem

SparkException: Task not serializable - Jackson Json

2015-01-24 Thread mickdelaney
) } }/ Exception in thread main org.apache.spark.SparkException: Task not serializable at org.apache.spark.util.ClosureCleaner$.ensureSerializable(ClosureCleaner.scala:166) at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:158) at org.apache.spark.SparkContext.clean(SparkContext.scala:1435

Getting exception on JavaSchemaRDD; org.apache.spark.SparkException: Task not serializable

2014-11-22 Thread vdiwakar.malladi
. Could anyone let me know the cause. org.apache.spark.SparkException: Task not serializable Caused by: org.apache.spark.SparkException: Task not serializable at org.apache.spark.util.ClosureCleaner$.ensureSerializable(ClosureCleaner.scala:166) at org.apache.spark.util.ClosureCleaner

Re: Getting exception on JavaSchemaRDD; org.apache.spark.SparkException: Task not serializable

2014-11-22 Thread Akhil Das
anyone let me know the cause. org.apache.spark.SparkException: Task not serializable Caused by: org.apache.spark.SparkException: Task not serializable at org.apache.spark.util.ClosureCleaner$.ensureSerializable(ClosureCleaner.scala:166

Re: Getting exception on JavaSchemaRDD; org.apache.spark.SparkException: Task not serializable

2014-11-22 Thread vdiwakar.malladi
Thanks for your prompt response. I'm not using any thing in my map function. please see the below code. For sample purpose, I would like to using 'select * from '. This code worked for me in standalone mode. But when I integrated with my web application, it is throwing the specified exception.

Re: Getting exception on JavaSchemaRDD; org.apache.spark.SparkException: Task not serializable

2014-11-22 Thread Sean Owen
You are declaring an anonymous inner class here. It has a reference to the containing class even if you don't use it. If the closure cleaner can't determine it isn't used, this reference will cause everything in the outer class to serialize. Try rewriting this as a named static inner class . On

Re: Getting exception on JavaSchemaRDD; org.apache.spark.SparkException: Task not serializable

2014-11-22 Thread vdiwakar.malladi
Thanks. After writing it as static inner class, that exception not coming. But getting snappy related exception. I could see the corresponding dependency is in the spark assembly jar. Still getting the exception. Any quick suggestion on this? Here is the stack trace.

Re: Accumulators : Task not serializable: java.io.NotSerializableException: org.apache.spark.SparkContext

2014-10-27 Thread Akhil Das
if this works for you also. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Accumulators-Task-not-serializable-java-io-NotSerializableException-org-apache-spark-SparkContext-tp17262p17287.html Sent from the Apache Spark User List mailing list archive

Re: Accumulators : Task not serializable: java.io.NotSerializableException: org.apache.spark.SparkContext

2014-10-26 Thread Akhil Das
) Now, if I remove the 'accum += 1', everything works fine. If I keep it, I get this weird error: Exception in thread main 14/10/25 21:58:56 INFO TaskSchedulerImpl: Cancelling stage 0 org.apache.spark.SparkException: Job aborted due to stage failure: Task not serializable

Re: Accumulators : Task not serializable: java.io.NotSerializableException: org.apache.spark.SparkContext

2014-10-26 Thread octavian.ganea
Hi Akhil, Please see this related message. http://apache-spark-user-list.1001560.n3.nabble.com/Bug-in-Accumulators-td17263.html I am curious if this works for you also. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Accumulators-Task-not-serializable

Accumulators : Task not serializable: java.io.NotSerializableException: org.apache.spark.SparkContext

2014-10-25 Thread octavian.ganea
: Cancelling stage 0 org.apache.spark.SparkException: Job aborted due to stage failure: Task not serializable: java.io.NotSerializableException: org.apache.spark.SparkContext at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages

Re: What's wrong with my spark filter? I get org.apache.spark.SparkException: Task not serializable

2014-10-19 Thread Ilya Ganelin
, i get org.apache.spark.SparkException: Task not serializable expetion. here is my filter function: object OBJ { def f1(): Boolean = { var i = 1; for (j-1 to 10) i = i +1; true; } } rdd.filter(row = OBJ.f1()) And when I run, I get the following exception

What's wrong with my spark filter? I get org.apache.spark.SparkException: Task not serializable

2014-10-17 Thread shahab
Hi, Probably I am missing very simple principle , but something is wrong with my filter, i get org.apache.spark.SparkException: Task not serializable expetion. here is my filter function: object OBJ { def f1(): Boolean = { var i = 1; for (j-1 to 10) i = i +1; true

Re: What's wrong with my spark filter? I get org.apache.spark.SparkException: Task not serializable

2014-10-17 Thread Sourav Chandra
...@gmail.com wrote: Hi, Probably I am missing very simple principle , but something is wrong with my filter, i get org.apache.spark.SparkException: Task not serializable expetion. here is my filter function: object OBJ { def f1(): Boolean = { var i = 1; for (j-1 to 10) i = i

Re: Task not serializable

2014-09-10 Thread Sean Owen
this FileSystem instance I read those reference files and use that data in my processing logic. This is throwing task not serializable exceptions for 'UserGroupInformation' and 'FileSystem' classes. I also tried using 'SparkHadoopUtil' instead of 'UserGroupInformation'. But it didn't resolve

Re: Task not serializable

2014-09-10 Thread Sarath Chandra
. This is throwing task not serializable exceptions for 'UserGroupInformation' and 'FileSystem' classes. I also tried using 'SparkHadoopUtil' instead of 'UserGroupInformation'. But it didn't resolve the issue. Request you provide some pointers in this regard. Also I have a query - when we

Re: Task not serializable

2014-09-10 Thread Marcelo Vanzin
program basically loads a HDFS file and for each line in the file it applies several transformation functions available in various external libraries. When I execute this over spark, it is throwing me Task not serializable exceptions for each and every class being used from

Re: Task not serializable

2014-09-06 Thread Sean Owen
migrated the program from Java to Scala. The map-reduce program basically loads a HDFS file and for each line in the file it applies several transformation functions available in various external libraries. When I execute this over spark, it is throwing me Task not serializable exceptions for each

Re: Task not serializable

2014-09-06 Thread Sarath Chandra
, it is throwing me Task not serializable exceptions for each and every class being used from these from external libraries. I included serialization to few classes which are in my scope, but there there are several other classes which are out of my scope like org.apache.hadoop.io.Text

Task not serializable

2014-09-05 Thread Sarath Chandra
this over spark, it is throwing me Task not serializable exceptions for each and every class being used from these from external libraries. I included serialization to few classes which are in my scope, but there there are several other classes which are out of my scope like org.apache.hadoop.io.Text

Re: Task not serializable

2014-09-05 Thread Akhil Das
this over spark, it is throwing me Task not serializable exceptions for each and every class being used from these from external libraries. I included serialization to few classes which are in my scope, but there there are several other classes which are out of my scope like org.apache.hadoop.io.Text

Re: Task not serializable

2014-09-05 Thread Sarath Chandra
this over spark, it is throwing me Task not serializable exceptions for each and every class being used from these from external libraries. I included serialization to few classes which are in my scope, but there there are several other classes which are out of my scope like

Re: Debugging Task not serializable

2014-08-15 Thread Juan Rodríguez Hortalá
. Thanks Best Regards On Mon, Jul 28, 2014 at 9:21 PM, Juan Rodríguez Hortalá juan.rodriguez.hort...@gmail.com wrote: Hi all, I was wondering if someone has conceived a method for debugging Task not serializable: java.io.NotSerializableException errors, apart from commenting

Re: Debugging Task not serializable

2014-07-30 Thread Juan Rodríguez Hortalá
juan.rodriguez.hort...@gmail.com wrote: Hi all, I was wondering if someone has conceived a method for debugging Task not serializable: java.io.NotSerializableException errors, apart from commenting and uncommenting parts of the program, or just turning everything into Serializable. I find this kind of error

Re: Debugging Task not serializable

2014-07-28 Thread Akhil Das
for debugging Task not serializable: java.io.NotSerializableException errors, apart from commenting and uncommenting parts of the program, or just turning everything into Serializable. I find this kind of error very hard to debug, as these are originated in the Spark runtime system. I'm using Spark

Re: Debugging Task not serializable

2014-07-28 Thread andy petrella
this exception. Thanks Best Regards On Mon, Jul 28, 2014 at 9:21 PM, Juan Rodríguez Hortalá juan.rodriguez.hort...@gmail.com wrote: Hi all, I was wondering if someone has conceived a method for debugging Task not serializable: java.io.NotSerializableException errors, apart from

Re: Task not serializable: java.io.NotSerializableException: org.apache.spark.SparkContext

2014-07-24 Thread lihu
in the redeceBykey operation, but failed at the collect operation, this confused me. INFO DAGScheduler: Failed to run collect at KMeans.scala:235 [error] (run-main-0) org.apache.spark.SparkException: Job aborted: Task not serializable: java.io.NotSerializableException: org.apache.spark.SparkContext

Re: Task not serializable: java.io.NotSerializableException: org.apache.spark.SparkContext

2014-07-24 Thread Tathagata Das
in the redeceBykey operation, but failed at the collect operation, this confused me. INFO DAGScheduler: Failed to run collect at KMeans.scala:235 [error] (run-main-0) org.apache.spark.SparkException: Job aborted: Task not serializable: java.io.NotSerializableException: org.apache.spark.SparkContext

Re: Task not serializable: java.io.NotSerializableException: org.apache.spark.SparkContext

2014-07-21 Thread hsy...@gmail.com
) org.apache.spark.SparkException: Job aborted: Task not serializable: java.io.NotSerializableException: org.apache.spark.SparkContext org.apache.spark.SparkException: Job aborted: Task not serializable: java.io.NotSerializableException: org.apache.spark.SparkContext

Task not serializable: java.io.NotSerializableException: org.apache.spark.SparkContext

2014-07-19 Thread lihu
the centerArrays ] it can success in the redeceBykey operation, but failed at the collect operation, this confused me. INFO DAGScheduler: Failed to run collect at KMeans.scala:235 [error] (run-main-0) org.apache.spark.SparkException: Job aborted: Task not serializable: java.io.NotSerializableException

Nested method in a class: Task not serializable?

2014-05-16 Thread Pierre B
Hi! I understand the usual Task not serializable issue that arises when accessing a field or a method that is out of scope of a closure. To fix it, I usually define a local copy of these fields/methods, which avoids the need to serialize the whole class: class MyClass(val myField: Any) { def

Re: Task not serializable?

2014-05-15 Thread pedro
.1001560.n3.nabble.com/Re-Task-not-serializable-tp3507p5506.html Sent from the Apache Spark User List mailing list archive at Nabble.com.

Re: Task not serializable: collect, take

2014-05-02 Thread SK
Thank you very much. Making the trait serializable worked. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Task-not-serializable-collect-take-tp5193p5236.html Sent from the Apache Spark User List mailing list archive at Nabble.com.

Task not serializable: collect, take

2014-05-01 Thread SK
Hi, I have the following code structure. I compiles ok, but at runtime it aborts with the error: Exception in thread main org.apache.spark.SparkException: Job aborted: Task not serializable: java.io.NotSerializableException: I am running in local (standalone) mode. trait A{ def input

Re: Task not serializable: collect, take

2014-05-01 Thread Marcelo Vanzin
Have you tried making A extend Serializable? On Thu, May 1, 2014 at 3:47 PM, SK skrishna...@gmail.com wrote: Hi, I have the following code structure. I compiles ok, but at runtime it aborts with the error: Exception in thread main org.apache.spark.SparkException: Job aborted: Task

Re: Task not serializable?

2014-03-31 Thread Daniel Liu
Hi I am new to Spark and I encountered this error when I try to map RDD[A] = RDD[Array[Double]] then collect the results. A is a custom class extends Serializable. (Actually it's just a wrapper class which wraps a few variables that are all serializable). I also tried KryoSerializer according