00:06:45 -0500
Subject: Re: Not Serializable exception when integrating SQL and Spark Streaming
To: bigdat...@live.com
CC: lian.cs@gmail.com; user@spark.apache.org
The various spark contexts generally aren't serializable because you can't use
them on the executors anyway. We made SQLContext
.nabble.com/Not-Serializable-exception-when-integrating-SQL-and-Spark-Streaming-tp20845.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional
@spark.apache.org
Subject: Re: Not Serializable exception when integrating SQL and Spark Streaming
Generally you can use -Dsun.io.serialization.extendedDebugInfo=true
to enable serialization debugging information when serialization
exceptions are raised
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Not-Serializable-exception-when-integrating-SQL-and-Spark-Streaming-tp20845.html
Sent from the Apache Spark User List mailing list archive at Nabble.com
: Wed, 24 Dec 2014 16:23:30 +0800
From: lian.cs@gmail.com
To: bigdat...@live.com; user@spark.apache.org
Subject: Re: Not Serializable exception when integrating SQL and Spark
Streaming
Generally you can use -Dsun.io.serialization.extendedDebugInfo=true to
enable serialization debugging
in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Not-Serializable-exception-when-integrating-SQL-and-Spark-Streaming-tp20845.html
Sent from the Apache Spark User List mailing list archive at Nabble.com