00:06:45 -0500
Subject: Re: Not Serializable exception when integrating SQL and Spark Streaming
To: bigdat...@live.com
CC: lian.cs@gmail.com; user@spark.apache.org
The various spark contexts generally aren't serializable because you can't use
them on the executors anyway. We made SQLContext
Generally you can use |-Dsun.io.serialization.extendedDebugInfo=true| to
enable serialization debugging information when serialization exceptions
are raised.
On 12/24/14 1:32 PM, bigdata4u wrote:
I am trying to use sql over Spark streaming using Java. But i am getting
Serialization
@spark.apache.org
Subject: Re: Not Serializable exception when integrating SQL and Spark Streaming
Generally you can use -Dsun.io.serialization.extendedDebugInfo=true
to enable serialization debugging information when serialization
exceptions are raised
-of-JavaRDD-td17094.html#a17150
Why there is difference SQLContext is Serializable but JavaSQLContext is not?
Spark is designed like this.
Thanks
Date: Wed, 24 Dec 2014 16:23:30 +0800
From: lian.cs@gmail.com
To: bigdat...@live.com; user@spark.apache.org
Subject: Re: Not Serializable exception when
: Wed, 24 Dec 2014 16:23:30 +0800
From: lian.cs@gmail.com
To: bigdat...@live.com; user@spark.apache.org
Subject: Re: Not Serializable exception when integrating SQL and Spark
Streaming
Generally you can use -Dsun.io.serialization.extendedDebugInfo=true to
enable serialization debugging