Michael - it is already transient. This should probably considered a bug in
the scala compiler, but we can easily work around it by removing the use of
destructuring binding.
On Mon, Feb 16, 2015 at 10:41 AM, Michael Armbrust mich...@databricks.com
wrote:
I'd suggest marking the HiveContext as
I submitted a patch
https://github.com/apache/spark/pull/4628
On Mon, Feb 16, 2015 at 10:59 AM, Michael Armbrust mich...@databricks.com
wrote:
I was suggesting you mark the variable that is holding the HiveContext
'@transient' since the scala compiler is not correctly propagating this
I'd suggest marking the HiveContext as @transient since its not valid to
use it on the slaves anyway.
On Mon, Feb 16, 2015 at 4:27 AM, Haopu Wang hw...@qilinsoft.com wrote:
When I'm investigating this issue (in the end of this email), I take a
look at HiveContext's code and find this change
I was suggesting you mark the variable that is holding the HiveContext
'@transient' since the scala compiler is not correctly propagating this
through the tuple extraction. This is only a workaround. We can also
remove the tuple extraction.
On Mon, Feb 16, 2015 at 10:47 AM, Reynold Xin
To: Michael Armbrust
Cc: Haopu Wang; dev@spark.apache.org
Subject: Re: HiveContext cannot be serialized
I submitted a patch
https://github.com/apache/spark/pull/4628
On Mon, Feb 16, 2015 at 10:59 AM, Michael Armbrust
mich...@databricks.com wrote:
I was suggesting you mark the variable
When I'm investigating this issue (in the end of this email), I take a
look at HiveContext's code and find this change
(https://github.com/apache/spark/commit/64945f868443fbc59cb34b34c16d782d
da0fb63d#diff-ff50aea397a607b79df9bec6f2a841db):
- @transient protected[hive] lazy val hiveconf = new