Thanks for help
On Aug 21, 2014, at 10:56, Yin Huai wrote:
> If you want to filter the table name, you can use
>
> hc.sql("show tables").filter(row => !"test".equals(row.getString(0
>
> Seems making functionRegistry transient can fix the error.
>
>
> On Wed, Aug 20, 2014 at 8:53 PM,
PR is https://github.com/apache/spark/pull/2074.
--
From: Yin Huai
Sent: 8/20/2014 10:56 PM
To: Vida Ha
Cc: tianyi ; Fengyun RAO ;
user@spark.apache.org
Subject: Re: Got NotSerializableException when access broadcast variable
If you want to filter the table name
If you want to filter the table name, you can use
hc.sql("show tables").filter(row => !"test".equals(row.getString(0
Seems making functionRegistry transient can fix the error.
On Wed, Aug 20, 2014 at 8:53 PM, Vida Ha wrote:
> Hi,
>
> I doubt the the broadcast variable is your problem, sin
Hi,
I doubt the the broadcast variable is your problem, since you are seeing:
org.apache.spark.SparkException: Task not serializable
Caused by: java.io.NotSerializableException: org.apache.spark.sql
.hive.HiveContext$$anon$3
We have a knowledgebase article that explains why this happens - it's a
Thanks for help.
I run this script again with "bin/spark-shell --conf
spark.serializer=org.apache.spark.serializer.KryoSerializer”
in the console, I can see:
scala> sc.getConf.getAll.foreach(println)
(spark.tachyonStore.folderName,spark-eaabe986-03cb-41bd-bde5-993c7db3f048)
(spark.driver.host,1
Hi everyone!
I got a exception when i run my script with spark-shell:
I added
SPARK_JAVA_OPTS="-Dsun.io.serialization.extendedDebugInfo=true"
in spark-env.sh to show the following stack:
org.apache.spark.SparkException: Task not serializable
at
org.apache.spark.util.ClosureCleaner$.