Typo in previous email, pardon me.
Set spark.driver.maxResultSize to 1068 or higher.
On Thu, Apr 9, 2015 at 8:57 AM, Ted Yu yuzhih...@gmail.com wrote:
Please set spark.kryoserializer.buffer.max.mb to 1068 (or higher).
Cheers
On Thu, Apr 9, 2015 at 8:54 AM, ÐΞ€ρ@Ҝ (๏̯͡๏) deepuj...@gmail.com
Yes i had tried that.
Now i see this
15/04/09 07:58:08 INFO scheduler.DAGScheduler: Job 0 failed: collect at
VISummaryDataProvider.scala:38, took 275.334991 s
15/04/09 07:58:08 ERROR yarn.ApplicationMaster: User class threw exception:
Job aborted due to stage failure: Total size of serialized
Pressed send early.
I had tried that with these settings
buffersize=128 maxbuffersize=1024
val conf = new SparkConf()
.setAppName(detail)
.set(spark.serializer, org.apache.spark.serializer.KryoSerializer)
.set(spark.kryoserializer.buffer.mb,arguments.get(buffersize).get)
My Spark (1.3.0) job is failing with
com.esotericsoftware.kryo.KryoException: Buffer overflow. Available: 0,
required: 1+details
com.esotericsoftware.kryo.KryoException: Buffer overflow. Available:
0, required: 1
at com.esotericsoftware.kryo.io.Output.require(Output.java:138)
at
Please take a look at
https://code.google.com/p/kryo/source/browse/trunk/src/com/esotericsoftware/kryo/io/Output.java?r=236
, starting line 27.
In Spark, you can control the maxBufferSize
with spark.kryoserializer.buffer.max.mb
Cheers