Github user srowen commented on a diff in the pull request: https://github.com/apache/spark/pull/4947#discussion_r26041428 --- Diff: core/src/main/scala/org/apache/spark/serializer/KryoSerializer.scala --- @@ -158,7 +158,13 @@ private[spark] class KryoSerializerInstance(ks: KryoSerializer) extends Serializ override def serialize[T: ClassTag](t: T): ByteBuffer = { output.clear() - kryo.writeClassAndObject(output, t) + try { + kryo.writeClassAndObject(output, t) + } catch { + case e: KryoException if e.getMessage.startsWith("Buffer overflow") => + throw new SparkException("Serialization failed: Kryo buffer overflow. To avoid this, " + --- End diff -- The cause stack trace / message would be printed by `printStackTrace`. It would not become part of the message from this new `SparkException`. Net-net I think it wouldn't hurt to just add additional info to the new `SparkException` message if it's deemed useful.
--- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org