Re: Why is Kryo not the default serializer?
Array issue was also discussed in Apache Hive forum. This problem seems like it can be resolved by using Kryo 3.x. Will upgrading to Kryo 3.x allow Kryo to become the default SerDes? https://issues.apache.org/jira/browse/HIVE-12174 - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org
Re: Why is Kryo not the default serializer?
I have seen some failures in our workloads with Kryo, one I remember is a scenario with very large arrays. We could not get Kryo to work despite using the different configuration properties. Switching to java serde was what worked. Regards Sab On Tue, Nov 10, 2015 at 11:43 AM, Hitoshi Ozawa wrote: > If Kryo usage is recommended, why is Java serialization the default > serializer instead of Kryo? Is there some limitation to using Kryo? I've > read through the documentation but it just seem Kryo is a better choice > and > should be made a default. > > > > -- > View this message in context: > http://apache-spark-user-list.1001560.n3.nabble.com/Why-is-Kryo-not-the-default-serializer-tp25338.html > Sent from the Apache Spark User List mailing list archive at Nabble.com. > > - > To unsubscribe, e-mail: user-unsubscr...@spark.apache.org > For additional commands, e-mail: user-h...@spark.apache.org > > -- Architect - Big Data Ph: +91 99805 99458 Manthan Systems | *Company of the year - Analytics (2014 Frost and Sullivan India ICT)* +++
Why is Kryo not the default serializer?
If Kryo usage is recommended, why is Java serialization the default serializer instead of Kryo? Is there some limitation to using Kryo? I've read through the documentation but it just seem Kryo is a better choice and should be made a default. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Why-is-Kryo-not-the-default-serializer-tp25338.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org