Re: Kryo fails with avro having Arrays and unions, but succeeds with simple avro.

2014-09-18 Thread Frank Austin Nothaft
browse/SPARK-3447 > > > > - > Thanks & Regards, > Mohan > -- > View this message in context: > http://apache-spark-user-list.1001560.n3.nabble.com/Kryo-fails-with-avro-having-Arrays-and-unions-but-succeeds-with-simple-avro-tp14549p14557.html > Sent from

Re: Kryo fails with avro having Arrays and unions, but succeeds with simple avro.

2014-09-18 Thread mohan.gadm
Added some more info on this issue in the tracker Spark-3447 https://issues.apache.org/jira/browse/SPARK-3447 - Thanks & Regards, Mohan -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Kryo-fails-with-avro-having-Arrays-and-unions-but-succeeds-

Kryo fails with avro having Arrays and unions, but succeeds with simple avro.

2014-09-18 Thread mohan.gadm
*I am facing similar issue to Spark-3447 with spark streaming Api, Kryo Serializer, Avro messages. If avro message is simple, its fine. but if the avro message has Union/Arrays its failing with the exception Below:* ERROR scheduler.JobScheduler: Error running job streaming job 1411043845000 ms.0

RE: spark kryo serilizable exception

2014-08-18 Thread Sameer Tilak
Hi,I was able to set this parameter in my application to resolve this issue: set("spark.kryoserializer.buffer.mb", "256") Please let me know if this helps. Date: Mon, 18 Aug 2014 21:50:02 +0800 From: dujinh...@hzduozhun.com To: user@spark.apache.org Subject: spark kryo se

spark kryo serilizable exception

2014-08-18 Thread adu
hi all, In RDD map , i invoke an object that is *Serialized* by java standard , and exception :: com.esotericsoftware.kryo.KryoException: Buffer overflow. Available: 0, required: 13 at com.esotericsoftware.kryo.io.Output.require(Output.java:138) at com.esotericsoftware.kryo.io.Output.writeAscii_

kryo out of buffer exception

2014-08-16 Thread Mohit Jaggi
Hi All, I was doing a groupBy and apparently some keys were very frequent making the serializer fail with buffer overflow exception. I did not need a groupBy so I switched to combineByKey in this case but would like to know how to increase the kryo buffer sizes to avoid this error. I hope there is

Re: Issue using kryo serilization

2014-08-01 Thread gpatcham
any pointers to this issue. Thanks -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Issue-using-kryo-serilization-tp11129p11191.html Sent from the Apache Spark User List mailing list archive at Nabble.com.

Re: Issue using kryo serilization

2014-07-31 Thread gpatcham
No,it doesn't implement serializable..It's third party class -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Issue-using-kryo-serilization-tp11129p11136.html Sent from the Apache Spark User List mailing list archive at Nabble.com.

Re: Issue using kryo serilization

2014-07-31 Thread ratabora
Does the class your serializing implement serializable? -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Issue-using-kryo-serilization-tp11129p11134.html Sent from the Apache Spark User List mailing list archive at Nabble.com.

Re: Issue using kryo serilization

2014-07-31 Thread gpatcham
Yes,I did enable that conf.set("spark.serializer", "org.apache.spark.serializer.KryoSerializer") conf.set("spark.kryo.registrator", "com.bigdata.MyRegistrator") -- View this message in context: http://apache-spark-user-list.1001560.n3.

Re: Issue using kryo serilization

2014-07-31 Thread Andrew Ash
Did you enable Kryo and have it use your registrator using spark.serializer=org.apache.spark.serializer.KryoSerializer and spark.kryo.registrator=mypackage.MyRegistrator ? It looks like the serializer being used is the default Java one http://spark.apache.org/docs/latest/tuning.html#data

Issue using kryo serilization

2014-07-31 Thread gpatcham
I'm new to spark programming and here I'm trying to use third party class in map with kryo serializer val deviceApi = new DeviceApi() deviceApi.loadDataFromStream(this.getClass.getClassLoader.getResourceAsStream("20140730.json")) val properties = uaRDD1.map(line =>

Re: Kryo Issue on Spark 1.0.1, Mesos 0.18.2

2014-07-25 Thread Gary Malouf
g loaded. On Fri, Jul 25, 2014 at 2:27 PM, Gary Malouf wrote: > After upgrading to Spark 1.0.1 from 0.9.1 everything seemed to be going > well. Looking at the Mesos slave logs, I noticed: > > ERROR KryoSerializer: Failed to run spark.kryo.registrator > java.lang.Cla

Kryo Issue on Spark 1.0.1, Mesos 0.18.2

2014-07-25 Thread Gary Malouf
After upgrading to Spark 1.0.1 from 0.9.1 everything seemed to be going well. Looking at the Mesos slave logs, I noticed: ERROR KryoSerializer: Failed to run spark.kryo.registrator java.lang.ClassNotFoundException: com/mediacrossing/verrazano/kryo/MxDataRegistrator My spark-env.sh has the

Kryo NoSuchMethodError on Spark 1.0.0 standalone

2014-07-15 Thread jfowkes
Hi there, I've been sucessfully using the precompiled Spark 1.0.0 Java api on a small cluster in standalone mode. However, when I try to use Kryo serializer by adding conf.set("spark.serializer","org.apache.spark.serializer.KryoSerializer"); as suggested, Spark crash

Error with Stream Kafka Kryo

2014-07-09 Thread richiesgr
Hi My setup is to use localMode standalone, Sprak 1.0.0 release version, scala 2.10.4 I made a job that receive serialized object from Kafka broker. The objects are serialized using kryo. The code : val sparkConf = new SparkConf().setMaster("local[4]").setAppName("SparkTe

RE: Kryo is slower, and the size saving is minimal

2014-07-09 Thread innowireless TaeYun Kim
ginal Message- From: wxhsdp [mailto:wxh...@gmail.com] Sent: Wednesday, July 09, 2014 5:47 PM To: u...@spark.incubator.apache.org Subject: Re: Kryo is slower, and the size saving is minimal i'am not familiar with kryo and my opinion may be not right. in my case, kryo only saves about 5% of th

Re: Kryo is slower, and the size saving is minimal

2014-07-09 Thread wxhsdp
i'am not familiar with kryo and my opinion may be not right. in my case, kryo only saves about 5% of the original size when dealing with primitive types such as Arrays. i'am not sure whether it is the common case. -- View this message in context: http://apache-spark-user-list.

Kryo is slower, and the size saving is minimal

2014-07-08 Thread innowireless TaeYun Kim
Hi, For my test case, using Kryo serializer does not help. It is slower than default Java serializer, and the size saving is minimal. I've registered almost all classes to the Kryo registrator. What is happening to my test case? Have Anyone experienced a case like this?

using kryo for spark.closure.serializer with a registrator doesn't work

2014-05-04 Thread Soren Macbeth
Is this supposed to be supported? It doesn't work, at least in mesos fine grained mode. First it fails a bunch of times because it can't find my registrator class because my assembly jar hasn't been fetched like so: java.lang.ClassNotFoundException: pickles.kryo.PicklesRegistrator at java.

Re: Crazy Kryo Exception

2014-05-04 Thread Soren Macbeth
ay 2, 2014 at 3:35 PM, Soren Macbeth wrote: > >> Hallo, >> >> I've getting this rather crazy kryo exception trying to run my spark job: >> >> Exception in thread "main" org.apache.spark.SparkException: Job aborted: >> E

Re: Crazy Kryo Exception

2014-05-03 Thread Soren Macbeth
so it seems that it dying while trying to fetch results from my tasks to return back to the driver. Am I close? On Fri, May 2, 2014 at 3:35 PM, Soren Macbeth wrote: > Hallo, > > I've getting this rather crazy kryo exception trying to run my spark job: > > Ex

Crazy Kryo Exception

2014-05-02 Thread Soren Macbeth
Hallo, I've getting this rather crazy kryo exception trying to run my spark job: Exception in thread "main" org.apache.spark.SparkException: Job aborted: Exception while deserializing and fetching task: com.esotericsoftware.kryo.KryoException: java.lang.IllegalArgumentExcepti

GraphX, Kryo and BoundedPriorityQueue?

2014-04-23 Thread Ryan Compton
For me, PageRank fails when I use Kryo (works fine if I don't). I found the same problem reported here: https://groups.google.com/forum/#!topic/spark-users/unngi3JdRk8 . Has this been resolved? I'm not launching code from spark-shell. I tried registering GraphKryoRegistrator (instead

Re: using Kryo with pyspark?

2014-04-14 Thread Matei Zaharia
Kryo won’t make a major impact on PySpark because it just stores data as byte[] objects, which are fast to serialize even with Java. But it may be worth a try — you would just set spark.serializer and not try to register any classes. What might make more impact is storing data as

using Kryo with pyspark?

2014-04-14 Thread Diana Carroll
I'm looking at the Tuning Guide suggestion to use Kryo instead of default serialization. My questions: Does pyspark use Java serialization by default, as Scala spark does? If so, then... can I use Kryo with pyspark instead? The instructions say I should register my classes with the

Re: Kryo serialization does not compress

2014-03-07 Thread pradeeps8
Hi Patrick, Thanks for your reply. I am guessing even an array type will be registered automatically. Is this correct? Thanks, Pradeep -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Kryo-serialization-does-not-compress-tp2042p2400.html Sent from the

Re: Kryo serialization does not compress

2014-03-06 Thread Patrick Wendell
other question, it's possible that serializing doesn't provide a big space savings for your objects, especially if you are serializing mostly primitive types. It depends a bit what the type of the object it is. One thing is, it would be good to register all of the object types you plan to ser

Re: Kryo serialization does not compress

2014-03-06 Thread pradeeps8
We are trying to use kryo serialization, but with kryo serialization ON the memory consumption does not change. We have tried this on multiple sets of data. We have also checked the logs of Kryo serialization and have confirmed that Kryo is being used. Can somebody please help us with this? The

Re: Kryo Registration, class is not registered, but Log.TRACE() says otherwise

2014-02-28 Thread pondwater
Has no one ever registered generic classes in scala? Is it possible? -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Kryo-Registration-class-is-not-registered-but-Log-TRACE-says-otherwise-tp2077p2182.html Sent from the Apache Spark User List mailing list

Kryo serialization does not compress

2014-02-25 Thread pradeeps8
(511.5 MB) which uses Kryo serialization. Both consumes almost equivalent storage (519.1 MB vs 511.5 MB respectively). Is this behavior expected? Because we were under the impression that kryo serialization is efficient and were expecting it to compress further. Also,we have noticed that when we

<    1   2   3