Re: Kryo serialization failed: Buffer overflow : Broadcast Join

2018-02-02 Thread Pralabh Kumar
I am using spark 2.1.0 On Fri, Feb 2, 2018 at 5:08 PM, Pralabh Kumar wrote: > Hi > > I am performing broadcast join where my small table is 1 gb . I am > getting following error . > > I am using > > > org.apache.spark.SparkException: > . Available: 0, required:

Re: Kryo not registered class

2017-11-20 Thread Vadim Semenov
Try: Class.forName("[Lorg.apache.spark.sql.execution.datasources.PartitioningAwareFileIndex$SerializableFileStatus$SerializableBlockLocation;") On Sun, Nov 19, 2017 at 3:24 PM, Angel Francisco Orta < angel.francisco.o...@gmail.com> wrote: > Hello, I'm with spark 2.1.0 with scala and I'm

Re: Kryo On Spark 1.6.0

2017-01-14 Thread Yan Facai
For scala, you could fix it by using: conf.registerKryoClasses(Array(Class.forName("scala.collection.mutable. WrappedArray$ofRef"))) By the way, if the class is array of primitive class of Java, say byte[], then to use: Class.forName("[B") if it is array of other class, say

RE: Kryo On Spark 1.6.0 [Solution in this email]

2017-01-11 Thread Enrico DUrso
January 2017 15:12 To: Enrico DUrso Cc: user@spark.apache.org Subject: Re: Kryo On Spark 1.6.0 If you don’t mind, could please share me with the scala solution? I tried to use kryo but seamed not work at all. I hope to get some practical example. THX On 2017年1月10日, at 19:10, Enrico DUrso <enrico

Re: Kryo On Spark 1.6.0

2017-01-10 Thread Yang Cao
If you don’t mind, could please share me with the scala solution? I tried to use kryo but seamed not work at all. I hope to get some practical example. THX > On 2017年1月10日, at 19:10, Enrico DUrso wrote: > > Hi, > > I am trying to use Kryo on Spark 1.6.0. > I am able to

RE: Kryo On Spark 1.6.0

2017-01-10 Thread Enrico DUrso
in according to how Spark works. How can I register all those classes? cheers, From: Richard Startin [mailto:richardstar...@outlook.com] Sent: 10 January 2017 11:18 To: Enrico DUrso; user@spark.apache.org Subject: Re: Kryo On Spark 1.6.0 Hi Enrico, Only set spark.kryo.registrationRequired if you want

Re: Kryo On Spark 1.6.0

2017-01-10 Thread Richard Startin
Hi Enrico, Only set spark.kryo.registrationRequired if you want to forbid any classes you have not explicitly registered - see http://spark.apache.org/docs/latest/configuration.html. Configuration - Spark 2.0.2 Documentation

Re: Kryo ClassCastException during Serialization/deserialization in Spark Streaming

2016-06-23 Thread swetha kasireddy
sampleMap is populated from inside a method that is getting called from updateStateByKey On Thu, Jun 23, 2016 at 1:13 PM, Ted Yu wrote: > Can you illustrate how sampleMap is populated ? > > Thanks > > On Thu, Jun 23, 2016 at 12:34 PM, SRK wrote:

Re: Kryo ClassCastException during Serialization/deserialization in Spark Streaming

2016-06-23 Thread Ted Yu
Can you illustrate how sampleMap is populated ? Thanks On Thu, Jun 23, 2016 at 12:34 PM, SRK wrote: > Hi, > > I keep getting the following error in my Spark Streaming every now and then > after the job runs for say around 10 hours. I have those 2 classes >

Re: kryo

2016-05-12 Thread Ted Yu
e.DateTimeZone.convertUTCToLocal(DateTimeZone.java:925) > > > > > > Any ideas? > > > > Thanks > > > > > > *From:* Ted Yu [mailto:yuzhih...@gmail.com] > *Sent:* May-11-16 5:32 PM > *To:* Younes Naguib > *Cc:* user@spark.apache.org > *Subject:* Re: kr

RE: kryo

2016-05-12 Thread Younes Naguib
) at org.joda.time.DateTimeZone.convertUTCToLocal(DateTimeZone.java:925) Any ideas? Thanks From: Ted Yu [mailto:yuzhih...@gmail.com] Sent: May-11-16 5:32 PM To: Younes Naguib Cc: user@spark.apache.org Subject: Re: kryo Have you seen this thread ? http://search-hadoop.com/m/q3RTtpO0qI3cp06/JodaDateTimeSerializer+spark

Re: kryo

2016-05-11 Thread Ted Yu
Have you seen this thread ? http://search-hadoop.com/m/q3RTtpO0qI3cp06/JodaDateTimeSerializer+spark=Re+NPE+when+using+Joda+DateTime On Wed, May 11, 2016 at 2:18 PM, Younes Naguib < younes.nag...@tritondigital.com> wrote: > Hi all, > > I'm trying to get to use spark.serializer. > I set it in the

Re: Kryo serialization mismatch in spark sql windowing function

2016-04-06 Thread Soam Acharya
Hi Josh, Appreciate the response! Also, Steve - we meet again :) At any rate, here's the output (a lot of it anyway) of running spark-sql with the verbose option so that you can get a sense of the settings and the classpath. Does anything stand out? Using properties file:

Re: Kryo serialization mismatch in spark sql windowing function

2016-04-06 Thread Josh Rosen
Spark is compiled against a custom fork of Hive 1.2.1 which added shading of Protobuf and removed shading of Kryo. What I think that what's happening here is that stock Hive 1.2.1 is taking precedence so the Kryo instance that it's returning is an instance of shaded/relocated Hive version rather

Re: Kryo serializer Exception during serialization: java.io.IOException: java.lang.IllegalArgumentException:

2016-01-08 Thread Shixiong(Ryan) Zhu
Could you disable `spark.kryo.registrationRequired`? Some classes may not be registered but they work well with Kryo's default serializer. On Fri, Jan 8, 2016 at 8:58 AM, Ted Yu wrote: > bq. try adding scala.collection.mutable.WrappedArray > > But the hint said registering

Re: Kryo serializer Exception during serialization: java.io.IOException: java.lang.IllegalArgumentException:

2016-01-08 Thread jiml
(point of post is to see if anyone has ideas about errors at end of post) In addition, the real way to test if it's working is to force serialization: In Java: Create array of all your classes: // for kyro serializer it wants to register all classes that need to be serialized Class[]

Re: Kryo serializer Exception during serialization: java.io.IOException: java.lang.IllegalArgumentException:

2016-01-08 Thread Ted Yu
bq. try adding scala.collection.mutable.WrappedArray But the hint said registering scala.collection.mutable.WrappedArray$ofRef.class , right ? On Fri, Jan 8, 2016 at 8:52 AM, jiml wrote: > (point of post is to see if anyone has ideas about errors at end of post) > >

Re: Kryo serialization fails when using SparkSQL and HiveContext

2015-12-14 Thread Michael Armbrust
You'll need to either turn off registration (spark.kryo.registrationRequired) or create a custom register spark.kryo.registrator http://spark.apache.org/docs/latest/configuration.html#compression-and-serialization On Mon, Dec 14, 2015 at 2:17 AM, Linh M. Tran wrote: >

Re: Kryo Serialization in Spark

2015-12-10 Thread manasdebashiskar
Are you sure you are using Kryo serialization. You are getting a java serialization error. Are you setting up your sparkcontext with kryo serialization enabled? -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Kryo-Serialization-in-Spark-tp25628p25678.html

Re: Kryo Serializer on Worker doesn't work by default.

2015-07-08 Thread Eugene Morozov
What I seem to be don’t get is how my code ends up being on Worker node. My understanding was that jar file, which I use to start the job should automatically be copied into Worker nodes and added to classpath. It seems to be not the case. But if my jar is not copied into Worker nodes, then how

Re: Kryo fails to serialise output

2015-07-03 Thread Will Briggs
Kryo serialization is used internally by Spark for spilling or shuffling intermediate results, not for writing out an RDD as an action. Look at Sandy Ryza's examples for some hints on how to do this: https://github.com/sryza/simplesparkavroapp Regards, Will On July 3, 2015, at 2:45 AM,

Re: Kryo serialization of classes in additional jars

2015-06-26 Thread patcharee
Hi, I am having this problem on spark 1.4. Do you have any ideas how to solve it? I tried to use spark.executor.extraClassPath, but it did not help BR, Patcharee On 04. mai 2015 23:47, Imran Rashid wrote: Oh, this seems like a real pain. You should file a jira, I didn't see an open issue

Re: Kryo serialization of classes in additional jars

2015-05-13 Thread Akshat Aranya
I cherry-picked this commit into my local 1.2 branch. It fixed the problem with setting spark.serializer, but I get a similar problem with spark.closure.serializer org.apache.spark.SparkException: Failed to register classes with Kryo at

Re: Kryo serialization of classes in additional jars

2015-05-04 Thread Akshat Aranya
Actually, after some digging, I did find a JIRA for it: SPARK-5470. The fix for this has gone into master, but it isn't in 1.2. On Mon, May 4, 2015 at 2:47 PM, Imran Rashid iras...@cloudera.com wrote: Oh, this seems like a real pain. You should file a jira, I didn't see an open issue -- if

RE: Kryo exception : Encountered unregistered class ID: 13994

2015-04-13 Thread mehdisinger
Hello, Thank you for your answer. I'm already registering my classes as you're suggesting... Regards De : tsingfu [via Apache Spark User List] [mailto:ml-node+s1001560n22468...@n3.nabble.com] Envoyé : lundi 13 avril 2015 03:48 À : Mehdi Singer Objet : Re: Kryo exception : Encountered

Re: Kryo exception : Encountered unregistered class ID: 13994

2015-04-13 Thread ๏̯͡๏
my classes as you’re suggesting… Regards *De :* tsingfu [via Apache Spark User List] [mailto:ml-node+[hidden email] http:///user/SendEmail.jtp?type=nodenode=22471i=0] *Envoyé :* lundi 13 avril 2015 03:48 *À :* Mehdi Singer *Objet :* Re: Kryo exception : Encountered unregistered class ID

Re: Kryo exception : Encountered unregistered class ID: 13994

2015-04-09 Thread Ted Yu
Is there custom class involved in your application ? I assume you have called sparkConf.registerKryoClasses() for such class(es). Cheers On Thu, Apr 9, 2015 at 7:15 AM, mehdisinger mehdi.sin...@lampiris.be wrote: Hi, I'm facing an issue when I try to run my Spark application. I keep getting

Re: Kryo NPE with Array

2014-12-02 Thread Simone Franzini
I finally solved this issue. The problem was that: 1. I defined a case class with a Buffer[MyType] field. 2. I instantiated the class with the field set to the value given by an implicit conversion from a Java list, which is supposedly a Buffer. 3. However, the underlying type of that field was

RE: Kryo exception for CassandraSQLRow

2014-12-01 Thread Ashic Mahtab
Don't know if this'll solve it, but if you're on Spark 1.1, the Cassandra Connector version 1.1.0 final fixed the guava back compat issue. Maybe taking the guava exclusions might help? Date: Mon, 1 Dec 2014 10:48:25 +0100 Subject: Kryo exception for CassandraSQLRow From: shahab.mok...@gmail.com

Re: Kryo NPE with Array

2014-11-26 Thread Simone Franzini
I guess I already have the answer of what I have to do here, which is to configure the kryo object with the strategy as above. Now the question becomes: how can I pass this custom kryo configuration to the spark kryo serializer / kryo registrator? I've had a look at the code but I am still fairly

Re: Kryo UnsupportedOperationException

2014-09-25 Thread Ian O'Connell
I would guess the field serializer is having issues being able to reconstruct the class again, its pretty much best effort. Is this an intermediate type? On Thu, Sep 25, 2014 at 2:12 PM, Sandy Ryza sandy.r...@cloudera.com wrote: We're running into an error (below) when trying to read spilled

Re: Kryo fails with avro having Arrays and unions, but succeeds with simple avro.

2014-09-19 Thread mohan.gadm
Thanks for the info frank. Twitter's-chill avro serializer looks great. But how does spark identifies it as serializer, as its not extending from KryoSerializer. (sorry scala is an alien lang for me). - Thanks Regards, Mohan -- View this message in context:

Re: Kryo fails with avro having Arrays and unions, but succeeds with simple avro.

2014-09-19 Thread Frank Austin Nothaft
Hi Mohan, It’s a bit convoluted to follow in their source, but they essentially typedef KSerializer as being a KryoSerializer, and then their serializers all extend KSerializer. Spark should identify them properly as Kryo Serializers, but I haven’t tried it myself. Regards, Frank Austin

Re: Kryo fails with avro having Arrays and unions, but succeeds with simple avro.

2014-09-18 Thread mohan.gadm
Hi frank, thanks for the info, thats great. but im not saying Avro serializer is failing. Kryo is failing but im using kryo serializer. and registering Avro generated classes with kryo. sparkConf.set(spark.serializer, org.apache.spark.serializer.KryoSerializer);

Re: Kryo Issue on Spark 1.0.1, Mesos 0.18.2

2014-07-25 Thread Gary Malouf
Maybe this is me misunderstanding the Spark system property behavior, but I'm not clear why the class being loaded ends up having '/' rather than '.' in it's fully qualified name. When I tested this out locally, the '/' were preventing the class from being loaded. On Fri, Jul 25, 2014 at 2:27

RE: Kryo is slower, and the size saving is minimal

2014-07-09 Thread innowireless TaeYun Kim
Message- From: wxhsdp [mailto:wxh...@gmail.com] Sent: Wednesday, July 09, 2014 5:47 PM To: u...@spark.incubator.apache.org Subject: Re: Kryo is slower, and the size saving is minimal i'am not familiar with kryo and my opinion may be not right. in my case, kryo only saves about 5% of the original

Re: Kryo serialization does not compress

2014-03-07 Thread pradeeps8
Hi Patrick, Thanks for your reply. I am guessing even an array type will be registered automatically. Is this correct? Thanks, Pradeep -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Kryo-serialization-does-not-compress-tp2042p2400.html Sent from the

Re: Kryo serialization does not compress

2014-03-06 Thread pradeeps8
We are trying to use kryo serialization, but with kryo serialization ON the memory consumption does not change. We have tried this on multiple sets of data. We have also checked the logs of Kryo serialization and have confirmed that Kryo is being used. Can somebody please help us with this? The