o idea, probably there is, I
>> remember stackoverflow has some examples on how to print all classes, but
>> how to print all classes of kryo classloader - no idea.
>>
>> On 8 September 2015 at 16:43, Nick Peterson wrote:
>>
>>> Yes, the jar contains the cla
ckoverflow has some examples on how to print all classes, but
> how to print all classes of kryo classloader - no idea.
>
> On 8 September 2015 at 16:43, Nick Peterson wrote:
>
>> Yes, the jar contains the class:
>>
>> $ jar -tf lumiata-evaluation-assembly-1.0.jar | gr
examples on how to print all classes, but
how to print all classes of kryo classloader - no idea.
On 8 September 2015 at 16:43, Nick Peterson wrote:
> Yes, the jar contains the class:
>
> $ jar -tf lumiata-evaluation-assembly-1.0.jar | grep 2028/Document/Document
> com/i2028/Documen
classloader kryo is using?
On Tue, Sep 8, 2015 at 6:34 AM Igor Berman wrote:
> java.lang.ClassNotFoundException: com.i2028.Document.Document
>
> 1. so have you checked that jar that you create(fat jar) contains this class?
>
> 2. might be there is some stale cache issue...not sure th
self is on the classpath, and __spark__.jar and __hadoop_conf__
>>>> are as well. When I do everything the same but switch the master to
>>>> local[*], the jar I submit IS added to the classpath.
>>>>
>>>> This seems like a likely culprit. What could c
;>> wrote:
>>>
>>>> as a starting point, attach your stacktrace...
>>>> ps: look for duplicates in your classpath, maybe you include another
>>>> jar with same class
>>>>
>>>> On 8 September 2015 at 06:38, Nicholas R. P
> On 8 September 2015 at 06:38, Nicholas R. Peterson
> wrote:
>
>> I'm trying to run a Spark 1.4.1 job on my CDH5.4 cluster, through Yarn.
>> Serialization is set to use Kryo.
>>
>> I have a large object which I send to the executors as a Broadcast. The
n is set to use Kryo.
>
> I have a large object which I send to the executors as a Broadcast. The
> object seems to serialize just fine. When it attempts to deserialize,
> though, Kryo throws a ClassNotFoundException... for a class that I include
> in the fat jar that I spark-submit.
I'm trying to run a Spark 1.4.1 job on my CDH5.4 cluster, through Yarn.
Serialization is set to use Kryo.
I have a large object which I send to the executors as a Broadcast. The
object seems to serialize just fine. When it attempts to deserialize,
though, Kryo throws a ClassNotFoundExce
For the exception w.r.t. ManifestFactory , there is SPARK-6497 which is
Open.
FYI
On Fri, Aug 28, 2015 at 8:25 AM, donhoff_h <165612...@qq.com> wrote:
> Hi, all
>
> I wrote a spark program which uses the Kryo serialization. When I count a
> rdd which type is RDD[(String,String
Hi, all
I wrote a spark program which uses the Kryo serialization. When I count a rdd
which type is RDD[(String,String)], it reported an Exception like the following
:
* Class is not registered: org.apache.spark.util.collection.CompactBuffer[]
* Note: To register this class use:
kryo.register
[mailto:gangele...@gmail.com]
Sent: Wednesday, April 15, 2015 10:59 AM
To: Imran Rashid
Cc: Akhil Das; user
Subject: Re: Execption while using kryo with broadcast
This worked with java serialization.I am using 1.2.0 you are right if I use
1.2.1 or 1.3.0 this issue will not occur
I will test this and let
Hello,
I´ve got a problem using Spark with Geomesa. I´m not quite sure where the
error comes from, but I assume its problem with Spark.
A ClassNotFoundException is thrown with following content: "Failed to
register classes with Kryo".
Please have a look at https://github.com/apache/
What I seem to be don’t get is how my code ends up being on Worker node.
My understanding was that jar file, which I use to start the job should
automatically be copied into Worker nodes and added to classpath. It seems to
be not the case. But if my jar is not copied into Worker nodes, then how
Hello.
I have an issue with CustomKryoRegistrator, which causes ClassNotFound on
Worker.
The issue is resolved if call SparkConf.setJar with path to the same jar I run.
It is a workaround, but it requires to specify the same jar file twice. The
first time I use it to actually run the job, and
actually correct?
Hope this helps..
Regards,
Gylfi.
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Why-Kryo-Serializer-is-slower-than-Java-Serializer-in-TeraSort-tp23621p23659.html
Sent from the Apache Spark User List mailing list archive at
That code doesn't appear to be registering classes with Kryo, which means the
fully-qualified classname is stored with every Kryo record. The Spark
documentation has more on this:
https://spark.apache.org/docs/latest/tuning.html#data-serialization
Regards,
Will
On July 5, 2015, at 2:
Looks like, it spend more time writing/transferring the 40GB of shuffle
when you used kryo. And surpirsingly, JavaSerializer has 700MB of shuffle?
Thanks
Best Regards
On Sun, Jul 5, 2015 at 12:01 PM, Gavin Liu
wrote:
> Hi,
>
> I am using TeraSort benchmark from ehiggs's
Hi,
I am using TeraSort benchmark from ehiggs's branch
https://github.com/ehiggs/spark-terasort
<https://github.com/ehiggs/spark-terasort> . Then I noticed that in
TeraSort.scala, it is using Kryo Serializer. So I made a small change from
"org.apache.spark.serializer.Kr
Kryo serialization is used internally by Spark for spilling or shuffling
intermediate results, not for writing out an RDD as an action. Look at Sandy
Ryza's examples for some hints on how to do this:
https://github.com/sryza/simplesparkavroapp
Regards,
Will
On July 3, 2015, at 2:
I have a rather simple avro schema to serialize Tweets (message, username,
timestamp).
Kryo and twitter chill are used to do so.
For my dev environment the Spark context is configured as below
val conf: SparkConf = new SparkConf()
conf.setAppName("kryo_test")
conf.setMaster(“local[4]&
ing the jars to the cluster manually, and
then using spark.executor.extraClassPath
On Wed, Apr 29, 2015 at 6:42 PM, Akshat Aranya <mailto:aara...@gmail.com>> wrote:
Hi,
Is it possible to register kryo serialization for classes
contained in jars that are added w
This is a kryo issue. https://github.com/EsotericSoftware/kryo/issues/124.
It has to do with the lengths of the fieldnames. This issue is fixed in
Kryo 2.23.
What's weird is this doesn't break on Hive itself, only when using
SparkSQL. Attached is the full stacktrace. It might be how S
/apache-spark-user-list.1001560.n3.nabble.com/UDF-accessing-hive-struct-array-fails-with-buffer-underflow-from-kryo-tp23078.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
-
To unsubscribe, e-mail:
I cherry-picked this commit into my local 1.2 branch. It fixed the problem
with setting spark.serializer, but I get a similar problem with
spark.closure.serializer
org.apache.spark.SparkException: Failed to register classes with Kryo
at
org.apache.spark.serializer.KryoSerializer.newKryo
reduce method we get null instead of the right graph.
The problem seems to be cause by the Kryo Serialization. In fact, when we
used java serialization Tuples are deserialized correctly.
In addition, using two mapPartition/reduce, the first one for left graphs
and the second one for right graphs t
t2
// Always throws NullPointerException on right1
(left1.merge(left2), right1.merge(right2))
})
Tuples of graphs are serialized correctly (we use a custom serializer).
However, in the reduce method we get null instead of the right graph.
The problem seems to be cause by the Kryo Ser
pushing the jars to the cluster manually, and then using
> spark.executor.extraClassPath
>
> On Wed, Apr 29, 2015 at 6:42 PM, Akshat Aranya wrote:
>>
>> Hi,
>>
>> Is it possible to register kryo serialization for classes contained in
>> jars that are added wit
es is pretty minimal, so as long as you
do this within reason, I think you're OK.
Imran
On Thu, Apr 30, 2015 at 12:34 AM, 邓刚[技术中心] wrote:
> Hi all
>
> We know that spark support Kryo serialization, suppose there is a
> map function which map C to K,V(here C,K,V are
27;t downloaded later. I think you could workaround with some
combination of pushing the jars to the cluster manually, and then using
spark.executor.extraClassPath
On Wed, Apr 29, 2015 at 6:42 PM, Akshat Aranya wrote:
> Hi,
>
> Is it possible to register kryo serialization for classes
(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:724)
I verified that the same configuration works without using Kryo serialization.
On Fri, May 1, 2015 at 9:44 AM, Akshat Aranya wrote:
> I cherry-picked the fix for SPARK-5470 and the problem has gone away.
>
> On Fri, May 1,
gt;>> Hi,
>>>
>>> I'm getting a ClassNotFoundException at the executor when trying to
>>> register a class for Kryo serialization:
>>>
>>> java.lang.reflect.InvocationTargetException
>>> at sun.reflect.NativeConstructorAcc
bit more about Schema$MyRow ?
>
> On Fri, May 1, 2015 at 8:05 AM, Akshat Aranya wrote:
>>
>> Hi,
>>
>> I'm getting a ClassNotFoundException at the executor when trying to
>> register a class for Kryo serialization
ception at the executor when trying to
> register a class for Kryo serialization:
>
> java.lang.reflect.InvocationTargetException
> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> Method)
> at
> sun.reflect.NativeConstructorAccessorImpl.newInstance(Na
Hi,
I'm getting a ClassNotFoundException at the executor when trying to
register a class for Kryo serialization:
java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
Method)
Hi all
We know that spark support Kryo serialization, suppose there is a map
function which map C to K,V(here C,K,V are instance of class C,K,V), when we
register kryo serialization, should I register all of these three class?
Best Wishes
Triones Deng
本电子邮件可能为保密文件。如果阁下非电子邮件所指定之收件人
Hi,
Is it possible to register kryo serialization for classes contained in jars
that are added with "spark.jars"? In my experiment it doesn't seem to
work, likely because the class registration happens before the jar is
shipped to the executor and added to the classloader. Her
p into another hashmap, you should try that:
>
> Map matchData =RddForMarch.collectAsMap();
> Map tmp = new HashMap(matchData);
> final Broadcast> dataMatchGlobal =
> jsc.broadcast(tmp);
>
> Can you please clarify:
> * Does it work w/ java serialization in the end? Or is this k
ork w/ java serialization in the end? Or is this kryo only?
* which Spark version you are using? (one of the relevant bugs was fixed in
1.2.1 and 1.3.0)
On Wed, Apr 15, 2015 at 9:06 AM, Jeetendra Gangele
wrote:
> This looks like known issue? check this out
>
> http://apache-spark-user-
:58 AM, Jeetendra Gangele
> wrote:
>
>> Yes Without Kryo it did work out.when I remove kryo registration it did
>> worked out
>>
>> On 15 April 2015 at 19:24, Jeetendra Gangele
>> wrote:
>>
>>> its not working with the combination of Broadcast.
>
this is a really strange exception ... I'm especially surprised that it
doesn't work w/ java serialization. Do you think you could try to boil it
down to a minimal example?
On Wed, Apr 15, 2015 at 8:58 AM, Jeetendra Gangele
wrote:
> Yes Without Kryo it did work out.when
Yes Without Kryo it did work out.when I remove kryo registration it did
worked out
On 15 April 2015 at 19:24, Jeetendra Gangele wrote:
> its not working with the combination of Broadcast.
> Without Kyro also not working.
>
>
> On 15 April 2015 at 19:20, Akhil Das wrote:
>
its not working with the combination of Broadcast.
Without Kyro also not working.
On 15 April 2015 at 19:20, Akhil Das wrote:
> Is it working without kryo?
>
> Thanks
> Best Regards
>
> On Wed, Apr 15, 2015 at 6:38 PM, Jeetendra Gangele
> wrote:
>
>> Hi All I am
Is it working without kryo?
Thanks
Best Regards
On Wed, Apr 15, 2015 at 6:38 PM, Jeetendra Gangele
wrote:
> Hi All I am getting below exception while using Kyro serializable with
> broadcast variable. I am broadcating a hasmap with below line.
>
> Map matchData =RddForMarch.
Hi All I am getting below exception while using Kyro serializable with
broadcast variable. I am broadcating a hasmap with below line.
Map matchData =RddForMarch.collectAsMap();
final Broadcast> dataMatchGlobal =
jsc.broadcast(matchData);
15/04/15 12:58:51 ERROR executor.Executor: Exception i
gt;
>
> Thank you for your answer.
>
>
>
> I’m already registering my classes as you’re suggesting…
>
>
>
> Regards
>
>
>
> *De :* tsingfu [via Apache Spark User List] [mailto:ml-node+[hidden email]
> <http:///user/SendEmail.jtp?type=node&
Hello,
Thank you for your answer.
I'm already registering my classes as you're suggesting...
Regards
De : tsingfu [via Apache Spark User List]
[mailto:ml-node+s1001560n22468...@n3.nabble.com]
Envoyé : lundi 13 avril 2015 03:48
À : Mehdi Singer
Objet : Re: Kryo exception : E
and workers are running on different machines (cluster mode), all
with the exact same architecture/configuration
Can anyone help?
Regards
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Kryo-exception-Encountered-unregistered-class-ID-13994-tp22437.htm
sue?
>
> I'm running Spark version 1.1.0.
> My Master and workers are running on different machines (cluster mode), all
> with the exact same architecture/configuration
>
> Can anyone help?
>
> Regards
>
>
>
> --
> View this message in context:
> http:/
list.1001560.n3.nabble.com/Kryo-exception-Encountered-unregistered-class-ID-13994-tp22437.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
I have set Kryo Serializer as default serializer in SparkConf and Spark UI
confirms it too, but in the Spark logs I'm getting this exception,
java.io.OptionalDataException
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1370)
at java.io.ObjectInputStream.readO
You probably don't cause a shuffle (which requires serialization) unless
there is a join or group by.
It's possible that we are need to pass the spark class loader to kryo when
creating a new instance (you can get it from Utils I believe). We never
run Otto this problem since this
rc/main/scala/org/apache/spark/sql/test/ExamplePointUDT.scala
> >
> example. But I'm having Kryo Serialization issues, here is stack trace:
>
> org.apache.spark.SparkException: Job aborted due to stage failure: Task 0
> in
> stage 6.0 failed 1 times, most recent fail
Hi,
I want to introduce custom type for SchemaRDD, I'm following this
<https://github.com/apache/spark/blob/branch-1.2/sql/core/src/main/scala/org/apache/spark/sql/test/ExamplePointUDT.scala>
example. But I'm having Kryo Serialization issues, her
a pre-built Spark; I'm not trying to compile Spark.
>>
>> The compile error appears when I try to register
>> HighlyCompressedMapStatus in my program:
>>
>> kryo.register(classOf[org.apache.spark.scheduler.HighlyCompressedMapStatus])
>>
>> If I don't register it, I get a runtime error sayi
lyCompressedMapStatus])
>
> If I don't register it, I get a runtime error saying that it needs to be
> registered (the error is only when I turn on kryo).
>
> However the code is running smoothly with kryo turned off.
>
> On Wed, Mar 11, 2015 at 5:38 PM, Imran Rashid
>
g that it needs to be
registered (the error is only when I turn on kryo).
However the code is running smoothly with kryo turned off.
On Wed, Mar 11, 2015 at 5:38 PM, Imran Rashid wrote:
> I'm not sure what you mean. Are you asking how you can recompile all of
> spark and deploy i
code compiles fine, so I'm not sure what
problem you are running into -- we'd need a lot more info to help
On Tue, Mar 10, 2015 at 6:54 PM, Arun Luthra wrote:
> Does anyone know how to get the HighlyCompressedMapStatus to compile?
>
> I will try turning off kryo in 1.2.0 and hope thi
Does anyone know how to get the HighlyCompressedMapStatus to compile?
I will try turning off kryo in 1.2.0 and hope things don't break. I want
to benefit from the MapOutputTracker fix in 1.2.0.
On Tue, Mar 3, 2015 at 5:41 AM, Imran Rashid wrote:
> the scala syntax for arrays is Array
the scala syntax for arrays is Array[T], not T[], so you want to use
something:
kryo.register(classOf[Array[org.roaringbitmap.RoaringArray$Element]])
kryo.register(classOf[Array[Short]])
nonetheless, the spark should take care of this itself. I'll look into it
later today.
On Mon, Mar 2, 2015
I think this is a Java vs scala syntax issue. Will check.
On Thu, Feb 26, 2015 at 8:17 PM, Arun Luthra wrote:
> Problem is noted here: https://issues.apache.org/jira/browse/SPARK-5949
>
> I tried this as a workaround:
>
> import org.apache.spark.scheduler._
> import org.roaringbitmap._
>
> ...
>
Problem is noted here: https://issues.apache.org/jira/browse/SPARK-5949
I tried this as a workaround:
import org.apache.spark.scheduler._
import org.roaringbitmap._
...
kryo.register(classOf[org.roaringbitmap.RoaringBitmap])
kryo.register(classOf[org.roaringbitmap.RoaringArray])
kryo.r
I was able to get this working by extending KryoRegistrator and setting the
"spark.kryo.registrator" property.
On Thu, Feb 12, 2015 at 12:31 PM, Corey Nolet wrote:
> I'm trying to register a custom class that extends Kryo's Serializer
> interface. I can't tell exactly what Class the registerKryo
I'm trying to register a custom class that extends Kryo's Serializer
interface. I can't tell exactly what Class the registerKryoClasses()
function on the SparkConf is looking for.
How do I register the Serializer class?
Hi, I want to include if possible Kryo serialization in a project and
first I'm trying to run FlumeEventCount with Kryo. If I comment setAll
method, runs correctly, but if I use Kryo params it returns several errors.
15/02/11 11:42:16 ERROR SparkDeploySchedulerBackend: Asked to remov
Thanks for the notification!
For now, I'll use the Kryo serializer without registering classes until the
bug fix has been merged into the next version of Spark (I guess that will
be 1.3, right?).
arun
On Sun, Feb 1, 2015 at 10:58 PM, Shixiong Zhu wrote:
> It's a bug that has
arkConf.registerKryoClasses(Array(
>
> summaryDataClass, summaryViewClass))
>
> ===
>
> I get the following error:
>
> Exception in thread "main" java.lang.reflect.InvocationTargetException
> ...
g.reflect.InvocationTargetException
...
Caused by: org.apache.spark.SparkException: Failed to load class to
register with Kryo
...
Caused by: java.lang.ClassNotFoundException:
com.dtex.analysis.transform.SummaryData
Note that the class in question SummaryData is in the same package as the
main program and hence in the same jar
on duplicate identical
>>> values?
>>>
>>> Oddly, I can fix this issue if I adjust my scala code to insert a map
>>> step
>>> before the call to sortByKey():
>>>
>>> .map(t => (new CustomKey(t._1),t._2))
&g
o insert a map step
>> before the call to sortByKey():
>>
>> .map(t => (new CustomKey(t._1),t._2))
>>
>> This constructor is just:
>>
>> public CustomKey(CustomKey left) { this.set(left); }
>>
>> Why does this work? I've no idea.
>>
this work? I've no idea.
>
> The spark job is running in yarn-client mode with all the default
> configuration values set. Using the external shuffle service and disabling
> spill compression makes no difference.
>
> Is this
park-user-list.1001560.n3.nabble.com/Duplicate-key-when-sorting-BytesWritable-with-Kryo-tp21447.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
F
A search shows several historical threads for similar Kryo issues, but none
seem to have a definitive solution. Currently using Spark 1.2.0.
While collecting/broadcasting/grouping moderately sized data sets (~500MB -
1GB), I regularly see exceptions such as the one below.
I’ve tried increasing
I'm new to Spark and have run into issues using Kryo for serialization
instead of Java. I have my SparkConf configured as such:
val conf = new SparkConf().setMaster("local").setAppName("test")
.set("spark.kryo.registrationRequired","false&qu
Hello all,
I am using Spark 1.0.2 and I have a custom receiver that works well.
I tried adding Kryo serialization to SparkConf:
val spark = new SparkConf()
…..
.set("spark.serializer", "org.apache.spark.serializer.KryoSerializer")
and I am getting a strange erro
Hi guys,
I get Kryo exceptions of the type "unregistered class id" and "cannot cast
to class" when the locality level of the tasks go beyond LOCAL.
However I get no Kryo exceptions during shuffling operations.
If the locality level never goes beyond LOCAL everything wor
quite often that when the locality level of a task goes
> further than LOCAL (NODE, RACK, etc), I get some of the following
> exceptions: "too many files open", "encountered unregistered class id",
> "cannot cast X to Y".
>
> I do not get any except
ptions during shuffling (which means that kryo works
well).
I'm running Spark 1.0.0 with the following characteristics:
- 18 executors with 30G each
- Yarn client mode
- ulimit is defined in 500k
- Input data: hdfs file with 1000 partitions and 10 GB of size
Please any hint would be app
Is the class com.dataken.spark.examples.MyRegistrator public? if not, change
it to public and give a try.
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/KryoRegistrator-exception-and-Kryo-class-not-found-while-compiling-tp10396p20646.html
Sent from the
Using
com.esotericsoftware
kryo-shaded
3.0.0
Instead of
com.esotericsoftware.kryo
kryo
2.24.0
fixed this
On 2014-12-03 18:15, Robin Keunen wrote:
Hi all,
I am having troubles using
Hi all,
I am having troubles using Kryo and being new to this kind of
serialization, I am not sure where to look. Can someone please help me? :-)
Here is my custom class:
public class *DummyClass* implements KryoSerializable {
private static final Logger LOGGER =
LoggerFactory.getLogger
instead
scala.collection.convert.Wrappers.JListWrapper, as noted in the exception
above. This type was not registered with Kryo and so that's why I got the
exception.
Registering the type did not solve the problem. However, an additional call
to .toBuffer did solve the problem, since the Buffer
Don't know if this'll solve it, but if you're on Spark 1.1, the Cassandra
Connector version 1.1.0 final fixed the guava back compat issue. Maybe taking
the guava exclusions might help?
Date: Mon, 1 Dec 2014 10:48:25 +0100
Subject: Kryo exception for CassandraSQLRow
From: shahab.m
I am using Cassandra-Spark connector to pull data from Cassandra, process
it and write it back to Cassandra.
Now I am getting the following exception, and apparently it is Kryo
serialisation. Does anyone what is the reason and how this can be solved?
I also tried to register
I guess I already have the answer of what I have to do here, which is to
configure the kryo object with the strategy as above.
Now the question becomes: how can I pass this custom kryo configuration to
the spark kryo serializer / kryo registrator?
I've had a look at the code but I am still f
(ObjectField.java:125)
I have been running into similar issues when using avro classes, that I was
able to resolve by registering them with a Kryo serializer that uses
chill-avro. However, in this case the field is in a case class and it seems
that registering the class does not help.
I found this stack
The problem was I didn't use the correct class name, it should
be org.apache.spark.*serializer*.KryoSerializer
On Mon, Nov 24, 2014 at 11:12 PM, Daniel Haviv
wrote:
> Hi,
> I want to test Kryo serialization but when starting spark-shell I'm
> hitting
Hi,
I want to test Kryo serialization but when starting spark-shell I'm hitting
the following error:
java.lang.ClassNotFoundException: org.apache.spark.KryoSerializer
the kryo-2.21.jar is on the classpath so I'm not sure why it's not picking
it up.
Thanks for your help,
Daniel
Hi,
If I look inside algebird Monoid implementation it uses
java.io.Serializable...
But when we use CMS/HLL in examples.streaming.TwitterAlgebirdCMS, I don't
see a KryoRegistrator for CMS and HLL monoid...
In these examples we will run with Kryo serialization on CMS and HLL or
they will be
; Class.forName("[Lorg.apache.spark.util.collection.CompactBuffer;")
>
> On Tue, Sep 30, 2014 at 5:33 PM, Andras Barjak
> wrote:
> > Hi,
> >
> > what is the correct scala code to register an Array of this private spark
> > class to Kryo?
> &
How about this?
Class.forName("[Lorg.apache.spark.util.collection.CompactBuffer;")
On Tue, Sep 30, 2014 at 5:33 PM, Andras Barjak
wrote:
> Hi,
>
> what is the correct scala code to register an Array of this private spark
> class to Kryo?
>
> "java.lang.Illeg
Hi,
what is the correct scala code to register an Array of this private spark
class to Kryo?
"java.lang.IllegalArgumentException: Class is not registered:
org.apache.spark.util.collection.CompactBuffer[]
Note: To register this class use:
kryo.reg
ata back in.
>
> Has anybody encountered this before / is anybody familiar with what causes
> these Kryo UnsupportedOperationExceptions?
>
> any guidance appreciated,
> Sandy
>
> ---
> com.esotericsoftware.kryo.KryoException
> (com.esotericsoftware.kryo.KryoException:
We're running into an error (below) when trying to read spilled shuffle
data back in.
Has anybody encountered this before / is anybody familiar with what causes
these Kryo UnsupportedOperationExceptions?
any guidance appreciated,
Sandy
---
com.esotericsoftware.kryo.KryoExce
Hi Mohan,
It’s a bit convoluted to follow in their source, but they essentially typedef
KSerializer as being a KryoSerializer, and then their serializers all extend
KSerializer. Spark should identify them properly as Kryo Serializers, but I
haven’t tried it myself.
Regards,
Frank Austin
user-list.1001560.n3.nabble.com/Kryo-fails-with-avro-having-Arrays-and-unions-but-succeeds-with-simple-avro-tp14549p14649.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
-
To unsubscribe, e-mail: user
; you
may want to look at Twitter’s Chill serializers, as they may have a higher
performance Kryo serializer for Avro.
Regards,
Frank Austin Nothaft
fnoth...@berkeley.edu
fnoth...@eecs.berkeley.edu
202-340-0466
On Sep 18, 2014, at 8:43 AM, mohan.gadm wrote:
> Thanks for the info frank.
> s
Thanks for the info frank.
so your suggestion could be to use Avro serializer.
i just have to configure it like Kryo for the same property? and is there
any registering process for this or just specify serializer?
Also does it effect performance. what measures to be taken to avoid.
(im using kryo
Hi Mohan,
It’s been a while since I’ve looked at this specifically, but I don’t think the
default Kryo serializer will properly serialize Avro. IIRC, there are
complications around the way that Avro handles nullable fields, which would be
consistent with the NPE you’re encountering here
Hi frank, thanks for the info, thats great. but im not saying Avro serializer
is failing. Kryo is failing
but
im using kryo serializer. and registering Avro generated classes with kryo.
sparkConf.set("spark.serializer",
"org.apache.spark.serializer.KryoSerializer&q
101 - 200 of 231 matches
Mail list logo