e objects similar to MapReduce
>> (HadoopRDD does this by actually using Hadoop's Writables, for instance),
>> but the general Spark APIs don't support this because mutable objects are
>> not friendly to caching or serializing.
>>
>>
>> On Tue, Jul 8, 201
>
> On Tue, Jul 8, 2014 at 9:27 AM, Konstantin Kudryavtsev <
> kudryavtsev.konstan...@gmail.com> wrote:
>
>> Hi all,
>>
>> I faced with the next exception during map step:
>> java.lang.OutOfMemoryError (java.lang.O
erializing.
On Tue, Jul 8, 2014 at 9:27 AM, Konstantin Kudryavtsev <
kudryavtsev.konstan...@gmail.com> wrote:
> Hi all,
>
> I faced with the next exception during map step:
> java.lang.OutOfMemoryError (java.lang.OutOfMemoryError: GC overhead limit
> exceeded)
> java.lang.re
Hi all,
I faced with the next exception during map step:
java.lang.OutOfMemoryError (java.lang.OutOfMemoryError: GC overhead limit
exceeded)
java.lang.reflect.Array.newInstance(Array.java:70)
com.esotericsoftware.kryo.serializers.DefaultArraySerializers$ObjectArraySerializer.read