Issue was resolved by upgrading Spark to version 1.6
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Streaming-heap-space-out-of-memory-tp27050p27131.html
Sent from the Apache Spark User List mailing list archive at Nabble.com
0.8 s
> 1/1 1/1Is this what you mean by pending stages?
>
>
>
> I have taken a few heap dumps but I’m not sure what I am looking at for
> the problematic classes.
>
>
>
> *From:* Shahbaz [mailto:shahzadh...@gmail.com
> <javascript:_e(%7B%7D,'cvml','shahzadh...@gmail.com');&g
at for the
problematic classes.
From: Shahbaz [mailto:shahzadh...@gmail.com]
Sent: 2016, May, 30 3:25 PM
To: Dancuart, Christian
Cc: user
Subject: Re: Spark Streaming heap space out of memory
Hi Christian,
* What is the processing time of each of your Batch,is it exceeding 15
seconds
480 82
> [Lio.netty.buffer.PoolThreadCache$MemoryRegionCache$Entry;
> 37: 7569 834968 [I
> 38: 9626 770080 org.apache.spark.rdd.MapPartitionsRDD
> 39:
: 9626 770080 org.apache.spark.rdd.MapPartitionsRDD
39: 31748 761952 java.lang.Long
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Streaming-heap-space-out-of-memory-tp27050.html
Sent from the Apache Spark User List mailing list