Hi, Spark users.
I am getting the following OutOfMemoryError: Java heap space after changing
to StorageLevel.MEMORY_ONLY_SER.
MEMORY_AND_DISK_SER also throws the same error.
I thought DISK option should put unfitting blocks to the disk.
What could cause the OOM in such situation?
Is there any
I found an unshaded google guava classes used internally in
spark-network-common while working with ElasticSearch.
Following link discusses about duplicate dependencies conflict cause by
guava classes and how I solved the build conflict issue.
Hi Sparkers.
I am very new to Spark, and I am occasionally getting RpCTimeoutException
with the following error.
15/11/01 22:19:46 WARN HeartbeatReceiver: Removing executor 0 with no
> recent heartbeats: 321792 ms exceeds timeout 30 ms
> 15/11/01 22:19:46 ERROR TaskSchedulerImpl: Lost