Re: Spark Mlib - java.lang.OutOfMemoryError: Java heap space

2017-04-24 Thread Selvam Raman
This is where job going out of memory 17/04/24 10:09:22 INFO TaskSetManager: Finished task 122.0 in stage 1.0 (TID 356) in 4260 ms on ip-...-45.dev (124/234) 17/04/24 10:09:26 INFO BlockManagerInfo: Removed taskresult_361 on ip-10...-185.dev:36974 in memory (size: 5.2 MB, free: 8.5 GB) 17/04/24

Spark Mlib - java.lang.OutOfMemoryError: Java heap space

2017-04-24 Thread Selvam Raman
Hi, I have 1 master and 4 slave node. Input data size is 14GB. Slave Node config : 32GB Ram,16 core I am trying to train word embedding model using spark. It is going out of memory. To train 14GB of data how much memory do i require?. I have givem 20gb per executor but below shows it is using