Hi, Dear developer, I am using Spark Streaming to read data from kafka, the 
program already run about 120 hours, but today the program failed because of 
driver's OOM as follow:


Container [pid=49133,containerID=container_1429773909253_0050_02_000001] is 
running beyond physical memory limits. Current usage: 2.5 GB of 2.5 GB physical 
memory used; 3.2 GB of 50 GB virtual memory used. Killing container.


I set --driver-memory to 2g, In my mind, driver is responsibility for job 
scheduler and job monitor(Please correct me If I'm wrong), Why it using so much 
memory?


So I using jmap to monitor other program(already run about 48 hours): 
sudo /home/q/java7/jdk1.7.0_45/bin/jmap -histo:live 31256, the result as follow:
the java.util.HashMap$Entry and java.lang.Long  object using about 600Mb memory!


and I also using jmap to monitor other program(already run about 1 hours),  the 
result as follow:
the java.util.HashMap$Entry and java.lang.Long object doesn't using so many 
memory, But I found, as time goes by, the java.util.HashMap$Entry and 
java.lang.Long object will occupied more and more memory,
It is driver's memory leak question? or other reason?
Thanks
Best Regards









Reply via email to