Hi, I'm a Spark newbie.

We had installed spark-1.0.2-bin-cdh4 on a 'super machine' with 256gb memory
and 48 cores. 

Tried to allocate a task with 64gb memory but for whatever reason Spark is
only using around 9gb max.

Submitted spark job with the following command:
"
/bin/spark-submit -class SimpleApp --master local[16] --executor-memory 64G
/var/tmp/simple-project_2.10-1.0.jar /data/lucene/ns.gz
"

When I run 'top' command I see only 9gb of memory is used by the spark
process

PID USER      PR  NI  VIRT  RES  SHR S %CPU %MEM    TIME+  COMMAND
3047005 fran  30  10 8785m 703m  18m S 112.9  0.3  48:19.63 java


Any idea why this is happening? I've also tried to set the memory
programatically using
" new SparkConf().set("spark.executor.memory", "64g") " but that also didn't
do anything.

Is there some limitation when running in 'local' mode?

Thanks.



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Memory-under-utilization-tp14396.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to