Re: OutOfMemoryError with ramdom forest and small training dataset

2015-02-12 Thread didmar
Ok, I would suggest adding SPARK_DRIVER_MEMORY in spark-env.sh, with a larger amount of memory than the default 512m -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/OutOfMemoryError-with-ramdom-forest-and-small-training-dataset-tp21598p21618.html Sent from

Re: OutOfMemoryError with ramdom forest and small training dataset

2015-02-12 Thread Sean Owen
Looking at the script, I'm not sure whether --driver-memory is supposed to work in standalone client mode. It's too late to set the driver's memory if the driver is what's already running. It specially handles the case where the value is the environment config though. Not sure, this might be on

Re: OutOfMemoryError with ramdom forest and small training dataset

2015-02-12 Thread poiuytrez
Very interesting. It works. When I set SPARK_DRIVER_MEMORY=83971m in spark-env.sh or spark-default.conf it works. However, when I set the --driver-memory option with spark submit, the memory is not allocated to the spark master. (the web ui shows the correct value of spark.driver.memory

Re: OutOfMemoryError with ramdom forest and small training dataset

2015-02-11 Thread poiuytrez
cat ../hadoop/spark-install/conf/spark-env.sh export SCALA_HOME=/home/hadoop/scala-install export SPARK_WORKER_MEMORY=83971m export SPARK_MASTER_IP=spark-m export SPARK_DAEMON_MEMORY=15744m export SPARK_WORKER_DIR=/hadoop/spark/work export SPARK_LOCAL_DIRS=/hadoop/spark/tmp export

Re: OutOfMemoryError with ramdom forest and small training dataset

2015-02-11 Thread poiuytrez
cat ../hadoop/spark-install/conf/spark-env.sh export SCALA_HOME=/home/hadoop/scala-install export SPARK_WORKER_MEMORY=83971m export SPARK_MASTER_IP=spark-m export SPARK_DAEMON_MEMORY=15744m export SPARK_WORKER_DIR=/hadoop/spark/work export SPARK_LOCAL_DIRS=/hadoop/spark/tmp export