I'm trying to configure driver memory size. So far I tied next parameters:

   -

   export JAVA_INTP_OPTS="-Xmx10g"

   -

   export SPARK_SUBMIT_OPTIONS="--driver-memory 10g --executor-memory 10g"

   -

   export ZEPPELIN_JAVA_OPTS=" -Xmx30000m \
    -Dspark.serializer=org.apache.spark.serializer.KryoSerializer \
    -Dspark.jars=/home/jenkins/spark-assembly.jar \
    -Dspark.master=yarn-client \
    -Dspark.executor.memory=13g \
    -Dspark.yarn.executor.memoryOverhead=3000 \
    -Dspark.executor.cores=3 \
    -Dspark.driver.memory=20g \
    -Dspark.yarn.driver.memoryOverhead=3000 \
    -Dspark.sql.autoBroadcastJoinThreshold=500485760 \
    -Dspark.network.timeout=1000s \
    -Dspark.driver.maxResultSize=2g \
    -Dspark.akka.frameSize=400 \
    -Dspark.akka.askTimeout=30 \
    -Dspark.yarn.am.memory=16g \
    -Dspark.yarn.am.memoryOverhead=3000 \
    -Dspark.executor.memory=13g \
    -Dspark.dynamicAllocation.enabled=true \
    -Dspark.shuffle.service.enabled=true \
    -Dspark.kryoserializer.buffer.max=600m"

   -

   Change SparkInterpreter.java: conf.set("spark.executor.memory", "10g");

   conf.set("spark.executor.cores", "2");
   conf.set("spark.driver.memory", "10g");
   conf.set("spark.shuffle.io.numConnectionsPerPeer", "5");
   conf.set("spark.sql.autoBroadcastJoinThreshold", "200483647");
   conf.set("spark.network.timeout", "400s");
   conf.set("spark.driver.maxResultSize", "3g");
   conf.set("spark.sql.hive.convertMetastoreParquet", "false");
   conf.set("spark.kryoserializer.buffer.max", "200m");
   conf.set("spark.serializer", "org.apache.spark.serializer.KryoSerializer");
   conf.set("spark.dynamicAllocation.enabled", "true");
   conf.set("spark.shuffle.service.enabled", "true");
   conf.set("spark.dynamicAllocation.minExecutors", "1");
   conf.set("spark.dynamicAllocation.maxExecutors", "30");
   conf.set("spark.dynamicAllocation.executorIdleTimeout", "60s");
           //.set("spark.sql.hive.metastore.version", "1.1.0")
   conf.set("spark.dynamicAllocation.cachedExecutorIdleTimeout", "100s");

   -

   I tried setting SPARK_HOME, but it didn't even started, failed with
"Incompatible minimum and maximum heap sizes specified"


No matter, what I do I get in logs: "INFO [2015-11-11 14:55:24,453]
({sparkDriver-akka.actor.default-dispatcher-14} Logging.scala[logInfo]:59)
- Registering block manager 192.168.12.121:45057 with 530.0 MB RAM,
BlockManagerId(driver, 192.168.12.121, 45057)" and on my spark UI:

​Has anyone faced this problem or knows what to do?


-- 


*Sincerely yoursEgor Pakhomov*

Reply via email to