Hi,
    I set a small cluster with 3 machines, every machine is 64GB RAM, 11
Core. and I used the spark0.9.

   I have set spark-env.sh as following:

   *SPARK_MASTER_IP=192.168.35.2*
*   SPARK_MASTER_PORT=7077*
*   SPARK_MASTER_WEBUI_PORT=12306*
*   SPARK_WORKER_CORES=3*
*   SPARK_WORKER_MEMORY=20g*
*  SPARK_JAVA_OPTS+="-Dspark.executor.memory=5g"*

   but I see the log in the master as following,

   *Spark Command: java -cp
:/usr/local/spark-0.9.0/conf:/usr/local/spark-0.9.0/assembly/target/scala-2.1
   0/spark-assembly-0.9.0-incubating-hadoop2.2.0.jar
-Dspark.akka.logLifecycleEvents=true -Djava.library    .path= -Xms512m
-Xmx512m org.apache.spark.deploy.master.Master --ip 192.168.35.2 --port
7077 --webui-    port 12306*
*  ========================================*

*  log4j:WARN No appenders could be found for logger
(akka.event.slf4j.Slf4jLogger).*
*  log4j:WARN Please initialize the log4j system properly.*
*  log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig
<http://logging.apache.org/log4j/1.2/faq.html#noconfig> for more info.*
*  14/05/07 08:30:31 INFO Master: Using Spark's default log4j profile:
org/apache/spark/log4j-defaults.p    roperties*
*  14/05/07 08:30:31 INFO Master: Starting Spark master at
spark://192.168.35.2:7077 <http://192.168.35.2:7077>*
*  14/05/07 08:30:31 INFO MasterWebUI: Started Master web UI at
http://pug-master:12306 <http://pug-master:12306>*
*  14/05/07 08:30:31 INFO Master: I have been elected leader! New state:
ALIVE*
*  14/05/07 08:30:34 INFO Master: Registering worker 192.168.35.2:52972
<http://192.168.35.2:52972> with 11 cores, 61.9 GB RAM*
*  14/05/07 08:30:34 INFO Master: Registering worker 192.168.35.2:43225
<http://192.168.35.2:43225> with 11 cores, 61.9 GB RAM*


    and the log in my worker as following:

   *Spark Command: java -cp
:/usr/local/spark-0.9.0/conf:/usr/local/spark-0.9.0/assembly/target/scala-2.1
   0/spark-assembly-0.9.0-incubating-hadoop2.2.0.jar
-Dspark.akka.logLifecycleEvents=true -Djava.library    .path= -Xms512m
-Xmx512m org.apache.spark.deploy.worker.Worker spark://192.168.35.2:7077
<http://192.168.35.2:7077>*
*   ========================================*

*  log4j:WARN No appenders could be found for logger
(akka.event.slf4j.Slf4jLogger).*
*  log4j:WARN Please initialize the log4j system properly.*
*  log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig
<http://logging.apache.org/log4j/1.2/faq.html#noconfig> for more info.*
*  14/05/07 08:30:34 INFO Worker: Using Spark's default log4j profile:
org/apache/spark/log4j-defaults.p    roperties*
*  14/05/07 08:30:34 INFO Worker: Starting Spark worker pug1:43225 with 11
cores, 61.9 GB RAM*
*  14/05/07 08:30:34 INFO Worker: Spark home: /usr/local/spark-0.9.0*
*  14/05/07 08:30:34 INFO WorkerWebUI: Started Worker web UI at
http://pug1:8081 <http://pug1:8081>*
*  14/05/07 08:30:34 INFO Worker: Connecting to master
spark://192.168.35.2:7077...*
* 14/05/07 08:30:34 INFO Worker: Successfully registered with master
spark://192.168.35.2:7077 <http://192.168.35.2:7077>*



   I have checked that I do not spell configuration  by mistaken, and use
the rsync sync the spark-env.sh file  from the master to the workers. but
it seem that the spark-env.sh do not take effect. I do not know what I have
missed.

Reply via email to