unsubscribing myself from the list.
| |
Sophia
|
|
邮箱:sln-1...@163.com
|
签名由 网易邮箱大师 定制
退订
| |
Sophia
|
|
邮箱:sln-1...@163.com
|
签名由 网易邮箱大师 定制
With the yarn-client mode,I submit a job from client to yarn,and the spark
file spark-env.sh:
export HADOOP_HOME=/usr/lib/hadoop
export HADOOP_CONF_DIR=$HADOOP_HOME/etc/hadoop
SPARK_EXECUTOR_INSTANCES=4
SPARK_EXECUTOR_CORES=1
SPARK_EXECUTOR_MEMORY=1G
SPARK_DRIVER_MEMORY=2G
When I run spark in cloudera of CDH5 with service spark-master start
command,it turns out that Spark master is dead and pid file exists,What can
I do to solve the problem?
--
View this message in context:
As the yarn-client mode,will spark be deployed in the node of yarn? If it is
deployed only in the client,can spark submit the job to yarn?
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/yarn-client-mode-question-tp6213.html
Sent from the Apache Spark User
But,I don't understand this point,is it necessary to deploy slave node of
spark in the yarn node?
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/yarn-client-mode-question-tp6213p6216.html
Sent from the Apache Spark User List mailing list archive at
Thank you
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/yarn-client-mode-question-tp6213p6224.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
How did you deal with this problem, I have met with it these days.God bless
me.
Best regard,
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/SparkContext-startup-time-out-tp1753p5738.html
Sent from the Apache Spark User List mailing list archive at
How did you deal with this problem finally?I also met with it.
Best regards,
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/SparkContext-startup-time-out-tp1753p5739.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
My configuration is just like this,the slave's node has been configuate,but I
donnot know what's happened to the shark?Can you help me Sir?
shark-env.sh
export SPARK_USER_HOME=/root
export SPARK_MEM=2g
export SCALA_HOME=/root/scala-2.11.0-RC4
export SHARK_MASTER_MEM=1g
export
Hi
Why I always confront remoting error:
akka.remote.remoteTransportException and
java.util.concurrent.timeoutException?
Best Regards,
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/build-shark-hadoop-CDH5-on-hadoop2-0-0-CDH4-tp5574p5629.html
Sent from
I have built shark in sbt way,but the sbt exception turn out:
[error] sbt.resolveException:unresolved dependency:
org.apache.hadoop#hadoop-client;2.0.0: not found.
How can I do to build it well?
--
View this message in context:
[root@CHBM220 spark-0.9.1]#
SPARK_JAR=.assembly/target/scala-2.10/spark-assembly_2.10-0.9.1-hadoop2.2.0.jar
./bin/spark-class org.apache.spark.deploy.yarn.Client --jar
examples/target/scala-2.10/spark-examples_2.10-assembly-0.9.1.jar --class
org.apache.spark.examples.SparkPi --args yarn-standalone
Hi,everyone,
[root@CHBM220 spark-0.9.1]#
SPARK_JAR=.assembly/target/scala-2.10/spark-assembly_2.10-0.9.1-hadoop2.2.0.jar
./bin/spark-class org.apache.spark.deploy.yarn.Client --jar
examples/target/scala-2.10/spark-examples_2.10-assembly-0.9.1.jar --class
org.apache.spark.examples.SparkPi --args
Hi all,
I have make HADOOP_CONF_DIR or YARN_CONF_DIR points to the directory which
contains the (client side) configuration files for the hadoop cluster.
The command to launch the YARN Client which I run is like this:
#
Hi all,
#./sbt/sbt assembly
Launching sbt from sbt/sbt-launch-0.12.4.jar
Invalid or corrupt jarfile sbt/sbt-launch-0.12.4.jar
Why cannot I run sbt well?
Best regards,
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/How-can-I-run-sbt-tp5429.html
Sent from
Hi all,
[root@sophia spark-0.9.1]#
SPARK_JAR=.assembly/target/scala-2.10/spark-assembly_2.10-0.9.1-hadoop2.2.0.jar
./bin/spark-class org.apache.spark.deploy.yarn.Client\--jar
examples/target/scala-2.10/spark-examples_2.10-assembly-0.9.1.jar\--class
org.apache.spark.examples.SparkPi\--args yarn
I have modified it in spark-env.sh,but it turns out that it does not work.So
coufused.
Best Regards
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/If-it-due-to-my-file-has-been-breakdown-tp5438p5442.html
Sent from the Apache Spark User List mailing list
Hey you guys,
What is the different in spark on yarn mode and standalone mode about
resource schedule?
Wish you happy everyday.
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/different-in-spark-on-yarn-mode-and-standalone-mode-tp5300.html
Sent from the
Hi,
when I configue spark, run the shell instruction:
./spark-shellit told me like this:
WARN:NativeCodeLoader:Uable to load native-hadoop livrary for your
builtin-java classes where applicable,when it connect to ResourceManager,it
stopped. What should I DO?
Wish your reply
--
View this
20 matches
Mail list logo