Try with : --master yarn-cluster

On Wed, Jun 22, 2016 at 4:30 PM, 另一片天 <958943...@qq.com> wrote:

> ./bin/spark-submit --class org.apache.spark.examples.SparkPi --master
> yarn-client --driver-memory 512m --num-executors 2 --executor-memory 512m
> --executor-cores 2
> hdfs://master:9000/user/shihj/spark_lib/spark-examples-1.6.1-hadoop2.6.0.jar
> 10
> Warning: Skip remote jar
> hdfs://master:9000/user/shihj/spark_lib/spark-examples-1.6.1-hadoop2.6.0.jar.
> java.lang.ClassNotFoundException: org.apache.spark.examples.SparkPi
> at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
> at java.lang.Class.forName0(Native Method)
> at java.lang.Class.forName(Class.java:348)
> at org.apache.spark.util.Utils$.classForName(Utils.scala:174)
> at
> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:689)
> at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
> at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
> at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
> at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>
>
>
> ------------------ 原始邮件 ------------------
> *发件人:* "Yash Sharma";<yash...@gmail.com>;
> *发送时间:* 2016年6月22日(星期三) 下午2:28
> *收件人:* "另一片天"<958943...@qq.com>;
> *抄送:* "Saisai Shao"<sai.sai.s...@gmail.com>; "user"<user@spark.apache.org>;
>
> *主题:* Re: Could not find or load main class
> org.apache.spark.deploy.yarn.ExecutorLauncher
>
> Or better , try the master as yarn-cluster,
>
> ./bin/spark-submit \
> --class org.apache.spark.examples.SparkPi \
> --master yarn-cluster \
> --driver-memory 512m \
> --num-executors 2 \
> --executor-memory 512m \
> --executor-cores 2 \
> hdfs://master:9000/user/shihj/spark_lib/spark-examples-1.6.
> 1-hadoop2.6.0.jar
>
> On Wed, Jun 22, 2016 at 4:27 PM, 另一片天 <958943...@qq.com> wrote:
>
>> Is it able to run on local mode ?
>>
>> what mean?? standalone mode ?
>>
>>
>> ------------------ 原始邮件 ------------------
>> *发件人:* "Yash Sharma";<yash...@gmail.com>;
>> *发送时间:* 2016年6月22日(星期三) 下午2:18
>> *收件人:* "Saisai Shao"<sai.sai.s...@gmail.com>;
>> *抄送:* "另一片天"<958943...@qq.com>; "user"<user@spark.apache.org>;
>> *主题:* Re: Could not find or load main class
>> org.apache.spark.deploy.yarn.ExecutorLauncher
>>
>> Try providing the jar with the hdfs prefix. Its probably just because its
>> not able to find the jar on all nodes.
>>
>> hdfs://master:9000/user/shihj/spark_lib/spark-examples-1.6.
>> 1-hadoop2.6.0.jar
>>
>> Is it able to run on local mode ?
>>
>> On Wed, Jun 22, 2016 at 4:14 PM, Saisai Shao <sai.sai.s...@gmail.com>
>> wrote:
>>
>>> spark.yarn.jar (none) The location of the Spark jar file, in case
>>> overriding the default location is desired. By default, Spark on YARN will
>>> use a Spark jar installed locally, but the Spark jar can also be in a
>>> world-readable location on HDFS. This allows YARN to cache it on nodes so
>>> that it doesn't need to be distributed each time an application runs. To
>>> point to a jar on HDFS, for example, set this configuration to
>>> hdfs:///some/path.
>>>
>>> spark.yarn.jar is used for spark run-time system jar, which is spark
>>> assembly jar, not the application jar (example-assembly jar). So in your
>>> case you upload the example-assembly jar into hdfs, in which spark system
>>> jars are not packed, so ExecutorLaucher cannot be found.
>>>
>>> Thanks
>>> Saisai
>>>
>>> On Wed, Jun 22, 2016 at 2:10 PM, 另一片天 <958943...@qq.com> wrote:
>>>
>>>> shihj@master:/usr/local/spark/spark-1.6.1-bin-hadoop2.6$
>>>> ./bin/spark-submit --class org.apache.spark.examples.SparkPi --master
>>>> yarn-client --driver-memory 512m --num-executors 2 --executor-memory 512m
>>>> --executor-cores 2
>>>> /user/shihj/spark_lib/spark-examples-1.6.1-hadoop2.6.0.jar 10
>>>> Warning: Local jar
>>>> /user/shihj/spark_lib/spark-examples-1.6.1-hadoop2.6.0.jar does not exist,
>>>> skipping.
>>>> java.lang.ClassNotFoundException: org.apache.spark.examples.SparkPi
>>>> at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
>>>> at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>>>> at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>>>> at java.lang.Class.forName0(Native Method)
>>>> at java.lang.Class.forName(Class.java:348)
>>>> at org.apache.spark.util.Utils$.classForName(Utils.scala:174)
>>>> at
>>>> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:689)
>>>> at
>>>> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
>>>> at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
>>>> at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
>>>> at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>>>> get error at once
>>>> ------------------ 原始邮件 ------------------
>>>> *发件人:* "Yash Sharma";<yash...@gmail.com>;
>>>> *发送时间:* 2016年6月22日(星期三) 下午2:04
>>>> *收件人:* "另一片天"<958943...@qq.com>;
>>>> *抄送:* "user"<user@spark.apache.org>;
>>>> *主题:* Re: Could not find or load main class
>>>> org.apache.spark.deploy.yarn.ExecutorLauncher
>>>>
>>>> How about supplying the jar directly in spark submit -
>>>>
>>>> ./bin/spark-submit \
>>>>> --class org.apache.spark.examples.SparkPi \
>>>>> --master yarn-client \
>>>>> --driver-memory 512m \
>>>>> --num-executors 2 \
>>>>> --executor-memory 512m \
>>>>> --executor-cores 2 \
>>>>> /user/shihj/spark_lib/spark-examples-1.6.1-hadoop2.6.0.jar
>>>>
>>>>
>>>> On Wed, Jun 22, 2016 at 3:59 PM, 另一片天 <958943...@qq.com> wrote:
>>>>
>>>>> i  config this  para  at spark-defaults.conf
>>>>> spark.yarn.jar
>>>>> hdfs://master:9000/user/shihj/spark_lib/spark-examples-1.6.1-hadoop2.6.0.jar
>>>>>
>>>>> then ./bin/spark-submit --class org.apache.spark.examples.SparkPi
>>>>> --master yarn-client --driver-memory 512m --num-executors 2
>>>>> --executor-memory 512m --executor-cores 2    10:
>>>>>
>>>>>
>>>>>
>>>>>    - Error: Could not find or load main class
>>>>>    org.apache.spark.deploy.yarn.ExecutorLauncher
>>>>>
>>>>> but  i don't config that para ,there no error  why???that para is only
>>>>> avoid Uploading resource file(jar package)??
>>>>>
>>>>
>>>>
>>>
>>
>

Reply via email to