Change jdk from 1.8.0_45 to 1.7.0_79  solve this issue.

I saw https://issues.apache.org/jira/browse/SPARK-6388
But it is not a problem however.

On Thu, Jul 2, 2015 at 1:30 PM, xiaohe lan <zombiexco...@gmail.com> wrote:

> Hi Expert,
>
> Hadoop version: 2.4
> Spark version: 1.3.1
>
> I am running the SparkPi example application.
>
> bin/spark-submit --class org.apache.spark.examples.SparkPi --master
> yarn-client --executor-memory 2G lib/spark-examples-1.3.1-hadoop2.4.0.jar
> 20000
>
> The same command sometimes gets WARN ReliableDeliverySupervisor, sometimes
> does not.
> Some runs are successful even with the WARN
>
> bin/spark-submit --class org.apache.spark.examples.SparkPi --master
> yarn-client --executor-memory 2G lib/spark-examples-1.3.1-hadoop2.4.0.jar
> 10000
> 15/07/02 04:38:20 WARN NativeCodeLoader: Unable to load native-hadoop
> library for your platform... using builtin-java classes where applicable
> Pi is roughly 3.141633956
>
> bin/spark-submit --class org.apache.spark.examples.SparkPi --master
> yarn-client --executor-memory 2G lib/spark-examples-1.3.1-hadoop2.4.0.jar
> 20000
> 15/07/02 05:17:42 WARN NativeCodeLoader: Unable to load native-hadoop
> library for your platform... using builtin-java classes where applicable
> 15/07/02 05:17:53 WARN ReliableDeliverySupervisor: Association with remote
> system [akka.tcp://sparkYarnAM@hostname:32544] has failed, address is now
> gated for [5000] ms. Reason is: [Disassociated].
> 15/07/02 05:18:01 ERROR YarnClientSchedulerBackend: Yarn application has
> already exited with state FINISHED!
> Exception in thread "main" java.lang.NullPointerException
>         at org.apache.spark.SparkContext.<init>(SparkContext.scala:544)
>         at org.apache.spark.examples.SparkPi$.main(SparkPi.scala:28)
>         at org.apache.spark.examples.SparkPi.main(SparkPi.scala)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>         at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:497)
>         at
> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:569)
>         at
> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:166)
>         at
> org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:189)
>         at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:110)
>         at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>
> bin/spark-submit --class org.apache.spark.examples.SparkPi --master
> yarn-client --executor-memory 2G lib/spark-examples-1.3.1-hadoop2.4.0.jar
> 10000
> 15/07/02 05:23:51 WARN NativeCodeLoader: Unable to load native-hadoop
> library for your platform... using builtin-java classes where applicable
> 15/07/02 05:24:09 WARN ReliableDeliverySupervisor: Association with remote
> system [akka.tcp://sparkYarnAM@hostname:15959] has failed, address is now
> gated for [5000] ms. Reason is: [Disassociated].
> Pi is roughly 3.141625776
>
> Also, the spark ui only available when I set --master to local.
>
> What could have caused those issues ?
>
> Thanks,
> Xiaohe
>

Reply via email to