What Spark tarball are you using? You may want to try the one for hadoop
2.6 (the one for hadoop 2.4 may cause that issue, IIRC).

On Tue, May 5, 2015 at 6:54 PM, felicia <shsh...@tsmc.com> wrote:

> Hi all,
>
> We're trying to implement SparkSQL on CDH5.3.0 with cluster mode,
> and we get this error either using java or python;
>
>
> Application application_1430482716098_0607 failed 2 times due to AM
> Container for appattempt_1430482716098_0607_000002 exited with exitCode: 10
> due to: Exception from container-launch.
> Container id: container_e10_1430482716098_0607_02_000001
> Exit code: 10
> Stack trace: ExitCodeException exitCode=10:
> at org.apache.hadoop.util.Shell.runCommand(Shell.java:538)
> at org.apache.hadoop.util.Shell.run(Shell.java:455)
> at
> org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:702)
> at
>
> org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor.launchContainer(DefaultContainerExecutor.java:197)
> at
>
> org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:299)
> at
>
> org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:81)
> at java.util.concurrent.FutureTask.run(FutureTask.java:266)
> at
>
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
> at
>
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
> at java.lang.Thread.run(Thread.java:745)
> Container exited with a non-zero exit code 10
> .Failing this attempt.. Failing the application.
>
>
> where the detail log in nodes are:
>
> ERROR yarn.ApplicationMaster: Uncaught exception:
> java.lang.IllegalArgumentException: Invalid ContainerId:
> container_e10_1430482716098_0607_02_000001
>    at
>
> org.apache.hadoop.yarn.util.ConverterUtils.toContainerId(ConverterUtils.java:182)
>    at
>
> org.apache.spark.deploy.yarn.YarnRMClient.getAttemptId(YarnRMClient.scala:93)
>    at
>
> org.apache.spark.deploy.yarn.ApplicationMaster.run(ApplicationMaster.scala:83)
>    at
>
> org.apache.spark.deploy.yarn.ApplicationMaster$$anonfun$main$1.apply$mcV$sp(ApplicationMaster.scala:576)
>    at
>
> org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:60)
>    at
>
> org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:59)
>    at java.security.AccessController.doPrivileged(Native Method)
>    at javax.security.auth.Subject.doAs(Subject.java:422)
>    at
>
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
>    at
>
> org.apache.spark.deploy.SparkHadoopUtil.runAsSparkUser(SparkHadoopUtil.scala:59)
>    at
>
> org.apache.spark.deploy.yarn.ApplicationMaster$.main(ApplicationMaster.scala:574)
>    at
>
> org.apache.spark.deploy.yarn.ExecutorLauncher$.main(ApplicationMaster.scala:597)
>    at
> org.apache.spark.deploy.yarn.ExecutorLauncher.main(ApplicationMaster.scala)
> Caused by: java.lang.NumberFormatException: For input string: "e10"
>    at
>
> java.lang.NumberFormatException.forInputString(NumberFormatException.java:65)
>    at java.lang.Long.parseLong(Long.java:589)
>    at java.lang.Long.parseLong(Long.java:631)
>    at
>
> org.apache.hadoop.yarn.util.ConverterUtils.toApplicationAttemptId(ConverterUtils.java:137)
>    at
>
> org.apache.hadoop.yarn.util.ConverterUtils.toContainerId(ConverterUtils.java:177)
>    ... 12 more
>
>
>
> we’ve already tried the solution described as following but it doesn’t
> seems
> to work.
> Please advise if there’s any environmental settings we should exploit for
> clarifying our question, thanks!
> https://github.com/abhibasu/sparksql/wiki/SparkSQL-Configuration-in-CDH-5.3
>
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/what-does-Container-exited-with-a-non-zero-exit-code-10-means-tp22778.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>


-- 
Marcelo

Reply via email to