This happens most probably because the Spark 1.3 you have downloaded
is built against an older version of the Hadoop libraries than those
used by CDH, and those libraries cannot parse the container IDs
generated by CDH.

You can try to work around this by manually adding CDH jars to the
front of the classpath by setting "spark.driver.extraClassPath" and
"spark.executor.extraClassPath" to "/usr/lib/hadoop/client/*" (or the
respective location if you're using parcels).


On Mon, Mar 23, 2015 at 6:32 PM, Manoj Samel <manojsamelt...@gmail.com> wrote:
> Spark 1.3, CDH 5.3.2, Kerberos
>
> Setup works fine with base configuration, spark-shell can be used in yarn
> client mode etc.
>
> When work recovery feature is enabled via
> http://www.cloudera.com/content/cloudera/en/documentation/core/latest/topics/admin_ha_yarn_work_preserving_recovery.html,
> the spark-shell fails with following log
>
> 15/03/24 01:20:16 ERROR yarn.ApplicationMaster: Uncaught exception:
> java.lang.IllegalArgumentException: Invalid ContainerId:
> container_e04_1427159778706_0002_01_000001
>         at
> org.apache.hadoop.yarn.util.ConverterUtils.toContainerId(ConverterUtils.java:182)
>         at
> org.apache.spark.deploy.yarn.YarnRMClient.getAttemptId(YarnRMClient.scala:93)
>         at
> org.apache.spark.deploy.yarn.ApplicationMaster.run(ApplicationMaster.scala:83)
>         at
> org.apache.spark.deploy.yarn.ApplicationMaster$$anonfun$main$1.apply$mcV$sp(ApplicationMaster.scala:576)
>         at
> org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:60)
>         at
> org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:59)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at javax.security.auth.Subject.doAs(Subject.java:422)
>         at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
>         at
> org.apache.spark.deploy.SparkHadoopUtil.runAsSparkUser(SparkHadoopUtil.scala:59)
>         at
> org.apache.spark.deploy.yarn.ApplicationMaster$.main(ApplicationMaster.scala:574)
>         at
> org.apache.spark.deploy.yarn.ExecutorLauncher$.main(ApplicationMaster.scala:597)
>         at
> org.apache.spark.deploy.yarn.ExecutorLauncher.main(ApplicationMaster.scala)
> Caused by: java.lang.NumberFormatException: For input string: "e04"
>         at
> java.lang.NumberFormatException.forInputString(NumberFormatException.java:65)
>         at java.lang.Long.parseLong(Long.java:589)
>         at java.lang.Long.parseLong(Long.java:631)
>         at
> org.apache.hadoop.yarn.util.ConverterUtils.toApplicationAttemptId(ConverterUtils.java:137)
>         at
> org.apache.hadoop.yarn.util.ConverterUtils.toContainerId(ConverterUtils.java:177)
>         ... 12 more
> 15/03/24 01:20:16 INFO yarn.ApplicationMaster: Final app status: FAILED,
> exitCode: 10, (reason: Uncaught exception: Invalid ContainerId:
> container_e04_1427159778706_0002_01_000001)
>
>



-- 
Marcelo

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to