Just to add to Sandy's comment, check your client configuration
(generally in /etc/spark/conf). If you're using CM, you may need to
run the "Deploy Client Configuration" command on the cluster to update
the configs to match the new version of CDH.

On Thu, Jan 8, 2015 at 11:38 AM, Sandy Ryza <sandy.r...@cloudera.com> wrote:
> Hi Mukesh,
>
> Those line numbers in ConverterUtils in the stack trace don't appear to line
> up with CDH 5.3:
> https://github.com/cloudera/hadoop-common/blob/cdh5-2.5.0_5.3.0/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-common/src/main/java/org/apache/hadoop/yarn/util/ConverterUtils.java
>
> Is it possible you're still including the old jars on the classpath in some
> way?
>
> -Sandy
>
> On Thu, Jan 8, 2015 at 3:38 AM, Mukesh Jha <me.mukesh....@gmail.com> wrote:
>>
>> Hi Experts,
>>
>> I am running spark inside YARN job.
>>
>> The spark-streaming job is running fine in CDH-5.0.0 but after the upgrade
>> to 5.3.0 it cannot fetch containers with the below errors. Looks like the
>> container id is incorrect and a string is present in a pace where it's
>> expecting a number.
>>
>>
>>
>> java.lang.IllegalArgumentException: Invalid ContainerId:
>> container_e01_1420481081140_0006_01_000001
>>
>> Caused by: java.lang.NumberFormatException: For input string: "e01"
>>
>>
>>
>> Is this a bug?? Did you face something similar and any ideas how to fix
>> this?
>>
>>
>>
>> 15/01/08 09:50:28 INFO yarn.ApplicationMaster: Registered signal handlers
>> for [TERM, HUP, INT]
>>
>> 15/01/08 09:50:29 ERROR yarn.ApplicationMaster: Uncaught exception:
>>
>> java.lang.IllegalArgumentException: Invalid ContainerId:
>> container_e01_1420481081140_0006_01_000001
>>
>> at
>> org.apache.hadoop.yarn.util.ConverterUtils.toContainerId(ConverterUtils.java:182)
>>
>> at
>> org.apache.spark.deploy.yarn.YarnRMClientImpl.getAttemptId(YarnRMClientImpl.scala:79)
>>
>> at
>> org.apache.spark.deploy.yarn.ApplicationMaster.run(ApplicationMaster.scala:79)
>>
>> at
>> org.apache.spark.deploy.yarn.ApplicationMaster$$anonfun$main$1.apply$mcV$sp(ApplicationMaster.scala:515)
>>
>> at
>> org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:60)
>>
>> at
>> org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:59)
>>
>> at java.security.AccessController.doPrivileged(Native Method)
>>
>> at javax.security.auth.Subject.doAs(Subject.java:415)
>>
>> at
>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
>>
>> at
>> org.apache.spark.deploy.SparkHadoopUtil.runAsSparkUser(SparkHadoopUtil.scala:59)
>>
>> at
>> org.apache.spark.deploy.yarn.ApplicationMaster$.main(ApplicationMaster.scala:513)
>>
>> at
>> org.apache.spark.deploy.yarn.ApplicationMaster.main(ApplicationMaster.scala)
>>
>> Caused by: java.lang.NumberFormatException: For input string: "e01"
>>
>> at
>> java.lang.NumberFormatException.forInputString(NumberFormatException.java:65)
>>
>> at java.lang.Long.parseLong(Long.java:441)
>>
>> at java.lang.Long.parseLong(Long.java:483)
>>
>> at
>> org.apache.hadoop.yarn.util.ConverterUtils.toApplicationAttemptId(ConverterUtils.java:137)
>>
>> at
>> org.apache.hadoop.yarn.util.ConverterUtils.toContainerId(ConverterUtils.java:177)
>>
>> ... 11 more
>>
>> 15/01/08 09:50:29 INFO yarn.ApplicationMaster: Final app status: FAILED,
>> exitCode: 10, (reason: Uncaught exception: Invalid ContainerId:
>> container_e01_1420481081140_0006_01_000001)
>>
>>
>> --
>> Thanks & Regards,
>>
>> Mukesh Jha
>
>



-- 
Marcelo

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to