Re: SPARKonYARN failing on CDH 5.3.0 : container cannot be fetched because of NumberFormatException

2015-01-09 Thread Mukesh Jha
I am using pre built *spark-1.2.0-bin-hadoop2.4* from *[1] *to submit spark
applications to yarn, I cannot find the pre built spark for *CDH-5.x*
versions. So, In my case the org.apache.hadoop.yarn.util.ConverterUtils class
is coming from the spark-assembly-1.1.0-hadoop2.4.0.jar which is part of
the pre built spark and hence causing this issue.

How / where can I get spark 1.2.0 built for CDH-5.3.0, Icheck in maven repo
etc with no luck.

*[1]* https://spark.apache.org/downloads.html

On Fri, Jan 9, 2015 at 1:12 AM, Marcelo Vanzin van...@cloudera.com wrote:

 Just to add to Sandy's comment, check your client configuration
 (generally in /etc/spark/conf). If you're using CM, you may need to
 run the Deploy Client Configuration command on the cluster to update
 the configs to match the new version of CDH.

 On Thu, Jan 8, 2015 at 11:38 AM, Sandy Ryza sandy.r...@cloudera.com
 wrote:
  Hi Mukesh,
 
  Those line numbers in ConverterUtils in the stack trace don't appear to
 line
  up with CDH 5.3:
 
 https://github.com/cloudera/hadoop-common/blob/cdh5-2.5.0_5.3.0/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-common/src/main/java/org/apache/hadoop/yarn/util/ConverterUtils.java
 
  Is it possible you're still including the old jars on the classpath in
 some
  way?
 
  -Sandy
 
  On Thu, Jan 8, 2015 at 3:38 AM, Mukesh Jha me.mukesh@gmail.com
 wrote:
 
  Hi Experts,
 
  I am running spark inside YARN job.
 
  The spark-streaming job is running fine in CDH-5.0.0 but after the
 upgrade
  to 5.3.0 it cannot fetch containers with the below errors. Looks like
 the
  container id is incorrect and a string is present in a pace where it's
  expecting a number.
 
 
 
  java.lang.IllegalArgumentException: Invalid ContainerId:
  container_e01_1420481081140_0006_01_01
 
  Caused by: java.lang.NumberFormatException: For input string: e01
 
 
 
  Is this a bug?? Did you face something similar and any ideas how to fix
  this?
 
 
 
  15/01/08 09:50:28 INFO yarn.ApplicationMaster: Registered signal
 handlers
  for [TERM, HUP, INT]
 
  15/01/08 09:50:29 ERROR yarn.ApplicationMaster: Uncaught exception:
 
  java.lang.IllegalArgumentException: Invalid ContainerId:
  container_e01_1420481081140_0006_01_01
 
  at
 
 org.apache.hadoop.yarn.util.ConverterUtils.toContainerId(ConverterUtils.java:182)
 
  at
 
 org.apache.spark.deploy.yarn.YarnRMClientImpl.getAttemptId(YarnRMClientImpl.scala:79)
 
  at
 
 org.apache.spark.deploy.yarn.ApplicationMaster.run(ApplicationMaster.scala:79)
 
  at
 
 org.apache.spark.deploy.yarn.ApplicationMaster$$anonfun$main$1.apply$mcV$sp(ApplicationMaster.scala:515)
 
  at
 
 org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:60)
 
  at
 
 org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:59)
 
  at java.security.AccessController.doPrivileged(Native Method)
 
  at javax.security.auth.Subject.doAs(Subject.java:415)
 
  at
 
 org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
 
  at
 
 org.apache.spark.deploy.SparkHadoopUtil.runAsSparkUser(SparkHadoopUtil.scala:59)
 
  at
 
 org.apache.spark.deploy.yarn.ApplicationMaster$.main(ApplicationMaster.scala:513)
 
  at
 
 org.apache.spark.deploy.yarn.ApplicationMaster.main(ApplicationMaster.scala)
 
  Caused by: java.lang.NumberFormatException: For input string: e01
 
  at
 
 java.lang.NumberFormatException.forInputString(NumberFormatException.java:65)
 
  at java.lang.Long.parseLong(Long.java:441)
 
  at java.lang.Long.parseLong(Long.java:483)
 
  at
 
 org.apache.hadoop.yarn.util.ConverterUtils.toApplicationAttemptId(ConverterUtils.java:137)
 
  at
 
 org.apache.hadoop.yarn.util.ConverterUtils.toContainerId(ConverterUtils.java:177)
 
  ... 11 more
 
  15/01/08 09:50:29 INFO yarn.ApplicationMaster: Final app status: FAILED,
  exitCode: 10, (reason: Uncaught exception: Invalid ContainerId:
  container_e01_1420481081140_0006_01_01)
 
 
  --
  Thanks  Regards,
 
  Mukesh Jha
 
 



 --
 Marcelo




-- 


Thanks  Regards,

*Mukesh Jha me.mukesh@gmail.com*


Re: SPARKonYARN failing on CDH 5.3.0 : container cannot be fetched because of NumberFormatException

2015-01-09 Thread Sean Owen
Again this is probably not the place for CDH-specific questions, and
this one is already answered at
http://community.cloudera.com/t5/Advanced-Analytics-Apache-Spark/CDH-5-3-0-container-cannot-be-fetched-because-of/m-p/23497#M478

On Fri, Jan 9, 2015 at 9:23 AM, Mukesh Jha me.mukesh@gmail.com wrote:
 I am using pre built spark-1.2.0-bin-hadoop2.4 from [1] to submit spark
 applications to yarn, I cannot find the pre built spark for CDH-5.x
 versions. So, In my case the org.apache.hadoop.yarn.util.ConverterUtils
 class is coming from the spark-assembly-1.1.0-hadoop2.4.0.jar which is part
 of the pre built spark and hence causing this issue.

 How / where can I get spark 1.2.0 built for CDH-5.3.0, Icheck in maven repo
 etc with no luck.

 [1] https://spark.apache.org/downloads.html

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



SPARKonYARN failing on CDH 5.3.0 : container cannot be fetched because of NumberFormatException

2015-01-08 Thread Mukesh Jha
Hi Experts,

I am running spark inside YARN job.

The spark-streaming job is running fine in CDH-5.0.0 but after the upgrade
to 5.3.0 it cannot fetch containers with the below errors. Looks like the
container id is incorrect and a string is present in a pace where it's
expecting a number.



java.lang.IllegalArgumentException: Invalid ContainerId:
container_e01_1420481081140_0006_01_01

Caused by: java.lang.NumberFormatException: For input string: e01



Is this a bug?? Did you face something similar and any ideas how to fix
this?



15/01/08 09:50:28 INFO yarn.ApplicationMaster: Registered signal handlers
for [TERM, HUP, INT]

15/01/08 09:50:29 ERROR yarn.ApplicationMaster: Uncaught exception:

java.lang.IllegalArgumentException: Invalid ContainerId:
container_e01_1420481081140_0006_01_01

at
org.apache.hadoop.yarn.util.ConverterUtils.toContainerId(ConverterUtils.java:182)

at
org.apache.spark.deploy.yarn.YarnRMClientImpl.getAttemptId(YarnRMClientImpl.scala:79)

at
org.apache.spark.deploy.yarn.ApplicationMaster.run(ApplicationMaster.scala:79)

at
org.apache.spark.deploy.yarn.ApplicationMaster$$anonfun$main$1.apply$mcV$sp(ApplicationMaster.scala:515)

at
org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:60)

at
org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:59)

at java.security.AccessController.doPrivileged(Native Method)

at javax.security.auth.Subject.doAs(Subject.java:415)

at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)

at
org.apache.spark.deploy.SparkHadoopUtil.runAsSparkUser(SparkHadoopUtil.scala:59)

at
org.apache.spark.deploy.yarn.ApplicationMaster$.main(ApplicationMaster.scala:513)

at
org.apache.spark.deploy.yarn.ApplicationMaster.main(ApplicationMaster.scala)

Caused by: java.lang.NumberFormatException: For input string: e01

at
java.lang.NumberFormatException.forInputString(NumberFormatException.java:65)

at java.lang.Long.parseLong(Long.java:441)

at java.lang.Long.parseLong(Long.java:483)

at
org.apache.hadoop.yarn.util.ConverterUtils.toApplicationAttemptId(ConverterUtils.java:137)

at
org.apache.hadoop.yarn.util.ConverterUtils.toContainerId(ConverterUtils.java:177)

... 11 more

15/01/08 09:50:29 INFO yarn.ApplicationMaster: Final app status: FAILED,
exitCode: 10, (reason: Uncaught exception: Invalid ContainerId:
container_e01_1420481081140_0006_01_01)

-- 
Thanks  Regards,

*Mukesh Jha me.mukesh@gmail.com*


Re: SPARKonYARN failing on CDH 5.3.0 : container cannot be fetched because of NumberFormatException

2015-01-08 Thread Sandy Ryza
Hi Mukesh,

Those line numbers in ConverterUtils in the stack trace don't appear to
line up with CDH 5.3:
https://github.com/cloudera/hadoop-common/blob/cdh5-2.5.0_5.3.0/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-common/src/main/java/org/apache/hadoop/yarn/util/ConverterUtils.java

Is it possible you're still including the old jars on the classpath in some
way?

-Sandy

On Thu, Jan 8, 2015 at 3:38 AM, Mukesh Jha me.mukesh@gmail.com wrote:

 Hi Experts,

 I am running spark inside YARN job.

 The spark-streaming job is running fine in CDH-5.0.0 but after the upgrade
 to 5.3.0 it cannot fetch containers with the below errors. Looks like the
 container id is incorrect and a string is present in a pace where it's
 expecting a number.



 java.lang.IllegalArgumentException: Invalid ContainerId:
 container_e01_1420481081140_0006_01_01

 Caused by: java.lang.NumberFormatException: For input string: e01



 Is this a bug?? Did you face something similar and any ideas how to fix
 this?



 15/01/08 09:50:28 INFO yarn.ApplicationMaster: Registered signal handlers
 for [TERM, HUP, INT]

 15/01/08 09:50:29 ERROR yarn.ApplicationMaster: Uncaught exception:

 java.lang.IllegalArgumentException: Invalid ContainerId:
 container_e01_1420481081140_0006_01_01

 at
 org.apache.hadoop.yarn.util.ConverterUtils.toContainerId(ConverterUtils.java:182)

 at
 org.apache.spark.deploy.yarn.YarnRMClientImpl.getAttemptId(YarnRMClientImpl.scala:79)

 at
 org.apache.spark.deploy.yarn.ApplicationMaster.run(ApplicationMaster.scala:79)

 at
 org.apache.spark.deploy.yarn.ApplicationMaster$$anonfun$main$1.apply$mcV$sp(ApplicationMaster.scala:515)

 at
 org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:60)

 at
 org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:59)

 at java.security.AccessController.doPrivileged(Native Method)

 at javax.security.auth.Subject.doAs(Subject.java:415)

 at
 org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)

 at
 org.apache.spark.deploy.SparkHadoopUtil.runAsSparkUser(SparkHadoopUtil.scala:59)

 at
 org.apache.spark.deploy.yarn.ApplicationMaster$.main(ApplicationMaster.scala:513)

 at
 org.apache.spark.deploy.yarn.ApplicationMaster.main(ApplicationMaster.scala)

 Caused by: java.lang.NumberFormatException: For input string: e01

 at
 java.lang.NumberFormatException.forInputString(NumberFormatException.java:65)

 at java.lang.Long.parseLong(Long.java:441)

 at java.lang.Long.parseLong(Long.java:483)

 at
 org.apache.hadoop.yarn.util.ConverterUtils.toApplicationAttemptId(ConverterUtils.java:137)

 at
 org.apache.hadoop.yarn.util.ConverterUtils.toContainerId(ConverterUtils.java:177)

 ... 11 more

 15/01/08 09:50:29 INFO yarn.ApplicationMaster: Final app status: FAILED,
 exitCode: 10, (reason: Uncaught exception: Invalid ContainerId:
 container_e01_1420481081140_0006_01_01)

 --
 Thanks  Regards,

 *Mukesh Jha me.mukesh@gmail.com*



Re: SPARKonYARN failing on CDH 5.3.0 : container cannot be fetched because of NumberFormatException

2015-01-08 Thread Marcelo Vanzin
Just to add to Sandy's comment, check your client configuration
(generally in /etc/spark/conf). If you're using CM, you may need to
run the Deploy Client Configuration command on the cluster to update
the configs to match the new version of CDH.

On Thu, Jan 8, 2015 at 11:38 AM, Sandy Ryza sandy.r...@cloudera.com wrote:
 Hi Mukesh,

 Those line numbers in ConverterUtils in the stack trace don't appear to line
 up with CDH 5.3:
 https://github.com/cloudera/hadoop-common/blob/cdh5-2.5.0_5.3.0/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-common/src/main/java/org/apache/hadoop/yarn/util/ConverterUtils.java

 Is it possible you're still including the old jars on the classpath in some
 way?

 -Sandy

 On Thu, Jan 8, 2015 at 3:38 AM, Mukesh Jha me.mukesh@gmail.com wrote:

 Hi Experts,

 I am running spark inside YARN job.

 The spark-streaming job is running fine in CDH-5.0.0 but after the upgrade
 to 5.3.0 it cannot fetch containers with the below errors. Looks like the
 container id is incorrect and a string is present in a pace where it's
 expecting a number.



 java.lang.IllegalArgumentException: Invalid ContainerId:
 container_e01_1420481081140_0006_01_01

 Caused by: java.lang.NumberFormatException: For input string: e01



 Is this a bug?? Did you face something similar and any ideas how to fix
 this?



 15/01/08 09:50:28 INFO yarn.ApplicationMaster: Registered signal handlers
 for [TERM, HUP, INT]

 15/01/08 09:50:29 ERROR yarn.ApplicationMaster: Uncaught exception:

 java.lang.IllegalArgumentException: Invalid ContainerId:
 container_e01_1420481081140_0006_01_01

 at
 org.apache.hadoop.yarn.util.ConverterUtils.toContainerId(ConverterUtils.java:182)

 at
 org.apache.spark.deploy.yarn.YarnRMClientImpl.getAttemptId(YarnRMClientImpl.scala:79)

 at
 org.apache.spark.deploy.yarn.ApplicationMaster.run(ApplicationMaster.scala:79)

 at
 org.apache.spark.deploy.yarn.ApplicationMaster$$anonfun$main$1.apply$mcV$sp(ApplicationMaster.scala:515)

 at
 org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:60)

 at
 org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:59)

 at java.security.AccessController.doPrivileged(Native Method)

 at javax.security.auth.Subject.doAs(Subject.java:415)

 at
 org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)

 at
 org.apache.spark.deploy.SparkHadoopUtil.runAsSparkUser(SparkHadoopUtil.scala:59)

 at
 org.apache.spark.deploy.yarn.ApplicationMaster$.main(ApplicationMaster.scala:513)

 at
 org.apache.spark.deploy.yarn.ApplicationMaster.main(ApplicationMaster.scala)

 Caused by: java.lang.NumberFormatException: For input string: e01

 at
 java.lang.NumberFormatException.forInputString(NumberFormatException.java:65)

 at java.lang.Long.parseLong(Long.java:441)

 at java.lang.Long.parseLong(Long.java:483)

 at
 org.apache.hadoop.yarn.util.ConverterUtils.toApplicationAttemptId(ConverterUtils.java:137)

 at
 org.apache.hadoop.yarn.util.ConverterUtils.toContainerId(ConverterUtils.java:177)

 ... 11 more

 15/01/08 09:50:29 INFO yarn.ApplicationMaster: Final app status: FAILED,
 exitCode: 10, (reason: Uncaught exception: Invalid ContainerId:
 container_e01_1420481081140_0006_01_01)


 --
 Thanks  Regards,

 Mukesh Jha





-- 
Marcelo

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org