RE: Job object toString() is throwing an exception

2014-11-25 Thread Rohith Sharma K S
Could you give error message or stack trace?

From: Corey Nolet [mailto:cjno...@gmail.com]
Sent: 26 November 2014 07:54
To: user@hadoop.apache.org
Subject: Job object toString() is throwing an exception

I was playing around in the Spark shell and newing up an instance of Job that I 
could use to configure the inputformat for a job. By default, the Scala shell 
println's the result of every command typed. It throws an exception when it 
printlns the newly created instance of Job because it looks like it's setting a 
state upon allocation and it's not happy with the state that it's in when 
toString() is called before the job is submitted.

I'm using Hadoop 2.5.1. I don't see any tickets for this for 2.6. Has anyone 
else ran into this?


Re: Job object toString() is throwing an exception

2014-11-25 Thread Corey Nolet
Here's the stack trace. I was going to file a ticket for this but wanted to
check on the user list first to make sure there wasn't already a fix in the
works. It has to do with the Scala shell doing a toString() each time a
command is typed in. The stack trace stops the instance of Job from ever
being assigned.


scala val job = new org.apache.hadoop.mapreduce.Job

warning: there were 1 deprecation warning(s); re-run with -deprecation for
details

java.lang.IllegalStateException: Job in state DEFINE instead of RUNNING

at org.apache.hadoop.mapreduce.Job.ensureState(Job.java:283)

at org.apache.hadoop.mapreduce.Job.toString(Job.java:452)

at
scala.runtime.ScalaRunTime$.scala$runtime$ScalaRunTime$$inner$1(ScalaRunTime.scala:324)

at scala.runtime.ScalaRunTime$.stringOf(ScalaRunTime.scala:329)

at scala.runtime.ScalaRunTime$.replStringOf(ScalaRunTime.scala:337)

at .init(console:10)

at .clinit(console)

at $print(console)

at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)

at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

at java.lang.reflect.Method.invoke(Method.java:606)

at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:789)

at
org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1062)

at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:615)

at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:646)

at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:610)

at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:814)

at
org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:859)

at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:771)

at org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:616)

at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:624)

at org.apache.spark.repl.SparkILoop.loop(SparkILoop.scala:629)

at
org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply$mcZ$sp(SparkILoop.scala:954)

at
org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:902)

at
org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:902)

at
scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)

at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:902)

at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:997)

at org.apache.spark.repl.Main$.main(Main.scala:31)

at org.apache.spark.repl.Main.main(Main.scala)

at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)

at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

at java.lang.reflect.Method.invoke(Method.java:606)

at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:328)

at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)

at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)



On Tue, Nov 25, 2014 at 9:39 PM, Rohith Sharma K S 
rohithsharm...@huawei.com wrote:

  Could you give error message or stack trace?



 *From:* Corey Nolet [mailto:cjno...@gmail.com]
 *Sent:* 26 November 2014 07:54
 *To:* user@hadoop.apache.org
 *Subject:* Job object toString() is throwing an exception



 I was playing around in the Spark shell and newing up an instance of Job
 that I could use to configure the inputformat for a job. By default, the
 Scala shell println's the result of every command typed. It throws an
 exception when it printlns the newly created instance of Job because it
 looks like it's setting a state upon allocation and it's not happy with the
 state that it's in when toString() is called before the job is submitted.



 I'm using Hadoop 2.5.1. I don't see any tickets for this for 2.6. Has
 anyone else ran into this?