Exceptions not caught?
Hi, I'm running a spark job and encountering an exception related to thrift. I wanted to know where this is being thrown, but the stack trace is completely useless. So I started adding try catches, to the point where my whole main method that does everything is surrounded with a try catch. Even then, nothing is being caught. I still see this message though: 2014-10-23 15:39:50,845 ERROR [] Exception in task 1.0 in stage 1.0 (TID 1) java.io.IOException: org.apache.thrift.protocol.TProtocolException: . What is going on? Why isn't the exception just being handled by the try-catch? (BTW this is in Scala) -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Exceptions-not-caught-tp17157.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org
Re: Exceptions not caught?
Can you show the stack trace ? Also, how do you catch exceptions ? Did you specify TProtocolException ? Cheers On Thu, Oct 23, 2014 at 3:40 PM, ankits wrote: > Hi, I'm running a spark job and encountering an exception related to > thrift. > I wanted to know where this is being thrown, but the stack trace is > completely useless. So I started adding try catches, to the point where my > whole main method that does everything is surrounded with a try catch. Even > then, nothing is being caught. I still see this message though: > > 2014-10-23 15:39:50,845 ERROR [] Exception in task 1.0 in stage 1.0 (TID > 1) > java.io.IOException: org.apache.thrift.protocol.TProtocolException: > . > > What is going on? Why isn't the exception just being handled by the > try-catch? (BTW this is in Scala) > > > > -- > View this message in context: > http://apache-spark-user-list.1001560.n3.nabble.com/Exceptions-not-caught-tp17157.html > Sent from the Apache Spark User List mailing list archive at Nabble.com. > > - > To unsubscribe, e-mail: user-unsubscr...@spark.apache.org > For additional commands, e-mail: user-h...@spark.apache.org > >
Re: Exceptions not caught?
pply(HashShuffleWriter.scala:65) at scala.collection.Iterator$class.foreach(Iterator.scala:727) at scala.collection.AbstractIterator.foreach(Iterator.scala:1157) at org.apache.spark.shuffle.hash.HashShuffleWriter.write(HashShuffleWriter.scala:65) at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:68) at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:41) at org.apache.spark.scheduler.Task.run(Task.scala:54) at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:177) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) at java.lang.Thread.run(Thread.java:745) 2014-10-23 15:51:10,791 ERROR [] Task 0 in stage 1.0 failed 1 times; aborting job -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Exceptions-not-caught-tp17157p17159.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org
Re: Exceptions not caught?
Also everything is running locally on my box, driver and workers. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Exceptions-not-caught-tp17157p17160.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org
Re: Exceptions not caught?
eam.java:1547) > at > java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1508) > at > > java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1431) > at > java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1177) > at > java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:347) > at > > org.apache.spark.serializer.JavaSerializationStream.writeObject(JavaSerializer.scala:42) > at > > org.apache.spark.storage.DiskBlockObjectWriter.write(BlockObjectWriter.scala:195) > at > > org.apache.spark.shuffle.hash.HashShuffleWriter$$anonfun$write$1.apply(HashShuffleWriter.scala:67) > at > > org.apache.spark.shuffle.hash.HashShuffleWriter$$anonfun$write$1.apply(HashShuffleWriter.scala:65) > at scala.collection.Iterator$class.foreach(Iterator.scala:727) > at scala.collection.AbstractIterator.foreach(Iterator.scala:1157) > at > > org.apache.spark.shuffle.hash.HashShuffleWriter.write(HashShuffleWriter.scala:65) > at > org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:68) > at > org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:41) > at org.apache.spark.scheduler.Task.run(Task.scala:54) > at > org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:177) > at > > java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) > at > > java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) > at java.lang.Thread.run(Thread.java:745) > 2014-10-23 15:51:10,791 ERROR [] Task 0 in stage 1.0 failed 1 times; > aborting job > > > > -- > View this message in context: > http://apache-spark-user-list.1001560.n3.nabble.com/Exceptions-not-caught-tp17157p17159.html > Sent from the Apache Spark User List mailing list archive at Nabble.com. > > - > To unsubscribe, e-mail: user-unsubscr...@spark.apache.org > For additional commands, e-mail: user-h...@spark.apache.org > >
Re: Exceptions not caught?
>Can you check your class Y and fix the above ? I can, but this is about catching the exception should it be thrown by any class in the spark job. Why is the exception not being caught? -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Exceptions-not-caught-tp17157p17163.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org
Re: Exceptions not caught?
On Thu, Oct 23, 2014 at 3:40 PM, ankits wrote: > 2014-10-23 15:39:50,845 ERROR [] Exception in task 1.0 in stage 1.0 (TID 1) > java.io.IOException: org.apache.thrift.protocol.TProtocolException: This looks like an exception that's happening on an executor and just being reported in the driver's logs, so there's nothing to catch in the driver, which might explain why you're not catching anything. -- Marcelo - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org
Re: Exceptions not caught?
Hi, Spark dispatches your tasks to the distributed (remote) executors when the task is terminated due to an exception, it will report to the driver with the reason (exception) So, in the driver side, you see the reason of the task failure which actually happened in remote end…so, you cannot catch anything in driver side Best, -- Nan Zhu On Thursday, October 23, 2014 at 6:40 PM, ankits wrote: > Hi, I'm running a spark job and encountering an exception related to thrift. > I wanted to know where this is being thrown, but the stack trace is > completely useless. So I started adding try catches, to the point where my > whole main method that does everything is surrounded with a try catch. Even > then, nothing is being caught. I still see this message though: > > 2014-10-23 15:39:50,845 ERROR [] Exception in task 1.0 in stage 1.0 (TID 1) > java.io.IOException: org.apache.thrift.protocol.TProtocolException: > . > > What is going on? Why isn't the exception just being handled by the > try-catch? (BTW this is in Scala) > > > > -- > View this message in context: > http://apache-spark-user-list.1001560.n3.nabble.com/Exceptions-not-caught-tp17157.html > Sent from the Apache Spark User List mailing list archive at Nabble.com > (http://Nabble.com). > > - > To unsubscribe, e-mail: user-unsubscr...@spark.apache.org > (mailto:user-unsubscr...@spark.apache.org) > For additional commands, e-mail: user-h...@spark.apache.org > (mailto:user-h...@spark.apache.org) > >