I'm not sure why spark is not showing the runtime exception in the logs. However, I think I can point out why the stage is failing.
1. "lineMapToStockPriceInfoObjectRDD.map(new stockDataFilter(_).requirementsMet.get)" The ".get" will throw a runtime exception when "requirementsMet" is None. I would suggest rewriting as either of these (both have equivalent result): - "lineMapToStockPriceInfoObjectRDD.flatMap(new stockDataFilter(_).requirementsMet)" - "lineMapToStockPriceInfoObjectRDD.filter(new stockDataFilter(_). isWithinTradingSession)" 2. "case true => Some(s).get" this should give a compile error. you should remove the .get: "case true => Some(s)" -Nick -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/why-would-a-spark-Job-fail-without-throwing-run-time-exceptions-tp25002p25034.html Sent from the Apache Spark User List mailing list archive at Nabble.com. --------------------------------------------------------------------- To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org