I have a Spark Streaming program that works fine if I execute it via 

sbt "runMain com.cray.examples.spark.streaming.cyber.StatefulDhcpServerHisto
-f /Users/spr/Documents/<...>/tmp/ -t 10"

but if I start it via

$S/bin/spark-submit --master local[12] --class StatefulNewDhcpServers 
target/scala-2.10/newd*jar -f /Users/spr/Documents/<...>/tmp/ -t 10

(where $S points to the base of the Spark installation), it prints the
output of print statements before the ssc.start() but nothing after that.

I might well have screwed up something, but I'm getting no output anywhere
AFAICT.  I have set spark.eventLog.enabled to True in my spark-defaults.conf
file.  The Spark History Server at localhost:18080 says "no completed
applications found".  There must be some log output somewhere.  Any ideas?



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/with-SparkStreeaming-spark-submit-don-t-see-output-after-ssc-start-tp17989.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to