Hi,I'm attempting to run the following simple standalone app on mac os and spark 1.0 using sbt:val sparkConf = new SparkConf().setAppName("ProcessEvents").setMaster("local[*]").setSparkHome("/Users/me/Downloads/spark")val ssc = new StreamingContext(sparkConf, Seconds(10))val lines = ssc.textFileStream("/Users/me/Downloads/test/")lines.foreachRDD(rdd => rdd.foreach(println(_)))ssc.start()ssc.awaitTermination()However, when running it with sbt run, I get quite a few errors:23:27:42.182 [run-main] DEBUG org.apache.hadoop.conf.Configuration - java.io.IOException: config() at org.apache.hadoop.conf.Configuration.(Configuration.java:227) at org.apache.hadoop.conf.Configuration.(Configuration.java:214)org.apache.spark.SparkException: Job aborted due to stage failure: Task 0.0:0 failed 1 times, most recent failure: Exception failure in TID 0 on host localhost: java.lang.ClassNotFoundException: scala.None$ java.net.URLClassLoader$1.run(URLClassLoader.java:366) java.net.URLClassLoader$1.run(URLClassLoader.java:355)Any ideas? Let me know what other info you need to figure this out.Thanks!
-- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Failing-to-run-standalone-streaming-app-IOException-classNotFoundException-and-more-tp7632.html Sent from the Apache Spark User List mailing list archive at Nabble.com.