How does one consume parameters passed to a Scala script via spark-shell -i?
1. If I use an object with a main() method, the println outputs nothing as if not called: import org.apache.spark.SparkContext import org.apache.spark.SparkContext._ import org.apache.spark.SparkConf object Test { def main(args: Array[String]) { println("args(0): " + args(0)) } } System.exit(0) spark-shell -i Test.scala pizza => no print output 2. If I use the Scala args insead, the compiler complains that the args object import org.apache.spark.SparkContext import org.apache.spark.SparkContext._ import org.apache.spark.SparkConf println("args(0): " + args(0)) System.exit(0) spark-shell -i Test.scala pizza => <console>:16: error: not found: value args println("args(0): " + args(0)) Thanks, Alec