I don’t see -i in the output of spark-shell --help. Moreover, in master I get an error:
$ bin/spark-shell -i test.scala bad option: '-i' iulian On Tue, Jan 26, 2016 at 3:47 PM, fernandrez1987 < andres.fernan...@wellsfargo.com> wrote: > spark-shell -i file.scala is not working for me in Spark 1.6.0, was this > removed or what do I have to take into account? The script does not get run > at all. What can be happening? > < > http://apache-spark-user-list.1001560.n3.nabble.com/file/n26071/script.png > > > > < > http://apache-spark-user-list.1001560.n3.nabble.com/file/n26071/shell-call.png > > > > < > http://apache-spark-user-list.1001560.n3.nabble.com/file/n26071/no-println.png > > > > > > -- > View this message in context: > http://apache-spark-user-list.1001560.n3.nabble.com/how-to-correctly-run-scala-script-using-spark-shell-through-stdin-spark-v1-0-0-tp12972p26071.html > Sent from the Apache Spark User List mailing list archive at Nabble.com. > > --------------------------------------------------------------------- > To unsubscribe, e-mail: user-unsubscr...@spark.apache.org > For additional commands, e-mail: user-h...@spark.apache.org > > -- -- Iulian Dragos ------ Reactive Apps on the JVM www.typesafe.com