I’ve just tried running this using a normal stdin redirect:

~/spark/bin/spark-shell < simple.scala

Which worked, it started spark-shell, executed the script, the stopped the 
shell.

Thanks,
Ewan

From: Iulian Dragoș [mailto:iulian.dra...@typesafe.com]
Sent: 26 January 2016 15:00
To: fernandrez1987 <andres.fernan...@wellsfargo.com>
Cc: user <user@spark.apache.org>
Subject: Re: how to correctly run scala script using spark-shell through stdin 
(spark v1.0.0)


I don’t see -i in the output of spark-shell --help. Moreover, in master I get 
an error:

$ bin/spark-shell -i test.scala

bad option: '-i'

iulian
​

On Tue, Jan 26, 2016 at 3:47 PM, fernandrez1987 
<andres.fernan...@wellsfargo.com<mailto:andres.fernan...@wellsfargo.com>> wrote:
spark-shell -i file.scala is not working for me in Spark 1.6.0, was this
removed or what do I have to take into account? The script does not get run
at all. What can be happening?
<http://apache-spark-user-list.1001560.n3.nabble.com/file/n26071/script.png>

<http://apache-spark-user-list.1001560.n3.nabble.com/file/n26071/shell-call.png>

<http://apache-spark-user-list.1001560.n3.nabble.com/file/n26071/no-println.png>



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/how-to-correctly-run-scala-script-using-spark-shell-through-stdin-spark-v1-0-0-tp12972p26071.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: 
user-unsubscr...@spark.apache.org<mailto:user-unsubscr...@spark.apache.org>
For additional commands, e-mail: 
user-h...@spark.apache.org<mailto:user-h...@spark.apache.org>



--

--
Iulian Dragos

------
Reactive Apps on the JVM
www.typesafe.com<http://www.typesafe.com>

Reply via email to