On Fri, Jan 29, 2016 at 5:22 PM, Iulian Dragoș <iulian.dra...@typesafe.com>
wrote:

> I found the issue in the 2.11 version of the REPL, PR will follow shortly.
>


https://github.com/apache/spark/pull/10984



>
> The 2.10 version of Spark doesn't have this issue, so you could use that
> in the mean time.
>
> iulian
>
> On Wed, Jan 27, 2016 at 3:17 PM, <andres.fernan...@wellsfargo.com> wrote:
>
>> So far, still cannot find a way of running a small Scala script right
>> after executing the shell, and get the shell to remain open. Is there a way
>> of doing this?
>>
>> Feels like a simple/naive question but really couldn’t find an answer.
>>
>>
>>
>> *From:* Fernandez, Andres
>> *Sent:* Tuesday, January 26, 2016 2:53 PM
>> *To:* 'Ewan Leith'; Iulian Dragoș
>> *Cc:* user
>> *Subject:* RE: how to correctly run scala script using spark-shell
>> through stdin (spark v1.0.0)
>>
>>
>>
>> True thank you. Is there a way of having the shell not closed (how to
>> avoid the :quit statement). Thank you both.
>>
>>
>>
>> Andres
>>
>>
>>
>> *From:* Ewan Leith [mailto:ewan.le...@realitymine.com
>> <ewan.le...@realitymine.com>]
>> *Sent:* Tuesday, January 26, 2016 1:50 PM
>> *To:* Iulian Dragoș; Fernandez, Andres
>> *Cc:* user
>> *Subject:* RE: how to correctly run scala script using spark-shell
>> through stdin (spark v1.0.0)
>>
>>
>>
>> I’ve just tried running this using a normal stdin redirect:
>>
>>
>>
>> ~/spark/bin/spark-shell < simple.scala
>>
>>
>>
>> Which worked, it started spark-shell, executed the script, the stopped
>> the shell.
>>
>>
>>
>> Thanks,
>>
>> Ewan
>>
>>
>>
>> *From:* Iulian Dragoș [mailto:iulian.dra...@typesafe.com
>> <iulian.dra...@typesafe.com>]
>> *Sent:* 26 January 2016 15:00
>> *To:* fernandrez1987 <andres.fernan...@wellsfargo.com>
>> *Cc:* user <user@spark.apache.org>
>> *Subject:* Re: how to correctly run scala script using spark-shell
>> through stdin (spark v1.0.0)
>>
>>
>>
>> I don’t see -i in the output of spark-shell --help. Moreover, in master
>> I get an error:
>>
>> $ bin/spark-shell -i test.scala
>>
>> bad option: '-i'
>>
>> iulian
>>
>> ​
>>
>>
>>
>> On Tue, Jan 26, 2016 at 3:47 PM, fernandrez1987 <
>> andres.fernan...@wellsfargo.com> wrote:
>>
>> spark-shell -i file.scala is not working for me in Spark 1.6.0, was this
>> removed or what do I have to take into account? The script does not get
>> run
>> at all. What can be happening?
>> <
>> http://apache-spark-user-list.1001560.n3.nabble.com/file/n26071/script.png
>> >
>>
>> <
>> http://apache-spark-user-list.1001560.n3.nabble.com/file/n26071/shell-call.png
>> >
>>
>> <
>> http://apache-spark-user-list.1001560.n3.nabble.com/file/n26071/no-println.png
>> >
>>
>>
>>
>> --
>> View this message in context:
>> http://apache-spark-user-list.1001560.n3.nabble.com/how-to-correctly-run-scala-script-using-spark-shell-through-stdin-spark-v1-0-0-tp12972p26071.html
>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>> For additional commands, e-mail: user-h...@spark.apache.org
>>
>>
>>
>>
>>
>> --
>>
>>
>> --
>> Iulian Dragos
>>
>>
>>
>> ------
>> Reactive Apps on the JVM
>> www.typesafe.com
>>
>>
>>
>
>
>
> --
>
> --
> Iulian Dragos
>
> ------
> Reactive Apps on the JVM
> www.typesafe.com
>
>


-- 

--
Iulian Dragos

------
Reactive Apps on the JVM
www.typesafe.com

Reply via email to