Re: how to correctly run scala script using spark-shell through stdin (spark v1.0.0)

2016-01-29 Thread Iulian Dragoș
On Fri, Jan 29, 2016 at 5:22 PM, Iulian Dragoș <iulian.dra...@typesafe.com>
wrote:

> I found the issue in the 2.11 version of the REPL, PR will follow shortly.
>


https://github.com/apache/spark/pull/10984



>
> The 2.10 version of Spark doesn't have this issue, so you could use that
> in the mean time.
>
> iulian
>
> On Wed, Jan 27, 2016 at 3:17 PM, <andres.fernan...@wellsfargo.com> wrote:
>
>> So far, still cannot find a way of running a small Scala script right
>> after executing the shell, and get the shell to remain open. Is there a way
>> of doing this?
>>
>> Feels like a simple/naive question but really couldn’t find an answer.
>>
>>
>>
>> *From:* Fernandez, Andres
>> *Sent:* Tuesday, January 26, 2016 2:53 PM
>> *To:* 'Ewan Leith'; Iulian Dragoș
>> *Cc:* user
>> *Subject:* RE: how to correctly run scala script using spark-shell
>> through stdin (spark v1.0.0)
>>
>>
>>
>> True thank you. Is there a way of having the shell not closed (how to
>> avoid the :quit statement). Thank you both.
>>
>>
>>
>> Andres
>>
>>
>>
>> *From:* Ewan Leith [mailto:ewan.le...@realitymine.com
>> <ewan.le...@realitymine.com>]
>> *Sent:* Tuesday, January 26, 2016 1:50 PM
>> *To:* Iulian Dragoș; Fernandez, Andres
>> *Cc:* user
>> *Subject:* RE: how to correctly run scala script using spark-shell
>> through stdin (spark v1.0.0)
>>
>>
>>
>> I’ve just tried running this using a normal stdin redirect:
>>
>>
>>
>> ~/spark/bin/spark-shell < simple.scala
>>
>>
>>
>> Which worked, it started spark-shell, executed the script, the stopped
>> the shell.
>>
>>
>>
>> Thanks,
>>
>> Ewan
>>
>>
>>
>> *From:* Iulian Dragoș [mailto:iulian.dra...@typesafe.com
>> <iulian.dra...@typesafe.com>]
>> *Sent:* 26 January 2016 15:00
>> *To:* fernandrez1987 <andres.fernan...@wellsfargo.com>
>> *Cc:* user <user@spark.apache.org>
>> *Subject:* Re: how to correctly run scala script using spark-shell
>> through stdin (spark v1.0.0)
>>
>>
>>
>> I don’t see -i in the output of spark-shell --help. Moreover, in master
>> I get an error:
>>
>> $ bin/spark-shell -i test.scala
>>
>> bad option: '-i'
>>
>> iulian
>>
>> ​
>>
>>
>>
>> On Tue, Jan 26, 2016 at 3:47 PM, fernandrez1987 <
>> andres.fernan...@wellsfargo.com> wrote:
>>
>> spark-shell -i file.scala is not working for me in Spark 1.6.0, was this
>> removed or what do I have to take into account? The script does not get
>> run
>> at all. What can be happening?
>> <
>> http://apache-spark-user-list.1001560.n3.nabble.com/file/n26071/script.png
>> >
>>
>> <
>> http://apache-spark-user-list.1001560.n3.nabble.com/file/n26071/shell-call.png
>> >
>>
>> <
>> http://apache-spark-user-list.1001560.n3.nabble.com/file/n26071/no-println.png
>> >
>>
>>
>>
>> --
>> View this message in context:
>> http://apache-spark-user-list.1001560.n3.nabble.com/how-to-correctly-run-scala-script-using-spark-shell-through-stdin-spark-v1-0-0-tp12972p26071.html
>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>
>> -
>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>> For additional commands, e-mail: user-h...@spark.apache.org
>>
>>
>>
>>
>>
>> --
>>
>>
>> --
>> Iulian Dragos
>>
>>
>>
>> --
>> Reactive Apps on the JVM
>> www.typesafe.com
>>
>>
>>
>
>
>
> --
>
> --
> Iulian Dragos
>
> --
> Reactive Apps on the JVM
> www.typesafe.com
>
>


-- 

--
Iulian Dragos

--
Reactive Apps on the JVM
www.typesafe.com


Re: how to correctly run scala script using spark-shell through stdin (spark v1.0.0)

2016-01-29 Thread Iulian Dragoș
I found the issue in the 2.11 version of the REPL, PR will follow shortly.

The 2.10 version of Spark doesn't have this issue, so you could use that in
the mean time.

iulian

On Wed, Jan 27, 2016 at 3:17 PM, <andres.fernan...@wellsfargo.com> wrote:

> So far, still cannot find a way of running a small Scala script right
> after executing the shell, and get the shell to remain open. Is there a way
> of doing this?
>
> Feels like a simple/naive question but really couldn’t find an answer.
>
>
>
> *From:* Fernandez, Andres
> *Sent:* Tuesday, January 26, 2016 2:53 PM
> *To:* 'Ewan Leith'; Iulian Dragoș
> *Cc:* user
> *Subject:* RE: how to correctly run scala script using spark-shell
> through stdin (spark v1.0.0)
>
>
>
> True thank you. Is there a way of having the shell not closed (how to
> avoid the :quit statement). Thank you both.
>
>
>
> Andres
>
>
>
> *From:* Ewan Leith [mailto:ewan.le...@realitymine.com
> <ewan.le...@realitymine.com>]
> *Sent:* Tuesday, January 26, 2016 1:50 PM
> *To:* Iulian Dragoș; Fernandez, Andres
> *Cc:* user
> *Subject:* RE: how to correctly run scala script using spark-shell
> through stdin (spark v1.0.0)
>
>
>
> I’ve just tried running this using a normal stdin redirect:
>
>
>
> ~/spark/bin/spark-shell < simple.scala
>
>
>
> Which worked, it started spark-shell, executed the script, the stopped the
> shell.
>
>
>
> Thanks,
>
> Ewan
>
>
>
> *From:* Iulian Dragoș [mailto:iulian.dra...@typesafe.com
> <iulian.dra...@typesafe.com>]
> *Sent:* 26 January 2016 15:00
> *To:* fernandrez1987 <andres.fernan...@wellsfargo.com>
> *Cc:* user <user@spark.apache.org>
> *Subject:* Re: how to correctly run scala script using spark-shell
> through stdin (spark v1.0.0)
>
>
>
> I don’t see -i in the output of spark-shell --help. Moreover, in master I
> get an error:
>
> $ bin/spark-shell -i test.scala
>
> bad option: '-i'
>
> iulian
>
> ​
>
>
>
> On Tue, Jan 26, 2016 at 3:47 PM, fernandrez1987 <
> andres.fernan...@wellsfargo.com> wrote:
>
> spark-shell -i file.scala is not working for me in Spark 1.6.0, was this
> removed or what do I have to take into account? The script does not get run
> at all. What can be happening?
> <
> http://apache-spark-user-list.1001560.n3.nabble.com/file/n26071/script.png
> >
>
> <
> http://apache-spark-user-list.1001560.n3.nabble.com/file/n26071/shell-call.png
> >
>
> <
> http://apache-spark-user-list.1001560.n3.nabble.com/file/n26071/no-println.png
> >
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/how-to-correctly-run-scala-script-using-spark-shell-through-stdin-spark-v1-0-0-tp12972p26071.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> -
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>
>
>
>
> --
>
>
> --
> Iulian Dragos
>
>
>
> --
> Reactive Apps on the JVM
> www.typesafe.com
>
>
>



-- 

--
Iulian Dragos

--
Reactive Apps on the JVM
www.typesafe.com


RE: how to correctly run scala script using spark-shell through stdin (spark v1.0.0)

2016-01-27 Thread Andres.Fernandez
So far, still cannot find a way of running a small Scala script right after 
executing the shell, and get the shell to remain open. Is there a way of doing 
this?
Feels like a simple/naive question but really couldn’t find an answer.

From: Fernandez, Andres
Sent: Tuesday, January 26, 2016 2:53 PM
To: 'Ewan Leith'; Iulian Dragoș
Cc: user
Subject: RE: how to correctly run scala script using spark-shell through stdin 
(spark v1.0.0)

True thank you. Is there a way of having the shell not closed (how to avoid the 
:quit statement). Thank you both.

Andres

From: Ewan Leith [mailto:ewan.le...@realitymine.com]
Sent: Tuesday, January 26, 2016 1:50 PM
To: Iulian Dragoș; Fernandez, Andres
Cc: user
Subject: RE: how to correctly run scala script using spark-shell through stdin 
(spark v1.0.0)

I’ve just tried running this using a normal stdin redirect:

~/spark/bin/spark-shell < simple.scala

Which worked, it started spark-shell, executed the script, the stopped the 
shell.

Thanks,
Ewan

From: Iulian Dragoș [mailto:iulian.dra...@typesafe.com]
Sent: 26 January 2016 15:00
To: fernandrez1987 
<andres.fernan...@wellsfargo.com<mailto:andres.fernan...@wellsfargo.com>>
Cc: user <user@spark.apache.org<mailto:user@spark.apache.org>>
Subject: Re: how to correctly run scala script using spark-shell through stdin 
(spark v1.0.0)


I don’t see -i in the output of spark-shell --help. Moreover, in master I get 
an error:

$ bin/spark-shell -i test.scala

bad option: '-i'

iulian
​

On Tue, Jan 26, 2016 at 3:47 PM, fernandrez1987 
<andres.fernan...@wellsfargo.com<mailto:andres.fernan...@wellsfargo.com>> wrote:
spark-shell -i file.scala is not working for me in Spark 1.6.0, was this
removed or what do I have to take into account? The script does not get run
at all. What can be happening?
<http://apache-spark-user-list.1001560.n3.nabble.com/file/n26071/script.png>

<http://apache-spark-user-list.1001560.n3.nabble.com/file/n26071/shell-call.png>

<http://apache-spark-user-list.1001560.n3.nabble.com/file/n26071/no-println.png>



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/how-to-correctly-run-scala-script-using-spark-shell-through-stdin-spark-v1-0-0-tp12972p26071.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: 
user-unsubscr...@spark.apache.org<mailto:user-unsubscr...@spark.apache.org>
For additional commands, e-mail: 
user-h...@spark.apache.org<mailto:user-h...@spark.apache.org>



--

--
Iulian Dragos

--
Reactive Apps on the JVM
www.typesafe.com<http://www.typesafe.com>



RE: how to correctly run scala script using spark-shell through stdin (spark v1.0.0)

2016-01-26 Thread fernandrez1987
spark-shell -i file.scala is not working for me in Spark 1.6.0, was this
removed or what do I have to take into account? The script does not get run
at all. What can be happening?
 


 


 



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/how-to-correctly-run-scala-script-using-spark-shell-through-stdin-spark-v1-0-0-tp12972p26071.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: how to correctly run scala script using spark-shell through stdin (spark v1.0.0)

2016-01-26 Thread Iulian Dragoș
I don’t see -i in the output of spark-shell --help. Moreover, in master I
get an error:

$ bin/spark-shell -i test.scala
bad option: '-i'

iulian
​

On Tue, Jan 26, 2016 at 3:47 PM, fernandrez1987 <
andres.fernan...@wellsfargo.com> wrote:

> spark-shell -i file.scala is not working for me in Spark 1.6.0, was this
> removed or what do I have to take into account? The script does not get run
> at all. What can be happening?
> <
> http://apache-spark-user-list.1001560.n3.nabble.com/file/n26071/script.png
> >
>
> <
> http://apache-spark-user-list.1001560.n3.nabble.com/file/n26071/shell-call.png
> >
>
> <
> http://apache-spark-user-list.1001560.n3.nabble.com/file/n26071/no-println.png
> >
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/how-to-correctly-run-scala-script-using-spark-shell-through-stdin-spark-v1-0-0-tp12972p26071.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> -
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>


-- 

--
Iulian Dragos

--
Reactive Apps on the JVM
www.typesafe.com


Re: how to correctly run scala script using spark-shell through stdin (spark v1.0.0)

2016-01-26 Thread Iulian Dragoș
On Tue, Jan 26, 2016 at 4:08 PM, <andres.fernan...@wellsfargo.com> wrote:

> Yes no option –i. Thanks Iulian, but do you know how can I send three
> lines to be executed just after spark-shell has initiated. Please check
> http://apache-spark-user-list.1001560.n3.nabble.com/how-to-correctly-run-scala-script-using-spark-shell-through-stdin-spark-v1-0-0-td12972.html#a26071
> .
>

To be honest, I think this might be a regression I introduced, or at least,
it's something that works in the 2.10 version of Spark. By just looking at
the code, it should accept the same arguments as the Scala interpreter.
I'll look into it.

iulian


>
>
> Thank you very much for your time.
>
>
>
> *From:* Iulian Dragoș [mailto:iulian.dra...@typesafe.com]
> *Sent:* Tuesday, January 26, 2016 12:00 PM
> *To:* Fernandez, Andres
> *Cc:* user
> *Subject:* Re: how to correctly run scala script using spark-shell
> through stdin (spark v1.0.0)
>
>
>
> I don’t see -i in the output of spark-shell --help. Moreover, in master I
> get an error:
>
> $ bin/spark-shell -i test.scala
>
> bad option: '-i'
>
> iulian
>
> ​
>
>
>
> On Tue, Jan 26, 2016 at 3:47 PM, fernandrez1987 <
> andres.fernan...@wellsfargo.com> wrote:
>
> spark-shell -i file.scala is not working for me in Spark 1.6.0, was this
> removed or what do I have to take into account? The script does not get run
> at all. What can be happening?
> <
> http://apache-spark-user-list.1001560.n3.nabble.com/file/n26071/script.png
> >
>
> <
> http://apache-spark-user-list.1001560.n3.nabble.com/file/n26071/shell-call.png
> >
>
> <
> http://apache-spark-user-list.1001560.n3.nabble.com/file/n26071/no-println.png
> >
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/how-to-correctly-run-scala-script-using-spark-shell-through-stdin-spark-v1-0-0-tp12972p26071.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> -
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>
>
>
>
> --
>
>
> --
> Iulian Dragos
>
>
>
> --
> Reactive Apps on the JVM
> www.typesafe.com
>
>
>



-- 

--
Iulian Dragos

--
Reactive Apps on the JVM
www.typesafe.com


RE: how to correctly run scala script using spark-shell through stdin (spark v1.0.0)

2016-01-26 Thread Ewan Leith
I’ve just tried running this using a normal stdin redirect:

~/spark/bin/spark-shell < simple.scala

Which worked, it started spark-shell, executed the script, the stopped the 
shell.

Thanks,
Ewan

From: Iulian Dragoș [mailto:iulian.dra...@typesafe.com]
Sent: 26 January 2016 15:00
To: fernandrez1987 <andres.fernan...@wellsfargo.com>
Cc: user <user@spark.apache.org>
Subject: Re: how to correctly run scala script using spark-shell through stdin 
(spark v1.0.0)


I don’t see -i in the output of spark-shell --help. Moreover, in master I get 
an error:

$ bin/spark-shell -i test.scala

bad option: '-i'

iulian
​

On Tue, Jan 26, 2016 at 3:47 PM, fernandrez1987 
<andres.fernan...@wellsfargo.com<mailto:andres.fernan...@wellsfargo.com>> wrote:
spark-shell -i file.scala is not working for me in Spark 1.6.0, was this
removed or what do I have to take into account? The script does not get run
at all. What can be happening?
<http://apache-spark-user-list.1001560.n3.nabble.com/file/n26071/script.png>

<http://apache-spark-user-list.1001560.n3.nabble.com/file/n26071/shell-call.png>

<http://apache-spark-user-list.1001560.n3.nabble.com/file/n26071/no-println.png>



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/how-to-correctly-run-scala-script-using-spark-shell-through-stdin-spark-v1-0-0-tp12972p26071.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: 
user-unsubscr...@spark.apache.org<mailto:user-unsubscr...@spark.apache.org>
For additional commands, e-mail: 
user-h...@spark.apache.org<mailto:user-h...@spark.apache.org>



--

--
Iulian Dragos

--
Reactive Apps on the JVM
www.typesafe.com<http://www.typesafe.com>



RE: how to correctly run scala script using spark-shell through stdin (spark v1.0.0)

2016-01-26 Thread Andres.Fernandez
True thank you. Is there a way of having the shell not closed (how to avoid the 
:quit statement). Thank you both.

Andres

From: Ewan Leith [mailto:ewan.le...@realitymine.com]
Sent: Tuesday, January 26, 2016 1:50 PM
To: Iulian Dragoș; Fernandez, Andres
Cc: user
Subject: RE: how to correctly run scala script using spark-shell through stdin 
(spark v1.0.0)

I’ve just tried running this using a normal stdin redirect:

~/spark/bin/spark-shell < simple.scala

Which worked, it started spark-shell, executed the script, the stopped the 
shell.

Thanks,
Ewan

From: Iulian Dragoș [mailto:iulian.dra...@typesafe.com]
Sent: 26 January 2016 15:00
To: fernandrez1987 
<andres.fernan...@wellsfargo.com<mailto:andres.fernan...@wellsfargo.com>>
Cc: user <user@spark.apache.org<mailto:user@spark.apache.org>>
Subject: Re: how to correctly run scala script using spark-shell through stdin 
(spark v1.0.0)


I don’t see -i in the output of spark-shell --help. Moreover, in master I get 
an error:

$ bin/spark-shell -i test.scala

bad option: '-i'

iulian
​

On Tue, Jan 26, 2016 at 3:47 PM, fernandrez1987 
<andres.fernan...@wellsfargo.com<mailto:andres.fernan...@wellsfargo.com>> wrote:
spark-shell -i file.scala is not working for me in Spark 1.6.0, was this
removed or what do I have to take into account? The script does not get run
at all. What can be happening?
<http://apache-spark-user-list.1001560.n3.nabble.com/file/n26071/script.png>

<http://apache-spark-user-list.1001560.n3.nabble.com/file/n26071/shell-call.png>

<http://apache-spark-user-list.1001560.n3.nabble.com/file/n26071/no-println.png>



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/how-to-correctly-run-scala-script-using-spark-shell-through-stdin-spark-v1-0-0-tp12972p26071.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: 
user-unsubscr...@spark.apache.org<mailto:user-unsubscr...@spark.apache.org>
For additional commands, e-mail: 
user-h...@spark.apache.org<mailto:user-h...@spark.apache.org>



--

--
Iulian Dragos

--
Reactive Apps on the JVM
www.typesafe.com<http://www.typesafe.com>



RE: how to correctly run scala script using spark-shell through stdin (spark v1.0.0)

2014-08-27 Thread Henry Hung
Update:

I use shell script to execute the spark-shell, inside the my-script.sh:
$SPARK_HOME/bin/spark-shell  $HOME/test.scala  $HOME/test.log 21 

Although it correctly finish the println(hallo world), but the strange thing 
is that my-script.sh finished before spark-shell even finish executing the 
script.

Best regards,
Henry Hung

From: MA33 YTHung1
Sent: Thursday, August 28, 2014 10:01 AM
To: user@spark.apache.org
Subject: how to correctly run scala script using spark-shell through stdin 
(spark v1.0.0)

HI All,

Right now I'm trying to execute a script using this command:

nohup $SPARK_HOME/bin/spark-shell  $HOME/my-script.scala  $HOME/my-script.log 
21 

my-script.scala just have 1 line of code:
println(hallo world)

But after waiting for a minute, I still don't receive the result from 
spark-shell.
And the output log only contains:

Spark assembly has been built with Hive, including Datanucleus jars on classpath
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in 
[jar:file:/data1/hadoop/spark-1.0.0-bin-hadoop2/lib/spark-assembly-1.0.0-hadoop2.2.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in 
[jar:file:/data1/hadoop/hbase-0.96.0-hadoop2/lib/slf4j-log4j12-1.6.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
14/08/28 09:38:27 INFO spark.SecurityManager: Changing view acls to: hadoop
14/08/28 09:38:27 INFO spark.SecurityManager: SecurityManager: authentication 
disabled; ui acls disabled; users with view permissions: Set(hadoop)
14/08/28 09:38:27 INFO spark.HttpServer: Starting HTTP Server
14/08/28 09:38:27 INFO server.Server: jetty-8.y.z-SNAPSHOT
14/08/28 09:38:27 INFO server.AbstractConnector: Started 
SocketConnector@0.0.0.0:49382mailto:SocketConnector@0.0.0.0:49382

My question is:
What is the right way to execute spark script?
I really want to use spark-shell as a way to run cron job in the future, just 
like when I run R script.

Best regards,
Henry


The privileged confidential information contained in this email is intended for 
use only by the addressees as indicated by the original sender of this email. 
If you are not the addressee indicated in this email or are not responsible for 
delivery of the email to such a person, please kindly reply to the sender 
indicating this fact and delete all copies of it from your computer and network 
server immediately. Your cooperation is highly appreciated. It is advised that 
any unauthorized use of confidential information of Winbond is strictly 
prohibited; and any information in this email irrelevant to the official 
business of Winbond shall be deemed as neither given nor endorsed by Winbond.


The privileged confidential information contained in this email is intended for 
use only by the addressees as indicated by the original sender of this email. 
If you are not the addressee indicated in this email or are not responsible for 
delivery of the email to such a person, please kindly reply to the sender 
indicating this fact and delete all copies of it from your computer and network 
server immediately. Your cooperation is highly appreciated. It is advised that 
any unauthorized use of confidential information of Winbond is strictly 
prohibited; and any information in this email irrelevant to the official 
business of Winbond shall be deemed as neither given nor endorsed by Winbond.


RE: how to correctly run scala script using spark-shell through stdin (spark v1.0.0)

2014-08-27 Thread Matei Zaharia
You can use spark-shell -i file.scala to run that. However, that keeps the 
interpreter open at the end, so you need to make your file end with 
System.exit(0) (or even more robustly, do stuff in a try {} and add that in 
finally {}).

In general it would be better to compile apps and run them with spark-submit. 
The Scala shell isn't that easy for debugging, etc.

Matei

On August 27, 2014 at 7:23:10 PM, Henry Hung (ythu...@winbond.com) wrote:

Update:

 

I use shell script to execute the spark-shell, inside the my-script.sh:

$SPARK_HOME/bin/spark-shell  $HOME/test.scala  $HOME/test.log 21 

 

Although it correctly finish the println(“hallo world”), but the strange thing 
is that my-script.sh finished before spark-shell even finish executing the 
script.

 

Best regards,

Henry Hung

 

From: MA33 YTHung1
Sent: Thursday, August 28, 2014 10:01 AM
To: user@spark.apache.org
Subject: how to correctly run scala script using spark-shell through stdin 
(spark v1.0.0)

 

HI All,

 

Right now I’m trying to execute a script using this command:

 

nohup $SPARK_HOME/bin/spark-shell  $HOME/my-script.scala  $HOME/my-script.log 
21 

 

my-script.scala just have 1 line of code:

println(“hallo world”)

 

But after waiting for a minute, I still don’t receive the result from 
spark-shell.

And the output log only contains:

 

Spark assembly has been built with Hive, including Datanucleus jars on classpath

SLF4J: Class path contains multiple SLF4J bindings.

SLF4J: Found binding in 
[jar:file:/data1/hadoop/spark-1.0.0-bin-hadoop2/lib/spark-assembly-1.0.0-hadoop2.2.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]

SLF4J: Found binding in 
[jar:file:/data1/hadoop/hbase-0.96.0-hadoop2/lib/slf4j-log4j12-1.6.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]

SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.

SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]

14/08/28 09:38:27 INFO spark.SecurityManager: Changing view acls to: hadoop

14/08/28 09:38:27 INFO spark.SecurityManager: SecurityManager: authentication 
disabled; ui acls disabled; users with view permissions: Set(hadoop)

14/08/28 09:38:27 INFO spark.HttpServer: Starting HTTP Server

14/08/28 09:38:27 INFO server.Server: jetty-8.y.z-SNAPSHOT

14/08/28 09:38:27 INFO server.AbstractConnector: Started 
SocketConnector@0.0.0.0:49382

 

My question is:

What is the right way to execute spark script?

I really want to use spark-shell as a way to run cron job in the future, just 
like when I run R script.

 

Best regards,

Henry

 

The privileged confidential information contained in this email is intended for 
use only by the addressees as indicated by the original sender of this email. 
If you are not the addressee indicated in this email or are not responsible for 
delivery of the email to such a person, please kindly reply to the sender 
indicating this fact and delete all copies of it from your computer and network 
server immediately. Your cooperation is highly appreciated. It is advised that 
any unauthorized use of confidential information of Winbond is strictly 
prohibited; and any information in this email irrelevant to the official 
business of Winbond shall be deemed as neither given nor endorsed by Winbond.


The privileged confidential information contained in this email is intended for 
use only by the addressees as indicated by the original sender of this email. 
If you are not the addressee indicated in this email or are not responsible for 
delivery of the email to such a person, please kindly reply to the sender 
indicating this fact and delete all copies of it from your computer and network 
server immediately. Your cooperation is highly appreciated. It is advised that 
any unauthorized use of confidential information of Winbond is strictly 
prohibited; and any information in this email irrelevant to the official 
business of Winbond shall be deemed as neither given nor endorsed by Winbond.