Re: Problem submiting an script .py against an standalone cluster.

2015-07-30 Thread Anh Hong
You might want to run spark-submit with option --deploy-mode cluster
 


 On Thursday, July 30, 2015 7:24 PM, Marcelo Vanzin van...@cloudera.com 
wrote:
   

 Can you share the part of the code in your script where you create the 
SparkContext instance?
On Thu, Jul 30, 2015 at 7:19 PM, fordfarline fordfarl...@gmail.com wrote:

Hi All,

I`m having an issue when lanching an app (python) against a stand alone
cluster, but runs in local, as it doesn't reach the cluster.
It's the first time i try the cluster, in local works ok.

i made this:

- /home/user/Spark/spark-1.3.0-bin-hadoop2.4/sbin/start-all.sh # Master and
worker are up in localhost:8080/4040
- /home/user/Spark/spark-1.3.0-bin-hadoop2.4/bin/spark-submit --master
spark://localhost:7077 Script.py
           * The script runs ok but in local :(    i can check it in
localhost:4040, but i don't see any job in cluster UI

The only warning it's:
WARN Utils: Your hostname, localhost resolves to a loopback address:
127.0.0.1; using 192.168.1.132 instead (on interface eth0)

I set SPARK_LOCAL_IP=127.0.0.1 to solve this, al least de warning disappear,
but the script keep executing in local not in cluster.

I think it has something to do with my virtual server:
- Host Server: Linux Mint
- The Virtual Server (workstation 10) where runs Spark is Linux Mint as
well.

Any ideas what am i doing wrong?

Thanks in advance for any suggestion, i getting mad on it!!




--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Problem-submiting-an-script-py-against-an-standalone-cluster-tp24091.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org





-- 
Marcelo


  

Re: Authentication Support with spark-submit cluster mode

2015-07-29 Thread Anh Hong
Hi Zhan,I'm running Standalone Spark cluster and execute spark-submit from a 
local host outside the cluster. Beside kerberos, do you know any other existing 
method? Is there any JIRA opened on this enhancement request?
Regards,Anh.
 


 On Wednesday, July 29, 2015 4:15 PM, Zhan Zhang zzh...@hortonworks.com 
wrote:
   

 If you run it on yarn with kerberos setup. You authenticate yourself by kinit 
before launching the job.
Thanks.
Zhan Zhang 
On Jul 28, 2015, at 8:51 PM, Anh Hong hongnhat...@yahoo.com.INVALID wrote:

Hi,I'd like to remotely run spark-submit from a local machine to submit a job 
to spark cluster (cluster mode).
What method do I use to authenticate myself to the cluster? Like how to pass 
user id or password or private key to the cluster

Any help is appreciated.





  

Authentication Support with spark-submit cluster mode

2015-07-28 Thread Anh Hong
Hi,I'd like to remotely run spark-submit from a local machine to submit a job 
to spark cluster (cluster mode). 
What method do I use to authenticate myself to the cluster? Like how to pass 
user id or password or private key to the cluster

Any help is appreciated.

 

Does spark-submit support file transfering from local to cluster?

2015-07-28 Thread Anh Hong
Hi, 
   I'm using spark-submit cluster mode to submit a job from local to spark 
cluster. There are input files, output files, and job log files that I need to 
transfer in and out between local machine and spark cluster.Any recommendation 
methods to use file transferring. Is there any future plan that spark will 
support file transferring from cluster to local and vice versa. Any help is 
appreciated.
Thanks.