When you run spark submit in either client or cluster mode, you can either use 
the options --packages or -jars to automatically copy your packages to the 
worker machines.
Thanks 

    On Monday, January 11, 2016 12:52 PM, Andy Davidson 
<a...@santacruzintegration.com> wrote:
 

 I use https://code.google.com/p/parallel-ssh/ to upgrade all my slaves


From:  "taotao.li" <charles.up...@gmail.com>
Date:  Sunday, January 10, 2016 at 9:50 PM
To:  "user @spark" <user@spark.apache.org>
Subject:  pre-install 3-party Python package on spark cluster


I have a spark cluster, from machine-1 to machine 100, and machine-1 acts asthe 
master.
Then one day my program need use a 3-party python package which is notinstalled 
on every machine of the cluster.
so here comes my problem: to make that 3-party python package usable onmaster 
and slaves, should I manually ssh to every machine and use pip toinstall that 
package?
I believe there should be some deploy scripts or other things to make 
thisgrace, but I can't find anything after googling.


--View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/pre-install-3-party-Python-package-on-spark-cluster-tp25930.htmlSent
 from the Apache Spark User List mailing list archive at Nabble.com.
---------------------------------------------------------------------To 
unsubscribe, e-mail: user-unsubscribe@spark.apache.orgFor additional commands, 
e-mail: user-h...@spark.apache.org




  

Reply via email to