You can make the nodes non-reachable from any computer external to the cluster. 
Applications can be deployed on an edge node that is connected to the cluster. 
Do you use Hadoop for managing the cluster? Then you may want to look at Apache 
Knox. 

> On 02 Mar 2016, at 15:14, zgpinnadhari <gpinnadh...@zaloni.com> wrote:
> 
> Hi
> 
> We want to use spark in a secure cluster with iptables enabled.
> For this, we need a specific list of ports used by spark so that we can
> whitelist them.
> 
> From what I could learn from -
> http://spark.apache.org/docs/latest/security.html#configuring-ports-for-network-security
>  
> - there are several ports chosen "randomly" which pose a challenge while
> coming up with specific iptables rules as we cannot allow any-any.
> 
> What is the recommendation here?
> 
> Can we specify a port range somewhere from which spark can choose randomly?
> 
> What do other secure / hardened clusters do?
> 
> Thanks!
> 
> 
> 
> 
> --
> View this message in context: 
> http://apache-spark-user-list.1001560.n3.nabble.com/Configuring-Ports-for-Network-Security-tp26376.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
> 
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
> 

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to