and it will show up along with other slaves.
Thanks
Best Regards
On Thu, May 28, 2015 at 12:29 PM, nizang ni...@windward.eu wrote:
hi,
I'm working on spark standalone system on ec2, and I'm having problems on
resizing the cluster (meaning - adding or removing slaves).
In the basic ec2 scripts
(http
hi,
I'm working on spark standalone system on ec2, and I'm having problems on
resizing the cluster (meaning - adding or removing slaves).
In the basic ec2 scripts
(http://spark.apache.org/docs/latest/ec2-scripts.html), there's only script
for lunching the cluster, not adding slaves
,
I'm working on spark standalone system on ec2, and I'm having problems on
resizing the cluster (meaning - adding or removing slaves).
In the basic ec2 scripts
(http://spark.apache.org/docs/latest/ec2-scripts.html), there's only
script
for lunching the cluster, not adding slaves
at 12:29 PM, nizang ni...@windward.eu wrote:
hi,
I'm working on spark standalone system on ec2, and I'm having problems on
resizing the cluster (meaning - adding or removing slaves).
In the basic ec2 scripts
(http://spark.apache.org/docs/latest/ec2-scripts.html), there's only
script
Regards
On Tue, Oct 7, 2014 at 3:36 AM, Ankur Srivastava ankur.srivast...@gmail.com
wrote:
Hi,
I have started a Spark Cluster on EC2 using Spark Standalone cluster
manager but spark is trying to identify the worker threads using the
hostnames which are not accessible publicly.
So when I try
ankur.srivast...@gmail.com wrote:
Hi,
I have started a Spark Cluster on EC2 using Spark Standalone cluster
manager but spark is trying to identify the worker threads using the
hostnames which are not accessible publicly.
So when I try to submit jobs from eclipse it is failing, is there some
way spark
address either i believe.
What are you trying to do here? running your eclipse locally and
connecting to your ec2 cluster?
Thanks
Best Regards
On Tue, Oct 7, 2014 at 3:36 AM, Ankur Srivastava
ankur.srivast...@gmail.com wrote:
Hi,
I have started a Spark Cluster on EC2 using Spark
Hi,
I have started a Spark Cluster on EC2 using Spark Standalone cluster
manager but spark is trying to identify the worker threads using the
hostnames which are not accessible publicly.
So when I try to submit jobs from eclipse it is failing, is there some way
spark can use IP address instead