You refer to `aws_security_token`, but I'm not sure where you're specifying
it. Can you elaborate? Is it an environment variable?
On Mon, Jul 27, 2015 at 4:21 AM Jan Zikeš jan.zi...@centrum.cz wrote:
Hi,
I would like to ask if it is currently possible to use spark-ec2 script
together with
You're probably requesting more instances than allowed by your account, so
the error gets generated for the extra instances. Try launching a smaller
cluster.
On Wed, Apr 1, 2015 at 12:41 PM, Vadim Bichutskiy
vadim.bichuts...@gmail.com wrote:
Hi all,
I just tried launching a Spark cluster on
The free tier includes 750 hours of t2.micro instance time per month.
http://aws.amazon.com/free/
That's basically a month of hours, so it's all free if you run one
instance only at a time. If you run 4, you'll be able to run your
cluster of 4 for about a week free.
A t2.micro has 1GB of memory,
Thank You Sean.
I was just trying to experiment with the performance of Spark Applications
with various worker instances (I hope you remember that we discussed about
the worker instances).
I thought it would be a good one to try in EC2. So, it doesn't work out,
does it?
Thank You
On Tue, Feb 24,
No, I think I am ok with the time it takes.
Just that, with the increase in the partitions along with the increase in
the number of workers, I want to see the improvement in the performance of
an application.
I just want to see this happen.
Any comments?
Thank You
On Tue, Feb 24, 2015 at 8:52
Hi,
I am sorry that I made a mistake on AWS tarif. You can read the email of
sean owen which explains better the strategies to run spark on AWS.
For your question: it means that you just download spark and unzip it. Then
run spark shell by ./bin/spark-shell or ./bin/pyspark. It is useful to get
Thank You Akhil. Will look into it.
Its free, isn't it? I am still a student :)
On Tue, Feb 24, 2015 at 9:06 PM, Akhil Das ak...@sigmoidanalytics.com
wrote:
If you signup for Google Compute Cloud, you will get free $300 credits for
3 months and you can start a pretty good cluster for your
You can definitely, easily, try a 1-node standalone cluster for free.
Just don't be surprised when the CPU capping kicks in within about 5
minutes of any non-trivial computation and suddenly the instance is
very s-l-o-w.
I would consider just paying the ~$0.07/hour to play with an
m3.medium,
This should help you understand the cost of running a Spark cluster for a
short period of time:
http://www.ec2instances.info/
If you run an instance for even 1 second of a single hour you are charged
for that complete hour. So before you shut down your miniature cluster make
sure you really are
If you signup for Google Compute Cloud, you will get free $300 credits for
3 months and you can start a pretty good cluster for your testing purposes.
:)
Thanks
Best Regards
On Tue, Feb 24, 2015 at 8:25 PM, Deep Pradhan pradhandeep1...@gmail.com
wrote:
Hi,
I have just signed up for Amazon AWS
Yes it is :)
Thanks
Best Regards
On Tue, Feb 24, 2015 at 9:09 PM, Deep Pradhan pradhandeep1...@gmail.com
wrote:
Thank You Akhil. Will look into it.
Its free, isn't it? I am still a student :)
On Tue, Feb 24, 2015 at 9:06 PM, Akhil Das ak...@sigmoidanalytics.com
wrote:
If you signup for
Kindly bear with my questions as I am new to this.
If you run spark on local mode on a ec2 machine
What does this mean? Is it that I launch Spark cluster from my local
machine,i.e., by running the shell script that is there in /spark/ec2?
On Tue, Feb 24, 2015 at 8:32 PM, gen tang
Hi,
As a real spark cluster needs a least one master and one slaves, you need
to launch two machine. Therefore the second machine is not free.
However, If you run spark on local mode on a ec2 machine. It is free.
The charge of AWS depends on how much and the types of machine that you
launched,
Thank You All.
I think I will look into paying ~$0.7/hr as Sean suggested.
On Tue, Feb 24, 2015 at 9:01 PM, gen tang gen.tan...@gmail.com wrote:
Hi,
I am sorry that I made a mistake on AWS tarif. You can read the email of
sean owen which explains better the strategies to run spark on AWS.
I don't see anything that says you must explicitly restart them to load the
new settings, but usually there is some sort of signal trapped [or brute
force full restart] to get a configuration reload for most daemons. I'd
take a guess and use the $SPARK_HOME/sbin/{stop,start}-slaves.sh scripts on
Oh yeah, they picked up changes after restart, thanks!
On Thu, Feb 5, 2015 at 8:13 PM, Charles Feduke charles.fed...@gmail.com
wrote:
I don't see anything that says you must explicitly restart them to load
the new settings, but usually there is some sort of signal trapped [or
brute force full
We found the problem and already fixed it. Basically, spark-ec2 requires ec2
instances to have external ip addresses. You need to specify this in the ASW
console.
From: nicholas.cham...@gmail.com
Date: Tue, 27 Jan 2015 17:19:21 +
Subject: Re: spark 1.2 ec2 launch script hang
fixed it. Basically, spark-ec2
requires ec2 instances to have external ip addresses. You need to specify
this in the ASW console.
--
From: nicholas.cham...@gmail.com
Date: Tue, 27 Jan 2015 17:19:21 +
Subject: Re: spark 1.2 ec2 launch script hang
To: charles.fed
addresses. You need to specify
this in the ASW console.
--
From: nicholas.cham...@gmail.com
Date: Tue, 27 Jan 2015 17:19:21 +
Subject: Re: spark 1.2 ec2 launch script hang
To: charles.fed...@gmail.com; pzybr...@gmail.com; eyc...@hotmail.com
CC: user
...@gmail.com
Date: Tue, 27 Jan 2015 17:19:21 +
Subject: Re: spark 1.2 ec2 launch script hang
To: charles.fed...@gmail.com; pzybr...@gmail.com; eyc...@hotmail.com
CC: user@spark.apache.org
For those who found that absolute vs. relative path for the pem file
mattered, what OS and shell
instances to have external ip addresses. You need to specify
this in the ASW console.
--
From: nicholas.cham...@gmail.com
Date: Tue, 27 Jan 2015 17:19:21 +
Subject: Re: spark 1.2 ec2 launch script hang
To: charles.fed...@gmail.com; pzybr...@gmail.com; eyc
addresses. You need to
specify
this in the ASW console.
--
From: nicholas.cham...@gmail.com
Date: Tue, 27 Jan 2015 17:19:21 +
Subject: Re: spark 1.2 ec2 launch script hang
To: charles.fed...@gmail.com; pzybr...@gmail.com; eyc...@hotmail.com
CC: user
to specify
this in the ASW console.
--
From: nicholas.cham...@gmail.com
Date: Tue, 27 Jan 2015 17:19:21 +
Subject: Re: spark 1.2 ec2 launch script hang
To: charles.fed...@gmail.com; pzybr...@gmail.com; eyc...@hotmail.com
CC: user@spark.apache.org
For those who
Absolute path means no ~ and also verify that you have the path to the file
correct. For some reason the Python code does not validate that the file
exists and will hang (this is the same reason why ~ hangs).
On Mon, Jan 26, 2015 at 10:08 PM Pete Zybrick pzybr...@gmail.com wrote:
Try using an
For those who found that absolute vs. relative path for the pem file
mattered, what OS and shell are you using? What version of Spark are you
using?
~/ vs. absolute path shouldn’t matter. Your shell will expand the ~/ to the
absolute path before sending it to spark-ec2. (i.e. tilde expansion.)
Try using an absolute path to the pem file
On Jan 26, 2015, at 8:57 PM, ey-chih chow eyc...@hotmail.com wrote:
Hi,
I used the spark-ec2 script of spark 1.2 to launch a cluster. I have
modified the script according to
Hi Gilberto,
Could you please attach the driver logs as well, so that we can pinpoint what's
going wrong? Could you also add the flag
`--driver-memory 4g` while submitting your application and try that as well?
Best,
Burak
- Original Message -
From: Gilberto Lira g...@scanboo.com.br
The script should be there, in the spark/bin directory. What command did you
use to launch the cluster?
Matei
On Jul 14, 2014, at 1:12 PM, Josh Happoldt josh.happo...@trueffect.com wrote:
Hi All,
I've used the spark-ec2 scripts to build a simple 1.0.1 Standalone cluster on
EC2. It
Hmm.. you've gotten further than me. Which AMI's are you using?
On Sun, Jun 1, 2014 at 2:21 PM, superback andrew.matrix.c...@gmail.com
wrote:
Hi,
I am trying to run an example on AMAZON EC2 and have successfully
set up one cluster with two nodes on EC2. However, when I was testing an
I haven't set up AMI yet. I am just trying to run a simple job on the EC2
cluster. So, is setting up AMI a prerequisite for running simple Spark
example like org.apache.spark.examples.GroupByTest?
--
View this message in context:
No, you don't have to set up your own AMI. Actually it's probably simpler
and less error prone if you let spark-ec2 manage that for you as you first
start to get comfortable with Spark. Just spin up a cluster without any
explicit mention of AMI and it will do the right thing.
2014년 6월 1일 일요일,
31 matches
Mail list logo