Github user JoshRosen commented on the pull request:
https://github.com/apache/spark/pull/2872#issuecomment-67666538
Couple of quick questions about this, just to confirm: do I always need to
specify the `--subnet-id` option when using the `--vpc-id` option? Do I have
to run
Github user mvj101 commented on the pull request:
https://github.com/apache/spark/pull/2872#issuecomment-67681180
Haven't worked with this in a while and different versions of boto may
alter things, but
1. You might not need to specify a subnet if you have a default VPC
Github user JoshRosen commented on the pull request:
https://github.com/apache/spark/pull/2872#issuecomment-67562466
I've opened a PR to upgrade the Boto version, which fixes this issue: #3737
---
If your project is set up for it, you can reply to this email and have your
reply
Github user dreid93 commented on the pull request:
https://github.com/apache/spark/pull/2872#issuecomment-67428078
@jontg a coffee for you sir @ChangeTip
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project
Github user dreid93 commented on the pull request:
https://github.com/apache/spark/pull/2872#issuecomment-67428485
@changetip does not appear to be picking up my mentions and sending the
appropriate tip. :/
---
If your project is set up for it, you can reply to this email and have
Github user JoshRosen commented on the pull request:
https://github.com/apache/spark/pull/2872#issuecomment-67429749
It looks like this PR may have broken the ability to launch spot clusters:
```python
Traceback (most recent call last):
File ./spark_ec2.py, line
Github user mvj101 commented on the pull request:
https://github.com/apache/spark/pull/2872#issuecomment-67432970
Oops, apologies for this breakage. I haven't worked with spot instances.
Feel free to revert this pull request and I or someone else can address that
corner case as time
Github user JoshRosen commented on the pull request:
https://github.com/apache/spark/pull/2872#issuecomment-67434318
I or someone else can address that corner case as time allows.
Let's just update boto. I'll submit a PR for this shortly.
---
If your project is set up for
Github user JoshRosen commented on the pull request:
https://github.com/apache/spark/pull/2872#issuecomment-67434751
Actually, I'm going to revert this for now. Looks like the `boto` update
will take a bit more work than I thought.
---
If your project is set up for it, you can
Github user JoshRosen commented on the pull request:
https://github.com/apache/spark/pull/2872#issuecomment-67434905
Ugh, this doesn't revert cleanly due to another patch that I merged. I've
go to go, so I'm just going to leave this for now. Someone else can deal with
this if it's
Github user mvj101 commented on the pull request:
https://github.com/apache/spark/pull/2872#issuecomment-67435661
Ok, I'll send a PR to revert in a few minutes.
Thanks,
Mike
---
If your project is set up for it, you can reply to this email and have your
reply appear on
Github user mvj101 commented on the pull request:
https://github.com/apache/spark/pull/2872#issuecomment-67437100
https://github.com/apache/spark/pull/3728
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/2872#issuecomment-67124660
[Test build #24488 has
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/24488/consoleFull)
for PR 2872 at commit
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/2872#issuecomment-67132172
Test PASSed.
Refer to this link for build results (access rights to CI server needed):
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/2872#issuecomment-67132162
[Test build #24488 has
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/24488/consoleFull)
for PR 2872 at commit
Github user JoshRosen commented on the pull request:
https://github.com/apache/spark/pull/2872#issuecomment-67223750
Thanks for fixing up the style issue. This looks good to me, so I'll merge
this into `master`. Thanks for your patience with the slow review!
---
If your project is
Github user asfgit closed the pull request at:
https://github.com/apache/spark/pull/2872
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is
Github user changetip commented on the pull request:
https://github.com/apache/spark/pull/2872#issuecomment-67276643
Hi @amar-analytx, @dreid93 sent you a Bitcoin tip worth a coffee (4,526
bits/$1.50), and I'm here to deliver it â **[collect your
Github user dreid93 commented on the pull request:
https://github.com/apache/spark/pull/2872#issuecomment-67276649
@mvj101 a coffee for you sir @ChangeTip
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project
Github user dreid93 commented on the pull request:
https://github.com/apache/spark/pull/2872#issuecomment-67276623
@amar-analytx here's a coffee for making the gist that @mvj101 based his
initial PR on. @ChangeTip
---
If your project is set up for it, you can reply to this email and
Github user dreid93 commented on the pull request:
https://github.com/apache/spark/pull/2872#issuecomment-67276691
@jontg may I buy you a coffee for your work helping people with this issue?
@ChangeTip
---
If your project is set up for it, you can reply to this email and have your
Github user dreid93 commented on the pull request:
https://github.com/apache/spark/pull/2872#issuecomment-67276753
@JoshRosen thanks for merging this in. Here's a coffee @ChangeTip
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub
Github user JoshRosen commented on a diff in the pull request:
https://github.com/apache/spark/pull/2872#discussion_r21873371
--- Diff: ec2/spark_ec2.py ---
@@ -303,12 +307,17 @@ def launch_cluster(conn, opts, cluster_name):
user_data_content =
Github user JoshRosen commented on a diff in the pull request:
https://github.com/apache/spark/pull/2872#discussion_r21873430
--- Diff: ec2/spark_ec2.py ---
@@ -341,11 +355,11 @@ def launch_cluster(conn, opts, cluster_name):
if opts.ami is None:
opts.ami =
Github user JoshRosen commented on the pull request:
https://github.com/apache/spark/pull/2872#issuecomment-67101443
Since this is an often-requested feature, we should mention this in the EC2
documentation page:
https://github.com/apache/spark/blob/master/docs/ec2-scripts.md
---
Github user JoshRosen commented on a diff in the pull request:
https://github.com/apache/spark/pull/2872#discussion_r21873865
--- Diff: ec2/spark_ec2.py ---
@@ -162,6 +162,10 @@ def parse_args():
parser.add_option(
--copy-aws-credentials, action=store_true,
Github user JoshRosen commented on the pull request:
https://github.com/apache/spark/pull/2872#issuecomment-67101983
Overall, this looks good to me. I left a couple of nitpicky comments, but
besides that + documentation, I'd be happy to merge this.
To address a question
Github user mvj101 commented on the pull request:
https://github.com/apache/spark/pull/2872#issuecomment-67115653
Thanks, I believe I've updated the code according to your comments.
Mike
---
If your project is set up for it, you can reply to this email and have your
reply
Github user JoshRosen commented on the pull request:
https://github.com/apache/spark/pull/2872#issuecomment-67118614
Even though we don't have Jenkins tests for the EC2 scripts, I'm just going
to have Jenkins run this so that I can avoid an inadvertent build break.
Jenkins,
Github user JoshRosen commented on the pull request:
https://github.com/apache/spark/pull/2872#issuecomment-67123620
Jenkins, this is ok to test.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/2872#issuecomment-67123879
[Test build #24486 has
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/24486/consoleFull)
for PR 2872 at commit
Github user SparkQA commented on the pull request:
https://github.com/apache/spark/pull/2872#issuecomment-67123939
[Test build #24486 has
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/24486/consoleFull)
for PR 2872 at commit
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/2872#issuecomment-67123940
Test FAILed.
Refer to this link for build results (access rights to CI server needed):
Github user mvj101 commented on the pull request:
https://github.com/apache/spark/pull/2872#issuecomment-67124150
Fixing style issues now.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
Github user tylerprete commented on the pull request:
https://github.com/apache/spark/pull/2872#issuecomment-66894914
@jontg thanks for the help. Turned on dns and now everything is working.
---
If your project is set up for it, you can reply to this email and have your
reply appear
Github user dreid93 commented on the pull request:
https://github.com/apache/spark/pull/2872#issuecomment-66823256
Just a heads up / bump. I am buying everyone a coffee (
in bitcoin ;) ) who contributes to getting this merged in!
---
If your project is set up for it, you can
Github user jontg commented on the pull request:
https://github.com/apache/spark/pull/2872#issuecomment-66840618
@tylerprete That might occur if your VPC is not set up to auto-assign DNS
records. If you can, that is where I would suggest beginning an investigation.
---
If your
Github user tylerprete commented on the pull request:
https://github.com/apache/spark/pull/2872#issuecomment-5398
@jontg I'm using this patch with your modifications (private_ip_address),
but I'm getting the following errors when the script tries and starts the
master:
Github user jeffsteinmetz commented on the pull request:
https://github.com/apache/spark/pull/2872#issuecomment-66229322
The EC2 docs could also be updated to include these new switches.
---
If your project is set up for it, you can reply to this email and have your
reply appear on
Github user brdw commented on the pull request:
https://github.com/apache/spark/pull/2872#issuecomment-65801064
I'd love to see this as well. We have a strict vpc policy.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well.
Github user changetip commented on the pull request:
https://github.com/apache/spark/pull/2872#issuecomment-65814859
Hi mvj101, dreid93 sent you a Bitcoin tip worth 1 lunch (21,255
bits/$8.00), and I'm here to deliver it â **[collect your tip at
Github user dreid93 commented on the pull request:
https://github.com/apache/spark/pull/2872#issuecomment-65814791
I'll buy anyone willing to take care of this merge lunch via @ChangeTip :)
---
If your project is set up for it, you can reply to this email and have your
reply appear
Github user evanv commented on the pull request:
https://github.com/apache/spark/pull/2872#issuecomment-65867098
Is VPC support slated for the next maintenance release? Support for VPCs is
definitely needed for a lot of us, and it'd be great if we didn't have to patch
it ourselves.
Github user Nypias commented on the pull request:
https://github.com/apache/spark/pull/2872#issuecomment-64077726
I would be interested in merging this patch as well :)
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If
Github user jontg commented on the pull request:
https://github.com/apache/spark/pull/2872#issuecomment-64087720
We had a couple of issues with this patch, in particular the script depends
on instances having a public dns name or ip. I had to modify the script a
little to get our
Github user jontg commented on the pull request:
https://github.com/apache/spark/pull/2872#issuecomment-63700947
+1
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and
GitHub user mvj101 opened a pull request:
https://github.com/apache/spark/pull/2872
[SPARK-3405] add subnet-id and vpc-id options to spark_ec2.py
Based on this gist:
https://gist.github.com/amar-analytx/0b62543621e1f246c0a2
We use security group ids instead of security
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/2872#issuecomment-59884935
Can one of the admins verify this patch?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your
Github user dreid93 commented on the pull request:
https://github.com/apache/spark/pull/2872#issuecomment-59987812
Awesome! I am glad to see that this was a priority to someone with the
time. :+1:
---
If your project is set up for it, you can reply to this email and have your
reply
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/2872#issuecomment-60013980
Can one of the admins verify this patch?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your
50 matches
Mail list logo