Re: Please keep s3://spark-related-packages/ alive

2018-03-01 Thread Nicholas Chammas
Marton, Thanks for the tip. (Too bad the docs referenced from the issue I opened with INFRA make no mention of mirrors.cgi.) Matei, A Requester Pays bucket is a good idea. I was trying to avoid

Re: Please keep s3://spark-related-packages/ alive

2018-02-28 Thread Marton, Elek
2. *Apache mirrors are inconvenient to use.* When you download something from an Apache mirror, you get a link like this one . Instead of automatically redirecting you to your download, though,

Re: Please keep s3://spark-related-packages/ alive

2018-02-27 Thread Matei Zaharia
For Flintrock, have you considered using a Requester Pays bucket? That way you’d get the availability of S3 without having to foot the bill for bandwidth yourself (which was the bulk of the cost for the old bucket). Matei > On Feb 27, 2018, at 4:35 PM, Nicholas Chammas

Re: Please keep s3://spark-related-packages/ alive

2018-02-27 Thread Nicholas Chammas
So is there no hope for this S3 bucket, or room to replace it with a bucket owned by some organization other than AMPLab (which is technically now defunct , I guess)? Sorry to persist, but I just have to ask. On Tue, Feb 27, 2018 at 10:36 AM Michael

Re: Please keep s3://spark-related-packages/ alive

2018-02-27 Thread Michael Heuer
On Tue, Feb 27, 2018 at 8:17 AM, Sean Owen wrote: > See http://apache-spark-developers-list.1001551.n3.nabble.com/What-is- > d3kbcqa49mib13-cloudfront-net-td22427.html -- it was 'retired', yes. > > Agree with all that, though they're intended for occasional individual use > and

Re: Please keep s3://spark-related-packages/ alive

2018-02-27 Thread Sean Owen
See http://apache-spark-developers-list.1001551.n3.nabble.com/What-is-d3kbcqa49mib13-cloudfront-net-td22427.html -- it was 'retired', yes. Agree with all that, though they're intended for occasional individual use and not a case where performance and uptime matter. For that, I think you'd want to

Re: Please keep s3://spark-related-packages/ alive

2018-02-27 Thread Reynold Xin
This was actually an AMPLab bucket. On Feb 27, 2018, 6:04 PM +1300, Holden Karau , wrote: > Thanks Nick, we deprecated this during the roll over to the new release > managers. I assume this bucket was maintained by someone at databricks so > maybe they can chime in. > >

Re: Please keep s3://spark-related-packages/ alive

2018-02-26 Thread Holden Karau
Thanks Nick, we deprecated this during the roll over to the new release managers. I assume this bucket was maintained by someone at databricks so maybe they can chime in. On Feb 26, 2018 8:57 PM, "Nicholas Chammas" wrote: If you go to the Downloads

Please keep s3://spark-related-packages/ alive

2018-02-26 Thread Nicholas Chammas
If you go to the Downloads page and download Spark 2.2.1, you’ll get a link to an Apache mirror. It didn’t use to be this way. As recently as Spark 2.2.0, downloads were served via CloudFront , which was backed by an S3