I don't think we have a firm contract around that. So far we've never
removed old artifacts, but the ASF has asked us at time to decrease the
size of binaries we post. In the future at some point we may drop older
ones since we keep adding new ones.

If downstream projects are depending on our artifacts, I'd say just hold
tight for now until something changes. If it changes, then those projects
might need to build Spark on their own and host older hadoop versions, etc.

On Wed, Oct 7, 2015 at 9:59 AM, Nicholas Chammas <nicholas.cham...@gmail.com
> wrote:

> Thanks guys.
>
> Regarding this earlier question:
>
> More importantly, is there some rough specification for what packages we
> should be able to expect in this S3 bucket with every release?
>
> Is the implied answer that we should continue to expect the same set of
> artifacts for every release for the foreseeable future?
>
> Nick
> ​
>
> On Tue, Oct 6, 2015 at 1:13 AM Patrick Wendell <pwend...@gmail.com> wrote:
>
>> The missing artifacts are uploaded now. Things should propagate in the
>> next 24 hours. If there are still issues past then ping this thread. Thanks!
>>
>> - Patrick
>>
>> On Mon, Oct 5, 2015 at 2:41 PM, Nicholas Chammas <
>> nicholas.cham...@gmail.com> wrote:
>>
>>> Thanks for looking into this Josh.
>>>
>>> On Mon, Oct 5, 2015 at 5:39 PM Josh Rosen <joshro...@databricks.com>
>>> wrote:
>>>
>>>> I'm working on a fix for this right now. I'm planning to re-run a
>>>> modified copy of the release packaging scripts which will emit only the
>>>> missing artifacts (so we won't upload new artifacts with different SHAs for
>>>> the builds which *did* succeed).
>>>>
>>>> I expect to have this finished in the next day or so; I'm currently
>>>> blocked by some infra downtime but expect that to be resolved soon.
>>>>
>>>> - Josh
>>>>
>>>> On Mon, Oct 5, 2015 at 8:46 AM, Nicholas Chammas <
>>>> nicholas.cham...@gmail.com> wrote:
>>>>
>>>>> Blaž said:
>>>>>
>>>>> Also missing is
>>>>> http://s3.amazonaws.com/spark-related-packages/spark-1.5.1-bin-hadoop1.tgz
>>>>> which breaks spark-ec2 script.
>>>>>
>>>>> This is the package I am referring to in my original email.
>>>>>
>>>>> Nick said:
>>>>>
>>>>> It appears that almost every version of Spark up to and including
>>>>> 1.5.0 has included a —bin-hadoop1.tgz release (e.g.
>>>>> spark-1.5.0-bin-hadoop1.tgz). However, 1.5.1 has no such package.
>>>>>
>>>>> Nick
>>>>> ​
>>>>>
>>>>> On Mon, Oct 5, 2015 at 3:27 AM Blaž Šnuderl <snud...@gmail.com> wrote:
>>>>>
>>>>>> Also missing is http://s3.amazonaws.com/spark-related-packages/spark-
>>>>>> 1.5.1-bin-hadoop1.tgz which breaks spark-ec2 script.
>>>>>>
>>>>>> On Mon, Oct 5, 2015 at 5:20 AM, Ted Yu <yuzhih...@gmail.com> wrote:
>>>>>>
>>>>>>> hadoop1 package for Scala 2.10 wasn't in RC1 either:
>>>>>>>
>>>>>>> http://people.apache.org/~pwendell/spark-releases/spark-1.5.1-rc1-bin/
>>>>>>>
>>>>>>> On Sun, Oct 4, 2015 at 5:17 PM, Nicholas Chammas <
>>>>>>> nicholas.cham...@gmail.com> wrote:
>>>>>>>
>>>>>>>> I’m looking here:
>>>>>>>>
>>>>>>>> https://s3.amazonaws.com/spark-related-packages/
>>>>>>>>
>>>>>>>> I believe this is where one set of official packages is published.
>>>>>>>> Please correct me if this is not the case.
>>>>>>>>
>>>>>>>> It appears that almost every version of Spark up to and including
>>>>>>>> 1.5.0 has included a --bin-hadoop1.tgz release (e.g.
>>>>>>>> spark-1.5.0-bin-hadoop1.tgz).
>>>>>>>>
>>>>>>>> However, 1.5.1 has no such package. There is a
>>>>>>>> spark-1.5.1-bin-hadoop1-scala2.11.tgz package, but this is a
>>>>>>>> separate thing. (1.5.0 also has a hadoop1-scala2.11 package.)
>>>>>>>>
>>>>>>>> Was this intentional?
>>>>>>>>
>>>>>>>> More importantly, is there some rough specification for what
>>>>>>>> packages we should be able to expect in this S3 bucket with every 
>>>>>>>> release?
>>>>>>>>
>>>>>>>> This is important for those of us who depend on this publishing
>>>>>>>> venue (e.g. spark-ec2 and related tools).
>>>>>>>>
>>>>>>>> Nick
>>>>>>>> ​
>>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>
>>>>
>>

Reply via email to