oh, i see pwendell is did a patch to the release branch to make the release version == --spark-version default

best,


matt

On 09/03/2014 01:30 PM, Shivaram Venkataraman wrote:
Actually the circular dependency doesn't depend on the spark-ec2 scripts
-- The scripts contain download links to many Spark versions and you can
configure which one should be used.

Shivaram


On Wed, Sep 3, 2014 at 10:22 AM, Matthew Farrellee <m...@redhat.com
<mailto:m...@redhat.com>> wrote:

    that's not a bad idea. it would also break the circular dep in
    versions that results in spark X's ec2 script installing spark X-1
    by default.

    best,


    matt


    On 09/03/2014 01:17 PM, Shivaram Venkataraman wrote:

        The spark-ec2 repository isn't a part of Mesos. Back in the
        days, Spark
        used to be hosted in the Mesos github organization as well and
        so we put
        scripts that were used by Spark under the same organization.

        FWIW I don't think these scripts belong in the Spark repository.
        They are
        helper scripts that setup EC2 clusters with different components
        like HDFS,
        Spark, Tachyon etc. Also one of the motivations for creating this
        repository was the ability to change these scripts without
        requiring a new
        Spark release or a new AMI etc.

        We can move the repository to a different github organization
        like AMPLab
        if that'll make sense.

        Thanks
        Shivaram


        On Wed, Sep 3, 2014 at 10:06 AM, Nicholas Chammas <
        nicholas.cham...@gmail.com <mailto:nicholas.cham...@gmail.com>>
        wrote:

            Spawned by this discussion
            <https://github.com/apache/__spark/pull/1120#issuecomment-__54305831
            <https://github.com/apache/spark/pull/1120#issuecomment-54305831>>.

            See these 2 lines in spark_ec2.py:

                 - spark_ec2 L42
                 <
            
https://github.com/apache/__spark/blob/__6a72a36940311fcb3429bd34c8818b__c7d513115c/ec2/spark_ec2.py#__L42
            
<https://github.com/apache/spark/blob/6a72a36940311fcb3429bd34c8818bc7d513115c/ec2/spark_ec2.py#L42>


                 - spark_ec2 L566
                 <
            
https://github.com/apache/__spark/blob/__6a72a36940311fcb3429bd34c8818b__c7d513115c/ec2/spark_ec2.py#__L566
            
<https://github.com/apache/spark/blob/6a72a36940311fcb3429bd34c8818bc7d513115c/ec2/spark_ec2.py#L566>



            Why does the spark-ec2 script depend on stuff in the Mesos
            repo? Should
            they be moved to the Spark repo?

            Nick
            ​






---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org

Reply via email to