spark-ec2 1.0.2 creates EC2 cluster at wrong version

2014-08-26 Thread Nicholas Chammas
I downloaded the source code release for 1.0.2 from here
http://spark.apache.org/downloads.html and launched an EC2 cluster using
spark-ec2.

After the cluster finishes launching, I fire up the shell and check the
version:

scala sc.version
res1: String = 1.0.1

The startup banner also shows the same thing. Hmm...

So I dig around and find that the spark_ec2.py script has the default Spark
version set to 1.0.1.

Derp.

  parser.add_option(-v, --spark-version, default=1.0.1,
  help=Version of Spark to use: 'X.Y.Z' or a specific git hash)

Is there any way to fix the release? It’s a minor issue, but could be very
confusing. And how can we prevent this from happening again?

Nick
​


Re: spark-ec2 1.0.2 creates EC2 cluster at wrong version

2014-08-26 Thread Shivaram Venkataraman
This is a chicken and egg problem in some sense. We can't change the ec2
script till we have made the release and uploaded the binaries -- But once
that is done, we can't update the script.

I think the model we support so far  is that you can launch the latest
spark version from the master branch on github. I guess we can try to add
something in the release process that updates the script but doesn't commit
it ? The release managers might be able to add more.

Thanks
Shivaram


On Tue, Aug 26, 2014 at 1:16 PM, Nicholas Chammas 
nicholas.cham...@gmail.com wrote:

 I downloaded the source code release for 1.0.2 from here
 http://spark.apache.org/downloads.html and launched an EC2 cluster using
 spark-ec2.

 After the cluster finishes launching, I fire up the shell and check the
 version:

 scala sc.version
 res1: String = 1.0.1

 The startup banner also shows the same thing. Hmm...

 So I dig around and find that the spark_ec2.py script has the default Spark
 version set to 1.0.1.

 Derp.

   parser.add_option(-v, --spark-version, default=1.0.1,
   help=Version of Spark to use: 'X.Y.Z' or a specific git hash)

 Is there any way to fix the release? It’s a minor issue, but could be very
 confusing. And how can we prevent this from happening again?

 Nick
 ​



Re: spark-ec2 1.0.2 creates EC2 cluster at wrong version

2014-08-26 Thread Matei Zaharia
This shouldn't be a chicken-and-egg problem, since the script fetches the AMI 
from a known URL. Seems like an issue in publishing this release.

On August 26, 2014 at 1:24:45 PM, Shivaram Venkataraman 
(shiva...@eecs.berkeley.edu) wrote:

This is a chicken and egg problem in some sense. We can't change the ec2  
script till we have made the release and uploaded the binaries -- But once  
that is done, we can't update the script.  

I think the model we support so far is that you can launch the latest  
spark version from the master branch on github. I guess we can try to add  
something in the release process that updates the script but doesn't commit  
it ? The release managers might be able to add more.  

Thanks  
Shivaram  


On Tue, Aug 26, 2014 at 1:16 PM, Nicholas Chammas   
nicholas.cham...@gmail.com wrote:  

 I downloaded the source code release for 1.0.2 from here  
 http://spark.apache.org/downloads.html and launched an EC2 cluster using  
 spark-ec2.  
  
 After the cluster finishes launching, I fire up the shell and check the  
 version:  
  
 scala sc.version  
 res1: String = 1.0.1  
  
 The startup banner also shows the same thing. Hmm...  
  
 So I dig around and find that the spark_ec2.py script has the default Spark  
 version set to 1.0.1.  
  
 Derp.  
  
 parser.add_option(-v, --spark-version, default=1.0.1,  
 help=Version of Spark to use: 'X.Y.Z' or a specific git hash)  
  
 Is there any way to fix the release? It’s a minor issue, but could be very  
 confusing. And how can we prevent this from happening again?  
  
 Nick  
 ​  
  


Re: spark-ec2 1.0.2 creates EC2 cluster at wrong version

2014-08-26 Thread Tathagata Das
Yes, this was an oversight on my part. I have opened a JIRA for this.
https://issues.apache.org/jira/browse/SPARK-3242

For the time being the workaround should be providing the version 1.0.2
explicitly as part of the script.

TD


On Tue, Aug 26, 2014 at 6:39 PM, Matei Zaharia matei.zaha...@gmail.com
wrote:

 This shouldn't be a chicken-and-egg problem, since the script fetches the
 AMI from a known URL. Seems like an issue in publishing this release.

 On August 26, 2014 at 1:24:45 PM, Shivaram Venkataraman (
 shiva...@eecs.berkeley.edu) wrote:

 This is a chicken and egg problem in some sense. We can't change the ec2
 script till we have made the release and uploaded the binaries -- But once
 that is done, we can't update the script.

 I think the model we support so far is that you can launch the latest
 spark version from the master branch on github. I guess we can try to add
 something in the release process that updates the script but doesn't commit
 it ? The release managers might be able to add more.

 Thanks
 Shivaram


 On Tue, Aug 26, 2014 at 1:16 PM, Nicholas Chammas 
 nicholas.cham...@gmail.com wrote:

  I downloaded the source code release for 1.0.2 from here
  http://spark.apache.org/downloads.html and launched an EC2 cluster
 using
  spark-ec2.
 
  After the cluster finishes launching, I fire up the shell and check the
  version:
 
  scala sc.version
  res1: String = 1.0.1
 
  The startup banner also shows the same thing. Hmm...
 
  So I dig around and find that the spark_ec2.py script has the default
 Spark
  version set to 1.0.1.
 
  Derp.
 
  parser.add_option(-v, --spark-version, default=1.0.1,
  help=Version of Spark to use: 'X.Y.Z' or a specific git hash)
 
  Is there any way to fix the release? It’s a minor issue, but could be
 very
  confusing. And how can we prevent this from happening again?
 
  Nick
  ​