I downloaded the source code release for 1.0.2 from here
<http://spark.apache.org/downloads.html> and launched an EC2 cluster using
spark-ec2.
After the cluster finishes launching, I fire up the shell and check the
version:
scala> sc.version
res1: String = 1.0.1
The startup banner also shows the same thing. Hmm...
So I dig around and find that the spark_ec2.py script has the default Spark
version set to 1.0.1.
Derp.
parser.add_option("-v", "--spark-version", default="1.0.1",
help="Version of Spark to use: 'X.Y.Z' or a specific git hash")
Is there any way to fix the release? It’s a minor issue, but could be very
confusing. And how can we prevent this from happening again?
Nick