https://github.com/apache/spark/blob/fd8d283eeb98e310b1e85ef8c3a8af9e547ab5e0/ec2/spark_ec2.py#L162-L164

Is there any reason we shouldn't update the default Hadoop major version in
spark-ec2 to 2?

Nick

Reply via email to