I’m not sure if I’m completely answering your question here but I’m currently 
working (on OSX) with Hadoop 2.5 and I used the Spark 1.1 with Hadoop 2.4 
without any issues.


On September 11, 2014 at 18:11:46, Haopu Wang (hw...@qilinsoft.com) wrote:

I see the binary packages include hadoop 1, 2.3 and 2.4.  
Does Spark 1.1.0 support hadoop 2.5.0 at below address?  

http://hadoop.apache.org/releases.html#11+August%2C+2014%3A+Release+2.5.0+available
  

-----Original Message-----  
From: Patrick Wendell [mailto:pwend...@gmail.com]  
Sent: Friday, September 12, 2014 8:13 AM  
To: d...@spark.apache.org; user@spark.apache.org  
Subject: Announcing Spark 1.1.0!  

I am happy to announce the availability of Spark 1.1.0! Spark 1.1.0 is  
the second release on the API-compatible 1.X line. It is Spark's  
largest release ever, with contributions from 171 developers!  

This release brings operational and performance improvements in Spark  
core including a new implementation of the Spark shuffle designed for  
very large scale workloads. Spark 1.1 adds significant extensions to  
the newest Spark modules, MLlib and Spark SQL. Spark SQL introduces a  
JDBC server, byte code generation for fast expression evaluation, a  
public types API, JSON support, and other features and optimizations.  
MLlib introduces a new statistics library along with several new  
algorithms and optimizations. Spark 1.1 also builds out Spark's Python  
support and adds new components to the Spark Streaming module.  

Visit the release notes [1] to read about the new features, or  
download [2] the release today.  

[1] http://spark.eu.apache.org/releases/spark-release-1-1-0.html  
[2] http://spark.eu.apache.org/downloads.html  

NOTE: SOME ASF DOWNLOAD MIRRORS WILL NOT CONTAIN THE RELEASE FOR SEVERAL HOURS. 
 

Please e-mail me directly for any type-o's in the release notes or name 
listing.  

Thanks, and congratulations!  
- Patrick  

---------------------------------------------------------------------  
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org  
For additional commands, e-mail: user-h...@spark.apache.org  

Reply via email to