You can use prebuilt version that is built upon hadoop2.4.
From: Siddharth Ubale Date: 2015-01-30 15:50 To: user@spark.apache.org Subject: Hi: hadoop 2.5 for spark Hi , I am beginner with Apache spark. Can anyone let me know if it is mandatory to build spark with the Hadoop version I am using or can I use a pre built package and use it with my existing HDFS root folder? I am using Hadoop 2.5.0 and want to use Apache spark 1.2.0 with it. I could see a pre built version for 2.4 and above in the downbloads section of Spark homepage -> downloads. Siddharth Ubale, Synchronized Communications #43, Velankani Tech Park, Block No. II, 3rd Floor, Electronic City Phase I, Bangalore – 560 100 Tel : +91 80 3202 4060 Web: www.syncoms.com London|Bangalore|Orlando we innovate, plan, execute, and transform the business 邮件带有附件预览链接,若您转发或回复此邮件时不希望对方预览附件,建议您手动删除链接。 共有 1 个附件 image001.jpg(3K) 极速下载 在线预览