Re: How to compile Spark with private build of Hadoop

2016-03-08 Thread Steve Loughran
On 8 Mar 2016, at 07:23, Lu, Yingqi > wrote: Thank you for the quick reply. I am very new to maven and always use the default settings. Can you please be a little more specific on the instructions? I think all the jar files from Hadoop build are

Re: RE: How to compile Spark with private build of Hadoop

2016-03-08 Thread fightf...@163.com
Hi, there, You may try to use nexus to establish maven local repository. I think this link would be helpful. http://www.sonatype.org/nexus/2015/02/27/setup-local-nexus-repository-and-deploy-war-file-from-maven/ After you had done the repository, you may use maven-deploy-plugin to deploy your

Re: How to compile Spark with private build of Hadoop

2016-03-08 Thread Saisai Shao
I think the first step is to publish your in-house built Hadoop related jars to your local maven or ivy repo, and then change the Spark building profiles like -Phadoop-2.x (you could use 2.7 or you have to change the pom file if you met jar conflicts) -Dhadoop.version=3.0.0-SNAPSHOT to build

RE: How to compile Spark with private build of Hadoop

2016-03-08 Thread Lu, Yingqi
Thank you for the quick reply. I am very new to maven and always use the default settings. Can you please be a little more specific on the instructions? I think all the jar files from Hadoop build are located at Hadoop-3.0.0-SNAPSHOT/share/hadoop. Which ones I need to use to compile Spark and

Re: How to compile Spark with private build of Hadoop

2016-03-07 Thread fightf...@163.com
I think you can establish your own maven repository and deploy your modified hadoop binary jar with your modified version number. Then you can add your repository in spark pom.xml and use mvn -Dhadoop.version= fightf...@163.com From: Lu, Yingqi Date: 2016-03-08 15:09 To: