On 8 Mar 2016, at 07:23, Lu, Yingqi
> wrote:
Thank you for the quick reply. I am very new to maven and always use the
default settings. Can you please be a little more specific on the instructions?
I think all the jar files from Hadoop build are
Hi, there,
You may try to use nexus to establish maven local repository. I think this link
would be helpful.
http://www.sonatype.org/nexus/2015/02/27/setup-local-nexus-repository-and-deploy-war-file-from-maven/
After you had done the repository, you may use maven-deploy-plugin to deploy
your
I think the first step is to publish your in-house built Hadoop related
jars to your local maven or ivy repo, and then change the Spark building
profiles like -Phadoop-2.x (you could use 2.7 or you have to change the pom
file if you met jar conflicts) -Dhadoop.version=3.0.0-SNAPSHOT to build
Thank you for the quick reply. I am very new to maven and always use the
default settings. Can you please be a little more specific on the instructions?
I think all the jar files from Hadoop build are located at
Hadoop-3.0.0-SNAPSHOT/share/hadoop. Which ones I need to use to compile Spark
and
I think you can establish your own maven repository and deploy your modified
hadoop binary jar
with your modified version number. Then you can add your repository in spark
pom.xml and use
mvn -Dhadoop.version=
fightf...@163.com
From: Lu, Yingqi
Date: 2016-03-08 15:09
To: