You can use --spark-version argument to spark-ec2 to specify a GIT hash
corresponding to the version you want to checkout. If you made changes that
are not in the master repository, you can use --spark-git-repo to specify
the git repository to pull down spark from, which contains the specified
Thanks Daniil! if I use --spark-git-repo, is there a way to specify the mvn
command line parameters? like following
mvn -Pyarn -Phadoop-2.3 -Dhadoop.version=2.3.0 -DskipTests clean package
mvn -Pyarn -Phadoop-2.3 -Phbase-hadoop2 -Dhadoop.version=2.3.0 -DskipTests
clean package
--
View this
I modified the pom files in my private repo to use those parameters as
default to solve the problem. But after the deployment, I found the
installed version is not the customized version, but an official one. Anyone
please give a hint on how the spark-ec2 work with spark from private repos..
--
Thanks for the help!
Hadoop version: 2.3.0
Hbase version: 0.98.1
Use python to read/write data from/to hbase.
Only change over the official spark 1.1.0 is the pom file under examples.
Compilation:
spark:mvn -Pyarn -Phadoop-2.3 -Dhadoop.version=2.3.0 -DskipTests clean
package
Hi,
Can you post what the error looks like?
Sameer F.
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Usage-of-spark-ec2-how-to-deploy-a-revised-version-of-spark-1-1-0-tp16943p16963.html
Sent from the Apache Spark User List mailing list archive at