It did hang for me too. High RAM consumption during build. Had to free a
lot of RAM and introduce swap memory just to get it build in my 3rd attempt.
Everything else looks fine. You can download the prebuilt versions from the
Spark homepage to save yourself from all this trouble.
Thanks,
Ritesh
\
I downloaded the latest Spark (1.3.) from github. Then I tried to build it.
First for scala 2.10 (and hadoop 2.4):
build/mvn -Pyarn -Phadoop-2.4 -Dhadoop.version=2.4.0 -DskipTests clean package
That resulted in hangup after printing bunch of line like
[INFO] Dependency-reduced POM written at
Have you run zinc during build ?
See build/mvn which installs zinc.
Cheers
On Tue, Jun 2, 2015 at 12:26 PM, Ritesh Kumar Singh
riteshoneinamill...@gmail.com wrote:
It did hang for me too. High RAM consumption during build. Had to free a
lot of RAM and introduce swap memory just to get it
Spark 1.3.1, Scala 2.11.6, Maven 3.3.3, I'm behind proxy, have set my proxy
settings in maven settings.
Thanks,
On Tue, Jun 2, 2015 at 2:54 PM, Ted Yu yuzhih...@gmail.com wrote:
Can you give us some more information ?
Such as:
which Spark release you were building
what command you used
Can you give us some more information ?
Such as:
which Spark release you were building
what command you used
Scala version you used
Thanks
On Tue, Jun 2, 2015 at 2:50 PM, Mulugeta Mammo mulugeta.abe...@gmail.com
wrote:
building Spark is throwing errors, any ideas?
[FATAL] Non-resolvable
building Spark is throwing errors, any ideas?
[FATAL] Non-resolvable parent POM: Could not transfer artifact
org.apache:apache:pom:14 from/to central (
http://repo.maven.apache.org/maven2): Error transferring file:
repo.maven.apache.org from
I ran dev/change-version-to-2.11.sh first.
I used the following command but didn't reproduce the error below:
mvn -DskipTests -Phadoop-2.4 -Pyarn -Phive clean package
My env: maven 3.3.1
Possibly the error was related to proxy setting.
FYI
On Tue, Jun 2, 2015 at 3:14 PM, Mulugeta Mammo