Looks better Ted bow :)

 

[INFO] ------------------------------------------------------------------------

[INFO] Reactor Summary:

[INFO]

[INFO] Spark Project Parent POM ........................... SUCCESS [ 39.937 s]

[INFO] Spark Project Launcher ............................. SUCCESS [ 44.718 s]

[INFO] Spark Project Networking ........................... SUCCESS [ 11.294 s]

[INFO] Spark Project Shuffle Streaming Service ............ SUCCESS [  4.720 s]

[INFO] Spark Project Unsafe ............................... SUCCESS [ 10.705 s]

[INFO] Spark Project Core ................................. SUCCESS [02:52 min]

[INFO] Spark Project Bagel ................................ SUCCESS [  5.937 s]

[INFO] Spark Project GraphX ............................... SUCCESS [ 15.977 s]

[INFO] Spark Project Streaming ............................ SUCCESS [ 36.453 s]

[INFO] Spark Project Catalyst ............................. SUCCESS [ 54.381 s]

[INFO] Spark Project SQL .................................. SUCCESS [01:07 min]

[INFO] Spark Project ML Library ........................... SUCCESS [01:22 min]

[INFO] Spark Project Tools ................................ SUCCESS [  2.493 s]

[INFO] Spark Project Hive ................................. SUCCESS [ 58.496 s]

[INFO] Spark Project REPL ................................. SUCCESS [  9.278 s]

[INFO] Spark Project YARN ................................. SUCCESS [ 12.424 s]

[INFO] Spark Project Assembly ............................. SUCCESS [01:51 min]

[INFO] Spark Project External Twitter ..................... SUCCESS [  7.604 s]

[INFO] Spark Project External Flume Sink .................. SUCCESS [  7.580 s]

[INFO] Spark Project External Flume ....................... SUCCESS [  9.526 s]

[INFO] Spark Project External Flume Assembly .............. SUCCESS [  3.163 s]

[INFO] Spark Project External MQTT ........................ SUCCESS [ 31.774 s]

[INFO] Spark Project External MQTT Assembly ............... SUCCESS [  8.698 s]

[INFO] Spark Project External ZeroMQ ...................... SUCCESS [  6.992 s]

[INFO] Spark Project External Kafka ....................... SUCCESS [ 11.487 s]

[INFO] Spark Project Examples ............................. SUCCESS [02:12 min]

[INFO] Spark Project External Kafka Assembly .............. SUCCESS [  9.046 s]

[INFO] Spark Project YARN Shuffle Service ................. SUCCESS [  6.097 s]

[INFO] ------------------------------------------------------------------------

[INFO] BUILD SUCCESS

[INFO] ------------------------------------------------------------------------

[INFO] Total time: 16:16 min

[INFO] Finished at: 2015-11-25T23:34:35+00:00

[INFO] Final Memory: 90M/1312M

[INFO] ------------------------------------------------------------------------

 

Mich Talebzadeh

 

Sybase ASE 15 Gold Medal Award 2008

A Winning Strategy: Running the most Critical Financial Data on ASE 15

http://login.sybase.com/files/Product_Overviews/ASE-Winning-Strategy-091908.pdf

Author of the books "A Practitioner’s Guide to Upgrading to Sybase ASE 15", 
ISBN 978-0-9563693-0-7. 

co-author "Sybase Transact SQL Guidelines Best Practices", ISBN 
978-0-9759693-0-4

Publications due shortly:

Complex Event Processing in Heterogeneous Environments, ISBN: 978-0-9563693-3-8

Oracle and Sybase, Concepts and Contrasts, ISBN: 978-0-9563693-1-4, volume one 
out shortly

 

http://talebzadehmich.wordpress.com <http://talebzadehmich.wordpress.com/> 

 

NOTE: The information in this email is proprietary and confidential. This 
message is for the designated recipient only, if you are not the intended 
recipient, you should destroy it immediately. Any information in this message 
shall not be understood as given or endorsed by Peridale Technology Ltd, its 
subsidiaries or their employees, unless expressly so stated. It is the 
responsibility of the recipient to ensure that this email is virus free, 
therefore neither Peridale Ltd, its subsidiaries nor their employees accept any 
responsibility.

 

From: Mich Talebzadeh [mailto:m...@peridale.co.uk] 
Sent: 25 November 2015 23:08
To: 'Ted Yu' <yuzhih...@gmail.com>
Cc: 'user' <user@spark.apache.org>
Subject: RE: Building Spark without hive libraries

 

Yep.

 

The user hduser was using the wrong version of maven

 

hduser@rhes564::/usr/lib/spark> build/mvn -X -Pyarn -Phadoop-2.6 
-Dhadoop.version=2.6.0 -DskipTests clean package > log

Using `mvn` from path: /usr/local/apache-maven/apache-maven-3.3.1/bin/mvn

 

 

WARNING] Rule 0: org.apache.maven.plugins.enforcer.RequireMavenVersion failed 
with message:

Detected Maven Version: 3.3.1 is not in the allowed range 3.3.3.

 

Mich Talebzadeh

 

Sybase ASE 15 Gold Medal Award 2008

A Winning Strategy: Running the most Critical Financial Data on ASE 15

http://login.sybase.com/files/Product_Overviews/ASE-Winning-Strategy-091908.pdf

Author of the books "A Practitioner’s Guide to Upgrading to Sybase ASE 15", 
ISBN 978-0-9563693-0-7. 

co-author "Sybase Transact SQL Guidelines Best Practices", ISBN 
978-0-9759693-0-4

Publications due shortly:

Complex Event Processing in Heterogeneous Environments, ISBN: 978-0-9563693-3-8

Oracle and Sybase, Concepts and Contrasts, ISBN: 978-0-9563693-1-4, volume one 
out shortly

 

http://talebzadehmich.wordpress.com <http://talebzadehmich.wordpress.com/> 

 

NOTE: The information in this email is proprietary and confidential. This 
message is for the designated recipient only, if you are not the intended 
recipient, you should destroy it immediately. Any information in this message 
shall not be understood as given or endorsed by Peridale Technology Ltd, its 
subsidiaries or their employees, unless expressly so stated. It is the 
responsibility of the recipient to ensure that this email is virus free, 
therefore neither Peridale Ltd, its subsidiaries nor their employees accept any 
responsibility.

 

From: Ted Yu [mailto:yuzhih...@gmail.com] 
Sent: 25 November 2015 22:54
To: Mich Talebzadeh <m...@peridale.co.uk <mailto:m...@peridale.co.uk> >
Cc: user <user@spark.apache.org <mailto:user@spark.apache.org> >
Subject: Re: Building Spark without hive libraries

 

bq. I have to run this as root otherwise build does not progress

 

I build Spark as non-root user and don't problem.

 

I suggest you dig a little bit to see what was stalling running as non-root 
user.

 

On Wed, Nov 25, 2015 at 2:48 PM, Mich Talebzadeh <m...@peridale.co.uk 
<mailto:m...@peridale.co.uk> > wrote:

Thanks Ted.

 

I have the jar file scala-compiler-2.10.4.jar as well

 

pwd

/

find ./ -name scala-compiler-2.10.4.jar

./usr/lib/spark/build/zinc-0.3.5.3/lib/scala-compiler-2.10.4.jar

./usr/lib/spark/build/apache-maven-3.3.3/lib/scala-compiler-2.10.4.jar

./root/.m2/repository/org/scala-lang/scala-compiler/2.10.4/scala-compiler-2.10.4.jar

 

Sounds like (?) because I am running the maven command as root, it cannot find 
that file!!

 

Do I need to add it somewhere or set it up on the PATH/CLASSPATH?

 

 

NOTE: The information in this email is proprietary and confidential. This 
message is for the designated recipient only, if you are not the intended 
recipient, you should destroy it immediately. Any information in this message 
shall not be understood as given or endorsed by Peridale Technology Ltd, its 
subsidiaries or their employees, unless expressly so stated. It is the 
responsibility of the recipient to ensure that this email is virus free, 
therefore neither Peridale Ltd, its subsidiaries nor their employees accept any 
responsibility.

 

From: Ted Yu [mailto:yuzhih...@gmail.com <mailto:yuzhih...@gmail.com> ] 
Sent: 25 November 2015 22:35


To: Mich Talebzadeh <m...@peridale.co.uk <mailto:m...@peridale.co.uk> >
Cc: user <user@spark.apache.org <mailto:user@spark.apache.org> >
Subject: Re: Building Spark without hive libraries

 

bq. ^[[0m[^[[31merror^[[0m] ^[[0mRequired file not found: 
scala-compiler-2.10.4.jar^[[0m

 

Can you search for the above jar ?

 

I found two locally:

 

/home/hbase/.ivy2/cache/org.scala-lang/scala-compiler/jars/scala-compiler-2.10.4.jar

/home/hbase/.m2/repository/org/scala-lang/scala-compiler/2.10.4/scala-compiler-2.10.4.jar

 

On Wed, Nov 25, 2015 at 2:30 PM, Mich Talebzadeh <m...@peridale.co.uk 
<mailto:m...@peridale.co.uk> > wrote:

Thanks Ted.

 

I ran maven in debug mode as follows

 

build/mvn -X -Pyarn -Phadoop-2.6 -Dhadoop.version=2.6.0 -DskipTests clean 
package > log

Using `mvn` from path: /usr/lib/spark/build/apache-maven-3.3.3/bin/mvn

 

Still cannot determine the cause of this error.

 

Thanks,

 

Mich

 

NOTE: The information in this email is proprietary and confidential. This 
message is for the designated recipient only, if you are not the intended 
recipient, you should destroy it immediately. Any information in this message 
shall not be understood as given or endorsed by Peridale Technology Ltd, its 
subsidiaries or their employees, unless expressly so stated. It is the 
responsibility of the recipient to ensure that this email is virus free, 
therefore neither Peridale Ltd, its subsidiaries nor their employees accept any 
responsibility.

 

From: Ted Yu [mailto:yuzhih...@gmail.com <mailto:yuzhih...@gmail.com> ] 
Sent: 25 November 2015 21:52
To: Mich Talebzadeh <m...@peridale.co.uk <mailto:m...@peridale.co.uk> >
Cc: user <user@spark.apache.org <mailto:user@spark.apache.org> >
Subject: Re: Building Spark without hive libraries

 

Take a look at install_zinc() in build/mvn

 

Cheers

 

On Wed, Nov 25, 2015 at 1:30 PM, Mich Talebzadeh <m...@peridale.co.uk 
<mailto:m...@peridale.co.uk> > wrote:

Hi,

 

I am trying to build sparc from the source and not using Hive. I am getting 

 

[error] Required file not found: scala-compiler-2.10.4.jar

[error] See zinc -help for information about locating necessary files

 

I have to run this as root otherwise build does not progress. Any help is 
appreciated.

 

 

-bash-3.2#  ./make-distribution.sh --name "hadoop2-without-hive" --tgz 
"-Pyarn,hadoop-provided,hadoop-2.6,parquet-provided"

+++ dirname ./make-distribution.sh

++ cd .

++ pwd

+ SPARK_HOME=/usr/lib/spark

+ DISTDIR=/usr/lib/spark/dist

+ SPARK_TACHYON=false

+ TACHYON_VERSION=0.7.1

+ TACHYON_TGZ=tachyon-0.7.1-bin.tar.gz

+ 
TACHYON_URL=https://github.com/amplab/tachyon/releases/download/v0.7.1/tachyon-0.7.1-bin.tar.gz

+ MAKE_TGZ=false

+ NAME=none

+ MVN=/usr/lib/spark/build/mvn

+ ((  4  ))

+ case $1 in

+ NAME=hadoop2-without-hive

+ shift

+ shift

+ ((  2  ))

+ case $1 in

+ MAKE_TGZ=true

+ shift

+ ((  1  ))

+ case $1 in

+ break

+ '[' -z /usr/java/latest ']'

+ '[' -z /usr/java/latest ']'

++ command -v git

+ '[' ']'

++ command -v /usr/lib/spark/build/mvn

+ '[' '!' /usr/lib/spark/build/mvn ']'

++ /usr/lib/spark/build/mvn help:evaluate -Dexpression=project.version 
-Pyarn,hadoop-provided,hadoop-2.6,parquet-provided

++ grep -v INFO

++ tail -n 1

+ VERSION=1.5.2

++ /usr/lib/spark/build/mvn help:evaluate -Dexpression=scala.binary.version 
-Pyarn,hadoop-provided,hadoop-2.6,parquet-provided

++ grep -v INFO

++ tail -n 1

+ SCALA_VERSION=2.10

++ /usr/lib/spark/build/mvn help:evaluate -Dexpression=hadoop.version 
-Pyarn,hadoop-provided,hadoop-2.6,parquet-provided

++ grep -v INFO

++ tail -n 1

+ SPARK_HADOOP_VERSION=2.6.0

++ /usr/lib/spark/build/mvn help:evaluate -Dexpression=project.activeProfiles 
-pl sql/hive -Pyarn,hadoop-provided,hadoop-2.6,parquet-provided

++ grep -v INFO

++ fgrep --count '<id>hive</id>'

++ echo -n

+ SPARK_HIVE=0

+ '[' hadoop2-without-hive == none ']'

+ echo 'Spark version is 1.5.2'

Spark version is 1.5.2

+ '[' true == true ']'

+ echo 'Making spark-1.5.2-bin-hadoop2-without-hive.tgz'

Making spark-1.5.2-bin-hadoop2-without-hive.tgz

+ '[' false == true ']'

+ echo 'Tachyon Disabled'

Tachyon Disabled

+ cd /usr/lib/spark

+ export 'MAVEN_OPTS=-Xmx2g -XX:MaxPermSize=512M -XX:ReservedCodeCacheSize=512m'

+ MAVEN_OPTS='-Xmx2g -XX:MaxPermSize=512M -XX:ReservedCodeCacheSize=512m'

+ BUILD_COMMAND=("$MVN" clean package -DskipTests $@)

+ echo -e '\nBuilding with...'

 

Building with...

+ echo -e '$ /usr/lib/spark/build/mvn' clean package -DskipTests 
'-Pyarn,hadoop-provided,hadoop-2.6,parquet-provided\n'

$ /usr/lib/spark/build/mvn clean package -DskipTests 
-Pyarn,hadoop-provided,hadoop-2.6,parquet-provided

 

+ /usr/lib/spark/build/mvn clean package -DskipTests 
-Pyarn,hadoop-provided,hadoop-2.6,parquet-provided

Using `mvn` from path: /usr/lib/spark/build/apache-maven-3.3.3/bin/mvn

[INFO] Scanning for projects...

[INFO] ------------------------------------------------------------------------

[INFO] Reactor Build Order:

[INFO]

[INFO] Spark Project Parent POM

[INFO] Spark Project Launcher

[INFO] Spark Project Networking

[INFO] Spark Project Shuffle Streaming Service

[INFO] Spark Project Unsafe

[INFO] Spark Project Core

[INFO] Spark Project Bagel

[INFO] Spark Project GraphX

[INFO] Spark Project Streaming

[INFO] Spark Project Catalyst

[INFO] Spark Project SQL

[INFO] Spark Project ML Library

[INFO] Spark Project Tools

[INFO] Spark Project Hive

[INFO] Spark Project REPL

[INFO] Spark Project YARN

[INFO] Spark Project Assembly

[INFO] Spark Project External Twitter

[INFO] Spark Project External Flume Sink

[INFO] Spark Project External Flume

[INFO] Spark Project External Flume Assembly

[INFO] Spark Project External MQTT

[INFO] Spark Project External MQTT Assembly

[INFO] Spark Project External ZeroMQ

[INFO] Spark Project External Kafka

[INFO] Spark Project Examples

[INFO] Spark Project External Kafka Assembly

[INFO] Spark Project YARN Shuffle Service

[INFO]

[INFO] ------------------------------------------------------------------------

[INFO] Building Spark Project Parent POM 1.5.2

[INFO] ------------------------------------------------------------------------

[INFO]

[INFO] --- maven-clean-plugin:2.6.1:clean (default-clean) @ spark-parent_2.10 
---

[INFO] Deleting /usr/lib/spark/target

[INFO]

[INFO] --- maven-enforcer-plugin:1.4:enforce (enforce-versions) @ 
spark-parent_2.10 ---

[INFO]

[INFO] --- scala-maven-plugin:3.2.2:add-source (eclipse-add-source) @ 
spark-parent_2.10 ---

[INFO] Add Source directory: /usr/lib/spark/src/main/scala

[INFO] Add Test Source directory: /usr/lib/spark/src/test/scala

[INFO]

[INFO] --- maven-remote-resources-plugin:1.5:process (default) @ 
spark-parent_2.10 ---

[INFO]

[INFO] --- scala-maven-plugin:3.2.2:compile (scala-compile-first) @ 
spark-parent_2.10 ---

[INFO] No sources to compile

[INFO]

[INFO] --- maven-antrun-plugin:1.8:run (create-tmp-dir) @ spark-parent_2.10 ---

[INFO] Executing tasks

 

main:

    [mkdir] Created dir: /usr/lib/spark/target/tmp

[INFO] Executed tasks

[INFO]

[INFO] --- scala-maven-plugin:3.2.2:testCompile (scala-test-compile-first) @ 
spark-parent_2.10 ---

[INFO] No sources to compile

[INFO]

[INFO] --- maven-dependency-plugin:2.10:build-classpath (default) @ 
spark-parent_2.10 ---

[INFO]

[INFO] --- scalatest-maven-plugin:1.0:test (test) @ spark-parent_2.10 ---

[INFO] Tests are skipped.

[INFO]

[INFO] --- maven-jar-plugin:2.6:test-jar (prepare-test-jar) @ spark-parent_2.10 
---

[INFO] Building jar: /usr/lib/spark/target/spark-parent_2.10-1.5.2-tests.jar

[INFO]

[INFO] --- maven-site-plugin:3.3:attach-descriptor (attach-descriptor) @ 
spark-parent_2.10 ---

[INFO]

[INFO] --- maven-shade-plugin:2.4.1:shade (default) @ spark-parent_2.10 ---

[INFO] Including org.spark-project.spark:unused:jar:1.0.0 in the shaded jar.

[INFO] Replacing original artifact with shaded artifact.

[INFO]

[INFO] --- maven-source-plugin:2.4:jar-no-fork (create-source-jar) @ 
spark-parent_2.10 ---

[INFO]

[INFO] --- maven-source-plugin:2.4:test-jar-no-fork (create-source-jar) @ 
spark-parent_2.10 ---

[INFO]

[INFO] ------------------------------------------------------------------------

[INFO] Building Spark Project Launcher 1.5.2

[INFO] ------------------------------------------------------------------------

[INFO]

[INFO] --- maven-clean-plugin:2.6.1:clean (default-clean) @ spark-launcher_2.10 
---

[INFO] Deleting /usr/lib/spark/launcher/target

[INFO]

[INFO] --- maven-enforcer-plugin:1.4:enforce (enforce-versions) @ 
spark-launcher_2.10 ---

[INFO]

[INFO] --- scala-maven-plugin:3.2.2:add-source (eclipse-add-source) @ 
spark-launcher_2.10 ---

[INFO] Add Source directory: /usr/lib/spark/launcher/src/main/scala

[INFO] Add Test Source directory: /usr/lib/spark/launcher/src/test/scala

[INFO]

[INFO] --- maven-remote-resources-plugin:1.5:process (default) @ 
spark-launcher_2.10 ---

[INFO]

[INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ 
spark-launcher_2.10 ---

[INFO] Using 'UTF-8' encoding to copy filtered resources.

[INFO] skip non existing resourceDirectory 
/usr/lib/spark/launcher/src/main/resources

[INFO] Copying 3 resources

[INFO]

[INFO] --- scala-maven-plugin:3.2.2:compile (scala-compile-first) @ 
spark-launcher_2.10 ---

[INFO] Using zinc server for incremental compilation

[error] Required file not found: scala-compiler-2.10.4.jar

[error] See zinc -help for information about locating necessary files

[INFO] ------------------------------------------------------------------------

[INFO] Reactor Summary:

[INFO]

[INFO] Spark Project Parent POM ........................... SUCCESS [  2.816 s]

[INFO] Spark Project Launcher ............................. FAILURE [  2.885 s]

[INFO] Spark Project Networking ........................... SKIPPED

[INFO] Spark Project Shuffle Streaming Service ............ SKIPPED

[INFO] Spark Project Unsafe ............................... SKIPPED

[INFO] Spark Project Core ................................. SKIPPED

[INFO] Spark Project Bagel ................................ SKIPPED

[INFO] Spark Project GraphX ............................... SKIPPED

[INFO] Spark Project Streaming ............................ SKIPPED

[INFO] Spark Project Catalyst ............................. SKIPPED

[INFO] Spark Project SQL .................................. SKIPPED

[INFO] Spark Project ML Library ........................... SKIPPED

[INFO] Spark Project Tools ................................ SKIPPED

[INFO] Spark Project Hive ................................. SKIPPED

[INFO] Spark Project REPL ................................. SKIPPED

[INFO] Spark Project YARN ................................. SKIPPED

[INFO] Spark Project Assembly ............................. SKIPPED

[INFO] Spark Project External Twitter ..................... SKIPPED

[INFO] Spark Project External Flume Sink .................. SKIPPED

[INFO] Spark Project External Flume ....................... SKIPPED

[INFO] Spark Project External Flume Assembly .............. SKIPPED

[INFO] Spark Project External MQTT ........................ SKIPPED

[INFO] Spark Project External MQTT Assembly ............... SKIPPED

[INFO] Spark Project External ZeroMQ ...................... SKIPPED

[INFO] Spark Project External Kafka ....................... SKIPPED

[INFO] Spark Project Examples ............................. SKIPPED

[INFO] Spark Project External Kafka Assembly .............. SKIPPED

[INFO] Spark Project YARN Shuffle Service ................. SKIPPED

[INFO] ------------------------------------------------------------------------

[INFO] BUILD FAILURE

[INFO] ------------------------------------------------------------------------

[INFO] Total time: 6.643 s

[INFO] Finished at: 2015-11-25T18:20:09+00:00

[INFO] Final Memory: 66M/896M

[INFO] ------------------------------------------------------------------------

[ERROR] Failed to execute goal 
net.alchim31.maven:scala-maven-plugin:3.2.2:compile (scala-compile-first) on 
project spark-launcher_2.10: Execution scala-compile-first of goal 
net.alchim31.maven:scala-maven-plugin:3.2.2:compile failed. CompileFailed -> 
[Help 1]

[ERROR]

[ERROR] To see the full stack trace of the errors, re-run Maven with the -e 
switch.

[ERROR] Re-run Maven using the -X switch to enable full debug logging.

[ERROR]

[ERROR] For more information about the errors and possible solutions, please 
read the following articles:

[ERROR] [Help 1] 
http://cwiki.apache.org/confluence/display/MAVEN/PluginExecutionException

[ERROR]

[ERROR] After correcting the problems, you can resume the build with the command

[ERROR]   mvn <goals> -rf :spark-launcher_2.10

 

 

NOTE: The information in this email is proprietary and confidential. This 
message is for the designated recipient only, if you are not the intended 
recipient, you should destroy it immediately. Any information in this message 
shall not be understood as given or endorsed by Peridale Technology Ltd, its 
subsidiaries or their employees, unless expressly so stated. It is the 
responsibility of the recipient to ensure that this email is virus free, 
therefore neither Peridale Ltd, its subsidiaries nor their employees accept any 
responsibility.

 

 



---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org 
<mailto:user-unsubscr...@spark.apache.org> 
For additional commands, e-mail: user-h...@spark.apache.org 
<mailto:user-h...@spark.apache.org> 

 

 

Reply via email to