bq. ^[[0m[^[[31merror^[[0m] ^[[0mRequired file not found:
scala-compiler-2.10.4.jar^[[0m

Can you search for the above jar ?

I found two locally:

/home/hbase/.ivy2/cache/org.scala-lang/scala-compiler/jars/scala-compiler-2.10.4.jar
/home/hbase/.m2/repository/org/scala-lang/scala-compiler/2.10.4/scala-compiler-2.10.4.jar

On Wed, Nov 25, 2015 at 2:30 PM, Mich Talebzadeh <m...@peridale.co.uk>
wrote:

> Thanks Ted.
>
>
>
> I ran maven in debug mode as follows
>
>
>
> *build/mvn -X -Pyarn -Phadoop-2.6 -Dhadoop.version=2.6.0 -DskipTests clean
> package > log*
>
> Using `mvn` from path: /usr/lib/spark/build/apache-maven-3.3.3/bin/mvn
>
>
>
> Still cannot determine the cause of this error.
>
>
>
> Thanks,
>
>
>
> Mich
>
>
>
> NOTE: The information in this email is proprietary and confidential. This
> message is for the designated recipient only, if you are not the intended
> recipient, you should destroy it immediately. Any information in this
> message shall not be understood as given or endorsed by Peridale Technology
> Ltd, its subsidiaries or their employees, unless expressly so stated. It is
> the responsibility of the recipient to ensure that this email is virus
> free, therefore neither Peridale Ltd, its subsidiaries nor their employees
> accept any responsibility.
>
>
>
> *From:* Ted Yu [mailto:yuzhih...@gmail.com]
> *Sent:* 25 November 2015 21:52
> *To:* Mich Talebzadeh <m...@peridale.co.uk>
> *Cc:* user <user@spark.apache.org>
> *Subject:* Re: Building Spark without hive libraries
>
>
>
> Take a look at install_zinc() in build/mvn
>
>
>
> Cheers
>
>
>
> On Wed, Nov 25, 2015 at 1:30 PM, Mich Talebzadeh <m...@peridale.co.uk>
> wrote:
>
> Hi,
>
>
>
> I am trying to build sparc from the source and not using Hive. I am
> getting
>
>
>
> [error] Required file not found: scala-compiler-2.10.4.jar
>
> [error] See zinc -help for information about locating necessary files
>
>
>
> I have to run this as root otherwise build does not progress. Any help is
> appreciated.
>
>
>
>
>
> -bash-3.2#  ./make-distribution.sh --name "hadoop2-without-hive" --tgz
> "-Pyarn,hadoop-provided,hadoop-2.6,parquet-provided"
>
> +++ dirname ./make-distribution.sh
>
> ++ cd .
>
> ++ pwd
>
> + SPARK_HOME=/usr/lib/spark
>
> + DISTDIR=/usr/lib/spark/dist
>
> + SPARK_TACHYON=false
>
> + TACHYON_VERSION=0.7.1
>
> + TACHYON_TGZ=tachyon-0.7.1-bin.tar.gz
>
> + TACHYON_URL=
> https://github.com/amplab/tachyon/releases/download/v0.7.1/tachyon-0.7.1-bin.tar.gz
>
> + MAKE_TGZ=false
>
> + NAME=none
>
> + MVN=/usr/lib/spark/build/mvn
>
> + ((  4  ))
>
> + case $1 in
>
> + NAME=hadoop2-without-hive
>
> + shift
>
> + shift
>
> + ((  2  ))
>
> + case $1 in
>
> + MAKE_TGZ=true
>
> + shift
>
> + ((  1  ))
>
> + case $1 in
>
> + break
>
> + '[' -z /usr/java/latest ']'
>
> + '[' -z /usr/java/latest ']'
>
> ++ command -v git
>
> + '[' ']'
>
> ++ command -v /usr/lib/spark/build/mvn
>
> + '[' '!' /usr/lib/spark/build/mvn ']'
>
> ++ /usr/lib/spark/build/mvn help:evaluate -Dexpression=project.version
> -Pyarn,hadoop-provided,hadoop-2.6,parquet-provided
>
> ++ grep -v INFO
>
> ++ tail -n 1
>
> + VERSION=1.5.2
>
> ++ /usr/lib/spark/build/mvn help:evaluate
> -Dexpression=scala.binary.version
> -Pyarn,hadoop-provided,hadoop-2.6,parquet-provided
>
> ++ grep -v INFO
>
> ++ tail -n 1
>
> + SCALA_VERSION=2.10
>
> ++ /usr/lib/spark/build/mvn help:evaluate -Dexpression=hadoop.version
> -Pyarn,hadoop-provided,hadoop-2.6,parquet-provided
>
> ++ grep -v INFO
>
> ++ tail -n 1
>
> + SPARK_HADOOP_VERSION=2.6.0
>
> ++ /usr/lib/spark/build/mvn help:evaluate
> -Dexpression=project.activeProfiles -pl sql/hive
> -Pyarn,hadoop-provided,hadoop-2.6,parquet-provided
>
> ++ grep -v INFO
>
> ++ fgrep --count '<id>hive</id>'
>
> ++ echo -n
>
> + SPARK_HIVE=0
>
> + '[' hadoop2-without-hive == none ']'
>
> + echo 'Spark version is 1.5.2'
>
> Spark version is 1.5.2
>
> + '[' true == true ']'
>
> + echo 'Making spark-1.5.2-bin-hadoop2-without-hive.tgz'
>
> Making spark-1.5.2-bin-hadoop2-without-hive.tgz
>
> + '[' false == true ']'
>
> + echo 'Tachyon Disabled'
>
> Tachyon Disabled
>
> + cd /usr/lib/spark
>
> + export 'MAVEN_OPTS=-Xmx2g -XX:MaxPermSize=512M
> -XX:ReservedCodeCacheSize=512m'
>
> + MAVEN_OPTS='-Xmx2g -XX:MaxPermSize=512M -XX:ReservedCodeCacheSize=512m'
>
> + BUILD_COMMAND=("$MVN" clean package -DskipTests $@)
>
> + echo -e '\nBuilding with...'
>
>
>
> Building with...
>
> + echo -e '$ /usr/lib/spark/build/mvn' clean package -DskipTests
> '-Pyarn,hadoop-provided,hadoop-2.6,parquet-provided\n'
>
> $ /usr/lib/spark/build/mvn clean package -DskipTests
> -Pyarn,hadoop-provided,hadoop-2.6,parquet-provided
>
>
>
> + /usr/lib/spark/build/mvn clean package -DskipTests
> -Pyarn,hadoop-provided,hadoop-2.6,parquet-provided
>
> Using `mvn` from path: /usr/lib/spark/build/apache-maven-3.3.3/bin/mvn
>
> [INFO] Scanning for projects...
>
> [INFO]
> ------------------------------------------------------------------------
>
> [INFO] Reactor Build Order:
>
> [INFO]
>
> [INFO] Spark Project Parent POM
>
> [INFO] Spark Project Launcher
>
> [INFO] Spark Project Networking
>
> [INFO] Spark Project Shuffle Streaming Service
>
> [INFO] Spark Project Unsafe
>
> [INFO] Spark Project Core
>
> [INFO] Spark Project Bagel
>
> [INFO] Spark Project GraphX
>
> [INFO] Spark Project Streaming
>
> [INFO] Spark Project Catalyst
>
> [INFO] Spark Project SQL
>
> [INFO] Spark Project ML Library
>
> [INFO] Spark Project Tools
>
> [INFO] Spark Project Hive
>
> [INFO] Spark Project REPL
>
> [INFO] Spark Project YARN
>
> [INFO] Spark Project Assembly
>
> [INFO] Spark Project External Twitter
>
> [INFO] Spark Project External Flume Sink
>
> [INFO] Spark Project External Flume
>
> [INFO] Spark Project External Flume Assembly
>
> [INFO] Spark Project External MQTT
>
> [INFO] Spark Project External MQTT Assembly
>
> [INFO] Spark Project External ZeroMQ
>
> [INFO] Spark Project External Kafka
>
> [INFO] Spark Project Examples
>
> [INFO] Spark Project External Kafka Assembly
>
> [INFO] Spark Project YARN Shuffle Service
>
> [INFO]
>
> [INFO]
> ------------------------------------------------------------------------
>
> [INFO] Building Spark Project Parent POM 1.5.2
>
> [INFO]
> ------------------------------------------------------------------------
>
> [INFO]
>
> [INFO] --- maven-clean-plugin:2.6.1:clean (default-clean) @
> spark-parent_2.10 ---
>
> [INFO] Deleting /usr/lib/spark/target
>
> [INFO]
>
> [INFO] --- maven-enforcer-plugin:1.4:enforce (enforce-versions) @
> spark-parent_2.10 ---
>
> [INFO]
>
> [INFO] --- scala-maven-plugin:3.2.2:add-source (eclipse-add-source) @
> spark-parent_2.10 ---
>
> [INFO] Add Source directory: /usr/lib/spark/src/main/scala
>
> [INFO] Add Test Source directory: /usr/lib/spark/src/test/scala
>
> [INFO]
>
> [INFO] --- maven-remote-resources-plugin:1.5:process (default) @
> spark-parent_2.10 ---
>
> [INFO]
>
> [INFO] --- scala-maven-plugin:3.2.2:compile (scala-compile-first) @
> spark-parent_2.10 ---
>
> [INFO] No sources to compile
>
> [INFO]
>
> [INFO] --- maven-antrun-plugin:1.8:run (create-tmp-dir) @
> spark-parent_2.10 ---
>
> [INFO] Executing tasks
>
>
>
> main:
>
>     [mkdir] Created dir: /usr/lib/spark/target/tmp
>
> [INFO] Executed tasks
>
> [INFO]
>
> [INFO] --- scala-maven-plugin:3.2.2:testCompile (scala-test-compile-first)
> @ spark-parent_2.10 ---
>
> [INFO] No sources to compile
>
> [INFO]
>
> [INFO] --- maven-dependency-plugin:2.10:build-classpath (default) @
> spark-parent_2.10 ---
>
> [INFO]
>
> [INFO] --- scalatest-maven-plugin:1.0:test (test) @ spark-parent_2.10 ---
>
> [INFO] Tests are skipped.
>
> [INFO]
>
> [INFO] --- maven-jar-plugin:2.6:test-jar (prepare-test-jar) @
> spark-parent_2.10 ---
>
> [INFO] Building jar:
> /usr/lib/spark/target/spark-parent_2.10-1.5.2-tests.jar
>
> [INFO]
>
> [INFO] --- maven-site-plugin:3.3:attach-descriptor (attach-descriptor) @
> spark-parent_2.10 ---
>
> [INFO]
>
> [INFO] --- maven-shade-plugin:2.4.1:shade (default) @ spark-parent_2.10 ---
>
> [INFO] Including org.spark-project.spark:unused:jar:1.0.0 in the shaded
> jar.
>
> [INFO] Replacing original artifact with shaded artifact.
>
> [INFO]
>
> [INFO] --- maven-source-plugin:2.4:jar-no-fork (create-source-jar) @
> spark-parent_2.10 ---
>
> [INFO]
>
> [INFO] --- maven-source-plugin:2.4:test-jar-no-fork (create-source-jar) @
> spark-parent_2.10 ---
>
> [INFO]
>
> [INFO]
> ------------------------------------------------------------------------
>
> [INFO] Building Spark Project Launcher 1.5.2
>
> [INFO]
> ------------------------------------------------------------------------
>
> [INFO]
>
> [INFO] --- maven-clean-plugin:2.6.1:clean (default-clean) @
> spark-launcher_2.10 ---
>
> [INFO] Deleting /usr/lib/spark/launcher/target
>
> [INFO]
>
> [INFO] --- maven-enforcer-plugin:1.4:enforce (enforce-versions) @
> spark-launcher_2.10 ---
>
> [INFO]
>
> [INFO] --- scala-maven-plugin:3.2.2:add-source (eclipse-add-source) @
> spark-launcher_2.10 ---
>
> [INFO] Add Source directory: /usr/lib/spark/launcher/src/main/scala
>
> [INFO] Add Test Source directory: /usr/lib/spark/launcher/src/test/scala
>
> [INFO]
>
> [INFO] --- maven-remote-resources-plugin:1.5:process (default) @
> spark-launcher_2.10 ---
>
> [INFO]
>
> [INFO] --- maven-resources-plugin:2.6:resources (default-resources) @
> spark-launcher_2.10 ---
>
> [INFO] Using 'UTF-8' encoding to copy filtered resources.
>
> [INFO] skip non existing resourceDirectory
> /usr/lib/spark/launcher/src/main/resources
>
> [INFO] Copying 3 resources
>
> [INFO]
>
> [INFO] --- scala-maven-plugin:3.2.2:compile (scala-compile-first) @
> spark-launcher_2.10 ---
>
> [INFO] Using zinc server for incremental compilation
>
> [error] Required file not found: scala-compiler-2.10.4.jar
>
> [error] See zinc -help for information about locating necessary files
>
> [INFO]
> ------------------------------------------------------------------------
>
> [INFO] Reactor Summary:
>
> [INFO]
>
> [INFO] Spark Project Parent POM ........................... SUCCESS [
> 2.816 s]
>
> [INFO] Spark Project Launcher ............................. FAILURE [
> 2.885 s]
>
> [INFO] Spark Project Networking ........................... SKIPPED
>
> [INFO] Spark Project Shuffle Streaming Service ............ SKIPPED
>
> [INFO] Spark Project Unsafe ............................... SKIPPED
>
> [INFO] Spark Project Core ................................. SKIPPED
>
> [INFO] Spark Project Bagel ................................ SKIPPED
>
> [INFO] Spark Project GraphX ............................... SKIPPED
>
> [INFO] Spark Project Streaming ............................ SKIPPED
>
> [INFO] Spark Project Catalyst ............................. SKIPPED
>
> [INFO] Spark Project SQL .................................. SKIPPED
>
> [INFO] Spark Project ML Library ........................... SKIPPED
>
> [INFO] Spark Project Tools ................................ SKIPPED
>
> [INFO] Spark Project Hive ................................. SKIPPED
>
> [INFO] Spark Project REPL ................................. SKIPPED
>
> [INFO] Spark Project YARN ................................. SKIPPED
>
> [INFO] Spark Project Assembly ............................. SKIPPED
>
> [INFO] Spark Project External Twitter ..................... SKIPPED
>
> [INFO] Spark Project External Flume Sink .................. SKIPPED
>
> [INFO] Spark Project External Flume ....................... SKIPPED
>
> [INFO] Spark Project External Flume Assembly .............. SKIPPED
>
> [INFO] Spark Project External MQTT ........................ SKIPPED
>
> [INFO] Spark Project External MQTT Assembly ............... SKIPPED
>
> [INFO] Spark Project External ZeroMQ ...................... SKIPPED
>
> [INFO] Spark Project External Kafka ....................... SKIPPED
>
> [INFO] Spark Project Examples ............................. SKIPPED
>
> [INFO] Spark Project External Kafka Assembly .............. SKIPPED
>
> [INFO] Spark Project YARN Shuffle Service ................. SKIPPED
>
> [INFO]
> ------------------------------------------------------------------------
>
> [INFO] BUILD FAILURE
>
> [INFO]
> ------------------------------------------------------------------------
>
> [INFO] Total time: 6.643 s
>
> [INFO] Finished at: 2015-11-25T18:20:09+00:00
>
> [INFO] Final Memory: 66M/896M
>
> [INFO]
> ------------------------------------------------------------------------
>
> [ERROR] Failed to execute goal
> net.alchim31.maven:scala-maven-plugin:3.2.2:compile (scala-compile-first)
> on project spark-launcher_2.10: Execution scala-compile-first of goal
> net.alchim31.maven:scala-maven-plugin:3.2.2:compile failed. CompileFailed
> -> [Help 1]
>
> [ERROR]
>
> [ERROR] To see the full stack trace of the errors, re-run Maven with the
> -e switch.
>
> [ERROR] Re-run Maven using the -X switch to enable full debug logging.
>
> [ERROR]
>
> [ERROR] For more information about the errors and possible solutions,
> please read the following articles:
>
> [ERROR] [Help 1]
> http://cwiki.apache.org/confluence/display/MAVEN/PluginExecutionException
>
> [ERROR]
>
> [ERROR] After correcting the problems, you can resume the build with the
> command
>
> [ERROR]   mvn <goals> -rf :spark-launcher_2.10
>
>
>
>
>
> NOTE: The information in this email is proprietary and confidential. This
> message is for the designated recipient only, if you are not the intended
> recipient, you should destroy it immediately. Any information in this
> message shall not be understood as given or endorsed by Peridale Technology
> Ltd, its subsidiaries or their employees, unless expressly so stated. It is
> the responsibility of the recipient to ensure that this email is virus
> free, therefore neither Peridale Ltd, its subsidiaries nor their employees
> accept any responsibility.
>
>
>
>
>
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>

Reply via email to