Hi, Chester,

Please check your pom.xml. Your java.version and maven.version might not
match your build environment.

Or using -Denforcer.skip=true from the command line to skip it.

Good luck,

Xiao Li

2015-10-08 10:35 GMT-07:00 Chester Chen <ches...@alpinenow.com>:

> Question regarding branch-1.5  build.
>
> Noticed that the spark project no longer publish the spark-assembly. We
> have to build ourselves ( until we find way to not depends on assembly
> jar).
>
>
> I check out the tag v.1.5.1 release version and using the sbt to build it,
> I get the following error
>
> build/sbt -Pyarn -Phadoop-2.6 -Dhadoop.version=2.6.0 -Phive
> -Phive-thriftserver -DskipTests clean package assembly
>
>
> [warn] ::::::::::::::::::::::::::::::::::::::::::::::
> [warn] ::          UNRESOLVED DEPENDENCIES         ::
> [warn] ::::::::::::::::::::::::::::::::::::::::::::::
> [warn] :: org.apache.spark#spark-network-common_2.10;1.5.1: configuration
> not public in org.apache.spark#spark-network-common_2.10;1.5.1: 'test'. It
> was required from org.apache.spark#spark-network-shuffle_2.10;1.5.1 test
> [warn] ::::::::::::::::::::::::::::::::::::::::::::::
> [warn]
> [warn] Note: Unresolved dependencies path:
> [warn] org.apache.spark:spark-network-common_2.10:1.5.1
> ((com.typesafe.sbt.pom.MavenHelper) MavenHelper.scala#L76)
> [warn]  +- org.apache.spark:spark-network-shuffle_2.10:1.5.1
> [info] Packaging
> /Users/chester/projects/alpine/apache/spark/launcher/target/scala-2.10/spark-launcher_2.10-1.5.1.jar
> ...
> [info] Done packaging.
> [warn] four warnings found
> [warn] Note: Some input files use unchecked or unsafe operations.
> [warn] Note: Recompile with -Xlint:unchecked for details.
> [warn] No main class detected
> [info] Packaging
> /Users/chester/projects/alpine/apache/spark/external/flume-sink/target/scala-2.10/spark-streaming-flume-sink_2.10-1.5.1.jar
> ...
> [info] Done packaging.
> sbt.ResolveException: unresolved dependency:
> org.apache.spark#spark-network-common_2.10;1.5.1: configuration not public
> in org.apache.spark#spark-network-common_2.10;1.5.1: 'test'. It was
> required from org.apache.spark#spark-network-shuffle_2.10;1.5.1 test
>
>
> Somehow the network-shuffle can't find the test jar needed ( not sure why
> test still needed, even the  -DskipTests is already specified)
>
> tried the maven command, the build failed as well ( without assembly)
>
> mvn -Pyarn -Phadoop-2.6 -Dhadoop.version=2.6.0 -Phive -Phive-thriftserver
> -DskipTests clean package
>
> [ERROR] Failed to execute goal
> org.apache.maven.plugins:maven-enforcer-plugin:1.4:enforce
> (enforce-versions) on project spark-parent_2.10: Some Enforcer rules have
> failed. Look above for specific messages explaining why the rule failed. ->
> [Help 1]
> [ERROR]
> [ERROR] To see the full stack trace of the errors, re-run Maven with the
> -e switch.
> [ERROR] Re-run Maven using the -X switch to enable full debug logging.
> [ERROR]
> [ERROR] For more information about the errors and possible solutions,
> please read the following articles:
> [ERROR] [Help 1]
> http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
>
>
>
> I checkout the branch-1.5 and replaced "1.5.2-SNAPSHOT" with "1.5.1" and
> build/sbt will still fail ( same error as above for sbt)
>
> But if I keep the version string as "1.5.2-SNAPSHOT", the build/sbt works
> fine.
>
>
> Any ideas ?
>
> Chester
>
>
>
>
>
>
>
>

Reply via email to