Hi everybody,

I tried to build Spark v1.4.1-rc4 with Scala 2.11:
../apache-maven-3.3.3/bin/mvn -Dscala-2.11 -DskipTests clean install

Before running this, I deleted:
../.m2/repository/org/apache/spark
../.m2/repository/org/spark-project

My changes to the code:
I just changed line 174 of org.apache.spark.executor.Executor$TaskRunner
to:
logInfo(s"test Executor is trying to kill $taskName (TID $taskId)")

Everything builds without an error, but I have an issue.

When I look into the jar of spark-core_2.10, I can see the changed string
in Executor$TaskRunner$$anonfun$kill$1.class. But when I look
into spark-core_2.11 the corresponding string didn't change. It seems like
it downloads the jar from maven.

Do you know what I did wrong?

I also tried to run "mvn -Dscala-2.11 -DskipTests clean install" on the
current master and got the following error:

[ERROR] Failed to execute goal
org.apache.maven.plugins:maven-enforcer-plugin:1.4:enforce
(enforce-versions) on project spark-parent_2.10: Some Enforcer rules have
failed. Look above for specific messages explaining why the rule failed. ->
[Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e
switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions,
please read the following articles:
[ERROR] [Help 1]
http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException

Thank you for your help.

Best regards,
Felix

Reply via email to