[GitHub] spark pull request #14637: [SPARK-16967] move mesos to module
Github user asfgit closed the pull request at: https://github.com/apache/spark/pull/14637 --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #14637: [SPARK-16967] move mesos to module
Github user mgummelt commented on a diff in the pull request: https://github.com/apache/spark/pull/14637#discussion_r76292137 --- Diff: dev/create-release/release-build.sh --- @@ -186,12 +186,13 @@ if [[ "$1" == "package" ]]; then # We increment the Zinc port each time to avoid OOM's and other craziness if multiple builds # share the same Zinc server. - make_binary_release "hadoop2.3" "-Psparkr -Phadoop-2.3 -Phive -Phive-thriftserver -Pyarn" "3033" & - make_binary_release "hadoop2.4" "-Psparkr -Phadoop-2.4 -Phive -Phive-thriftserver -Pyarn" "3034" & - make_binary_release "hadoop2.6" "-Psparkr -Phadoop-2.6 -Phive -Phive-thriftserver -Pyarn" "3035" & - make_binary_release "hadoop2.7" "-Psparkr -Phadoop-2.7 -Phive -Phive-thriftserver -Pyarn" "3036" & - make_binary_release "hadoop2.4-without-hive" "-Psparkr -Phadoop-2.4 -Pyarn" "3037" & - make_binary_release "without-hadoop" "-Psparkr -Phadoop-provided -Pyarn" "3038" & + FLAGS="-Psparkr -Phadoop-2.3 -Phive -Phive-thriftserver -Pyarn -Pmesos" + make_binary_release "hadoop2.3" "$FLAGS" "3033" & + make_binary_release "hadoop2.4" "$FLAGS" "3034" & --- End diff -- ah, yea. fixing... --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #14637: [SPARK-16967] move mesos to module
Github user vanzin commented on a diff in the pull request: https://github.com/apache/spark/pull/14637#discussion_r76291436 --- Diff: dev/create-release/release-build.sh --- @@ -186,12 +186,13 @@ if [[ "$1" == "package" ]]; then # We increment the Zinc port each time to avoid OOM's and other craziness if multiple builds # share the same Zinc server. - make_binary_release "hadoop2.3" "-Psparkr -Phadoop-2.3 -Phive -Phive-thriftserver -Pyarn" "3033" & - make_binary_release "hadoop2.4" "-Psparkr -Phadoop-2.4 -Phive -Phive-thriftserver -Pyarn" "3034" & - make_binary_release "hadoop2.6" "-Psparkr -Phadoop-2.6 -Phive -Phive-thriftserver -Pyarn" "3035" & - make_binary_release "hadoop2.7" "-Psparkr -Phadoop-2.7 -Phive -Phive-thriftserver -Pyarn" "3036" & - make_binary_release "hadoop2.4-without-hive" "-Psparkr -Phadoop-2.4 -Pyarn" "3037" & - make_binary_release "without-hadoop" "-Psparkr -Phadoop-provided -Pyarn" "3038" & + FLAGS="-Psparkr -Phadoop-2.3 -Phive -Phive-thriftserver -Pyarn -Pmesos" + make_binary_release "hadoop2.3" "$FLAGS" "3033" & + make_binary_release "hadoop2.4" "$FLAGS" "3034" & --- End diff -- This is wrong now; "FLAGS" enables "-Phadoop-2.3" when here it shuold be "-Phadoop-2.4" (and matching versions in the lines below). --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #14637: [SPARK-16967] move mesos to module
Github user mgummelt commented on a diff in the pull request: https://github.com/apache/spark/pull/14637#discussion_r75953420 --- Diff: mesos/pom.xml --- @@ -0,0 +1,109 @@ + + +http://maven.apache.org/POM/4.0.0; xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance; xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd;> + 4.0.0 + +org.apache.spark +spark-parent_2.11 +2.1.0-SNAPSHOT +../pom.xml + + + spark-mesos_2.11 + jar + Spark Project Mesos + +mesos +1.0.0 +shaded-protobuf + + + + + org.apache.spark + spark-core_${scala.binary.version} + ${project.version} + + + + org.apache.spark + spark-core_${scala.binary.version} + ${project.version} + test-jar + test + + + + org.apache.mesos + mesos + ${mesos.version} + ${mesos.classifier} + + + com.google.protobuf + protobuf-java + + + + + --- End diff -- Ah, I didn't realize the intransitivity of provided dependencies applied to inheritance as well. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #14637: [SPARK-16967] move mesos to module
Github user vanzin commented on a diff in the pull request: https://github.com/apache/spark/pull/14637#discussion_r75939624 --- Diff: mesos/pom.xml --- @@ -0,0 +1,109 @@ + + +http://maven.apache.org/POM/4.0.0; xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance; xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd;> + 4.0.0 + +org.apache.spark +spark-parent_2.11 +2.1.0-SNAPSHOT +../pom.xml + + + spark-mesos_2.11 + jar + Spark Project Mesos + +mesos +1.0.0 +shaded-protobuf + + + + + org.apache.spark + spark-core_${scala.binary.version} + ${project.version} + + + + org.apache.spark + spark-core_${scala.binary.version} + ${project.version} + test-jar + test + + + + org.apache.mesos + mesos + ${mesos.version} + ${mesos.classifier} + + + com.google.protobuf + protobuf-java + + + + + --- End diff -- (Note to self: YARN - and this new mesos module - should only need to declare the Guava dependency explicitly, since they don't use Jetty directly.) --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #14637: [SPARK-16967] move mesos to module
Github user vanzin commented on a diff in the pull request: https://github.com/apache/spark/pull/14637#discussion_r75930641 --- Diff: mesos/pom.xml --- @@ -0,0 +1,109 @@ + + +http://maven.apache.org/POM/4.0.0; xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance; xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd;> + 4.0.0 + +org.apache.spark +spark-parent_2.11 +2.1.0-SNAPSHOT +../pom.xml + + + spark-mesos_2.11 + jar + Spark Project Mesos + +mesos +1.0.0 +shaded-protobuf + + + + + org.apache.spark + spark-core_${scala.binary.version} + ${project.version} + + + + org.apache.spark + spark-core_${scala.binary.version} + ${project.version} + test-jar + test + + + + org.apache.mesos + mesos + ${mesos.version} + ${mesos.classifier} + + + com.google.protobuf + protobuf-java + + + + + --- End diff -- They're shaded because people want to use different versions of guava and jetty. Spark doesn't need to promote them to compile scope because the relocated guava / jetty classes are packaged with Spark itself. I see what they problem you're running into is now. It has nothing to do with compile scope, you just need to explicit list the dependencies because the mesos code references them directly. So you actually just need to fix the comment since you're not promoting anything to compile scope by explicitly adding the dependencies (they still inherit the scope from the parent pom, which is "provided", and is actually what you want here). --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #14637: [SPARK-16967] move mesos to module
Github user mgummelt commented on a diff in the pull request: https://github.com/apache/spark/pull/14637#discussion_r75929868 --- Diff: mesos/pom.xml --- @@ -0,0 +1,109 @@ + + +http://maven.apache.org/POM/4.0.0; xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance; xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd;> + 4.0.0 + +org.apache.spark +spark-parent_2.11 +2.1.0-SNAPSHOT +../pom.xml + + + spark-mesos_2.11 + jar + Spark Project Mesos + +mesos +1.0.0 +shaded-protobuf + + + + + org.apache.spark + spark-core_${scala.binary.version} + ${project.version} + + + + org.apache.spark + spark-core_${scala.binary.version} + ${project.version} + test-jar + test + + + + org.apache.mesos + mesos + ${mesos.version} + ${mesos.classifier} + + + com.google.protobuf + protobuf-java + + + + + --- End diff -- YARN does it: https://github.com/apache/spark/blob/master/yarn/pom.xml#L79 which is what I was using as a starting point. I initially did try to remove them, but got a ClassNotFoundException in WebUI.class when trying to import org.eclipse.jetty. The parent project seems to rename org.eclipse.jetty to org.spark_project.jetty https://github.com/apache/spark/blob/master/pom.xml#L2273 It's documented that the shaded deps should be put into compile scope in the module: https://github.com/apache/spark/blob/master/pom.xml#L315 Though I still don't understand a) why they're shaded in the first place, and b) how those classes are loaded when spark is built without a module that promotes those dependencies to compile scope. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #14637: [SPARK-16967] move mesos to module
Github user vanzin commented on a diff in the pull request: https://github.com/apache/spark/pull/14637#discussion_r75927255 --- Diff: mesos/pom.xml --- @@ -0,0 +1,109 @@ + + +http://maven.apache.org/POM/4.0.0; xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance; xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd;> + 4.0.0 + +org.apache.spark +spark-parent_2.11 +2.1.0-SNAPSHOT +../pom.xml + + + spark-mesos_2.11 + jar + Spark Project Mesos + +mesos +1.0.0 +shaded-protobuf + + + + + org.apache.spark + spark-core_${scala.binary.version} + ${project.version} + + + + org.apache.spark + spark-core_${scala.binary.version} + ${project.version} + test-jar + test + + + + org.apache.mesos + mesos + ${mesos.version} + ${mesos.classifier} + + + com.google.protobuf + protobuf-java + + + + + --- End diff -- You shouldn't need to touch the shaded dependencies (see how no other module does that). If you're running into problems, you're probably missing something else, so post the error so we can take a look. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #14637: [SPARK-16967] move mesos to module
Github user mgummelt commented on a diff in the pull request: https://github.com/apache/spark/pull/14637#discussion_r75926093 --- Diff: mesos/pom.xml --- @@ -0,0 +1,109 @@ + + +http://maven.apache.org/POM/4.0.0; xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance; xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd;> + 4.0.0 + +org.apache.spark +spark-parent_2.11 +2.1.0-SNAPSHOT +../pom.xml + + + spark-mesos_2.11 + jar + Spark Project Mesos + +mesos +1.0.0 +shaded-protobuf + + + + + org.apache.spark + spark-core_${scala.binary.version} + ${project.version} + + + + org.apache.spark + spark-core_${scala.binary.version} + ${project.version} + test-jar + test + + + + org.apache.mesos + mesos + ${mesos.version} + ${mesos.classifier} + + + com.google.protobuf + protobuf-java + + + + + --- End diff -- I just now removed mockito, because it was already a dependency of the parent project. I'm pretty sure all of the shaded deps need to be listed. I don't know why they're shaded in the first place, but since they are, they must be listed here to avoid classloading errors at runtime. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org