Github user markhamstra commented on a diff in the pull request:

    https://github.com/apache/spark/pull/996#discussion_r15743162
  
    --- Diff: assembly/pom.xml ---
    @@ -26,7 +26,7 @@
       </parent>
     
       <groupId>org.apache.spark</groupId>
    -  <artifactId>spark-assembly_2.10</artifactId>
    +  <artifactId>spark-assembly_${scala.binary.version}</artifactId>
    --- End diff --
    
    I won't claim to be a maven-archetype-plugin expert, or to have figured out 
everything that we would need to do to start using archetyped builds (e.g. I 
haven't got sub-project builds with archetypes figured out yet), but the basics 
of archetypes are pretty simple to use and do get us a significant part of the 
way to cross-building Spark.
    
    To see how the rudiments work, you should be able to quickly and easily do 
the following:
    
    1) Clone the spark repo
    2) cd to spark/core
    3) mvn archetype:create-from-project
    4) cd into the new target/generated-sources/archetype
    5) mvn install
    6) make a tmp dir someplace and cd into it
    7) mvn archetype:generate -DarchetypeCatalog=local 
-DgroupId=org.apache.spark -DartifactId=spark-core_2.11 -Dversion=2.0.0-SNAPSHOT
    8) select '1', the only archetype that you now have installed locally
    9) observe that you now have a spark-core tree ready to build 
spark-core_2.11-2.0.0-SNAPSHOT (except for the spark-parent reference -- I told 
you, I haven't got sub-projects figured out yet.)


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to