[ 
https://issues.apache.org/jira/browse/PIO-30?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15930044#comment-15930044
 ] 

ASF GitHub Bot commented on PIO-30:
-----------------------------------

Github user dszeto commented on the issue:

    https://github.com/apache/incubator-predictionio/pull/364
  
    @shimamoto I realized the original build system was quite inflexible, so I 
went ahead and did this in the last couple days: 
https://travis-ci.org/apache/incubator-predictionio/builds/212014483. It's very 
close to working with tests across different dependencies. My apologies without 
syncing up with you.
    
    The build system on that branch basically allows you to do this:
    ```
    ./make-distribution.sh -Dscala.version=2.10.6 -Dspark.version=2.1.0 
-Dhadoop.version=2.7.3 -Delasticsearch.version=5.2.2
    ```
    The script does not cross build, but `crossScalaVersions` in `build.sbt` is 
already defined, so it's possible to cross build and cross publish artifacts. 
`make-distribution.sh` could possibly be extended to produce more than one 
tarballs as well.
    
    Also, before `docker-compose` becomes available on Apache Jenkins, I am 
using that as a multi-build just to make sure things compile across versions: 
https://builds.apache.org/job/incubator-predictionio-multibuild/
    
    Could you take a look at this branch and see how it looks to you? It would 
be great to converge our changes.


> Cross build for different versions of scala and spark
> -----------------------------------------------------
>
>                 Key: PIO-30
>                 URL: https://issues.apache.org/jira/browse/PIO-30
>             Project: PredictionIO
>          Issue Type: Improvement
>            Reporter: Marcin ZiemiƄski
>            Assignee: Chan
>             Fix For: 0.11.0
>
>
> The present version of Scala is 2.10 and Spark is 1.4, which is quite old. 
> With Spark 2.0.0 come many performance improvements and features, that people 
> will definitely like to add to their templates. I am also aware that past 
> cannot be ignored and simply dumping 1.x might not be an option for other 
> users. 
> I propose setting up a crossbuild in sbt to build with scala 2.10 and Spark 
> 1.6 and a separate one for Scala 2.11 and Spark 2.0. Most of the files will 
> be consistent between versions including API. The problematic ones will be 
> divided between additional source directories: src/main/scala-2.10/ and 
> src/main/scala-2.11/. The dockerized tests should also take the two versions 
> into consideration



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

Reply via email to