[ 
https://issues.apache.org/jira/browse/BIGTOP-1755?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14366448#comment-14366448
 ] 

Jonathan Kelly commented on BIGTOP-1755:
----------------------------------------

Bigtop doesn't really care about the fact that Spark ensures API compatibility 
across 1.x releases.  It's more just that simply upgrading the Spark version in 
Bigtop could potentially cause the RPM/DEB builds to fail (due to, say, build 
output files being in different places, or requiring different Maven build 
options, etc.) or could cause the installed packages not to work properly 
(e.g., maybe the new major version adds some feature or file that 
do-component-build or install_spark.sh doesn't take into account and thus 
doesn't make its way into the RPM/DEB).  Both of these situations could 
certainly occur across 1.x releases, but I did verify that this is not the case 
for 1.2.1->1.3.0, as it does still work without any changes other than 
upgrading the version in bigtop.mk.

> Upgrade to Spark 1.3.0
> ----------------------
>
>                 Key: BIGTOP-1755
>                 URL: https://issues.apache.org/jira/browse/BIGTOP-1755
>             Project: Bigtop
>          Issue Type: Task
>          Components: spark
>    Affects Versions: 0.8.0
>            Reporter: Jonathan Kelly
>            Assignee: Jonathan Kelly
>            Priority: Critical
>             Fix For: 0.9.0
>
>         Attachments: BIGTOP-1755.patch
>
>
> Spark 1.3.0 was released today (see 
> http://spark.apache.org/news/spark-1-3-0-released.html), so I figured that I 
> might as well upgrade it in Bigtop as soon as possible.  I have already 
> tested that Spark 1.3.0 works fine with just the trivial changes to the Spark 
> version number in bigtop.mk.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to