[jira] [Commented] (SPARK-2071) Package private classes that are deleted from an older version of Spark trigger errors

2014-06-10 Thread Patrick Wendell (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-2071?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14026195#comment-14026195
 ] 

Patrick Wendell commented on SPARK-2071:


I just meant that we could introduce a new project in the build that only 
existed for the purpose of donwloading the older artifacts. So the script could 
do `sbt/sbt oldDeps/retrieveManaged`.

> Package private classes that are deleted from an older version of Spark 
> trigger errors
> --
>
> Key: SPARK-2071
> URL: https://issues.apache.org/jira/browse/SPARK-2071
> Project: Spark
>  Issue Type: Sub-task
>  Components: Build
>Reporter: Patrick Wendell
>Assignee: Prashant Sharma
> Fix For: 1.1.0
>
>
> We should figure out how to fix this. One idea is to run the MIMA exclude 
> generator with sbt itself (rather than ./spark-class) so it can run against 
> the older versions of Spark and make sure to exclude classes that are marked 
> as package private in that version as well.



--
This message was sent by Atlassian JIRA
(v6.2#6252)


[jira] [Commented] (SPARK-2071) Package private classes that are deleted from an older version of Spark trigger errors

2014-06-10 Thread Prashant Sharma (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-2071?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14026190#comment-14026190
 ] 

Prashant Sharma commented on SPARK-2071:


unfortunately lib_managed does not have these jars unless they are declared as 
project dependencies. By manually I meant our script will pull the right jar 
from maven the way we do it RAT.

> Package private classes that are deleted from an older version of Spark 
> trigger errors
> --
>
> Key: SPARK-2071
> URL: https://issues.apache.org/jira/browse/SPARK-2071
> Project: Spark
>  Issue Type: Sub-task
>  Components: Build
>Reporter: Patrick Wendell
>Assignee: Prashant Sharma
> Fix For: 1.1.0
>
>
> We should figure out how to fix this. One idea is to run the MIMA exclude 
> generator with sbt itself (rather than ./spark-class) so it can run against 
> the older versions of Spark and make sure to exclude classes that are marked 
> as package private in that version as well.



--
This message was sent by Atlassian JIRA
(v6.2#6252)


[jira] [Commented] (SPARK-2071) Package private classes that are deleted from an older version of Spark trigger errors

2014-06-09 Thread Patrick Wendell (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-2071?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14026045#comment-14026045
 ] 

Patrick Wendell commented on SPARK-2071:


Yes we could use sbt to retreive them and place them in lib_managed or 
something similar.

> Package private classes that are deleted from an older version of Spark 
> trigger errors
> --
>
> Key: SPARK-2071
> URL: https://issues.apache.org/jira/browse/SPARK-2071
> Project: Spark
>  Issue Type: Sub-task
>  Components: Build
>Reporter: Patrick Wendell
>Assignee: Prashant Sharma
> Fix For: 1.1.0
>
>
> We should figure out how to fix this. One idea is to run the MIMA exclude 
> generator with sbt itself (rather than ./spark-class) so it can run against 
> the older versions of Spark and make sure to exclude classes that are marked 
> as package private in that version as well.



--
This message was sent by Atlassian JIRA
(v6.2#6252)


[jira] [Commented] (SPARK-2071) Package private classes that are deleted from an older version of Spark trigger errors

2014-06-09 Thread Prashant Sharma (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-2071?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14025239#comment-14025239
 ] 

Prashant Sharma commented on SPARK-2071:


Or manually place the jar of the older version on ./spark-class before invoking 
GenerateMimaIgnore.

> Package private classes that are deleted from an older version of Spark 
> trigger errors
> --
>
> Key: SPARK-2071
> URL: https://issues.apache.org/jira/browse/SPARK-2071
> Project: Spark
>  Issue Type: Sub-task
>  Components: Build
>Reporter: Patrick Wendell
>Assignee: Prashant Sharma
> Fix For: 1.1.0
>
>
> We should figure out how to fix this. One idea is to run the MIMA exclude 
> generator with sbt itself (rather than ./spark-class) so it can run against 
> the older versions of Spark and make sure to exclude classes that are marked 
> as package private in that version as well.



--
This message was sent by Atlassian JIRA
(v6.2#6252)