[jira] [Issue Comment Deleted] (SPARK-33090) Upgrade Google Guava

2020-10-12 Thread Stephen Coy (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-33090?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Stephen Coy updated SPARK-33090:

Comment: was deleted

(was: Created PR)

> Upgrade Google Guava
> 
>
> Key: SPARK-33090
> URL: https://issues.apache.org/jira/browse/SPARK-33090
> Project: Spark
>  Issue Type: Improvement
>  Components: Build
>Affects Versions: 3.0.1
>Reporter: Stephen Coy
>Priority: Major
>
> Hadoop versions newer than 3.2.0 (such as 3.2.1 and 3.3.0) have started using 
> features from newer versions of Google Guava.
> This leads to MethodNotFound exceptions, etc in Spark builds that specify 
> newer versions of Hadoop. I believe this is due to the use of new methods in 
> com.google.common.base.Preconditions.
> The above versions of Hadoop use guava-27.0-jre, whereas Spark is currently 
> glued to guava-14.0.1.
> I have been running a Spark cluster with the version bumped to guava-29.0-jre 
> without issue.
> Partly due to the way Spark is built, this change is a little more 
> complicated that just changing the version, because newer versions of guava 
> have a new dependency on com.google.guava:failureaccess:1.0.
>  
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-33090) Upgrade Google Guava

2020-10-12 Thread Stephen Coy (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-33090?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17212826#comment-17212826
 ] 

Stephen Coy commented on SPARK-33090:
-

Created PR

> Upgrade Google Guava
> 
>
> Key: SPARK-33090
> URL: https://issues.apache.org/jira/browse/SPARK-33090
> Project: Spark
>  Issue Type: Improvement
>  Components: Build
>Affects Versions: 3.0.1
>Reporter: Stephen Coy
>Priority: Major
>
> Hadoop versions newer than 3.2.0 (such as 3.2.1 and 3.3.0) have started using 
> features from newer versions of Google Guava.
> This leads to MethodNotFound exceptions, etc in Spark builds that specify 
> newer versions of Hadoop. I believe this is due to the use of new methods in 
> com.google.common.base.Preconditions.
> The above versions of Hadoop use guava-27.0-jre, whereas Spark is currently 
> glued to guava-14.0.1.
> I have been running a Spark cluster with the version bumped to guava-29.0-jre 
> without issue.
> Partly due to the way Spark is built, this change is a little more 
> complicated that just changing the version, because newer versions of guava 
> have a new dependency on com.google.guava:failureaccess:1.0.
>  
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-33090) Upgrade Google Guava

2020-10-07 Thread Stephen Coy (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-33090?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17209968#comment-17209968
 ] 

Stephen Coy commented on SPARK-33090:
-

I can create a PR for this if you like...

 

> Upgrade Google Guava
> 
>
> Key: SPARK-33090
> URL: https://issues.apache.org/jira/browse/SPARK-33090
> Project: Spark
>  Issue Type: Improvement
>  Components: Build
>Affects Versions: 3.0.1
>Reporter: Stephen Coy
>Priority: Major
>
> Hadoop versions newer than 3.2.0 (such as 3.2.1 and 3.3.0) have started using 
> features from newer versions of Google Guava.
> This leads to MethodNotFound exceptions, etc in Spark builds that specify 
> newer versions of Hadoop. I believe this is due to the use of new methods in 
> com.google.common.base.Preconditions.
> The above versions of Hadoop use guava-27.0-jre, whereas Spark is currently 
> glued to guava-14.0.1.
> I have been running a Spark cluster with the version bumped to guava-29.0-jre 
> without issue.
> Partly due to the way Spark is built, this change is a little more 
> complicated that just changing the version, because newer versions of guava 
> have a new dependency on com.google.guava:failureaccess:1.0.
>  
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-33090) Upgrade Google Guava

2020-10-07 Thread Stephen Coy (Jira)
Stephen Coy created SPARK-33090:
---

 Summary: Upgrade Google Guava
 Key: SPARK-33090
 URL: https://issues.apache.org/jira/browse/SPARK-33090
 Project: Spark
  Issue Type: Improvement
  Components: Build
Affects Versions: 3.0.1
Reporter: Stephen Coy


Hadoop versions newer than 3.2.0 (such as 3.2.1 and 3.3.0) have started using 
features from newer versions of Google Guava.

This leads to MethodNotFound exceptions, etc in Spark builds that specify newer 
versions of Hadoop. I believe this is due to the use of new methods in 
com.google.common.base.Preconditions.

The above versions of Hadoop use guava-27.0-jre, whereas Spark is currently 
glued to guava-14.0.1.

I have been running a Spark cluster with the version bumped to guava-29.0-jre 
without issue.

Partly due to the way Spark is built, this change is a little more complicated 
that just changing the version, because newer versions of guava have a new 
dependency on com.google.guava:failureaccess:1.0.

 

 



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org