[ 
https://issues.apache.org/jira/browse/SPARK-16725?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15394572#comment-15394572
 ] 

Min Wei commented on SPARK-16725:
---------------------------------

First I am not blocked. Plus I can hack anything I need to make it work given 
the OSS nature. Here I am purely curious how Spark plans versioning management 
as a more general platform. 

As it stands, looks like Spark is "leaking" the Guava dependency. I just did a 
web search and looks like there is quite a bit of energy spent on this: 
   
https://groups.google.com/a/lists.datastax.com/forum/#!topic/spark-connector-user/HnTsWJkI5jo
   https://issues.apache.org/jira/browse/ZEPPELIN-620

My suggestion is that, the Spark platform needs to provide guidelines. So 
spark-cassandra etc. platform pieces built on top should follow it. Otherwise 
it will be painful for the upper stack developers or users to consume the whole 
stack. 

>Spark has to ship a Guava jar because Hadoop needs it 
I don't understand this. I assume Hadoop is a dependency for Spark. Spark uses 
v14 of Guava to shadows v11 in Hadoop? 

>Changing fro 14 to 16 will fix your use case, but what about someone who wants 
>a different version? 
As long as the version is moving forward, not backwards. Of course in this case 
Guava itself could have done a better job of backwards compatibility. [

>"shade your custom dependencies" works for everyone, 
Won't this cause code/jar bloat and pain for everyone?


> Migrate Guava to 16+?
> ---------------------
>
>                 Key: SPARK-16725
>                 URL: https://issues.apache.org/jira/browse/SPARK-16725
>             Project: Spark
>          Issue Type: Improvement
>          Components: Build
>    Affects Versions: 2.0.1
>            Reporter: Min Wei
>            Priority: Minor
>   Original Estimate: 12h
>  Remaining Estimate: 12h
>
> Currently Spark depends on an old version of Guava, version 14. However 
> Spark-cassandra driver asserts on Guava version 16 and above. 
> It would be great to update the Guava dependency to version 16+
> diff --git a/core/src/main/scala/org/apache/spark/SecurityManager.scala 
> b/core/src/main/scala/org/apache/spark/SecurityManager.scala
> index f72c7de..abddafe 100644
> --- a/core/src/main/scala/org/apache/spark/SecurityManager.scala
> +++ b/core/src/main/scala/org/apache/spark/SecurityManager.scala
> @@ -23,7 +23,7 @@ import java.security.{KeyStore, SecureRandom}
>  import java.security.cert.X509Certificate
>  import javax.net.ssl._
>  
> -import com.google.common.hash.HashCodes
> +import com.google.common.hash.HashCode
>  import com.google.common.io.Files
>  import org.apache.hadoop.io.Text
>  
> @@ -432,7 +432,7 @@ private[spark] class SecurityManager(sparkConf: SparkConf)
>          val secret = new Array[Byte](length)
>          rnd.nextBytes(secret)
>  
> -        val cookie = HashCodes.fromBytes(secret).toString()
> +        val cookie = HashCode.fromBytes(secret).toString()
>          SparkHadoopUtil.get.addSecretKeyToUserCredentials(SECRET_LOOKUP_KEY, 
> cookie)
>          cookie
>        } else {
> diff --git a/core/src/main/scala/org/apache/spark/SparkEnv.scala 
> b/core/src/main/scala/org/apache/spark/SparkEnv.scala
> index af50a6d..02545ae 100644
> --- a/core/src/main/scala/org/apache/spark/SparkEnv.scala
> +++ b/core/src/main/scala/org/apache/spark/SparkEnv.scala
> @@ -72,7 +72,7 @@ class SparkEnv (
>  
>    // A general, soft-reference map for metadata needed during HadoopRDD 
> split computation
>    // (e.g., HadoopFileRDD uses this to cache JobConfs and InputFormats).
> -  private[spark] val hadoopJobMetadata = new 
> MapMaker().softValues().makeMap[String, Any]()
> +  private[spark] val hadoopJobMetadata = new 
> MapMaker().weakValues().makeMap[String, Any]()
>  
>    private[spark] var driverTmpDir: Option[String] = None
>  
> diff --git a/pom.xml b/pom.xml
> index d064cb5..7c3e036 100644
> --- a/pom.xml
> +++ b/pom.xml
> @@ -368,8 +368,7 @@
>        <dependency>
>          <groupId>com.google.guava</groupId>
>          <artifactId>guava</artifactId>
> -        <version>14.0.1</version>
> -        <scope>provided</scope>
> +        <version>19.0</version>
>        </dependency>
>        <!-- End of shaded deps -->
>        <dependency>



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to