[ 
https://issues.apache.org/jira/browse/SPARK-30272?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17069933#comment-17069933
 ] 

Jorge Machado edited comment on SPARK-30272 at 3/28/20, 4:25 PM:
-----------------------------------------------------------------

Hey Sean, 

This seems still to make problems for example: 
{code:java}
> $ ./bin/run-example SparkPi 100                                               
>                                                                               
>                                                                               
>     [±master ●]> $ ./bin/run-example SparkPi 100                              
>                                                                               
>                                                                               
>                      [±master ●]20/03/28 17:21:13 WARN Utils: Your hostname, 
> Jorges-MacBook-Pro.local resolves to a loopback address: 127.0.0.1; using 
> 192.168.1.2 instead (on interface en0)20/03/28 17:21:13 WARN Utils: Set 
> SPARK_LOCAL_IP if you need to bind to another address20/03/28 17:21:14 WARN 
> NativeCodeLoader: Unable to load native-hadoop library for your platform... 
> using builtin-java classes where applicableUsing Spark's default log4j 
> profile: org/apache/spark/log4j-defaults.properties20/03/28 17:21:14 INFO 
> SparkContext: Running Spark version 3.1.0-SNAPSHOT20/03/28 17:21:14 INFO 
> ResourceUtils: 
> ==============================================================20/03/28 
> 17:21:14 INFO ResourceUtils: No custom resources configured for 
> spark.driver.20/03/28 17:21:14 INFO ResourceUtils: 
> ==============================================================20/03/28 
> 17:21:14 INFO SparkContext: Submitted application: Spark Pi20/03/28 17:21:14 
> INFO ResourceProfile: Default ResourceProfile created, executor resources: 
> Map(cores -> name: cores, amount: 1, script: , vendor: , memory -> name: 
> memory, amount: 1024, script: , vendor: ), task resources: Map(cpus -> name: 
> cpus, amount: 1.0)20/03/28 17:21:14 INFO ResourceProfile: Limiting resource 
> is cpu20/03/28 17:21:14 INFO ResourceProfileManager: Added ResourceProfile 
> id: 020/03/28 17:21:14 INFO SecurityManager: Changing view acls to: 
> jorge20/03/28 17:21:14 INFO SecurityManager: Changing modify acls to: 
> jorge20/03/28 17:21:14 INFO SecurityManager: Changing view acls groups 
> to:20/03/28 17:21:14 INFO SecurityManager: Changing modify acls groups 
> to:20/03/28 17:21:14 INFO SecurityManager: SecurityManager: authentication 
> disabled; ui acls disabled; users  with view permissions: Set(jorge); groups 
> with view permissions: Set(); users  with modify permissions: Set(jorge); 
> groups with modify permissions: Set()20/03/28 17:21:14 INFO Utils: 
> Successfully started service 'sparkDriver' on port 58192.20/03/28 17:21:14 
> INFO SparkEnv: Registering MapOutputTracker20/03/28 17:21:14 INFO SparkEnv: 
> Registering BlockManagerMaster20/03/28 17:21:14 INFO 
> BlockManagerMasterEndpoint: Using 
> org.apache.spark.storage.DefaultTopologyMapper for getting topology 
> information20/03/28 17:21:14 INFO BlockManagerMasterEndpoint: 
> BlockManagerMasterEndpoint up20/03/28 17:21:14 INFO SparkEnv: Registering 
> BlockManagerMasterHeartbeat20/03/28 17:21:14 INFO DiskBlockManager: Created 
> local directory at 
> /private/var/folders/0h/5b7dw9p11l58hyk0_s0d3cnh0000gn/T/blockmgr-d9e88815-075e-4c9b-9cc8-21c72e97c86920/03/28
>  17:21:14 INFO MemoryStore: MemoryStore started with capacity 366.3 
> MiB20/03/28 17:21:14 INFO SparkEnv: Registering 
> OutputCommitCoordinator20/03/28 17:21:15 INFO Utils: Successfully started 
> service 'SparkUI' on port 4040.20/03/28 17:21:15 INFO SparkUI: Bound SparkUI 
> to 0.0.0.0, and started at http://192.168.1.2:404020/03/28 17:21:15 INFO 
> SparkContext: Added JAR 
> file:///Users/jorge/Downloads/spark/dist/examples/jars/spark-examples_2.12-3.1.0-SNAPSHOT.jar
>  at spark://192.168.1.2:58192/jars/spark-examples_2.12-3.1.0-SNAPSHOT.jar 
> with timestamp 158541247516620/03/28 17:21:15 INFO SparkContext: Added JAR 
> file:///Users/jorge/Downloads/spark/dist/examples/jars/scopt_2.12-3.7.1.jar 
> at spark://192.168.1.2:58192/jars/scopt_2.12-3.7.1.jar with timestamp 
> 158541247516620/03/28 17:21:15 INFO Executor: Starting executor ID driver on 
> host 192.168.1.220/03/28 17:21:15 INFO Utils: Successfully started service 
> 'org.apache.spark.network.netty.NettyBlockTransferService' on port 
> 58193.20/03/28 17:21:15 INFO NettyBlockTransferService: Server created on 
> 192.168.1.2:5819320/03/28 17:21:15 INFO BlockManager: Using 
> org.apache.spark.storage.RandomBlockReplicationPolicy for block replication 
> policyException in thread "main" java.lang.NoClassDefFoundError: 
> org/sparkproject/guava/util/concurrent/internal/InternalFutureFailureAccess 
> at java.lang.ClassLoader.defineClass1(Native Method) at 
> java.lang.ClassLoader.defineClass(ClassLoader.java:763) at 
> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142) at 
> java.net.URLClassLoader.defineClass(URLClassLoader.java:468) at 
> java.net.URLClassLoader.access$100(URLClassLoader.java:74) at 
> java.net.URLClassLoader$1.run(URLClassLoader.java:369) at 
> java.net.URLClassLoader$1.run(URLClassLoader.java:363) at 
> java.security.AccessController.doPrivileged(Native Method) at 
> java.net.URLClassLoader.findClass(URLClassLoader.java:362) at 
> java.lang.ClassLoader.loadClass(ClassLoader.java:424) at 
> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349) at 
> java.lang.ClassLoader.loadClass(ClassLoader.java:357) at 
> java.lang.ClassLoader.defineClass1(Native Method) at 
> java.lang.ClassLoader.defineClass(ClassLoader.java:763) at 
> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142) at 
> java.net.URLClassLoader.defineClass(URLClassLoader.java:468) at 
> java.net.URLClassLoader.access$100(URLClassLoader.java:74) at 
> java.net.URLClassLoader$1.run(URLClassLoader.java:369) at 
> java.net.URLClassLoader$1.run(URLClassLoader.java:363) at 
> java.security.AccessController.doPrivileged(Native Method) at 
> java.net.URLClassLoader.findClass(URLClassLoader.java:362) at 
> java.lang.ClassLoader.loadClass(ClassLoader.java:424) at 
> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349) at 
> java.lang.ClassLoader.loadClass(ClassLoader.java:357) at 
> java.lang.ClassLoader.defineClass1(Native Method) at 
> java.lang.ClassLoader.defineClass(ClassLoader.java:763) at 
> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142) at 
> java.net.URLClassLoader.defineClass(URLClassLoader.java:468) at 
> java.net.URLClassLoader.access$100(URLClassLoader.java:74) at 
> java.net.URLClassLoader$1.run(URLClassLoader.java:369) at 
> java.net.URLClassLoader$1.run(URLClassLoader.java:363) at 
> java.security.AccessController.doPrivileged(Native Method) at 
> java.net.URLClassLoader.findClass(URLClassLoader.java:362) at 
> java.lang.ClassLoader.loadClass(ClassLoader.java:424) at 
> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349) at 
> java.lang.ClassLoader.loadClass(ClassLoader.java:357) at 
> org.sparkproject.guava.cache.LocalCache$LoadingValueReference.<init>(LocalCache.java:3472)
>  at 
> org.sparkproject.guava.cache.LocalCache$LoadingValueReference.<init>(LocalCache.java:3476)
>  at 
> org.sparkproject.guava.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2134)
>  at org.sparkproject.guava.cache.LocalCache$Segment.get(LocalCache.java:2045) 
> at org.sparkproject.guava.cache.LocalCache.get(LocalCache.java:3951) at 
> org.sparkproject.guava.cache.LocalCache.getOrLoad(LocalCache.java:3974) at 
> org.sparkproject.guava.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4958)
>  at 
> org.apache.spark.storage.BlockManagerId$.getCachedBlockManagerId(BlockManagerId.scala:146)
>  at org.apache.spark.storage.BlockManagerId$.apply(BlockManagerId.scala:127) 
> at org.apache.spark.storage.BlockManager.initialize(BlockManager.scala:457) 
> at org.apache.spark.SparkContext.<init>(SparkContext.scala:564) at 
> org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2570) at 
> org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate$5(SparkSession.scala:936)
>  at scala.Option.getOrElse(Option.scala:189) at 
> org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:927) 
> at org.apache.spark.examples.SparkPi$.main(SparkPi.scala:30) at 
> org.apache.spark.examples.SparkPi.main(SparkPi.scala) at 
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
> at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>  at java.lang.reflect.Method.invoke(Method.java:498) at 
> org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52) 
> at 
> org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:932)
>  at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180) at 
> org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203) at 
> org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90) at 
> org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1011) 
> at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1020) at 
> org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)Caused by: 
> java.lang.ClassNotFoundException: 
> org.sparkproject.guava.util.concurrent.internal.InternalFutureFailureAccess 
> at java.net.URLClassLoader.findClass(URLClassLoader.java:382) at 
> java.lang.ClassLoader.loadClass(ClassLoader.java:424) at 
> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349) at 
> java.lang.ClassLoader.loadClass(ClassLoader.java:357) ... 65 more20/03/28 
> 17:21:15 INFO DiskBlockManager: Shutdown hook called
{code}
I still see a lot of references to guava 14 on master is this normal ? Sorry 
for the question...

 

 

 


was (Author: jomach):
Hey Sean, 

This seems still to make problems for example: 
{code:java}
 java.lang.NoClassDefFoundError: 
com/google/common/util/concurrent/internal/InternalFutureFailureAccess 
java.lang.NoClassDefFoundError: 
com/google/common/util/concurrent/internal/InternalFutureFailureAccess at 
java.lang.ClassLoader.defineClass1(Native Method) at 
java.lang.ClassLoader.defineClass(ClassLoader.java:757) at 
java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142) at 
java.net.URLClassLoader.defineClass(URLClassLoader.java:468) at 
java.net.URLClassLoader.access$100(URLClassLoader.java:74) at 
java.net.URLClassLoader$1.run(URLClassLoader.java:369) at 
java.net.URLClassLoader$1.run(URLClassLoader.java:363) at 
java.security.AccessController.doPrivileged(Native Method) at 
java.net.URLClassLoader.findClass(URLClassLoader.java:362) at 
java.lang.ClassLoader.loadClass(ClassLoader.java:419) at 
sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352) at 
java.lang.ClassLoader.loadClass(ClassLoader.java:352) at 
java.lang.ClassLoader.defineClass1(Native Method) at 
java.lang.ClassLoader.defineClass(ClassLoader.java:757) at 
java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142) at 
java.net.URLClassLoader.defineClass(URLClassLoader.java:468) at 
java.net.URLClassLoader.access$100(URLClassLoader.java:74) at 
java.net.URLClassLoader$1.run(URLClassLoader.java:369) at 
java.net.URLClassLoader$1.run(URLClassLoader.java:363) at 
java.security.AccessController.doPrivileged(Native Method) at 
java.net.URLClassLoader.findClass(URLClassLoader.java:362) at 
java.lang.ClassLoader.loadClass(ClassLoader.java:419) at 
sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352) at 
java.lang.ClassLoader.loadClass(ClassLoader.java:352) at 
java.lang.ClassLoader.defineClass1(Native Method) at 
java.lang.ClassLoader.defineClass(ClassLoader.java:757) at 
java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142) at 
java.net.URLClassLoader.defineClass(URLClassLoader.java:468) at 
java.net.URLClassLoader.access$100(URLClassLoader.java:74) at 
java.net.URLClassLoader$1.run(URLClassLoader.java:369) at 
java.net.URLClassLoader$1.run(URLClassLoader.java:363) at 
java.security.AccessController.doPrivileged(Native Method) at 
java.net.URLClassLoader.findClass(URLClassLoader.java:362) at 
java.lang.ClassLoader.loadClass(ClassLoader.java:419) at 
sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352) at 
java.lang.ClassLoader.loadClass(ClassLoader.java:352) at 
com.google.common.cache.LocalCache$LoadingValueReference.<init>(LocalCache.java:3472)
 at 
com.google.common.cache.LocalCache$LoadingValueReference.<init>(LocalCache.java:3476)
 at 
com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2134)
 at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2045) at 
com.google.common.cache.LocalCache.get(LocalCache.java:3951) at 
com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:3974) at 
com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4958) 
at org.apache.hadoop.security.Groups.getGroups(Groups.java:228) at 
org.apache.hadoop.security.UserGroupInformation.getGroups(UserGroupInformation.java:1588)
 at 
org.apache.hadoop.security.UserGroupInformation.getPrimaryGroupName(UserGroupInformation.java:1453)
 at 
org.apache.hadoop.fs.azurebfs.AzureBlobFileSystemStore.<init>(AzureBlobFileSystemStore.java:147)
 at 
org.apache.hadoop.fs.azurebfs.AzureBlobFileSystem.initialize(AzureBlobFileSystem.java:104)
 at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:3303) at 
org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:124) at 
org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:3352) at 
org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:3320) at 
org.apache.hadoop.fs.FileSystem.get(FileSystem.java:479) at 
org.apache.hadoop.fs.Path.getFileSystem(Path.java:365) at 
org.apache.hadoop.mapreduce.lib.input.FileInputFormat.setInputPaths(FileInputFormat.java:522)
 at 
org.apache.hadoop.mapreduce.lib.input.FileInputFormat.setInputPaths(FileInputFormat.java:491)
 at 
org.apache.spark.SparkContext.$anonfun$newAPIHadoopFile$2(SparkContext.scala:1219)
 at 
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151) 
at 
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112) 
at org.apache.spark.SparkContext.withScope(SparkContext.scala:757) at 
org.apache.spark.SparkContext.newAPIHadoopFile(SparkContext.scala:1207) at 
org.apache.spark.api.java.JavaSparkContext.newAPIHadoopFile(JavaSparkContext.scala:484)
{code}
I still see a lot of references to guava 14 on master is this normal ? Sorry 
for the question...

 

 

 

> Remove usage of Guava that breaks in Guava 27
> ---------------------------------------------
>
>                 Key: SPARK-30272
>                 URL: https://issues.apache.org/jira/browse/SPARK-30272
>             Project: Spark
>          Issue Type: Improvement
>          Components: Spark Core, SQL
>    Affects Versions: 3.0.0
>            Reporter: Sean R. Owen
>            Assignee: Sean R. Owen
>            Priority: Major
>             Fix For: 3.0.0
>
>
> Background:
> https://issues.apache.org/jira/browse/SPARK-29250
> https://github.com/apache/spark/pull/25932
> Hadoop 3.2.1 will update Guava from 11 to 27. There are a number of methods 
> that changed between those releases, typically just a rename, but, means one 
> set of code can't work with both, while we want to work with Hadoop 2.x and 
> 3.x. Among them:
> - Objects.toStringHelper was moved to MoreObjects; we can just use the 
> Commons Lang3 equivalent
> - Objects.hashCode etc were renamed; use java.util.Objects equivalents
> - MoreExecutors.sameThreadExecutor() became directExecutor(); for same-thread 
> execution we can use a dummy implementation of ExecutorService / Executor
> - TypeToken.isAssignableFrom become isSupertypeOf; work around with reflection
> There is probably more to the Guava issue than just this change, but it will 
> make Spark itself work with more versions and reduce our exposure to Guava 
> along the way anyway.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to