[ https://issues.apache.org/jira/browse/SPARK-35691?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Kousuke Saruta resolved SPARK-35691. ------------------------------------ Fix Version/s: 3.2.0 Assignee: Kevin Su Resolution: Fixed Issue resolved in https://github.com/apache/spark/pull/32845. > addFile/addJar/addDirectory should put CanonicalFile > ---------------------------------------------------- > > Key: SPARK-35691 > URL: https://issues.apache.org/jira/browse/SPARK-35691 > Project: Spark > Issue Type: Improvement > Components: Spark Core > Affects Versions: 3.2.0 > Reporter: Kevin Su > Assignee: Kevin Su > Priority: Minor > Fix For: 3.2.0 > > > I met the error below. > > {code:java} > 21/06/07 00:06:57 ERROR SparkContext: Failed to add > file:/home/runner/work/spark/spark/./core/target/scala-2.12/spark-core_2.12-3.2.0-SNAPSHOT-tests.jar > to Spark environment java.lang.IllegalArgumentException: requirement > failed: File spark-core_2.12-3.2.0-SNAPSHOT-tests.jar was already registered > with a different path (old path = > /home/runner/work/spark/spark/core/target/scala-2.12/spark-core_2.12-3.2.0-SNAPSHOT-tests.jar, > new path = > /home/runner/work/spark/spark/./core/target/scala-2.12/spark-core_2.12-3.2.0-SNAPSHOT-tests.jar > {code} > But actually, > *[{color:#172b4d}/home/runner/work/spark/spark/./core/target/scala-2.12/spark-core_2.12-3.2.0-SNAPSHOT-tests.jar*{color}|file:///home/runner/work/spark/spark/core/target/scala-2.12/spark-core_2.12-3.2.0-SNAPSHOT-tests.jar*]* > and ** > /*home/runner/work/spark/spark/core/target/scala-2.12/spark-core_2.12-3.2.0-SNAPSHOT-tests.jar > are the same**. > I think we should put the Canonical File in ConcurrentHashMap. > [https://github.com/apache/spark/blob/9f010a8eb20502292b3bca42d17ce1dc357343b1/core/src/main/scala/org/apache/spark/rpc/netty/NettyStreamManager.scala#L68-L89] > > -- This message was sent by Atlassian Jira (v8.3.4#803005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org