sarutak commented on a change in pull request #32845:
URL: https://github.com/apache/spark/pull/32845#discussion_r649860579



##########
File path: core/src/test/scala/org/apache/spark/SparkContextSuite.scala
##########
@@ -1285,6 +1285,30 @@ class SparkContextSuite extends SparkFunSuite with 
LocalSparkContext with Eventu
       }
     }
   }
+
+  test("SPARK-35691: addFile/addJar/addDirectory should put CanonicalFile") {
+    withTempDir { dir =>
+      try {
+        val sep = File.separator
+        val tmpDir = Utils.createTempDir(dir.getAbsolutePath + sep + "." + sep 
+ ".")
+        val tmpJar = File.createTempFile("test", ".jar", tmpDir)
+        val tmpFile = File.createTempFile("test", ".txt", tmpDir)
+
+        sc = new SparkContext(new 
SparkConf().setAppName("test").setMaster("local"))

Review comment:
       I think you don't need to modify `RpcEnvSuite`.
   How about `setMaster("local-cluster[...]")?

##########
File path: core/src/test/scala/org/apache/spark/SparkContextSuite.scala
##########
@@ -1285,6 +1285,30 @@ class SparkContextSuite extends SparkFunSuite with 
LocalSparkContext with Eventu
       }
     }
   }
+
+  test("SPARK-35691: addFile/addJar/addDirectory should put CanonicalFile") {
+    withTempDir { dir =>
+      try {
+        val sep = File.separator
+        val tmpDir = Utils.createTempDir(dir.getAbsolutePath + sep + "." + sep 
+ ".")

Review comment:
       > Also, it's better to assert the original path is not canonical.
   
   You add some assertions due to response to this comment right?
   If so, I don't mean that.
   
   I mean it's better to check whether `tmpDir` here is canonical or not.
   In fact, what `Utils.createTempDir` returns is already canonical.

##########
File path: core/src/test/scala/org/apache/spark/SparkContextSuite.scala
##########
@@ -1285,6 +1285,30 @@ class SparkContextSuite extends SparkFunSuite with 
LocalSparkContext with Eventu
       }
     }
   }
+
+  test("SPARK-35691: addFile/addJar/addDirectory should put CanonicalFile") {
+    withTempDir { dir =>
+      try {
+        val sep = File.separator
+        val tmpDir = Utils.createTempDir(dir.getAbsolutePath + sep + "." + sep 
+ ".")

Review comment:
       Yeah.

##########
File path: core/src/test/scala/org/apache/spark/SparkContextSuite.scala
##########
@@ -1285,6 +1285,30 @@ class SparkContextSuite extends SparkFunSuite with 
LocalSparkContext with Eventu
       }
     }
   }
+
+  test("SPARK-35691: addFile/addJar/addDirectory should put CanonicalFile") {
+    withTempDir { dir =>
+      try {
+        val sep = File.separator
+        val tmpDir = Utils.createTempDir(dir.getAbsolutePath + sep + "." + sep 
+ ".")
+        val tmpJar = File.createTempFile("test", ".jar", tmpDir)
+        val tmpFile = File.createTempFile("test", ".txt", tmpDir)
+
+        sc = new SparkContext(new 
SparkConf().setAppName("test").setMaster("local"))

Review comment:
       Now we set `setMaster("local-cluster[...]")`, we don't need to add new 
assertions to `RpcEnvSuite.scala`.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to