[ https://issues.apache.org/jira/browse/SPARK-36761?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17426163#comment-17426163 ]
Steve Loughran commented on SPARK-36761: ---------------------------------------- something in the code has got the default cluster FS and then ({{FileSystem.get(Configuration)}} but is then invoking a method on it with a path beginning with the s3a schema {{FileSystem.exists(s3a://d14/import/rajtestsp33/dfs_read_write_test, expected)}}.' > spark-examples_2.12-3.0.2.jar DFSReadWriteTest S3A Implementation > ----------------------------------------------------------------- > > Key: SPARK-36761 > URL: https://issues.apache.org/jira/browse/SPARK-36761 > Project: Spark > Issue Type: Bug > Components: Examples > Affects Versions: 3.0.2 > Reporter: Raj > Priority: Major > > Dear Team, > I am using Spark3 to test the s3a storage writing. Part of test i am invoking > DFSReadWriteTest from spark-examples_2.12-3.0.2.jar file. I am passing the > arguments as below > spark-submit --verbose --driver-java-options > "-Dlog4j.configuration=file:/home/myid/log4j.properties" --conf > "spark.executor.extraJavaOptions='-Dlog4j.configuration=file:/home/myid/log4j.properties'" > --driver-class-path > "/usr/hdp/3.1.5.0-152/hadoop/ceph-rgw-sts-auth-6.jar,/opt/spark3/jars/hadoop-aws-3.1.1.3.1.5.0-152.jar" > --class org.apache.spark.examples.DFSReadWriteTest --deploy-mode client > --executor-memory 1G --num-executors 3 --conf > "spark.hadoop.fs.s3a.refreshTokenFile='/home/myid/keycloaktoken/tokenfile'" > /opt/spark3/examples/jars/spark-examples_2.12-3.0.2.jar > "/home/myid/sparkreadtest.txt" "s3a://d14/import/rajtestsp33" > The Program fails with the message Wrong FS ( It seems the file system > comparison fails in checkpath) > Exception in thread "main" java.lang.IllegalArgumentException: Wrong FS: > s3a://d14/import/rajtestsp33/dfs_read_write_test, expected: > hdfs://pphdpException in thread "main" java.lang.IllegalArgumentException: > Wrong FS: s3a://d14/import/rajtestsp33/dfs_read_write_test, expected: > hdfs://pphdp at > org.apache.hadoop.fs.FileSystem.checkPath(FileSystem.java:730) at > org.apache.hadoop.hdfs.DistributedFileSystem.getPathName(DistributedFileSystem.java:234) > at > org.apache.hadoop.hdfs.DistributedFileSystem$29.doCall(DistributedFileSystem.java:1577) > at > org.apache.hadoop.hdfs.DistributedFileSystem$29.doCall(DistributedFileSystem.java:1574) > at > org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81) > at > org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1589) > at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1683) at > org.apache.spark.examples.DFSReadWriteTest$.main(DFSReadWriteTest.scala:115) > at org.apache.spark.examples.DFSReadWriteTest.main(DFSReadWriteTest.scala) at > sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > at java.lang.reflect.Method.invoke(Method.java:498) at > org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52) > at > org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:928) > at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180) at > org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203) at > org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90) at > org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1007) > at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1016) at > org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) > > And the same program successfully ran with the Spark2 with HDP 3.1.5.0-152 > and spark-examples_2.11-2.3.2.3.1.5.0-152.jar file. > Any inputs appreciated. > > Thanks > Raj -- This message was sent by Atlassian Jira (v8.3.4#803005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org