[ https://issues.apache.org/jira/browse/SPARK-24448?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16497548#comment-16497548 ]
Saisai Shao commented on SPARK-24448: ------------------------------------- Does it only happen in standalone cluster mode, have you tried client mode? > File not found on the address SparkFiles.get returns on standalone cluster > -------------------------------------------------------------------------- > > Key: SPARK-24448 > URL: https://issues.apache.org/jira/browse/SPARK-24448 > Project: Spark > Issue Type: Bug > Components: Spark Core > Affects Versions: 2.2.1 > Reporter: Pritpal Singh > Priority: Major > > I want to upload a file on all worker nodes in a standalone cluster and > retrieve the location of file. Here is my code > > val tempKeyStoreLoc = System.getProperty("java.io.tmpdir") + "/keystore.jks" > val file = new File(tempKeyStoreLoc) > sparkContext.addFile(file.getAbsolutePath) > val keyLoc = SparkFiles.get("keystore.jks") > > SparkFiles.get returns a random location where keystore.jks does not exist. I > submit the job in cluster mode. In fact the location Spark.Files returns does > not exist on any of the worker nodes (including the driver node). > I observed that Spark does load keystore.jks files on worker nodes at > <SPARK_HOME>/work/<app_id>/<partition_id>/keystore.jks. The partition_id > changes from one worker node to another. > My requirement is to upload a file on all nodes of a cluster and retrieve its > location. I'm expecting the location to be common across all worker nodes. > > -- This message was sent by Atlassian JIRA (v7.6.3#76005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org