Hi,

Thanks for the question.

1) The core-site.xml holds the parameter for the defaultFS:
<property>
    <name>fs.defaultFS</name>
    <value>hdfs://<hostname>:8020</value>
  </property>

This will be appended to your value in spark.eventLog.dir. So depending on
which location you intend to write it to, you can point it to either HDFS or
local.

As far as I know (feel free to correct me if I am incorrect), you can write
to one location in the fileSystem depending on which it is.
A script may help you achieve the copy to the local FileSystem if needed.

A caveat would be to make sure the permissions are sound since the user who
submits a job may not be in the correct user group or have permissions to
write to local system.

Hope that helps.




-----
Neelesh S. Salian
Cloudera
--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Write-spark-eventLog-to-both-HDFS-and-local-FileSystem-tp26203p26217.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to