[ 
https://issues.apache.org/jira/browse/SPARK-19045?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15791511#comment-15791511
 ] 

Assaf Mendelson commented on SPARK-19045:
-----------------------------------------


I 100% agree standalone is not local, however, standalone in a single node 
should not give this error, it makes perfect sense to use local file system in 
this case.
As for figuring out if a path is NFS mount there are many ways to find this 
such as: 
https://unix.stackexchange.com/questions/72223/check-if-folder-is-a-mounted-remote-filesystem
Of course no method is universal and it doesn't cover all distributed systems 
so maybe we should simply change the wording to something like: Make sure the 
directory is on a distributed file system visible by all nodes.

> irrelevant warning when creating a checkpoint dir
> -------------------------------------------------
>
>                 Key: SPARK-19045
>                 URL: https://issues.apache.org/jira/browse/SPARK-19045
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 2.1.0
>            Reporter: Assaf Mendelson
>            Priority: Trivial
>
> When I do:
> sc.setCheckpointDir("/mydir") I get a warning:
> 17/01/01 06:26:42 WARN SparkContext: Spark is not running in local mode, 
> therefore the checkpoint directory must not be on the local filesystem. 
> Directory '/mydir' appears to be on the local filesystem.
> This occurs even though I use a single computer with local filesystem (using 
> spark standalone). Reading the code it seems the same error would occur if I 
> would use a cluster and an NFS share.
> Checkpoint should work on a local directory as long as there is a single node 
> (even when using a resource manager rather than a single node) and should 
> work in a cluster with locally mounted NFS share.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to