[ https://issues.apache.org/jira/browse/SPARK-11074?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14954025#comment-14954025 ]
Marcelo Vanzin commented on SPARK-11074: ---------------------------------------- bq. What do you mean by the "right" permissions here? That's the thing. There is no "right permission". It all depends on the environment. Which is why Spark apps should not be trying to create this directory. For example, the default permissions (your PR) will allow just the user running that app to write event logs. A saner default would be {{1777}}, allowing everybody to write but only the owner of the logs to delete the file. Someone might want to secure things more and only allow a certain group to read / write to that directory (so, {{1770}}). Or they may want to get fancy and use ACLs. bq. I've had many folks complain to me about this issue, so I think it's worth fixing. That sounds like an environment configuration problem to me, not a Spark issue. > Attempt to automatically create event log directory if it doesn't already > exist > ------------------------------------------------------------------------------- > > Key: SPARK-11074 > URL: https://issues.apache.org/jira/browse/SPARK-11074 > Project: Spark > Issue Type: Improvement > Components: Spark Core > Affects Versions: 1.5.1 > Reporter: Kay Ousterhout > Assignee: Kay Ousterhout > Priority: Minor > > Right now, if the log where event logs are to be placed (spark.eventLog.dir) > doesn't already exist, the SparkContext fails to start. We should try to > automatically create this directory. -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org