[ https://issues.apache.org/jira/browse/SPARK-9636?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14655310#comment-14655310 ]
Philipp Angerer commented on SPARK-9636: ---------------------------------------- everything is more obvious than picing a location relative to the binary ;) and the location is reported anyway since the {{start-master.sh}} script outputs {{starting org.apache.spark.deploy.master.Master, logging to /home/<user>/.cache/spark-logs/spark-<user>-org.apache.spark.deploy.master.Master-1-<hostname>.out}} about write permissions, mind that i suggest testing them sequentially until one is found that can be written to. that’s IMHO a more sensible default than failing, and having to {{grep -i 'log' $SPARK_HOME/sbin/*.sh}} to find that an environment variable exists, and then retrying with that variable set. > Treat $SPARK_HOME as write-only > ------------------------------- > > Key: SPARK-9636 > URL: https://issues.apache.org/jira/browse/SPARK-9636 > Project: Spark > Issue Type: Improvement > Components: Input/Output > Affects Versions: 1.4.1 > Environment: Linux > Reporter: Philipp Angerer > Priority: Minor > Labels: easyfix > > when starting spark scripts as user and it is installed in a directory the > user has no write permissions on, many things work fine, except for the logs > (e.g. for {{start-master.sh}}) > logs are per default written to {{$SPARK_LOG_DIR}} or (if unset) to > {{$SPARK_HOME/logs}}. > if installed in this way, it should, instead of throwing an error, write logs > to {{/var/log/spark/}}. that’s easy to fix by simply testing a few log dirs > in sequence for writability before trying to use one. i suggest using > {{$SPARK_LOG_DIR}} (if set) → {{/var/log/spark/}} → {{~/.cache/spark-logs/}} > → {{$SPARK_HOME/logs/}} -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org