This is an automated email from the ASF dual-hosted git repository. baunsgaard pushed a commit to branch main in repository https://gitbox.apache.org/repos/asf/systemds.git
commit 49e03eaae554085dde607a186092e5f68e7d84df Author: baunsgaard <[email protected]> AuthorDate: Fri Dec 9 11:57:48 2022 +0100 [SYSTEMDS-3476] Spark with default settings There was a bug in the /bin/systemds script that did not properly set up the spark execution variables with default variables. 1. bug was memory was set to 16 not 16g, 2. was the log4j variable that was overwritten and used incorrectly. Both bugs are fixed, but there could be use of some more cleanup of our /bin/systemds script. Closes #1748 --- bin/systemds | 4 +++- 1 file changed, 3 insertions(+), 1 deletion(-) diff --git a/bin/systemds b/bin/systemds index 0855807754..745d92ddea 100755 --- a/bin/systemds +++ b/bin/systemds @@ -107,6 +107,8 @@ else LOG4JPROP2=$(find "$LOG4JPROP") if [ -z "${LOG4JPROP2}" ]; then LOG4JPROP="" + elif [ -z "${SYSTEMDS_DISTRIBUTED_OPTS}" ]; then + LOG4JPROP=$LOG4JPROP else LOG4JPROP="-Dlog4j.configuration=file:$LOG4JPROP2" fi @@ -126,7 +128,7 @@ else --files $LOG4JPROP \ --conf spark.network.timeout=512s \ --num-executors 4 \ - --executor-memory 64 \ + --executor-memory 64g \ --executor-cores 16 " fi
