[ 
https://issues.apache.org/jira/browse/SPARK-12263?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15058429#comment-15058429
 ] 

Neelesh Srinivas Salian commented on SPARK-12263:
-------------------------------------------------

I would like to work on this.

Please let me know if anyone has started working on it already.
Else I can go ahead and submit a PR.

Thank you.

> IllegalStateException: Memory can't be 0 for SPARK_WORKER_MEMORY without unit
> -----------------------------------------------------------------------------
>
>                 Key: SPARK-12263
>                 URL: https://issues.apache.org/jira/browse/SPARK-12263
>             Project: Spark
>          Issue Type: Improvement
>          Components: Documentation
>            Reporter: Jacek Laskowski
>            Priority: Trivial
>              Labels: starter
>
> When starting a worker with the following command - note 
> {{SPARK_WORKER_MEMORY=1024}} it fails saying that the memory was 0 while it 
> was 1024 (without size unit).
> {code}
> ➜  spark git:(master) ✗ SPARK_WORKER_MEMORY=1024 SPARK_WORKER_CORES=5 
> ./sbin/start-slave.sh spark://localhost:7077
> starting org.apache.spark.deploy.worker.Worker, logging to 
> /Users/jacek/dev/oss/spark/logs/spark-jacek-org.apache.spark.deploy.worker.Worker-1-japila.local.out
> failed to launch org.apache.spark.deploy.worker.Worker:
>   INFO ShutdownHookManager: Shutdown hook called
>   INFO ShutdownHookManager: Deleting directory 
> /private/var/folders/0w/kb0d3rqn4zb9fcc91pxhgn8w0000gn/T/spark-f4e5f222-e938-46b2-a189-241453cf1f50
> full log in 
> /Users/jacek/dev/oss/spark/logs/spark-jacek-org.apache.spark.deploy.worker.Worker-1-japila.local.out
> {code}
> The full stack trace is as follows:
> {code}
> Setting default log level to "WARN".
> To adjust logging level use sc.setLogLevel(newLevel).
> INFO Worker: Registered signal handlers for [TERM, HUP, INT]
> Exception in thread "main" java.lang.IllegalStateException: Memory can't be 
> 0, missing a M or G on the end of the memory specification?
>         at 
> org.apache.spark.deploy.worker.WorkerArguments.checkWorkerMemory(WorkerArguments.scala:179)
>         at 
> org.apache.spark.deploy.worker.WorkerArguments.<init>(WorkerArguments.scala:64)
>         at org.apache.spark.deploy.worker.Worker$.main(Worker.scala:691)
>         at org.apache.spark.deploy.worker.Worker.main(Worker.scala)
> INFO ShutdownHookManager: Shutdown hook called
> INFO ShutdownHookManager: Deleting directory 
> /private/var/folders/0w/kb0d3rqn4zb9fcc91pxhgn8w0000gn/T/spark-f4e5f222-e938-46b2-a189-241453cf1f50
> {code}
> The following command starts spark standalone worker successfully:
> {code}
> SPARK_WORKER_MEMORY=1g SPARK_WORKER_CORES=5 ./sbin/start-slave.sh 
> spark://localhost:7077
> {code}
> The master reports:
> {code}
> INFO Master: Registering worker 192.168.1.6:63884 with 5 cores, 1024.0 MB RAM
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to