GitHub user berngp opened a pull request:

    https://github.com/apache/spark/pull/116

    [SPARK-1186] : Enrich the Spark Shell to support additional arguments.

    Enrich the Spark Shell functionality to support the following options.
    
    ```
    Usage: spark-shell [OPTIONS]
    
    OPTIONS:
    
    basic:
    
        -h  --help              : print this help information.
        -c  --executor-cores    : the maximum number of cores to be used by the 
spark shell.
        -em --executor-memory   : num[m|g], the memory used by each executor of 
spark shell.
        -dm --drivermem --driver-memory     : num[m|g], the memory used by the 
spark shell and driver.
    
    soon to be deprecated:
    
        --cores         : please use -c/--executor-cores
    
    other options:
    
        -mip --master-ip     : Spark Master IP/Host Address
        -mp  --master-port   : num, Spark Master Port
        -m   --master        : full string that describes the Spark Master.
        -ld  --local-dir     : absolute path to a local directory that will be 
use for "scratch" space in Spark.
        -dh  --driver-host   : hostname or IP address for the driver to listen 
on.
        -dp  --driver-port   : num, port for the driver to listen on.
        -uip --ui-port       : num, port for your application's dashboard, 
which shows memory and workload data.
        --parallelism        : num, default number of tasks to use across the 
cluster for distributed shuffle operations when not set by user.
        --locality-wait      : num, number of milliseconds to wait to launch a 
data-local task before giving up.
        --schedule-fair      : flag, enables FAIR scheduling between jobs 
submitted to the same SparkContext.
        --max-failures       : num, number of individual task failures before 
giving up on the job.
        --log-conf           : flag, log the supplied SparkConf as INFO at 
start of spark context.
    
    e.g.
        spark-shell -m local -ld /tmp -dh 127.0.0.1 -dp 4001 -uip 4010 
--parallelism 10 --locality-wait 500 --schedule-fair --max-failures 100
    ```
    
    **Note**: this commit reflects the changes applied to _master_ based on 
[5d98cfc1].
    
    [ticket: SPARK-1186] : Enrich the Spark Shell to support additional 
arguments.
                            
https://spark-project.atlassian.net/browse/SPARK-1186
    
    Author      : bernardo.gomezpal...@gmail.com

You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/berngp/spark feature/enrich-spark-shell

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/116.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #116
    
----
commit 32ad3484ad7a7cc6a0e903f80b6dd3579550e1f9
Author: Bernardo Gomez Palacio <bernardo.gomezpala...@gmail.com>
Date:   2014-03-10T17:41:30Z

    [SPARK-1186] : Enrich the Spark Shell to support additional arguments.
    
    Enrich the Spark Shell functionality to support the following options.
    
    ```
    Usage: spark-shell [OPTIONS]
    
    OPTIONS:
    
    basic:
    
        -h  --help              : print this help information.
        -c  --executor-cores    : the maximum number of cores to be used by the 
spark shell.
        -em --executor-memory   : num[m|g], the memory used by each executor of 
spark shell.
        -dm --drivermem --driver-memory     : num[m|g], the memory used by the 
spark shell and driver.
    
    soon to be deprecated:
    
        --cores         : please use -c/--executor-cores
    
    other options:
    
        -mip --master-ip     : Spark Master IP/Host Address
        -mp  --master-port   : num, Spark Master Port
        -m   --master        : full string that describes the Spark Master.
        -ld  --local-dir     : absolute path to a local directory that will be 
use for "scratch" space in Spark.
        -dh  --driver-host   : hostname or IP address for the driver to listen 
on.
        -dp  --driver-port   : num, port for the driver to listen on.
        -uip --ui-port       : num, port for your application's dashboard, 
which shows memory and workload data.
        --parallelism        : num, default number of tasks to use across the 
cluster for distributed shuffle operations when not set by user.
        --locality-wait      : num, number of milliseconds to wait to launch a 
data-local task before giving up.
        --schedule-fair      : flag, enables FAIR scheduling between jobs 
submitted to the same SparkContext.
        --max-failures       : num, number of individual task failures before 
giving up on the job.
        --log-conf           : flag, log the supplied SparkConf as INFO at 
start of spark context.
    
    e.g.
        spark-shell -m local -ld /tmp -dh 127.0.0.1 -dp 4001 -uip 4010 
--parallelism 10 --locality-wait 500 --schedule-fair --max-failures 100
    ```
    
    **Note**: this commit reflects the changes applied to _master_ based on 
[5d98cfc1].
    
    [ticket: SPARK-1186] : Enrich the Spark Shell to support additional 
arguments.
                            
https://spark-project.atlassian.net/browse/SPARK-1186
    
    Author      : bernardo.gomezpal...@gmail.com

----


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

Reply via email to