GitHub user berngp opened a pull request:

    https://github.com/apache/spark/pull/84

    [SPARK-1186] : Enrich the Spark Shell to support additional arguments.

    Enrich the Spark Shell functionality to support the following options.
    
    ```
    Usage: spark-shell [OPTIONS]
    
    OPTIONS:
    
    basic:
    
        -h  --help              : print this help information.
        -c  --executor-cores    : the maximum number of cores to be used by the 
spark shell.
        -em --executor-memory   : num[m|g], the memory used by each executor of 
spark shell.
        -dm --driver-memory     : num[m|g], the memory used by the spark shell 
and driver.
    
    soon to be deprecated:
    
        --cores         : please use -c/--executor-cores
        --drivermem     : please use -dm/--driver-memory
    
    other options:
    
        -mip --master-ip     : Spark Master IP/Host Address
        -mp  --master-port   : num, Spark Master Port
        -m   --master        : full string that describes the Spark Master.
        -ld  --local-dir     : absolute path to a local directory that will be 
use for "scratch" space in Spark.
        -dh  --driver-host   : hostname or IP address for the driver to listen 
on.
        -dp  --driver-port   : num, port for the driver to listen on.
        -uip --ui-port       : num, port for your application's dashboard, 
which shows memory and workload data.
        --parallelism        : num, default number of tasks to use across the 
cluster for distributed shuffle operations when not set by user.
        --locality-wait      : num, number of milliseconds to wait to launch a 
data-local task before giving up.
        --schedule-fair      : flag, enables FAIR scheduling between jobs 
submitted to the same SparkContext.
        --max-failures       : num, number of individual task failures before 
giving up on the job.
        --log-conf           : flag, log the supplied SparkConf as INFO at 
start of spark context.
    
    e.g.
        spark-shell -m 127.0.0.1 -ld /tmp -dh 127.0.0.1 -dp 4001 -uip 4010 
--parallelism 10 --locality-wait 500 --schedule-fair --max-failures 100
    ```
    
    [ticket: SPARK-1186] : Enrich the Spark Shell to support additional 
arguments.
                           https://spark-project.atlassian.net/browse/SPARK-1186
    
    Author      : bernardo.gomezpal...@gmail.com
    Reviewer    : ?
    Testing     : ?

You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/berngp/spark feature/enrich-spark-shell

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/84.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #84
    
----
commit c7ac8ebe0740d9ea7347253b251c5b6b90706b2f
Author: Bernardo Gomez Palacio <bernardo.gomezpala...@gmail.com>
Date:   2014-03-05T23:37:30Z

    [SPARK-1186] : Enrich the Spark Shell to support additional arguments.
    
    Enrich the Spark Shell functionality to support the following options.
    
    ```
    Usage: spark-shell [OPTIONS]
    
    OPTIONS:
    
    basic:
    
        -h  --help              : print this help information.
        -c  --executor-cores    : the maximum number of cores to be used by the 
spark shell.
        -em --executor-memory   : num[m|g], the memory used by each executor of 
spark shell.
        -dm --driver-memory     : num[m|g], the memory used by the spark shell 
and driver.
    
    soon to be deprecated:
    
        --cores         : please use -c/--executor-cores
        --drivermem     : please use -dm/--driver-memory
    
    other options:
    
        -mip --master-ip     : Spark Master IP/Host Address
        -mp  --master-port   : num, Spark Master Port
        -m   --master        : full string that describes the Spark Master.
        -ld  --local-dir     : absolute path to a local directory that will be 
use for "scratch" space in Spark.
        -dh  --driver-host   : hostname or IP address for the driver to listen 
on.
        -dp  --driver-port   : num, port for the driver to listen on.
        -uip --ui-port       : num, port for your application's dashboard, 
which shows memory and workload data.
        --parallelism        : num, default number of tasks to use across the 
cluster for distributed shuffle operations when not set by user.
        --locality-wait      : num, number of milliseconds to wait to launch a 
data-local task before giving up.
        --schedule-fair      : flag, enables FAIR scheduling between jobs 
submitted to the same SparkContext.
        --max-failures       : num, number of individual task failures before 
giving up on the job.
        --log-conf           : flag, log the supplied SparkConf as INFO at 
start of spark context.
    
    e.g.
        spark-shell -m 127.0.0.1 -ld /tmp -dh 127.0.0.1 -dp 4001 -uip 4010 
--parallelism 10 --locality-wait 500 --schedule-fair --max-failures 100
    ```
    
    [ticket: SPARK-1186] : Enrich the Spark Shell to support additional 
arguments.
                           https://spark-project.atlassian.net/browse/SPARK-1186
    
    Author      : bernardo.gomezpal...@gmail.com
    Reviewer    : ?
    Testing     : ?

----


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

Reply via email to