Github user berngp commented on the pull request:
https://github.com/apache/spark/pull/116#issuecomment-37579075
@aarondav thank you, applied the changes. I took the liberty to capitalize
references to Spark, Spark Context and Spark Shell; concepts such as ip,
address, hostname are
Github user berngp commented on a diff in the pull request:
https://github.com/apache/spark/pull/116#discussion_r10579188
--- Diff: bin/spark-shell ---
@@ -30,69 +30,367 @@ esac
# Enter posix mode for bash
set -o posix
-CORE_PATTERN="^[0-9]+$"
-M
Github user berngp commented on the pull request:
https://github.com/apache/spark/pull/116#issuecomment-37328186
@aarondav thanks. Regarding colors, my understanding is that `tput` solves
the compatibility issues across terms.
---
If your project is set up for it, you can reply to
Github user berngp commented on the pull request:
https://github.com/apache/spark/pull/116#issuecomment-37216046
[Pull Request 84](https://github.com/apache/spark/pull/84) was the original
ref. of this request. To keep things clean I did a rebase on apache/master and
squashed the
GitHub user berngp opened a pull request:
https://github.com/apache/spark/pull/116
[SPARK-1186] : Enrich the Spark Shell to support additional arguments.
Enrich the Spark Shell functionality to support the following options.
```
Usage: spark-shell [OPTIONS
Github user berngp commented on the pull request:
https://github.com/apache/spark/pull/84#issuecomment-37212763
Just cleaned my feature/enrich-spark-shell branch to reflect the rebase and
squash on the latest apache/master but apparently I can't edit the ranges of a
pull re
Github user berngp closed the pull request at:
https://github.com/apache/spark/pull/84
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is
GitHub user berngp reopened a pull request:
https://github.com/apache/spark/pull/84
[SPARK-1186] : Enrich the Spark Shell to support additional arguments.
Enrich the Spark Shell functionality to support the following options.
```
Usage: spark-shell [OPTIONS
Github user berngp closed the pull request at:
https://github.com/apache/spark/pull/84
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is
GitHub user berngp reopened a pull request:
https://github.com/apache/spark/pull/84
[SPARK-1186] : Enrich the Spark Shell to support additional arguments.
Enrich the Spark Shell functionality to support the following options.
```
Usage: spark-shell [OPTIONS
Github user berngp closed the pull request at:
https://github.com/apache/spark/pull/84
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is
Github user berngp commented on the pull request:
https://github.com/apache/spark/pull/84#issuecomment-37055758
@pwendell,@aarondav, @sryza couple of questions.
1. Based [SPARK-929] would it make sense to also include
--spark-daemon-memory as an optional argument.?
2. Should I
GitHub user berngp opened a pull request:
https://github.com/apache/spark/pull/84
[SPARK-1186] : Enrich the Spark Shell to support additional arguments.
Enrich the Spark Shell functionality to support the following options.
```
Usage: spark-shell [OPTIONS
13 matches
Mail list logo