[GitHub] spark pull request: spark-submit with accept multiple properties-f...

2014-12-09 Thread lvsoft
Github user lvsoft commented on the pull request: https://github.com/apache/spark/pull/3490#issuecomment-66404194 Sorry for late reply. I'll explain the use cases for multiple properties files. Currently I'm working on a benchmark utility for spark. It'll be nature

[GitHub] spark pull request: spark-submit with accept multiple properties-f...

2014-12-09 Thread lvsoft
Github user lvsoft commented on the pull request: https://github.com/apache/spark/pull/3490#issuecomment-66405387 Well, that's called separated property files, not *common* properties. It'll be hard to adjust common properties and easy to make mistakes. Delete tmp files

[GitHub] spark pull request: spark-submit with accept multiple properties-f...

2014-12-09 Thread lvsoft
Github user lvsoft commented on the pull request: https://github.com/apache/spark/pull/3490#issuecomment-66414770 Well, I can't understand what's the complexity of this PR. I've reviewed the SPARK-3779 marked as related and didn't find something related to this patch

[GitHub] spark pull request: spark-submit with accept multiple properties-f...

2014-11-26 Thread lvsoft
GitHub user lvsoft opened a pull request: https://github.com/apache/spark/pull/3490 spark-submit with accept multiple properties-files and merge the values Current ```spark-submit``` accepts only one properties-file, and use ```spark-defaults.conf``` if unspecified. A more

[GitHub] spark pull request: [SPARK-4475] change localhost to 127.0.0.1...

2014-11-24 Thread lvsoft
GitHub user lvsoft opened a pull request: https://github.com/apache/spark/pull/3425 [SPARK-4475] change localhost to 127.0.0.1 if localhost can't be resolved This will fix [SPARK-4475] Simply change localhost to equivalent 127.0.0.1 will solve the issue. You can merge

[GitHub] spark pull request: [SPARK-2313] PySpark pass port rather than std...

2014-11-24 Thread lvsoft
Github user lvsoft commented on the pull request: https://github.com/apache/spark/pull/3424#issuecomment-64302794 I think this is a better solution. However, pass the port back via socket will affair py4j too. Currently, stdin is the only supported method in py4j to pass back

[GitHub] spark pull request: [SPARK-4475] change localhost to 127.0.0.1...

2014-11-24 Thread lvsoft
Github user lvsoft commented on the pull request: https://github.com/apache/spark/pull/3425#issuecomment-64308603 I did a doctest in aggregation.py to confirm this fix is OK if ```localhost``` can not be resolved. However, I'm not fully confident that spark will work well totally

[GitHub] spark pull request: [SPARK-2313] PySpark pass port rather than std...

2014-11-23 Thread lvsoft
GitHub user lvsoft opened a pull request: https://github.com/apache/spark/pull/3424 [SPARK-2313] PySpark pass port rather than stdin This patch will fix [SPARK-2313]. It peek available free port number, and pass the port number to Py4j.Gateway for binding via command line