Github user asfgit closed the pull request at:
https://github.com/apache/spark/pull/14959
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is ena
Github user BryanCutler commented on a diff in the pull request:
https://github.com/apache/spark/pull/14959#discussion_r82649182
--- Diff: python/pyspark/conf.py ---
@@ -101,13 +109,25 @@ def __init__(self, loadDefaults=True, _jvm=None,
_jconf=None):
self._jconf =
Github user BryanCutler commented on a diff in the pull request:
https://github.com/apache/spark/pull/14959#discussion_r82649091
--- Diff: python/pyspark/context.py ---
@@ -121,7 +121,15 @@ def __init__(self, master=None, appName=None,
sparkHome=None, pyFiles=None,
def _do
Github user zjffdu commented on a diff in the pull request:
https://github.com/apache/spark/pull/14959#discussion_r82512997
--- Diff: python/pyspark/conf.py ---
@@ -101,13 +101,25 @@ def __init__(self, loadDefaults=True, _jvm=None,
_jconf=None):
self._jconf = _jcon
Github user zjffdu commented on a diff in the pull request:
https://github.com/apache/spark/pull/14959#discussion_r82512981
--- Diff: python/pyspark/conf.py ---
@@ -118,28 +130,28 @@ def setIfMissing(self, key, value):
def setMaster(self, value):
"""Set m
Github user zjffdu commented on a diff in the pull request:
https://github.com/apache/spark/pull/14959#discussion_r82512714
--- Diff: python/pyspark/conf.py ---
@@ -149,35 +161,53 @@ def setAll(self, pairs):
:param pairs: list of key-value pairs to set
"""
Github user BryanCutler commented on a diff in the pull request:
https://github.com/apache/spark/pull/14959#discussion_r80599035
--- Diff: python/pyspark/conf.py ---
@@ -101,13 +101,25 @@ def __init__(self, loadDefaults=True, _jvm=None,
_jconf=None):
self._jconf =
Github user zjffdu commented on a diff in the pull request:
https://github.com/apache/spark/pull/14959#discussion_r80596041
--- Diff: python/pyspark/conf.py ---
@@ -101,13 +101,25 @@ def __init__(self, loadDefaults=True, _jvm=None,
_jconf=None):
self._jconf = _jcon
Github user BryanCutler commented on a diff in the pull request:
https://github.com/apache/spark/pull/14959#discussion_r80567211
--- Diff: python/pyspark/conf.py ---
@@ -118,28 +130,28 @@ def setIfMissing(self, key, value):
def setMaster(self, value):
"""
Github user BryanCutler commented on a diff in the pull request:
https://github.com/apache/spark/pull/14959#discussion_r80566426
--- Diff: python/pyspark/java_gateway.py ---
@@ -41,7 +41,7 @@ def can_convert_list(self, obj):
ListConverter.can_convert = can_convert_list
Github user BryanCutler commented on a diff in the pull request:
https://github.com/apache/spark/pull/14959#discussion_r80565092
--- Diff: python/pyspark/conf.py ---
@@ -149,35 +161,53 @@ def setAll(self, pairs):
:param pairs: list of key-value pairs to set
Github user BryanCutler commented on a diff in the pull request:
https://github.com/apache/spark/pull/14959#discussion_r80564836
--- Diff: python/pyspark/conf.py ---
@@ -149,35 +161,53 @@ def setAll(self, pairs):
:param pairs: list of key-value pairs to set
Github user BryanCutler commented on a diff in the pull request:
https://github.com/apache/spark/pull/14959#discussion_r80561545
--- Diff: python/pyspark/conf.py ---
@@ -101,13 +101,25 @@ def __init__(self, loadDefaults=True, _jvm=None,
_jconf=None):
self._jconf =
Github user BryanCutler commented on a diff in the pull request:
https://github.com/apache/spark/pull/14959#discussion_r80561001
--- Diff: python/pyspark/conf.py ---
@@ -101,13 +101,25 @@ def __init__(self, loadDefaults=True, _jvm=None,
_jconf=None):
self._jconf =
Github user zjffdu closed the pull request at:
https://github.com/apache/spark/pull/14959
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is ena
GitHub user zjffdu reopened a pull request:
https://github.com/apache/spark/pull/14959
[SPARK-17387][PYSPARK] Creating SparkContext() from python without
spark-submit ignores user conf
## What changes were proposed in this pull request?
The root cause that we would ignore S
Github user zjffdu commented on a diff in the pull request:
https://github.com/apache/spark/pull/14959#discussion_r79539542
--- Diff: python/pyspark/java_gateway.py ---
@@ -50,13 +50,18 @@ def launch_gateway():
# proper classpath and settings from spark-env.sh
Github user vanzin commented on a diff in the pull request:
https://github.com/apache/spark/pull/14959#discussion_r79461443
--- Diff: python/pyspark/java_gateway.py ---
@@ -50,13 +50,18 @@ def launch_gateway():
# proper classpath and settings from spark-env.sh
Github user vanzin commented on a diff in the pull request:
https://github.com/apache/spark/pull/14959#discussion_r78453870
--- Diff: python/pyspark/java_gateway.py ---
@@ -51,13 +51,16 @@ def launch_gateway():
on_windows = platform.system() == "Windows"
sc
Github user holdenk commented on a diff in the pull request:
https://github.com/apache/spark/pull/14959#discussion_r78105215
--- Diff: python/pyspark/java_gateway.py ---
@@ -51,13 +51,16 @@ def launch_gateway():
on_windows = platform.system() == "Windows"
s
Github user BryanCutler commented on a diff in the pull request:
https://github.com/apache/spark/pull/14959#discussion_r78053223
--- Diff: python/pyspark/conf.py ---
@@ -101,13 +101,31 @@ def __init__(self, loadDefaults=True, _jvm=None,
_jconf=None):
self._jconf =
Github user zjffdu commented on a diff in the pull request:
https://github.com/apache/spark/pull/14959#discussion_r77936098
--- Diff: python/pyspark/conf.py ---
@@ -101,13 +101,31 @@ def __init__(self, loadDefaults=True, _jvm=None,
_jconf=None):
self._jconf = _jcon
Github user BryanCutler commented on a diff in the pull request:
https://github.com/apache/spark/pull/14959#discussion_r77697522
--- Diff: python/pyspark/conf.py ---
@@ -101,13 +101,31 @@ def __init__(self, loadDefaults=True, _jvm=None,
_jconf=None):
self._jconf =
GitHub user zjffdu opened a pull request:
https://github.com/apache/spark/pull/14959
[SPARK-17387][PYSPARK] Creating SparkContext() from python without
spark-submit ignores user conf
## What changes were proposed in this pull request?
The cause that now we would ignore Spar
24 matches
Mail list logo