Github user cloud-fan commented on a diff in the pull request:
https://github.com/apache/spark/pull/19646#discussion_r148791016
--- Diff: python/pyspark/sql/session.py ---
@@ -512,9 +557,7 @@ def createDataFrame(self, data, schema=None,
samplingRatio=None, verifySchema=Tr
Github user HyukjinKwon commented on a diff in the pull request:
https://github.com/apache/spark/pull/19646#discussion_r148784896
--- Diff: python/pyspark/sql/session.py ---
@@ -512,9 +557,7 @@ def createDataFrame(self, data, schema=None,
samplingRatio=None, verifySchema=Tr
Github user HyukjinKwon commented on a diff in the pull request:
https://github.com/apache/spark/pull/19646#discussion_r148782931
--- Diff: python/pyspark/sql/session.py ---
@@ -512,9 +557,7 @@ def createDataFrame(self, data, schema=None,
samplingRatio=None, verifySchema=Tr
Github user cloud-fan commented on a diff in the pull request:
https://github.com/apache/spark/pull/19646#discussion_r148779399
--- Diff: python/pyspark/sql/session.py ---
@@ -512,9 +557,7 @@ def createDataFrame(self, data, schema=None,
samplingRatio=None, verifySchema=Tr
Github user cloud-fan commented on a diff in the pull request:
https://github.com/apache/spark/pull/19646#discussion_r148778328
--- Diff: python/pyspark/sql/session.py ---
@@ -416,6 +417,50 @@ def _createFromLocal(self, data, schema):
data = [schema.toInternal(row) for
Github user HyukjinKwon commented on a diff in the pull request:
https://github.com/apache/spark/pull/19646#discussion_r148773536
--- Diff: python/pyspark/sql/session.py ---
@@ -416,6 +417,50 @@ def _createFromLocal(self, data, schema):
data = [schema.toInternal(row)
Github user HyukjinKwon commented on a diff in the pull request:
https://github.com/apache/spark/pull/19646#discussion_r148771775
--- Diff: python/pyspark/sql/tests.py ---
@@ -2592,6 +2592,16 @@ def test_create_dataframe_from_array_of_long(self):
df =
Github user HyukjinKwon commented on a diff in the pull request:
https://github.com/apache/spark/pull/19646#discussion_r148770218
--- Diff: python/pyspark/sql/session.py ---
@@ -416,6 +417,50 @@ def _createFromLocal(self, data, schema):
data = [schema.toInternal(row)
Github user HyukjinKwon commented on a diff in the pull request:
https://github.com/apache/spark/pull/19646#discussion_r148763906
--- Diff: python/pyspark/sql/session.py ---
@@ -416,6 +417,50 @@ def _createFromLocal(self, data, schema):
data = [schema.toInternal(row)
Github user HyukjinKwon commented on a diff in the pull request:
https://github.com/apache/spark/pull/19646#discussion_r148763235
--- Diff: python/pyspark/sql/session.py ---
@@ -416,6 +417,50 @@ def _createFromLocal(self, data, schema):
data = [schema.toInternal(row)
Github user cloud-fan commented on a diff in the pull request:
https://github.com/apache/spark/pull/19646#discussion_r148724335
--- Diff: python/pyspark/sql/session.py ---
@@ -416,6 +417,50 @@ def _createFromLocal(self, data, schema):
data = [schema.toInternal(row) for
Github user BryanCutler commented on a diff in the pull request:
https://github.com/apache/spark/pull/19646#discussion_r148707442
--- Diff: python/pyspark/sql/session.py ---
@@ -512,9 +512,39 @@ def createDataFrame(self, data, schema=None,
samplingRatio=None, verifySchema=Tr
Github user BryanCutler commented on a diff in the pull request:
https://github.com/apache/spark/pull/19646#discussion_r148707441
--- Diff: python/pyspark/sql/session.py ---
@@ -512,9 +512,39 @@ def createDataFrame(self, data, schema=None,
samplingRatio=None, verifySchema=Tr
Github user BryanCutler commented on a diff in the pull request:
https://github.com/apache/spark/pull/19646#discussion_r148709143
--- Diff: python/pyspark/sql/session.py ---
@@ -416,6 +417,50 @@ def _createFromLocal(self, data, schema):
data = [schema.toInternal(row)
Github user ueshin commented on a diff in the pull request:
https://github.com/apache/spark/pull/19646#discussion_r148708752
--- Diff: python/pyspark/sql/session.py ---
@@ -416,6 +417,50 @@ def _createFromLocal(self, data, schema):
data = [schema.toInternal(row) for
Github user ueshin commented on a diff in the pull request:
https://github.com/apache/spark/pull/19646#discussion_r148709362
--- Diff: python/pyspark/sql/session.py ---
@@ -416,6 +417,50 @@ def _createFromLocal(self, data, schema):
data = [schema.toInternal(row) for
Github user ueshin commented on a diff in the pull request:
https://github.com/apache/spark/pull/19646#discussion_r148709757
--- Diff: python/pyspark/sql/tests.py ---
@@ -2592,6 +2592,16 @@ def test_create_dataframe_from_array_of_long(self):
df =
Github user viirya commented on a diff in the pull request:
https://github.com/apache/spark/pull/19646#discussion_r148696316
--- Diff: python/pyspark/sql/session.py ---
@@ -512,9 +512,39 @@ def createDataFrame(self, data, schema=None,
samplingRatio=None, verifySchema=Tr
Github user viirya commented on a diff in the pull request:
https://github.com/apache/spark/pull/19646#discussion_r148695963
--- Diff: python/pyspark/sql/session.py ---
@@ -512,9 +512,39 @@ def createDataFrame(self, data, schema=None,
samplingRatio=None, verifySchema=Tr
GitHub user BryanCutler opened a pull request:
https://github.com/apache/spark/pull/19646
[SPARK-22147][PYTHON] Fix for createDataFrame from pandas.DataFrame with
timestamp
## What changes were proposed in this pull request?
Currently, a pandas.DataFrame that contains a
20 matches
Mail list logo