ueshin commented on a change in pull request #23534: [SPARK-26610][PYTHON] Fix 
inconsistency between toJSON Method in Python and Scala.
URL: https://github.com/apache/spark/pull/23534#discussion_r247411271
 
 

 ##########
 File path: python/pyspark/sql/dataframe.py
 ##########
 @@ -109,15 +109,18 @@ def stat(self):
     @ignore_unicode_prefix
     @since(1.3)
     def toJSON(self, use_unicode=True):
-        """Converts a :class:`DataFrame` into a :class:`RDD` of string.
+        """Converts a :class:`DataFrame` into a :class:`DataFrame` of JSON 
string.
 
-        Each row is turned into a JSON document as one element in the returned 
RDD.
+        Each row is turned into a JSON document as one element in the returned 
DataFrame.
 
         >>> df.toJSON().first()
-        u'{"age":2,"name":"Alice"}'
+        Row(value=u'{"age":2,"name":"Alice"}')
         """
-        rdd = self._jdf.toJSON()
-        return RDD(rdd.toJavaRDD(), self._sc, UTF8Deserializer(use_unicode))
+        jdf = self._jdf.toJSON()
+        if self.sql_ctx._conf.pysparkDataFrameToJSONShouldReturnDataFrame():
+            return DataFrame(jdf, self.sql_ctx)
+        else:
+            return RDD(jdf.toJavaRDD(), self._sc, 
UTF8Deserializer(use_unicode))
 
 Review comment:
   Good point, but I'm feeling it's natural to return DataFrame because I think 
DataFrame in Python is a corresponding expression of DataFrame/Dataset in 
Scala/Java, whereas to return RDD is weird for me.
   I can understand what you mean, and that is one of the reasons I added a 
config to restore the behavior, actually.
   cc @gatorsmile @cloud-fan 

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to