[jira] [Assigned] (SPARK-9964) PySpark DataFrameReader accept RDD of String for JSON

2015-08-25 Thread Apache Spark (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-9964?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Apache Spark reassigned SPARK-9964:
---

Assignee: Apache Spark

 PySpark DataFrameReader accept RDD of String for JSON
 -

 Key: SPARK-9964
 URL: https://issues.apache.org/jira/browse/SPARK-9964
 Project: Spark
  Issue Type: New Feature
  Components: PySpark, SQL
Reporter: Joseph K. Bradley
Assignee: Apache Spark
Priority: Minor

 It would be nice (but not necessary) for the PySpark DataFrameReader to 
 accept an RDD of Strings (like the Scala version does) for JSON, rather than 
 only taking a path.
 If this JIRA is accepted, it should probably be duplicated to cover the other 
 input types (not just JSON).



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-9964) PySpark DataFrameReader accept RDD of String for JSON

2015-08-25 Thread Apache Spark (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-9964?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Apache Spark reassigned SPARK-9964:
---

Assignee: (was: Apache Spark)

 PySpark DataFrameReader accept RDD of String for JSON
 -

 Key: SPARK-9964
 URL: https://issues.apache.org/jira/browse/SPARK-9964
 Project: Spark
  Issue Type: New Feature
  Components: PySpark, SQL
Reporter: Joseph K. Bradley
Priority: Minor

 It would be nice (but not necessary) for the PySpark DataFrameReader to 
 accept an RDD of Strings (like the Scala version does) for JSON, rather than 
 only taking a path.
 If this JIRA is accepted, it should probably be duplicated to cover the other 
 input types (not just JSON).



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org