[ 
https://issues.apache.org/jira/browse/SPARK-6495?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14379243#comment-14379243
 ] 

Chaozhong Yang edited comment on SPARK-6495 at 3/25/15 6:31 AM:
----------------------------------------------------------------

Thanks! Maybe what you point at is the resolved issue 
https://issues.apache.org/jira/browse/SPARK-3851. Reading data from  parquet 
files with different but compatible schemas  has been supported in Spark 1.3.0. 
 

https://spark.apache.org/docs/latest/sql-programming-guide.html#schema-merging


was (Author: debugger87):
Thanks! Maybe what you point at is the resolved issue 
https://issues.apache.org/jira/browse/SPARK-3851. Reading data from  parquet 
files with different but compatible schemas  has been support in Spark 1.3.0.  

https://spark.apache.org/docs/latest/sql-programming-guide.html#schema-merging

> DataFrame#insertInto method should support insert rows with sub-columns
> -----------------------------------------------------------------------
>
>                 Key: SPARK-6495
>                 URL: https://issues.apache.org/jira/browse/SPARK-6495
>             Project: Spark
>          Issue Type: Improvement
>          Components: SQL
>            Reporter: Chaozhong Yang
>
> The original table's schema is like this:
>  |-- a: string (nullable = true)
>  |-- b: string (nullable = true)
>  |-- c: string (nullable = true)
>  |-- d: string (nullable = true)
> If we want to insert one row(can be transformed into DataFrame) with this 
> schema:
>  |-- a: string (nullable = true)
>  |-- b: string (nullable = true)
>  |-- c: string (nullable = true)
> Of course, that operation will fail. Actually, in many cases, people need to 
> insert new rows with columns which is the subset of original table columns. 
> If we can support and fix those issue, Spark SQL's insertion can be more 
> valuable to users.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to