[jira] [Commented] (SPARK-36879) Support Parquet v2 data page encodings for the vectorized path

2022-01-20 Thread Apache Spark (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-36879?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17479602#comment-17479602
 ] 

Apache Spark commented on SPARK-36879:
--

User 'parthchandra' has created a pull request for this issue:
https://github.com/apache/spark/pull/35262

> Support Parquet v2 data page encodings for the vectorized path
> --
>
> Key: SPARK-36879
> URL: https://issues.apache.org/jira/browse/SPARK-36879
> Project: Spark
>  Issue Type: Sub-task
>  Components: SQL
>Affects Versions: 3.3.0
>Reporter: Chao Sun
>Assignee: Parth Chandra
>Priority: Major
> Fix For: 3.3.0
>
>
> Currently Spark only support Parquet V1 encodings (i.e., 
> PLAIN/DICTIONARY/RLE) in the vectorized path, and throws exception otherwise:
> {code}
> java.lang.UnsupportedOperationException: Unsupported encoding: 
> DELTA_BYTE_ARRAY
> {code}
> It will be good to support v2 encodings too, including DELTA_BINARY_PACKED, 
> DELTA_LENGTH_BYTE_ARRAY, DELTA_BYTE_ARRAY as well as BYTE_STREAM_SPLIT as 
> listed in https://github.com/apache/parquet-format/blob/master/Encodings.md



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-36879) Support Parquet v2 data page encodings for the vectorized path

2022-01-14 Thread Apache Spark (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-36879?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17476343#comment-17476343
 ] 

Apache Spark commented on SPARK-36879:
--

User 'parthchandra' has created a pull request for this issue:
https://github.com/apache/spark/pull/35212

> Support Parquet v2 data page encodings for the vectorized path
> --
>
> Key: SPARK-36879
> URL: https://issues.apache.org/jira/browse/SPARK-36879
> Project: Spark
>  Issue Type: Sub-task
>  Components: SQL
>Affects Versions: 3.3.0
>Reporter: Chao Sun
>Assignee: Parth Chandra
>Priority: Major
> Fix For: 3.3.0
>
>
> Currently Spark only support Parquet V1 encodings (i.e., 
> PLAIN/DICTIONARY/RLE) in the vectorized path, and throws exception otherwise:
> {code}
> java.lang.UnsupportedOperationException: Unsupported encoding: 
> DELTA_BYTE_ARRAY
> {code}
> It will be good to support v2 encodings too, including DELTA_BINARY_PACKED, 
> DELTA_LENGTH_BYTE_ARRAY, DELTA_BYTE_ARRAY as well as BYTE_STREAM_SPLIT as 
> listed in https://github.com/apache/parquet-format/blob/master/Encodings.md



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-36879) Support Parquet v2 data page encodings for the vectorized path

2021-11-02 Thread Apache Spark (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-36879?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17437628#comment-17437628
 ] 

Apache Spark commented on SPARK-36879:
--

User 'parthchandra' has created a pull request for this issue:
https://github.com/apache/spark/pull/34471

> Support Parquet v2 data page encodings for the vectorized path
> --
>
> Key: SPARK-36879
> URL: https://issues.apache.org/jira/browse/SPARK-36879
> Project: Spark
>  Issue Type: Sub-task
>  Components: SQL
>Affects Versions: 3.3.0
>Reporter: Chao Sun
>Priority: Major
>
> Currently Spark only support Parquet V1 encodings (i.e., 
> PLAIN/DICTIONARY/RLE) in the vectorized path, and throws exception otherwise:
> {code}
> java.lang.UnsupportedOperationException: Unsupported encoding: 
> DELTA_BYTE_ARRAY
> {code}
> It will be good to support v2 encodings too, including DELTA_BINARY_PACKED, 
> DELTA_LENGTH_BYTE_ARRAY, DELTA_BYTE_ARRAY as well as BYTE_STREAM_SPLIT as 
> listed in https://github.com/apache/parquet-format/blob/master/Encodings.md



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org