On 28. mars 2018 03:26, Dongjoon Hyun wrote:
> You may hit SPARK-23355 (convertMetastore should not ignore table properties).
>
> Since it's a known Spark issue for all Hive tables (Parquet/ORC), could you
> check that too?
>
> Bests,
> Dongjoon.
>
Hi,
I think you might be right, I can run
You may hit SPARK-23355 (convertMetastore should not ignore table properties).
Since it's a known Spark issue for all Hive tables (Parquet/ORC), could you
check that too?
Bests,
Dongjoon.
On 2018/03/28 01:00:55, Dongjoon Hyun wrote:
> Hi, Eric.
>
> For me, Spark 2.3
Hi, Eric.
For me, Spark 2.3 works correctly like the following. Could you give us some
reproducible example?
```
scala> sql("set spark.sql.orc.impl=native")
scala> sql("set spark.sql.orc.compression.codec=zlib")
res1: org.apache.spark.sql.DataFrame = [key: string, value: string]
scala>
Hi, Eirik,
Yes, please open a JIRA.
Thanks,
Xiao
2018-03-23 8:03 GMT-07:00 Eirik Thorsnes :
> Hi all,
>
> I'm trying the new ORC native in Spark 2.3
> (org.apache.spark.sql.execution.datasources.orc).
>
> I've compiled Spark 2.3 from the git branch-2.3 as of March 20th.
Hi all,
I'm trying the new ORC native in Spark 2.3
(org.apache.spark.sql.execution.datasources.orc).
I've compiled Spark 2.3 from the git branch-2.3 as of March 20th.
I also get the same error for the Spark 2.2 from Hortonworks HDP 2.6.4.
*NOTE*: the error only occurs with zlib compression, and