[ https://issues.apache.org/jira/browse/SPARK-3208?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Michael Armbrust updated SPARK-3208: ------------------------------------ Description: There is a workaround, which is to set 'spark.sql.hive.convertMetastoreParquet=true'. However, it would still be good to figure out what is going on here. Relatedly the following also doesn't work (the results are corrupted). {code} sql("CREATE TABLE test (a int, b string) ROW FORMAT SERDE 'parquet.hive.serde.ParquetHiveSerDe' STORED AS INPUTFORMAT 'parquet.hive.DeprecatedParquetInputFormat' OUTPUTFORMAT 'parquet.hive.DeprecatedParquetOutputFormat'") case class MyClass(a: Int, b: String) val rows = sc.parallelize(Seq(MyClass(1, "x"), MyClass(2, "y"))) rows.insertInto("test") sql("select * from test").collect() a b 1 eA== 2 eQ== {code} was:There is a workaround, which is to set 'spark.sql.hive.convertMetastoreParquet=true'. However, it would still be good to figure out what is going on here. > Hive Parquet SerDe returns null columns > --------------------------------------- > > Key: SPARK-3208 > URL: https://issues.apache.org/jira/browse/SPARK-3208 > Project: Spark > Issue Type: Bug > Components: SQL > Affects Versions: 1.1.0 > Reporter: Michael Armbrust > Priority: Minor > > There is a workaround, which is to set > 'spark.sql.hive.convertMetastoreParquet=true'. However, it would still be > good to figure out what is going on here. > Relatedly the following also doesn't work (the results are corrupted). > {code} > sql("CREATE TABLE test (a int, b string) ROW FORMAT SERDE > 'parquet.hive.serde.ParquetHiveSerDe' STORED AS INPUTFORMAT > 'parquet.hive.DeprecatedParquetInputFormat' OUTPUTFORMAT > 'parquet.hive.DeprecatedParquetOutputFormat'") > case class MyClass(a: Int, b: String) > val rows = sc.parallelize(Seq(MyClass(1, "x"), MyClass(2, "y"))) > rows.insertInto("test") > sql("select * from test").collect() > a b > 1 eA== > 2 eQ== > {code} -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org