[ 
https://issues.apache.org/jira/browse/SPARK-8093?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14575023#comment-14575023
 ] 

Yin Huai commented on SPARK-8093:
---------------------------------

When you get time, can you try it with master? We just bumped the parquet to 
1.7. I am wondering if the logic of ParquetFileReader.readAllFootersInParallel 
has been changed to handle it or now.

> Failure to save empty json object as parquet
> --------------------------------------------
>
>                 Key: SPARK-8093
>                 URL: https://issues.apache.org/jira/browse/SPARK-8093
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 1.4.0
>            Reporter: Harish Butani
>         Attachments: t1.json
>
>
> This is similar to SPARK-3365. Sample json is attached. Code to reproduce
> {code}
> var jsonDF = read.json("/tmp/t1.json")
> jsonDF.write.parquet("/tmp/t1.parquet")
> {code}
> The 'integration' object is empty in the json.
> StackTrace:
> {code}
> ....
> Caused by: java.io.IOException: Could not read footer: 
> java.lang.IllegalStateException: Cannot build an empty group
>       at 
> parquet.hadoop.ParquetFileReader.readAllFootersInParallel(ParquetFileReader.java:238)
>       at 
> org.apache.spark.sql.parquet.ParquetRelation2$MetadataCache.refresh(newParquet.scala:369)
>       at 
> org.apache.spark.sql.parquet.ParquetRelation2.org$apache$spark$sql$parquet$ParquetRelation2$$metadataCache$lzycompute(newParquet.scala:154)
>       at 
> org.apache.spark.sql.parquet.ParquetRelation2.org$apache$spark$sql$parquet$ParquetRelation2$$metadataCache(newParquet.scala:152)
>       at 
> org.apache.spark.sql.parquet.ParquetRelation2.refresh(newParquet.scala:197)
>       at 
> org.apache.spark.sql.sources.InsertIntoHadoopFsRelation.insert(commands.scala:134)
>       ... 69 more
> Caused by: java.lang.IllegalStateException: Cannot build an empty group
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to