I am using the following:

<dependency>
    <groupId>com.twitter</groupId>
    <artifactId>parquet-avro</artifactId>
    <version>1.6.0</version>
</dependency>


On Mon, Nov 9, 2015 at 1:00 AM, Fengdong Yu <fengdo...@everstring.com>
wrote:

> Which Spark version used?
>
> It was fixed in Parquet-1.7x, so Spark-1.5.x will be work.
>
>
>
>
> > On Nov 9, 2015, at 3:43 PM, swetha <swethakasire...@gmail.com> wrote:
> >
> > Hi,
> >
> > I see unwanted Warning when I try to save a Parquet file in hdfs in
> Spark.
> > Please find below the code and the Warning message. Any idea as to how to
> > avoid the unwanted Warning message?
> >
> > activeSessionsToBeSaved.saveAsNewAPIHadoopFile("test", classOf[Void],
> > classOf[ActiveSession],
> >      classOf[ParquetOutputFormat[ActiveSession]], job.getConfiguration)
> >
> > Nov 8, 2015 11:35:39 PM WARNING: parquet.hadoop.ParquetOutputCommitter:
> > could not write summary file for active_sessions_current
> > parquet.io.ParquetEncodingException:
> > maprfs:/user/testId/active_sessions_current/part-r-00142.parquet invalid:
> > all the files must be contained in the root active_sessions_current
> >       at
> > parquet.hadoop.ParquetFileWriter.mergeFooters(ParquetFileWriter.java:422)
> >       at
> >
> parquet.hadoop.ParquetFileWriter.writeMetadataFile(ParquetFileWriter.java:398)
> >       at
> >
> parquet.hadoop.ParquetOutputCommitter.commitJob(ParquetOutputCommitter.java:51)
> >       at
> >
> org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsNewAPIHadoopDataset$1.apply$mcV$sp(PairRDDFunctions.scala:1056)
> >       at
> >
> org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsNewAPIHadoopDataset$1.apply(PairRDDFunctions.scala:998)
> >
> >
> >
> > --
> > View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/parquet-io-ParquetEncodingException-Warning-when-trying-to-save-parquet-file-in-Spark-tp25326.html
> > Sent from the Apache Spark User List mailing list archive at Nabble.com.
> >
> > ---------------------------------------------------------------------
> > To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> > For additional commands, e-mail: user-h...@spark.apache.org
> >
>
>

Reply via email to