Re: Parquet compression codecs not applied

2015-02-05 Thread Cheng Lian
-compression-codecs-not-applied-tp21058.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org mailto:user

Re: Parquet compression codecs not applied

2015-02-04 Thread Ayoub
knows if this works for you. Best Sahan -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Parquet-compression-codecs-not-applied-tp21058p21498.html Sent from the Apache Spark User List mailing list archive at Nabble.com

Re: Parquet compression codecs not applied

2015-02-04 Thread sahanbull
it with different compression methods, Please let the mailing list knows if this works for you. Best Sahan -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Parquet-compression-codecs-not-applied-tp21058p21498.html Sent from the Apache Spark User List mailing

Re: Parquet compression codecs not applied

2015-01-10 Thread Ayoub Benali
://apache-spark-user-list.1001560.n3.nabble.com/Parquet-compression-codecs-not-applied-tp21058.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe, e-mail: user-unsubscr

Re: Parquet compression codecs not applied

2015-01-09 Thread Michael Armbrust
that with Impala on the same cluster which applied correctly the compression codecs. Does anyone know what could be the problem ? Thanks, Ayoub. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Parquet-compression-codecs-not-applied-tp21058.html Sent

Parquet compression codecs not applied

2015-01-09 Thread Ayoub
-compression-codecs-not-applied-tp21058.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org

Parquet compression codecs not applied

2015-01-08 Thread Ayoub Benali
Hello, I tried to save a table created via the hive context as a parquet file but whatever compression codec (uncompressed, snappy, gzip or lzo) I set via setConf like: setConf(spark.sql.parquet.compression.codec, gzip) the size of the generated files is the always the same, so it seems like

Parquet compression codecs not applied

2015-01-08 Thread Ayoub
against hive 0.13 and I also tried that with Impala on the same cluster which applied correctly the compression codecs. Does anyone know what could be the problem ? Thanks, Ayoub. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Parquet-compression-codecs