Github user HyukjinKwon commented on a diff in the pull request:

    https://github.com/apache/spark/pull/22358#discussion_r216881788
  
    --- Diff: docs/sql-programming-guide.md ---
    @@ -965,6 +965,8 @@ Configuration of Parquet can be done using the 
`setConf` method on `SparkSession
         `parquet.compression` is specified in the table-specific 
options/properties, the precedence would be
         `compression`, `parquet.compression`, 
`spark.sql.parquet.compression.codec`. Acceptable values include:
         none, uncompressed, snappy, gzip, lzo, brotli, lz4, zstd.
    +    Note that `zstd` needs to install `ZStandardCodec` before Hadoop 
2.9.0, `brotli` needs to install
    +    `brotliCodec`.
    --- End diff --
    
    If the link looks expected to be rather permanent, it's fine.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to