Github user HyukjinKwon commented on a diff in the pull request:

    https://github.com/apache/spark/pull/22358#discussion_r216657064
  
    --- Diff: docs/sql-programming-guide.md ---
    @@ -965,6 +965,7 @@ Configuration of Parquet can be done using the 
`setConf` method on `SparkSession
         `parquet.compression` is specified in the table-specific 
options/properties, the precedence would be
         `compression`, `parquet.compression`, 
`spark.sql.parquet.compression.codec`. Acceptable values include:
         none, uncompressed, snappy, gzip, lzo, brotli, lz4, zstd.
    +    Note that `zstd` needs install `ZStandardCodec` before Hadoop 2.9.0, 
`brotli` needs install `brotliCodec`.
    --- End diff --
    
    `needs install` -> `needs to install`


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to