[ https://issues.apache.org/jira/browse/SPARK-28898?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
ABHISHEK KUMAR GUPTA updated SPARK-28898: ----------------------------------------- Description: Now if user gives set -v; then only End user will come to know about entire list of the spark.sql.XX configuration. For the End user unless familiar with Spark code can not use these configure SQL Configuration entire list. I feel this should be documented in 3.0 user guide for the entire list of SQL Configuration like [Spark-Streaming|https://spark.apache.org/docs/latest/configuration.html#spark-streaming] was: Now if user gives set -v; then only End user will come to know about entire list of the spark.sql.XX configuration. For the End user unless familiar with Spark code can not use these configure SQL Configuration entire list. I feel this should be documented in 3.0 user guide for the entire list of SQL Configuration like [Spark-Strreaming|https://spark.apache.org/docs/latest/configuration.html#spark-streaming] > SQL Configuration should be mentioned under Spark SQL in User Guide > ------------------------------------------------------------------- > > Key: SPARK-28898 > URL: https://issues.apache.org/jira/browse/SPARK-28898 > Project: Spark > Issue Type: Improvement > Components: SQL > Affects Versions: 3.0.0 > Reporter: ABHISHEK KUMAR GUPTA > Priority: Minor > > Now if user gives set -v; then only End user will come to know about entire > list of the spark.sql.XX configuration. > For the End user unless familiar with Spark code can not use these configure > SQL Configuration entire list. > I feel this should be documented in 3.0 user guide for the entire list of SQL > Configuration like > [Spark-Streaming|https://spark.apache.org/docs/latest/configuration.html#spark-streaming] -- This message was sent by Atlassian Jira (v8.3.2#803003) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org