Github user budde commented on a diff in the pull request:

    https://github.com/apache/spark/pull/16944#discussion_r101460565
  
    --- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala ---
    @@ -296,6 +296,17 @@ object SQLConf {
           .longConf
           .createWithDefault(250 * 1024 * 1024)
     
    +  val HIVE_SCHEMA_INFERENCE_MODE = 
buildConf("spark.sql.hive.schemaInferenceMode")
    +    .doc("Configures the action to take when a case-sensitive schema 
cannot be read from a Hive " +
    +      "table's properties. Valid options include INFER_AND_SAVE (infer the 
case-sensitive " +
    +      "schema from the underlying data files and write it back to the 
table properties), " +
    +      "INFER_ONLY (infer the schema but don't attempt to write it to the 
table properties) and " +
    +      "NEVER_INFER (fallback to using the case-insensitive metastore 
schema instead of inferring).")
    +    .stringConf
    +    .transform(_.toUpperCase())
    +    .checkValues(Set("INFER_AND_SAVE", "INFER_ONLY", "NEVER_INFER"))
    +    .createWithDefault("INFER_AND_SAVE")
    --- End diff --
    
    This was proposed in #16797 but I'd like to open this for discussion.
    - ```INFER_ONLY``` would mimic the pre-2.1.0 behavior.
    - ```INFER_AND_SAVE``` would attempt to prevent future inferences but may 
fail if the Hive client doesn't have write permissions on the metastore. 
    - ```NEVER_INFER``` is the current behavior in 2.1.0 which breaks support 
with the tables affected by 
[SPARK-19611](https://issues.apache.org/jira/browse/SPARK-19611). Users may 
wish to enable this mode for tables without the table properties schema that 
they know are case-insensitive.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to