lsm1 opened a new pull request, #7365:
URL: https://github.com/apache/paimon/pull/7365

   <!-- Please specify the module before the PR name: [core] ... or [flink] ... 
-->
   
   ### Purpose
   
   <!-- Linking this pull request to the issue -->
   
   
   <!-- What is the purpose of the change -->
   
      Fix false failures in Paimon required Spark configuration checks by 
making the check rely on the current SparkSession configuration (instead of a 
temporary/active SQLConf), so Spark SQL operations (including creating/querying 
views across session restarts) don’t incorrectly report missing 
spark.sql.extensions.
   
   
   
   ### Tests
   
   <!-- List UT and IT cases to verify this change -->
   
   - Spark SQL manual verification:
   ```
        create table paimon.default.p1 (
            k int,
            v string
        ) USING paimon
        tblproperties (
            'primary-key' = 'k'
        );
        insert into table paimon.default.p1 select 1,'2';
        create view v2 as  select * from paimon.default.p1;
        --- restart spark-sql session
        select * from v2;
   ```
    - UT: paimon-spark-ut added/updated cases for required conf checks with
      temporary SQLConf.
   
   ### API and Format
   
   <!-- Does this change affect API or storage format -->
   
   ### Documentation
   
   <!-- Does this change introduce a new feature -->
   
   ### Generative AI tooling
   
   <!--
   If generative AI tooling has been used in the process of authoring this 
patch, please include the
   phrase: 'Generated-by: ' followed by the name of the tool and its version.
   If no, write 'No'.
   Please refer to the [ASF Generative Tooling 
Guidance](https://www.apache.org/legal/generative-tooling.html) for details.
   -->
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to