matthewwillian opened a new issue, #7461:
URL: https://github.com/apache/iceberg/issues/7461

   ### Query engine
   
   Glue 3.0
   
   ### Question
   
   I have the following query
   ```
           results_df = spark.sql(f'''
   MERGE INTO {args["catalog"]}.{args["database"]}.{args["output_table"]} t
   USING (SELECT * FROM {NEW_EVENTS_DATA_VIEW}) s
   ON (s.client_id = t.client_id AND
       s.environment_type = t.environment_type AND
       s.customer_id_hash = t.customer_id_hash AND
       s.timestamp = t.timestamp AND
       s.customer_id = t.customer_id AND
       s.id = t.id)
   WHEN MATCHED THEN UPDATE SET *
   WHEN NOT MATCHED THEN INSERT *''')
   ```
   that is merging into a table with the following schema
   ```
   # Table schema:              
   # col_name   data_type       comment
   id   string  
   client_id    string  
   environment_type     string  
   customer_id  string  
   customer_id_hash     string  
   timestamp    timestamp       
   transaction_id       string  
   event_type   string  
   received_at  timestamp       
   properties   map<string, string>     
   topic        string  
   partition    int     
   offset       bigint  
   jws_data     string  
                
   # Partition spec:            
   # field_name field_transform column_name
   client_id    identity        client_id
   environment_type     identity        environment_type
   customer_id_hash     identity        customer_id_hash
   timestamp_day        day     timestamp
   ```
   When I run this query, the spark query plan appears to show that predicate 
pushdown is not happening on the `timestamp_day` partitions. Is that intended 
behavior?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to