pmallex opened a new issue, #19249:
URL: https://github.com/apache/datafusion/issues/19249

   ### Describe the bug
   
   My understanding, which could be wrong, is that quoted field names should 
not be treated as placeholder variables when parsing SQL statements.
   
   For example:
   ```
   SELECT `@my_field` FROM my_table
   ```
   
   The SqlToRel parser currently treats these fields as variables, because it 
[only 
checks](https://github.com/apache/datafusion/blob/6751f441f435efc401376c1d3d35397b1b39ab1f/datafusion/sql/src/expr/identifier.rs#L39)
 for the name starting with `@` and does not check if the name was quoted. I 
think the fix is as simple as updating the condition from 
`id.value.starts_with('@')` to `id.value.starts_with('@') && 
id.value.quote_style.is_none()`, but would like to hear some thoughts on this.
   
   ### To Reproduce
   
   Execute the following test:
   ```
       #[test]
       fn test_quoted_field_names_with_at_sign() {
           use std::sync::Arc;
           use datafusion::arrow::datatypes::{DataType, Field, Schema};
           use datafusion::config::ConfigOptions;
           use datafusion::logical_expr::LogicalTableSource;
           use datafusion_sql::planner::{ContextProvider, SqlToRel};
           use datafusion_sql::sqlparser::dialect::GenericDialect;
           use datafusion_sql::sqlparser::parser::Parser;
   
           struct MyContextProvider {
               config_options: ConfigOptions
           }
   
           impl ContextProvider for MyContextProvider {
               fn get_table_source(&self, _name: 
datafusion_sql::TableReference) -> datafusion_common::Result<Arc<dyn 
datafusion::logical_expr::TableSource>> {
                   let mut fields = Vec::new();
                   fields.push(Field::new("@my_field", DataType::Utf8, true));
                   let schema = Arc::new(Schema::new(fields));
                   let table = Arc::new(LogicalTableSource::new(schema));
                   Ok(table)
               }
               fn get_function_meta(&self, _name: &str) -> 
Option<Arc<datafusion::logical_expr::ScalarUDF>> { None }
               fn get_aggregate_meta(&self, _name: &str) -> 
Option<Arc<datafusion::logical_expr::AggregateUDF>> { None }
               fn get_window_meta(&self, _name: &str) -> 
Option<Arc<datafusion::logical_expr::WindowUDF>> { None }
               fn get_variable_type(&self, _variable_names: &[String]) -> 
Option<datafusion::arrow::datatypes::DataType> { None }
               fn options(&self) -> &datafusion::config::ConfigOptions { 
&self.config_options }
               fn udf_names(&self) -> Vec<String> { Vec::new() }
               fn udaf_names(&self) -> Vec<String> { Vec::new() }
               fn udwf_names(&self) -> Vec<String> { Vec::new() }
           }
   
           let sql = r#"SELECT `@my_field` FROM my_table"#;
           let dialect = GenericDialect {};
           let abstract_syntax_tree = Parser::parse_sql(&dialect, sql).unwrap();
           let statement = &abstract_syntax_tree[0];
           let context_provider = MyContextProvider { config_options: 
ConfigOptions::default() };
           let sql_to_rel = SqlToRel::new(&context_provider);
           let plan = sql_to_rel.sql_statement_to_plan(statement.clone());
   
           println!("{}", plan.unwrap());
       }
   ```
   
   Running this test will result in the following error:
   ```
   called `Result::unwrap()` on an `Err` value: Plan("variable [\"@my_field\"] 
has no type information")
   ```
   
   ### Expected behavior
   
   This should produce a logical plan with `@my_field` as an identifier:
   ```
   Projection: my_table.@my_field
     TableScan: my_table
   ```
   
   ### Additional context
   
   _No response_


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to