Alaeddine22 opened a new issue, #27329:
URL: https://github.com/apache/airflow/issues/27329

   ### Apache Airflow version
   
   Other Airflow 2 version (please specify below)
   
   ### What happened
   
   Hello,
   I installed the stable version 2.4.1 with sqlAlchemy configured for an 
Oracle database.
   
   when running airflow standalone i'm having the following error : 
   
   `sqlalchemy.exc.DatabaseError: (cx_Oracle.DatabaseError) ORA-00972: 
identifier is too long
   [SQL: SELECT dag.dag_id AS dag_dag_id, dag.root_dag_id AS dag_root_dag_id, 
dag.is_paused AS dag_is_paused, dag.is_subdag AS dag_is_subdag, dag.is_active 
AS dag_is_active, dag.last_parsed_time AS dag_last_parsed_time, 
dag.last_pickled AS dag_last_pickled, dag.last_expired AS dag_last_expired, 
dag.scheduler_lock AS dag_scheduler_lock, dag.pickle_id AS dag_pickle_id, 
dag.fileloc AS dag_fileloc, dag.processor_subdir AS dag_processor_subdir, 
dag.owners AS dag_owners, dag.description AS dag_description, dag.default_view 
AS dag_default_view, dag.schedule_interval AS dag_schedule_interval, 
dag.timetable_description AS dag_timetable_descriptio_1, dag.max_active_tasks 
AS dag_max_active_tasks, dag.max_active_runs AS dag_max_active_runs, 
dag.has_task_concurrency_limits AS dag_has_task_concurrency_2, 
dag.has_import_errors AS dag_has_import_errors, dag.next_dagrun AS 
dag_next_dagrun, dag.next_dagrun_data_interval_start AS 
dag_next_dagrun_data_int_3, dag.next_dagrun_data_interval_end AS dag_next
 _dagrun_data_int_4, dag.next_dagrun_create_after AS 
dag_next_dagrun_create_a_5, dag_tag_1.name AS dag_tag_1_name, dag_tag_1.dag_id 
AS dag_tag_1_dag_id, dag_schedule_dataset_ref_1.dataset_id AS 
dag_schedule_dataset_ref_6, dag_schedule_dataset_ref_1.dag_id AS 
dag_schedule_dataset_ref_7, dag_schedule_dataset_ref_1.created_at AS 
dag_schedule_dataset_ref_8, dag_schedule_dataset_ref_1.updated_at AS 
dag_schedule_dataset_ref_9, task_outlet_dataset_refe_2.dataset_id AS 
task_outlet_dataset_refe_a, task_outlet_dataset_refe_2.dag_id AS 
task_outlet_dataset_refe_b, task_outlet_dataset_refe_2.task_id AS 
task_outlet_dataset_refe_c, task_outlet_dataset_refe_2.created_at AS 
task_outlet_dataset_refe_d, task_outlet_dataset_refe_2.updated_at AS 
task_outlet_dataset_refe_e
   FROM dag LEFT OUTER JOIN dag_tag dag_tag_1 ON dag.dag_id = dag_tag_1.dag_id 
LEFT OUTER JOIN dag_schedule_dataset_reference dag_schedule_dataset_ref_1 ON 
dag.dag_id = dag_schedule_dataset_ref_1.dag_id LEFT OUTER JOIN 
task_outlet_dataset_reference task_outlet_dataset_refe_2 ON dag.dag_id = 
task_outlet_dataset_refe_2.dag_id
   WHERE dag.dag_id IN (:dag_id_1_1, :dag_id_1_2, :dag_id_1_3, :dag_id_1_4, 
:dag_id_1_5, :dag_id_1_6, :dag_id_1_7, :dag_id_1_8, :dag_id_1_9, :dag_id_1_10, 
:dag_id_1_11, :dag_id_1_12, :dag_id_1_13, :dag_id_1_14, :dag_id_1_15, 
:dag_id_1_16, :dag_id_1_17, :dag_id_1_18, :dag_id_1_19, :dag_id_1_20, 
:dag_id_1_21, :dag_id_1_22, :dag_id_1_23, :dag_id_1_24, :dag_id_1_25, 
:dag_id_1_26, :dag_id_1_27, :dag_id_1_28, :dag_id_1_29, :dag_id_1_30, 
:dag_id_1_31, :dag_id_1_32, :dag_id_1_33, :dag_id_1_34, :dag_id_1_35, 
:dag_id_1_36, :dag_id_1_37, :dag_id_1_38, :dag_id_1_39, :dag_id_1_40, 
:dag_id_1_41, :dag_id_1_42) FOR UPDATE OF ]
   [parameters: {'dag_id_1_1': 'latest_only', 'dag_id_1_2': 
'example_short_circuit_operator', 'dag_id_1_3': 
'example_branch_python_operator_decorator', 'dag_id_1_4': 
'example_weekday_branch_operator', 'dag_id_1_5': 
'example_short_circuit_decorator', 'dag_id_1_6': 
'example_external_task_marker_child', 'dag_id_1_7': 
'dataset_consumes_1_never_scheduled', 'dag_id_1_8': 
'example_branch_datetime_operator_3', 'dag_id_1_9': 
'example_trigger_controller_dag', 'dag_id_1_10': 
'example_subdag_operator.section-2', 'dag_id_1_11': 'example_bash_operator', 
'dag_id_1_12': 'dataset_consumes_1_and_2', 'dag_id_1_13': 'example_complex', 
'dag_id_1_14': 'dataset_produces_2', 'dag_id_1_15': 
'example_xcom_args_with_operators', 'dag_id_1_16': 'tutorial_taskflow_api', 
'dag_id_1_17': 'example_branch_datetime_operator_2', 'dag_id_1_18': 
'example_branch_datetime_operator', 'dag_id_1_19': 
'example_branch_dop_operator_v3', 'dag_id_1_20': 'latest_only_with_trigger', 
'dag_id_1_21': 'dataset_produces_1', 'dag_id_1_22':
  'example_subdag_operator.section-1', 'dag_id_1_23': 'example_skip_dag', 
'dag_id_1_24': 'tutorial_dag', 'dag_id_1_25': 'example_branch_operator', 
'dag_id_1_26': 'example_external_task_marker_parent', 'dag_id_1_27': 
'example_xcom_args', 'dag_id_1_28': 'example_trigger_target_dag', 
'dag_id_1_29': 'dataset_consumes_unknown_never_scheduled', 'dag_id_1_30': 
'dataset_consumes_1', 'dag_id_1_31': 'example_subdag_operator', 'dag_id_1_32': 
'example_nested_branch_dag', 'dag_id_1_33': 'example_dag_decorator', 
'dag_id_1_34': 'example_python_operator', 'dag_id_1_35': 'tutorial', 
'dag_id_1_36': 'example_sla_dag', 'dag_id_1_37': 
'example_task_group_decorator', 'dag_id_1_38': 
'example_passing_params_via_test_command', 'dag_id_1_39': 
'example_branch_labels', 'dag_id_1_40': 'example_task_group', 'dag_id_1_41': 
'example_time_delta_sensor_async', 'dag_id_1_42': 'example_xcom'}]
   (Background on this error at: https://sqlalche.me/e/14/4xp6)`
   
   
   Any solutions for this problem ?
   Would you consider the limit of 30 characters for the fields of tables ?
   
   
   Thank you in advance.
   
   ### What you think should happen instead
   
   The table DAG is having a field (next_dagrun_data_interval_start) with more 
than 30 characters
   
   ### How to reproduce
   
   _No response_
   
   ### Operating System
   
   ubuntu
   
   ### Versions of Apache Airflow Providers
   
   _No response_
   
   ### Deployment
   
   Official Apache Airflow Helm Chart
   
   ### Deployment details
   
   _No response_
   
   ### Anything else
   
   _No response_
   
   ### Are you willing to submit PR?
   
   - [ ] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org

Reply via email to