DipBaldha commented on issue #42136:
URL: https://github.com/apache/airflow/issues/42136#issuecomment-3836093043

   > Hi [@DipBaldha](https://github.com/DipBaldha), could you elaborate a bit 
further? I cannot find the `airflow.cfg` file because it is being generated as 
I understand. There is only a airflow-default.cfg if I am not mistaken. Thank 
you for spending the time to comment.
   > 
   > I quickly deployed latest airflow 3.1.6 version using the same Helm chart 
and I still get this error:
   > 
   > `Log message source details sources=["Could not read served logs: Invalid 
URL 
'http://:8793/log/dag_id=demo_DAG/run_id=manual__2026-02-02T15:38:52+00:00/task_id=meteo-data_fetching_in_airflow/attempt=1.log':
 No host supplied"]`
   > 
   > <img alt="Image" width="1097" height="263" 
src="https://private-user-images.githubusercontent.com/78788009/543870885-8e00a112-8748-4c21-86cb-3963aacf6550.png?jwt=eyJ0eXAiOiJKV1QiLCJhbGciOiJIUzI1NiJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmF3LmdpdGh1YnVzZXJjb250ZW50LmNvbSIsImtleSI6ImtleTUiLCJleHAiOjE3NzAwNDgwODEsIm5iZiI6MTc3MDA0Nzc4MSwicGF0aCI6Ii83ODc4ODAwOS81NDM4NzA4ODUtOGUwMGExMTItODc0OC00YzIxLTg2Y2ItMzk2M2FhY2Y2NTUwLnBuZz9YLUFtei1BbGdvcml0aG09QVdTNC1ITUFDLVNIQTI1NiZYLUFtei1DcmVkZW50aWFsPUFLSUFWQ09EWUxTQTUzUFFLNFpBJTJGMjAyNjAyMDIlMkZ1cy1lYXN0LTElMkZzMyUyRmF3czRfcmVxdWVzdCZYLUFtei1EYXRlPTIwMjYwMjAyVDE1NTYyMVomWC1BbXotRXhwaXJlcz0zMDAmWC1BbXotU2lnbmF0dXJlPWNhNDZlMGI2ZDYyZmNlMzQwNzc5ZmY4YmViMDY2ZWVmY2MzYjE5NGQ3MDIzNDcwZmVmMzE1YWJkZjI5NzA1ZGImWC1BbXotU2lnbmVkSGVhZGVycz1ob3N0In0.IckG_nF9QMmiX3O2N8yLh3vkGQJ31L1el9emD7CpHx0";>
   > My DAG code part that fails is something like the following:
   > 
   > ```
   > from airflow import DAG
   > from airflow.operators.python import PythonOperator
   > from airflow.operators.bash import BashOperator
   > from airflow.providers.ssh.operators.ssh import SSHOperator
   > from airflow.providers.ssh.hooks.ssh import SSHHook
   > from datetime import datetime,timedelta
   > 
   > sshHook_py = SSHHook(ssh_conn_id="jupyterlab", cmd_timeout=3600)
   > sshHook_r = SSHHook(ssh_conn_id="rstudio", cmd_timeout=3600)
   > 
   > default_args = {
   >     'owner': 'airflow',
   >     'depends_on_past': False,
   >     'email_on_failure': False,
   >     'email_on_retry': False,
   >     'retries': 0,
   >     'retry_delay': timedelta(minutes=1)
   > }
   > 
   > dag = DAG(
   >     dag_id='demo_DAG',
   >     default_args=default_args,
   >     # schedule_interval="@daily", # DEPRECATED in airflow 3.0.2
   >     schedule="@daily",
   >     start_date=datetime(2022, 1, 1),
   >     catchup=False
   > )
   > 
   > t6 = BashOperator(
   >     task_id='meteo-data_fetching_in_airflow',
   >     bash_command='python 
/opt/airflow/dags/demo/code/meteo_data_fetching.py',
   >     dag=dag
   > )
   > 
   > t6 >> [t1, t3]
   > ...
   > ```
   
   Please try using this airflow.cfg file, and let me know if it works for you.
   https://github.com/DipBaldha/airflow/blob/main/airflow.cfg


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to