pretendhigh opened a new issue #16905:
URL: https://github.com/apache/airflow/issues/16905


   <!--
   
   Welcome to Apache Airflow!  For a smooth issue process, try to answer the 
following questions.
   Don't worry if they're not all applicable; just try to include what you can 
:-)
   
   If you need to include code snippets or logs, please put them in fenced code
   blocks.  If they're super-long, please use the details tag like
   <details><summary>super-long log</summary> lots of stuff </details>
   
   Please delete these comment blocks before submitting the issue.
   
   -->
   
   <!--
   
   IMPORTANT!!!
   
   PLEASE CHECK "SIMILAR TO X EXISTING ISSUES" OPTION IF VISIBLE
   NEXT TO "SUBMIT NEW ISSUE" BUTTON!!!
   
   PLEASE CHECK IF THIS ISSUE HAS BEEN REPORTED PREVIOUSLY USING SEARCH!!!
   
   Please complete the next sections or the issue will be closed.
   These questions are the first thing we need to know to understand the 
context.
   
   -->
   
   **Apache Airflow version**:
   Airflow2.0.2
   
   
   **Environment**:
   OS : Ubuntu16.04
   python 3.7.1 
   mysql8.0
   
   **What happened**:
   
   <!-- (please include exact error messages if you can) -->
   
   **What you expected to happen**:
   It should started statsd and I can get metrics at port 8125.
   I think it may be a statsd problem since it has not been updated since 2018. 
And https://pypi.org/project/statsd/ does not says it compatable with 
python3.7+.
   
   <!-- What do you think went wrong? -->
   
   **How to reproduce it**:
   
   1.Install statsd
   ```
   pip install 'apache-airflow[statsd]'
   ```
   It will install statsd 3.3.0
   
   2.change airflow.cfg
   ```
   [metrics]
   # StatsD (https://github.com/etsy/statsd) integration settings.
   # Enables sending metrics to StatsD.
   statsd_on = True
   statsd_host = localhost
   statsd_port = 8125
   statsd_prefix = airflow
   ```
   3.start airflow scheduler
   ```
   airflow scheduler -D --stdout /dev/null --stderr /dev/null -l 
airflow/logs/scheduler.log
   ```
   4.Scheduler will fail. Here is the log:
   ```
   2021-07-09 11:55:28,530 INFO - Starting the scheduler
   2021-07-09 11:55:28,531 INFO - Processing each file at most -1 times
   2021-07-09 11:55:28,537 INFO - Launched DagFileProcessorManager with pid: 
1923
   2021-07-09 11:55:28,538 INFO - Resetting orphaned tasks for active dag runs
   2021-07-09 11:55:28,542 INFO - Configured default timezone 
Timezone('Asia/Jakarta')
   2021-07-09 11:55:48,289 INFO - Exiting gracefully upon receiving signal 15
   2021-07-09 11:55:48,392 INFO - Sending Signals.SIGTERM to GPID 1923
   2021-07-09 11:55:48,407 INFO - Sending Signals.SIGTERM to GPID 1923
   2021-07-09 11:55:48,407 INFO - Exited execute loop
   2021-07-09 11:56:23,731 INFO - Starting the scheduler
   2021-07-09 11:56:23,731 INFO - Processing each file at most -1 times
   2021-07-09 11:56:23,738 INFO - Launched DagFileProcessorManager with pid: 
5181
   2021-07-09 11:56:23,740 INFO - Resetting orphaned tasks for active dag runs
   2021-07-09 11:56:23,743 INFO - Configured default timezone 
Timezone('Asia/Jakarta')
   2021-07-09 11:56:23,758 ERROR - Exception when executing 
SchedulerJob._run_scheduler_loop
   Traceback (most recent call last):
     File 
"/home/hadoop/air2/lib/python3.7/site-packages/sqlalchemy/engine/base.py", line 
771, in _commit_impl
       self.engine.dialect.do_commit(self.connection)
     File 
"/home/hadoop/air2/lib/python3.7/site-packages/sqlalchemy/dialects/mysql/base.py",
 line 2501, in do_commit
       dbapi_connection.commit()
   MySQLdb._exceptions.OperationalError: (2006, 'MySQL server has gone away')
   
   The above exception was the direct cause of the following exception:
   
   Traceback (most recent call last):
     File 
"/home/hadoop/air2/lib/python3.7/site-packages/airflow/jobs/scheduler_job.py", 
line 1284, in _execute
       self._run_scheduler_loop()
     File 
"/home/hadoop/air2/lib/python3.7/site-packages/airflow/jobs/scheduler_job.py", 
line 1358, in _run_scheduler_loop
       self.adopt_or_reset_orphaned_tasks()
     File 
"/home/hadoop/air2/lib/python3.7/site-packages/airflow/utils/session.py", line 
70, in wrapper
       return func(*args, session=session, **kwargs)
     File "/usr/local/python3.7/lib/python3.7/contextlib.py", line 119, in 
__exit__
       next(self.gen)
     File 
"/home/hadoop/air2/lib/python3.7/site-packages/airflow/utils/session.py", line 
32, in create_session
       session.commit()
     File 
"/home/hadoop/air2/lib/python3.7/site-packages/sqlalchemy/orm/session.py", line 
1046, in commit
       self.transaction.commit()
     File 
"/home/hadoop/air2/lib/python3.7/site-packages/sqlalchemy/orm/session.py", line 
1046, in commit
       self.transaction.commit()
     File 
"/home/hadoop/air2/lib/python3.7/site-packages/sqlalchemy/orm/session.py", line 
508, in commit
       t[1].commit()
     File 
"/home/hadoop/air2/lib/python3.7/site-packages/sqlalchemy/engine/base.py", line 
1762, in commit
       self._do_commit()
     File 
"/home/hadoop/air2/lib/python3.7/site-packages/sqlalchemy/engine/base.py", line 
1793, in _do_commit
       self.connection._commit_impl()
     File 
"/home/hadoop/air2/lib/python3.7/site-packages/sqlalchemy/engine/base.py", line 
773, in _commit_impl
       self._handle_dbapi_exception(e, None, None, None, None)
     File 
"/home/hadoop/air2/lib/python3.7/site-packages/sqlalchemy/engine/base.py", line 
1511, in _handle_dbapi_exception
       sqlalchemy_exception, with_traceback=exc_info[2], from_=e
     File 
"/home/hadoop/air2/lib/python3.7/site-packages/sqlalchemy/util/compat.py", line 
182, in raise_
       raise exception
     File 
"/home/hadoop/air2/lib/python3.7/site-packages/sqlalchemy/engine/base.py", line 
771, in _commit_impl
       self.engine.dialect.do_commit(self.connection)
     File 
"/home/hadoop/air2/lib/python3.7/site-packages/sqlalchemy/dialects/mysql/base.py",
 line 2501, in do_commit
       dbapi_connection.commit()
   sqlalchemy.exc.OperationalError: (MySQLdb._exceptions.OperationalError) 
(2006, 'MySQL server has gone away')
   (Background on this error at: http://sqlalche.me/e/13/e3q8)
   2021-07-09 11:56:24,773 INFO - Sending Signals.SIGTERM to GPID 5181
   2021-07-09 11:56:24,865 INFO - Process psutil.Process(pid=5181, 
status='terminated', exitcode=0, started='11:56:23') (5181) terminated with 
exit code 0
   2021-07-09 11:56:24,866 INFO - Exited execute loop
   ```
   5.Change the statsd back, airflow scheduler will start ok.
   ```
   [metrics]
   # StatsD (https://github.com/etsy/statsd) integration settings.
   # Enables sending metrics to StatsD.
   statsd_on = False
   statsd_host = localhost
   statsd_port = 8125
   statsd_prefix = airflow
   ```
   
   **Anything else we need to know**:
   When in Ubuntu20, airlow2.0.2, mysql8.0,  python3.8, the error will rise 
again.
   
   
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to