The error logs in the thread below appeared in airflow-worker.err. Post this error, none of the other DAG executes. It requires restarting Airflow Worker & Scheduler to resume the run.
Airflow version: 1.8.2 Using Celery executor Regards, Abhishek | Infoworks.io | M: +91-9035191078 On 9 November 2018 at 6:39:30 AM, Abhishek Sinha (abhis...@infoworks.io) wrote: Hi, I get the following error if a script inside bash operator gives out unicode characters: Traceback (most recent call last): File "/home/ec2-user/resources/python27/lib/python2.7/threading.py", line 810, in __bootstrap_inner self.run() File "/home/ec2-user/resources/python27/lib/python2.7/threading.py", line 763, in run self.__target(*self.__args, **self.__kwargs) File "/home/ec2-user/resources/python27/lib/python2.7/site-packages/airflow/task_runner/base_task_runner.py", line 95, in _read_task_logs self.logger.info('Subtask: {}'.format(line.rstrip('\n'))) UnicodeEncodeError: 'ascii' codec can't encode character u'\xe9' in position 572: ordinal not in range(128) I saw in the BashOperator code that there is a way to override output_encoding (default: UTF-8). However, in the base_task_runner.py code, I see that the encoding is hardcoded to UTF-8. Am I missing something? Airflow version: 1.8.2 Regards, Abhishek