DreamyWen opened a new issue #14063:
URL: https://github.com/apache/airflow/issues/14063


   <!--
   
   Welcome to Apache Airflow!  For a smooth issue process, try to answer the 
following questions.
   Don't worry if they're not all applicable; just try to include what you can 
:-)
   
   If you need to include code snippets or logs, please put them in fenced code
   blocks.  If they're super-long, please use the details tag like
   <details><summary>super-long log</summary> lots of stuff </details>
   
   Please delete these comment blocks before submitting the issue.
   
   -->
   
   <!--
   
   IMPORTANT!!!
   
   PLEASE CHECK "SIMILAR TO X EXISTING ISSUES" OPTION IF VISIBLE
   NEXT TO "SUBMIT NEW ISSUE" BUTTON!!!
   
   PLEASE CHECK IF THIS ISSUE HAS BEEN REPORTED PREVIOUSLY USING SEARCH!!!
   
   Please complete the next sections or the issue will be closed.
   These questions are the first thing we need to know to understand the 
context.
   
   -->
   
   **Apache Airflow version**:
   1.10.14
   
   **Kubernetes version (if you are using kubernetes)** (use `kubectl version`):
   
   **Environment**:
   
   System Version: macOS 11.1 (20C69)
   Version: Darwin 20.2.0
   Python 3.8.5 
   
   - **Cloud provider or hardware configuration**:
   - **OS** (e.g. from /etc/os-release):
   - **Kernel** (e.g. `uname -a`):
   - **Install tools**:
   - **Others**:
   
   **What happened**:
   [2021-02-04 13:55:00,605] {base_executor.py:58} INFO - Adding to queue: 
['airflow', 'run', 'example_bash_operator', 'runme_2', 
'2021-02-04T05:54:56.206452+00:00', '--local', '--pool', 'default_pool', '-sd', 
'/Users/saith/anaconda3/envs/airflow/lib/python3.8/site-packages/airflow/example_dags/example_bash_operator.py']
   [2021-02-04 13:55:00,636] {scheduler_job.py:1384} ERROR - Exception when 
executing execute_helper
   Traceback (most recent call last):
     File 
"/Users/saith/anaconda3/envs/airflow/lib/python3.8/site-packages/airflow/jobs/scheduler_job.py",
 line 1382, in _execute
       self._execute_helper()
     File 
"/Users/saith/anaconda3/envs/airflow/lib/python3.8/site-packages/airflow/jobs/scheduler_job.py",
 line 1453, in _execute_helper
       if not 
self._validate_and_run_task_instances(simple_dag_bag=simple_dag_bag):
     File 
"/Users/saith/anaconda3/envs/airflow/lib/python3.8/site-packages/airflow/jobs/scheduler_job.py",
 line 1515, in _validate_and_run_task_instances
       self.executor.heartbeat()
     File 
"/Users/saith/anaconda3/envs/airflow/lib/python3.8/site-packages/airflow/executors/base_executor.py",
 line 130, in heartbeat
       self.trigger_tasks(open_slots)
     File 
"/Users/saith/anaconda3/envs/airflow/lib/python3.8/site-packages/airflow/executors/celery_executor.py",
 line 229, in trigger_tasks
       send_pool = Pool(processes=num_processes, initializer=reset_signals)
     File 
"/Users/saith/anaconda3/envs/airflow/lib/python3.8/multiprocessing/context.py", 
line 119, in Pool
       return Pool(processes, initializer, initargs, maxtasksperchild,
     File 
"/Users/saith/anaconda3/envs/airflow/lib/python3.8/multiprocessing/pool.py", 
line 212, in __init__
       self._repopulate_pool()
     File 
"/Users/saith/anaconda3/envs/airflow/lib/python3.8/multiprocessing/pool.py", 
line 303, in _repopulate_pool
       return self._repopulate_pool_static(self._ctx, self.Process,
     File 
"/Users/saith/anaconda3/envs/airflow/lib/python3.8/multiprocessing/pool.py", 
line 326, in _repopulate_pool_static
       w.start()
     File 
"/Users/saith/anaconda3/envs/airflow/lib/python3.8/multiprocessing/process.py", 
line 121, in start
       self._popen = self._Popen(self)
     File 
"/Users/saith/anaconda3/envs/airflow/lib/python3.8/multiprocessing/context.py", 
line 284, in _Popen
       return Popen(process_obj)
     File 
"/Users/saith/anaconda3/envs/airflow/lib/python3.8/multiprocessing/popen_spawn_posix.py",
 line 32, in __init__
       super().__init__(process_obj)
     File 
"/Users/saith/anaconda3/envs/airflow/lib/python3.8/multiprocessing/popen_fork.py",
 line 19, in __init__
       self._launch(process_obj)
     File 
"/Users/saith/anaconda3/envs/airflow/lib/python3.8/multiprocessing/popen_spawn_posix.py",
 line 47, in _launch
       reduction.dump(process_obj, fp)
     File 
"/Users/saith/anaconda3/envs/airflow/lib/python3.8/multiprocessing/reduction.py",
 line 60, in dump
       ForkingPickler(file, protocol).dump(obj)
   AttributeError: Can't pickle local object 
'CeleryExecutor.trigger_tasks.<locals>.reset_signals'
   
   
   <!-- (please include exact error messages if you can) -->
   
   **What you expected to happen**:
   the demo dag should be triggerd with no error 
   <!-- What do you think went wrong? -->
   I think the error is similar to this issue 
   https://issues.apache.org/jira/browse/AIRFLOW-6529
   
   **How to reproduce it**:
   <!---
   
   As minimally and precisely as possible. Keep in mind we do not have access 
to your cluster or dags.
   
   If you are using kubernetes, please attempt to recreate the issue using 
minikube or kind.
   
   ## Install minikube/kind
   
   - Minikube https://minikube.sigs.k8s.io/docs/start/
   - Kind https://kind.sigs.k8s.io/docs/user/quick-start/
   
   If this is a UI bug, please provide a screenshot of the bug or a link to a 
youtube video of the bug in action
   
   You can include images using the .md style of
   ![alt text](http://url/to/img.png)
   
   To record a screencast, mac users can use QuickTime and then create an 
unlisted youtube video with the resulting .mov file.
   
   --->
   
   airflow install
   ===
   ```
   conda create -n airflow python=3.8
   pip install 'apache-airflow==1.10.14'
   pip install 'apache-airflow[redis]==1.10.14'
   pip install 'apache-airflow[celery]==1.10.14'
   pip install 'apache-airflow[mysql]==1.10.14'
   ```
   
   change the airflow.cfg
   ```
   sql_alchemy_conn = mysql://airflow:airflow@localhost:3306/airflow
   broker_url = redis://127.0.0.1:6379/0
   result_backend = db+mysql://airflow:airflow@localhost:3306/airflow
   executor = CeleryExecutor
   ```
   
   ====
   boot mysql, redis
   ```
   mysql.server start
   redis-server
   ```
   ======
   ```
   CREATE DATABASE airflow CHARACTER SET utf8 COLLATE utf8_unicode_ci;
   ```
   airflow initdb
   ```
   airflow webserver
   airflow worker
   airflow scheduler
   ```
   when i trigger on demo dag , the error occurs
   **Anything else we need to know**:
   
   <!--
   
   How often does this problem occur? Once? Every time etc?
   
   Any relevant logs to include? Put them here in side a detail tag:
   <details><summary>x.log</summary> lots of stuff </details>
   
   -->
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to