[jira] [Updated] (AIRFLOW-1833) Airflow 'retries' parameter not being honored with CeleryExecutor
[ https://issues.apache.org/jira/browse/AIRFLOW-1833?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Connor Ameres updated AIRFLOW-1833: --- Affects Version/s: 1.8.1 > Airflow 'retries' parameter not being honored with CeleryExecutor > - > > Key: AIRFLOW-1833 > URL: https://issues.apache.org/jira/browse/AIRFLOW-1833 > Project: Apache Airflow > Issue Type: Bug >Affects Versions: 1.8.1 >Reporter: Connor Ameres > Attachments: Screen Shot 2017-11-20 at 11.13.04 AM.png, Screen Shot > 2017-11-20 at 11.13.27 AM.png > > > I've noticed this for a few task_instances that end up being retried more > times than the 'retries' parameter that is passed to the constructor of the > DAG in the default_args dictionary. I've attached the task instance > attributes & task attributes sections to highlight this. > Note: > - retries: 1 > - try_number: 6 > We're using docker containers w/ https://github.com/puckel/docker-airflow & > the following versions of python and packages... > - python 2.7.9 > - apache-airflow==1.8.1 > - celery==3.1.17 -- This message was sent by Atlassian JIRA (v6.4.14#64029)
[jira] [Created] (AIRFLOW-1833) Airflow 'retries' parameter not being honored with CeleryExecutor
Connor Ameres created AIRFLOW-1833: -- Summary: Airflow 'retries' parameter not being honored with CeleryExecutor Key: AIRFLOW-1833 URL: https://issues.apache.org/jira/browse/AIRFLOW-1833 Project: Apache Airflow Issue Type: Bug Reporter: Connor Ameres Attachments: Screen Shot 2017-11-20 at 11.13.04 AM.png, Screen Shot 2017-11-20 at 11.13.27 AM.png I've noticed this for a few task_instances that end up being retried more times than the 'retries' parameter that is passed to the constructor of the DAG in the default_args dictionary. I've attached the task instance attributes & task attributes sections to highlight this. Note: - retries: 1 - try_number: 6 We're using docker containers w/ https://github.com/puckel/docker-airflow & the following versions of python and packages... - python 2.7.9 - apache-airflow==1.8.1 - celery==3.1.17 -- This message was sent by Atlassian JIRA (v6.4.14#64029)
[jira] [Commented] (AIRFLOW-1137) Problem installing [all] subpackages python3
[ https://issues.apache.org/jira/browse/AIRFLOW-1137?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15992121#comment-15992121 ] Connor Ameres commented on AIRFLOW-1137: I'm a little late on this one, but have you taken a look at https://airflow.incubator.apache.org/installation.html#extra-packages documentation or `extras_requires` of the `setup.py` to determine what you need to install? > Problem installing [all] subpackages python3 > > > Key: AIRFLOW-1137 > URL: https://issues.apache.org/jira/browse/AIRFLOW-1137 > Project: Apache Airflow > Issue Type: Bug >Reporter: Hamed >Priority: Minor > > I am installing all packages for airflow in python3 using: > {noformat}pip3 install 'airflow[all]'{noformat} but it throws me the > following error: > {code:xml} > Collecting cx_Oracle>=5.1.2 (from airflow[all]) > Downloading cx_Oracle-5.3.tar.gz (129kB) > 100% || 133kB 5.9MB/s > Complete output from command python setup.py egg_info: > Traceback (most recent call last): > File "", line 1, in > File "/private/tmp/pip-build-5re1trj4/cx-Oracle/setup.py", line 174, in > > raise DistutilsSetupError("cannot locate an Oracle software " \ > distutils.errors.DistutilsSetupError: cannot locate an Oracle software > installation > > {code} > I dont want to use oracle subpackage but that blocks the installation of > other packages. -- This message was sent by Atlassian JIRA (v6.3.15#6346)
[jira] [Assigned] (AIRFLOW-1160) Upadte SparkSubmitOperator parameters
[ https://issues.apache.org/jira/browse/AIRFLOW-1160?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Connor Ameres reassigned AIRFLOW-1160: -- Assignee: Connor Ameres > Upadte SparkSubmitOperator parameters > - > > Key: AIRFLOW-1160 > URL: https://issues.apache.org/jira/browse/AIRFLOW-1160 > Project: Apache Airflow > Issue Type: Bug > Components: contrib >Affects Versions: Airflow 1.8 >Reporter: Xi Wang >Assignee: Connor Ameres > Fix For: Airflow 1.8 > > > param executor_cores from spark_submit_hook.py is not compatible with > SparkSubmit, should be total-executor-cores instead. -- This message was sent by Atlassian JIRA (v6.3.15#6346)
[jira] [Commented] (AIRFLOW-1160) Upadte SparkSubmitOperator parameters
[ https://issues.apache.org/jira/browse/AIRFLOW-1160?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15990088#comment-15990088 ] Connor Ameres commented on AIRFLOW-1160: Doesn't spark-submit support `--total-executor-cores ` for standalone & mesos and then `--executor-cores` for standalone & yarn (found this from spark-submit --help in 2.1.0)? I believe renaming would break support for YARN. > Upadte SparkSubmitOperator parameters > - > > Key: AIRFLOW-1160 > URL: https://issues.apache.org/jira/browse/AIRFLOW-1160 > Project: Apache Airflow > Issue Type: Bug > Components: contrib >Affects Versions: Airflow 1.8 >Reporter: Xi Wang > Fix For: Airflow 1.8 > > > param executor_cores from spark_submit_hook.py is not compatible with > SparkSubmit, should be total-executor-cores instead. -- This message was sent by Atlassian JIRA (v6.3.15#6346)
[jira] [Comment Edited] (AIRFLOW-1160) Upadte SparkSubmitOperator parameters
[ https://issues.apache.org/jira/browse/AIRFLOW-1160?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15990088#comment-15990088 ] Connor Ameres edited comment on AIRFLOW-1160 at 4/30/17 1:36 AM: - Doesn't spark-submit support `\--total-executor-cores ` for standalone & mesos and then `\--executor-cores` for standalone & yarn (found this from spark-submit --help in 2.1.0)? I believe renaming would break support for YARN. was (Author: cameres): Doesn't spark-submit support `--total-executor-cores ` for standalone & mesos and then `--executor-cores` for standalone & yarn (found this from spark-submit --help in 2.1.0)? I believe renaming would break support for YARN. > Upadte SparkSubmitOperator parameters > - > > Key: AIRFLOW-1160 > URL: https://issues.apache.org/jira/browse/AIRFLOW-1160 > Project: Apache Airflow > Issue Type: Bug > Components: contrib >Affects Versions: Airflow 1.8 >Reporter: Xi Wang > Fix For: Airflow 1.8 > > > param executor_cores from spark_submit_hook.py is not compatible with > SparkSubmit, should be total-executor-cores instead. -- This message was sent by Atlassian JIRA (v6.3.15#6346)
[jira] [Commented] (AIRFLOW-602) Unit Test Cases Doesn't run in Master Branch
[ https://issues.apache.org/jira/browse/AIRFLOW-602?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15971885#comment-15971885 ] Connor Ameres commented on AIRFLOW-602: --- I had a similar issue and believe I solved it by deleting the $AIRFLOW_HOME directory and then running `$ airflow initdb` again. I now am able to run the unit tests successfully. > Unit Test Cases Doesn't run in Master Branch > > > Key: AIRFLOW-602 > URL: https://issues.apache.org/jira/browse/AIRFLOW-602 > Project: Apache Airflow > Issue Type: Bug > Components: tests > Environment: Mac >Reporter: Siddharth > > Trying to run test cases in master branch > I am trying to run airflow unit tests on my mac > but I get this error: > ERROR: Failure: OperationalError ((sqlite3.OperationalError) no such table: > task_instance [SQL: u'DELETE FROM task_instance WHERE task_instance.dag_id = > ?'] [parameters: ('unit_tests',)]) > I basically checked out master version of the repo and ran run_unit_tests.sh > any idea whats going on? Looks like the table task_instance is not created > while running test case file. -- This message was sent by Atlassian JIRA (v6.3.15#6346)