[jira] [Commented] (AIRFLOW-1503) AssertionError: INTERNAL: No default project is specified

2018-02-12 Thread Kaxil Naik (JIRA)

[ 
https://issues.apache.org/jira/browse/AIRFLOW-1503?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16361850#comment-16361850
 ] 

Kaxil Naik commented on AIRFLOW-1503:
-

[~maximilianr] Have you created a service account? Once you do that you need to 
create a connection for Airflow to interact with GCP service. Check out example 
over here: https://github.com/alexvanboxel/airflow-gcp-examples .

Once you create a connection then you need to pass that connection name to 
`bigquery_conn_id` parameter in BigQuery Operator.

> AssertionError: INTERNAL: No default project is specified
> -
>
> Key: AIRFLOW-1503
> URL: https://issues.apache.org/jira/browse/AIRFLOW-1503
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: gcp
>Affects Versions: Airflow 1.8
> Environment: Unix platform
>Reporter: chaitanya
>Priority: Minor
>  Labels: beginner
>
> Hi ,
> New to airflow. Tried to run BigQuery query and store the result in another 
> table. Getting the following error. 
> Please let me know where to default project. 
> Code: 
> sql_bigquery = BigQueryOperator(
> task_id='sql_bigquery',
> use_legacy_sql=False,
> write_disposition='WRITE_TRUNCATE',
> allow_large_results=True,
> bql='''
> #standardSQL
> SELECT ID, Name, Group, Mark, RATIO_TO_REPORT(Mark) 
> OVER(PARTITION BY Group) AS percent FROM `tensile-site-168620.temp.marks`
> ''',
> destination_dataset_table='temp.percentage',
> dag=dag
> )
> Error Message: 
> Traceback (most recent call last):
>   File "/usr/local/bin/airflow", line 28, in 
> args.func(args)
>   File "/usr/local/lib/python2.7/dist-packages/airflow/bin/cli.py", line 585, 
> in test
> ti.run(ignore_task_deps=True, ignore_ti_state=True, test_mode=True)
>   File "/usr/local/lib/python2.7/dist-packages/airflow/utils/db.py", line 53, 
> in wrapper
> result = func(*args, **kwargs)
>   File "/usr/local/lib/python2.7/dist-packages/airflow/models.py", line 1374, 
> in run
> result = task_copy.execute(context=context)
>   File 
> "/usr/local/lib/python2.7/dist-packages/airflow/contrib/operators/bigquery_operator.py",
>  line 82, in execute
> self.allow_large_results, self.udf_config, self.use_legacy_sql)
>   File 
> "/usr/local/lib/python2.7/dist-packages/airflow/contrib/hooks/bigquery_hook.py",
>  line 228, in run_query
> default_project_id=self.project_id)
>   File 
> "/usr/local/lib/python2.7/dist-packages/airflow/contrib/hooks/bigquery_hook.py",
>  line 917, in _split_tablename
> assert default_project_id is not None, "INTERNAL: No default project is 
> specified"
> AssertionError: INTERNAL: No default project is specified



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Assigned] (AIRFLOW-1618) Allow creating of Storage buckets through Google Cloud Storage Hook

2018-02-12 Thread Kaxil Naik (JIRA)

 [ 
https://issues.apache.org/jira/browse/AIRFLOW-1618?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kaxil Naik reassigned AIRFLOW-1618:
---

Assignee: Kaxil Naik

> Allow creating of Storage buckets through Google Cloud Storage Hook
> ---
>
> Key: AIRFLOW-1618
> URL: https://issues.apache.org/jira/browse/AIRFLOW-1618
> Project: Apache Airflow
>  Issue Type: New Feature
>  Components: contrib, gcp, hooks
>Reporter: Daniel
>Assignee: Kaxil Naik
>Priority: Minor
>   Original Estimate: 96h
>  Remaining Estimate: 96h
>
> The current way that gcs_hook.py is written does not allow for the addition 
> of a new Storage bucket.
> It is possible to create a bucket with config from the storage api, and this 
> is what is being used in the hook. However it does require a small rethink or 
> addition to the current way the service is returned and used.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Assigned] (AIRFLOW-1882) Add ignoreUnknownValues option to gcs_to_bq operator

2018-02-12 Thread Kaxil Naik (JIRA)

 [ 
https://issues.apache.org/jira/browse/AIRFLOW-1882?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kaxil Naik reassigned AIRFLOW-1882:
---

Assignee: Kaxil Naik

> Add ignoreUnknownValues option to gcs_to_bq operator
> 
>
> Key: AIRFLOW-1882
> URL: https://issues.apache.org/jira/browse/AIRFLOW-1882
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: contrib, gcp
>Affects Versions: 1.8.2
>Reporter: Yannick Einsweiler
>Assignee: Kaxil Naik
>Priority: Major
>  Labels: gcp
>
> Would allow to load csv's that have columns not defined in schema. For 
> instance when lines end with a dummy/extra separator. BigQuery considers it 
> as an extra column and won't load the file if option is not passed. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Closed] (AIRFLOW-1977) @once, @daily, etc schedule_interval is failing

2018-02-12 Thread Fokko Driesprong (JIRA)

 [ 
https://issues.apache.org/jira/browse/AIRFLOW-1977?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Fokko Driesprong closed AIRFLOW-1977.
-
   Resolution: Fixed
Fix Version/s: 1.10.0

> @once, @daily, etc schedule_interval is failing
> ---
>
> Key: AIRFLOW-1977
> URL: https://issues.apache.org/jira/browse/AIRFLOW-1977
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: scheduler, webserver, worker
>Affects Versions: 1.9.0
> Environment: Redhat 7, Redis and Postgres (RDS) in AWS 
>Reporter: Anant Mistry
>Assignee: Bolke de Bruin
>Priority: Major
> Fix For: 1.10.0
>
>
> This is an except from the scheduler log while attempting to run the SubDag 
> example (example_subdag_operator)
> [2018-01-08 16:05:50,201] {jobs.py:1386} INFO - Processing 
> example_subdag_operator
> [2018-01-08 16:05:50,209] {jobs.py:379} ERROR - Got an exception! 
> Propagating...
> Traceback (most recent call last):
>   File "/usr/lib/python2.7/site-packages/airflow/jobs.py", line 371, in helper
> pickle_dags)
>   File "/usr/lib/python2.7/site-packages/airflow/utils/db.py", line 50, in 
> wrapper
> result = func(*args, **kwargs)
>   File "/usr/lib/python2.7/site-packages/airflow/jobs.py", line 1792, in 
> process_file
> self._process_dags(dagbag, dags, ti_keys_to_schedule)
>   File "/usr/lib/python2.7/site-packages/airflow/jobs.py", line 1388, in 
> _process_dags
> dag_run = self.create_dag_run(dag)
>   File "/usr/lib/python2.7/site-packages/airflow/utils/db.py", line 50, in 
> wrapper
> result = func(*args, **kwargs)
>   File "/usr/lib/python2.7/site-packages/airflow/jobs.py", line 807, in 
> create_dag_run
> if next_start <= now:
> TypeError: can't compare datetime.datetime to NoneType
> [2018-01-08 16:05:50,214] {jobs.py:379} ERROR - Got an exception! 
> Propagating...
> Traceback (most recent call last):
>   File "/usr/lib/python2.7/site-packages/airflow/jobs.py", line 371, in helper
> pickle_dags)
>   File "/usr/lib/python2.7/site-packages/airflow/utils/db.py", line 50, in 
> wrapper
> result = func(*args, **kwargs)
>   File "/usr/lib/python2.7/site-packages/airflow/jobs.py", line 1792, in 
> process_file
> self._process_dags(dagbag, dags, ti_keys_to_schedule)
>   File "/usr/lib/python2.7/site-packages/airflow/jobs.py", line 1388, in 
> _process_dags
> dag_run = self.create_dag_run(dag)
>   File "/usr/lib/python2.7/site-packages/airflow/utils/db.py", line 50, in 
> wrapper
> result = func(*args, **kwargs)
>   File "/usr/lib/python2.7/site-packages/airflow/jobs.py", line 807, in 
> create_dag_run
> if next_start <= now:
> TypeError: can't compare datetime.datetime to NoneType
> [2018-01-08 16:05:50,618] {jobs.py:1627} INFO - Heartbeating the process 
> manager
> Also while this DAG is enabled in the UI, all other DAGs end up stuck in the 
> DAG Running State (although no tasks are scheduled or run)



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Assigned] (AIRFLOW-1977) @once, @daily, etc schedule_interval is failing

2018-02-12 Thread Fokko Driesprong (JIRA)

 [ 
https://issues.apache.org/jira/browse/AIRFLOW-1977?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Fokko Driesprong reassigned AIRFLOW-1977:
-

Assignee: Bolke de Bruin

Should be fixed in https://github.com/apache/incubator-airflow/pull/2883

> @once, @daily, etc schedule_interval is failing
> ---
>
> Key: AIRFLOW-1977
> URL: https://issues.apache.org/jira/browse/AIRFLOW-1977
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: scheduler, webserver, worker
>Affects Versions: 1.9.0
> Environment: Redhat 7, Redis and Postgres (RDS) in AWS 
>Reporter: Anant Mistry
>Assignee: Bolke de Bruin
>Priority: Major
>
> This is an except from the scheduler log while attempting to run the SubDag 
> example (example_subdag_operator)
> [2018-01-08 16:05:50,201] {jobs.py:1386} INFO - Processing 
> example_subdag_operator
> [2018-01-08 16:05:50,209] {jobs.py:379} ERROR - Got an exception! 
> Propagating...
> Traceback (most recent call last):
>   File "/usr/lib/python2.7/site-packages/airflow/jobs.py", line 371, in helper
> pickle_dags)
>   File "/usr/lib/python2.7/site-packages/airflow/utils/db.py", line 50, in 
> wrapper
> result = func(*args, **kwargs)
>   File "/usr/lib/python2.7/site-packages/airflow/jobs.py", line 1792, in 
> process_file
> self._process_dags(dagbag, dags, ti_keys_to_schedule)
>   File "/usr/lib/python2.7/site-packages/airflow/jobs.py", line 1388, in 
> _process_dags
> dag_run = self.create_dag_run(dag)
>   File "/usr/lib/python2.7/site-packages/airflow/utils/db.py", line 50, in 
> wrapper
> result = func(*args, **kwargs)
>   File "/usr/lib/python2.7/site-packages/airflow/jobs.py", line 807, in 
> create_dag_run
> if next_start <= now:
> TypeError: can't compare datetime.datetime to NoneType
> [2018-01-08 16:05:50,214] {jobs.py:379} ERROR - Got an exception! 
> Propagating...
> Traceback (most recent call last):
>   File "/usr/lib/python2.7/site-packages/airflow/jobs.py", line 371, in helper
> pickle_dags)
>   File "/usr/lib/python2.7/site-packages/airflow/utils/db.py", line 50, in 
> wrapper
> result = func(*args, **kwargs)
>   File "/usr/lib/python2.7/site-packages/airflow/jobs.py", line 1792, in 
> process_file
> self._process_dags(dagbag, dags, ti_keys_to_schedule)
>   File "/usr/lib/python2.7/site-packages/airflow/jobs.py", line 1388, in 
> _process_dags
> dag_run = self.create_dag_run(dag)
>   File "/usr/lib/python2.7/site-packages/airflow/utils/db.py", line 50, in 
> wrapper
> result = func(*args, **kwargs)
>   File "/usr/lib/python2.7/site-packages/airflow/jobs.py", line 807, in 
> create_dag_run
> if next_start <= now:
> TypeError: can't compare datetime.datetime to NoneType
> [2018-01-08 16:05:50,618] {jobs.py:1627} INFO - Heartbeating the process 
> manager
> Also while this DAG is enabled in the UI, all other DAGs end up stuck in the 
> DAG Running State (although no tasks are scheduled or run)



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (AIRFLOW-2103) Authentication using password_auth backend prevents webserver from running

2018-02-12 Thread Mark Ward (JIRA)

 [ 
https://issues.apache.org/jira/browse/AIRFLOW-2103?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Mark Ward updated AIRFLOW-2103:
---
Description: 
airflow webserver fails to run with config
{code:java}
[webserver]
authenticate = True
auth_backend = airflow.contrib.auth.backends.password_auth
{code}
and errors out with following traceback
{code:java}
Traceback (most recent call last):

  File "/usr/local/bin/airflow", line 4, in 

    
__import__('pkg_resources').run_script('apache-airflow==1.10.0.dev0+incubating',
 'airflow')

  File "/usr/local/lib/python3.6/site-packages/pkg_resources/__init__.py", line 
750, in run_script

    self.require(requires)[0].run_script(script_name, ns)

  File "/usr/local/lib/python3.6/site-packages/pkg_resources/__init__.py", line 
1527, in run_script

    exec(code, namespace, namespace)

  File 
"/usr/local/lib/python3.6/site-packages/apache_airflow-1.10.0.dev0+incubating-py3.6.egg/EGG-INFO/scripts/airflow",
 line 27, in 

    args.func(args)

  File 
"/usr/local/lib/python3.6/site-packages/apache_airflow-1.10.0.dev0+incubating-py3.6.egg/airflow/bin/cli.py",
 line 696, in webserver

    app = cached_app(conf)

  File 
"/usr/local/lib/python3.6/site-packages/apache_airflow-1.10.0.dev0+incubating-py3.6.egg/airflow/www/app.py",
 line 176, in cached_app

    app = create_app(config, testing)

  File 
"/usr/local/lib/python3.6/site-packages/apache_airflow-1.10.0.dev0+incubating-py3.6.egg/airflow/www/app.py",
 line 63, in create_app

    from airflow.www import views

  File 
"/usr/local/lib/python3.6/site-packages/apache_airflow-1.10.0.dev0+incubating-py3.6.egg/airflow/www/views.py",
 line 97, in 

    login_required = airflow.login.login_required

AttributeError: module 'airflow.contrib.auth.backends.password_auth' has no 
attribute 'login_required'
{code}
Broke at [https://github.com/apache/incubator-airflow/pull/2730] with the 
changing of 
{code:java}
from flask_login import login_required, current_user, logout_user
{code}
to 
{code:java}
from flask_login import current_user
{code}
 

  was:
airflow webserver fails to run with config

 
{code:java}
[webserver]
authenticate = True
auth_backend = airflow.contrib.auth.backends.password_auth
{code}
and errors out with following traceback

 

 
{code:java}
Traceback (most recent call last):

  File "/usr/local/bin/airflow", line 4, in 

    
__import__('pkg_resources').run_script('apache-airflow==1.10.0.dev0+incubating',
 'airflow')

  File "/usr/local/lib/python3.6/site-packages/pkg_resources/__init__.py", line 
750, in run_script

    self.require(requires)[0].run_script(script_name, ns)

  File "/usr/local/lib/python3.6/site-packages/pkg_resources/__init__.py", line 
1527, in run_script

    exec(code, namespace, namespace)

  File 
"/usr/local/lib/python3.6/site-packages/apache_airflow-1.10.0.dev0+incubating-py3.6.egg/EGG-INFO/scripts/airflow",
 line 27, in 

    args.func(args)

  File 
"/usr/local/lib/python3.6/site-packages/apache_airflow-1.10.0.dev0+incubating-py3.6.egg/airflow/bin/cli.py",
 line 696, in webserver

    app = cached_app(conf)

  File 
"/usr/local/lib/python3.6/site-packages/apache_airflow-1.10.0.dev0+incubating-py3.6.egg/airflow/www/app.py",
 line 176, in cached_app

    app = create_app(config, testing)

  File 
"/usr/local/lib/python3.6/site-packages/apache_airflow-1.10.0.dev0+incubating-py3.6.egg/airflow/www/app.py",
 line 63, in create_app

    from airflow.www import views

  File 
"/usr/local/lib/python3.6/site-packages/apache_airflow-1.10.0.dev0+incubating-py3.6.egg/airflow/www/views.py",
 line 97, in 

    login_required = airflow.login.login_required

AttributeError: module 'airflow.contrib.auth.backends.password_auth' has no 
attribute 'login_required'
{code}
Broke at [https://github.com/apache/incubator-airflow/pull/2730] with the 
changing of 

 
{code:java}
from flask_login import login_required, current_user, logout_user
{code}
to 

 

 
{code:java}
from flask_login import current_user
{code}
 

 

 


> Authentication using password_auth backend prevents webserver from running
> --
>
> Key: AIRFLOW-2103
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2103
> Project: Apache Airflow
>  Issue Type: Bug
>Reporter: Mark Ward
>Priority: Critical
>
> airflow webserver fails to run with config
> {code:java}
> [webserver]
> authenticate = True
> auth_backend = airflow.contrib.auth.backends.password_auth
> {code}
> and errors out with following traceback
> {code:java}
> Traceback (most recent call last):
>   File "/usr/local/bin/airflow", line 4, in 
>     
> __import__('pkg_resources').run_script('apache-airflow==1.10.0.dev0+incubating',
>  'airflow')
>   File "/usr/local/lib/python3.6/site-packages/pkg_resources/__init__.py", 
> line 750, in run_script
>     

[jira] [Created] (AIRFLOW-2103) Authentication using password_auth backend prevents webserver from running

2018-02-12 Thread Mark Ward (JIRA)
Mark Ward created AIRFLOW-2103:
--

 Summary: Authentication using password_auth backend prevents 
webserver from running
 Key: AIRFLOW-2103
 URL: https://issues.apache.org/jira/browse/AIRFLOW-2103
 Project: Apache Airflow
  Issue Type: Bug
Reporter: Mark Ward


airflow webserver fails to run with config

 
{code:java}
[webserver]
authenticate = True
auth_backend = airflow.contrib.auth.backends.password_auth
{code}
and errors out with following traceback

 

 
{code:java}
Traceback (most recent call last):

  File "/usr/local/bin/airflow", line 4, in 

    
__import__('pkg_resources').run_script('apache-airflow==1.10.0.dev0+incubating',
 'airflow')

  File "/usr/local/lib/python3.6/site-packages/pkg_resources/__init__.py", line 
750, in run_script

    self.require(requires)[0].run_script(script_name, ns)

  File "/usr/local/lib/python3.6/site-packages/pkg_resources/__init__.py", line 
1527, in run_script

    exec(code, namespace, namespace)

  File 
"/usr/local/lib/python3.6/site-packages/apache_airflow-1.10.0.dev0+incubating-py3.6.egg/EGG-INFO/scripts/airflow",
 line 27, in 

    args.func(args)

  File 
"/usr/local/lib/python3.6/site-packages/apache_airflow-1.10.0.dev0+incubating-py3.6.egg/airflow/bin/cli.py",
 line 696, in webserver

    app = cached_app(conf)

  File 
"/usr/local/lib/python3.6/site-packages/apache_airflow-1.10.0.dev0+incubating-py3.6.egg/airflow/www/app.py",
 line 176, in cached_app

    app = create_app(config, testing)

  File 
"/usr/local/lib/python3.6/site-packages/apache_airflow-1.10.0.dev0+incubating-py3.6.egg/airflow/www/app.py",
 line 63, in create_app

    from airflow.www import views

  File 
"/usr/local/lib/python3.6/site-packages/apache_airflow-1.10.0.dev0+incubating-py3.6.egg/airflow/www/views.py",
 line 97, in 

    login_required = airflow.login.login_required

AttributeError: module 'airflow.contrib.auth.backends.password_auth' has no 
attribute 'login_required'
{code}
Broke at [https://github.com/apache/incubator-airflow/pull/2730] with the 
changing of 

 
{code:java}
from flask_login import login_required, current_user, logout_user
{code}
to 

 

 
{code:java}
from flask_login import current_user
{code}
 

 

 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (AIRFLOW-2102) Add custom_args to Sendgrid personalizations

2018-02-12 Thread Marcin Szymanski (JIRA)
Marcin Szymanski created AIRFLOW-2102:
-

 Summary: Add custom_args to Sendgrid personalizations
 Key: AIRFLOW-2102
 URL: https://issues.apache.org/jira/browse/AIRFLOW-2102
 Project: Apache Airflow
  Issue Type: New Feature
  Components: contrib
Reporter: Marcin Szymanski
Assignee: Marcin Szymanski


Add support for {{custom_args}} in personalizations

[https://sendgrid.com/docs/Classroom/Send/v3_Mail_Send/personalizations.html]

{{custom_args}} should be passed in {{kwargs}} as other backends don't support 
them



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-2099) Task details cannot be shown when PythonOperator calls partial function / class instance with __call__

2018-02-12 Thread Matthew Revell (JIRA)

[ 
https://issues.apache.org/jira/browse/AIRFLOW-2099?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16361249#comment-16361249
 ] 

Matthew Revell commented on AIRFLOW-2099:
-

PR created from my branch 
[https://github.com/apache/incubator-airflow/pull/3032]

Branch rebased on master, no conflicts, and all tests passing

> Task details cannot be shown when PythonOperator calls partial function / 
> class instance with __call__
> --
>
> Key: AIRFLOW-2099
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2099
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: ui
>Affects Versions: Airflow 1.8
>Reporter: Matthew Revell
>Assignee: Matthew Revell
>Priority: Minor
>
> There are several scenarios where the inspect.getsource() method fails with:
> {{object at 0x> is not a module, class, method, function, traceback, 
> frame, or code object}}
> One such scenario is described in 
> [AIRFLOW-1027|https://issues.apache.org/jira/browse/AIRFLOW-1027] where a 
> partial function is used. Another is when an instance of a class which 
> implements __call__() is used.
> Example:
> {{class MyClass(object):}}
> {{    def __init__(self):}}
> {{        pass}}
> {{    def __call__(self):}}
> {{        pass}}
> {{my_class = MyClass()}}
> {{dag_task = PythonOperator(}}
> {{    task_id='dag_task',}}
> {{    dag=dag, }}
> {{    python_callable=my_class,}}
> {{)}}
> There exists a PR for AIRFLOW-1027, however, this fix does not address this 
> other scenario, and also does not guard against any other edge cases which my 
> result in this error in future.
> A better solution would be to catch known scenarios with work arounds, and 
> default to reporting that the source is unavailable for unknown cases. This 
> would at least display the Task Instance details in every case.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-2101) pip install apache-airflow does not install minimum packages for tutorial

2018-02-12 Thread Ash Berlin-Taylor (JIRA)

[ 
https://issues.apache.org/jira/browse/AIRFLOW-2101?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16361198#comment-16361198
 ] 

Ash Berlin-Taylor commented on AIRFLOW-2101:


Log uploaded as [^DMNx1dG4.txt]  in case the pastebin goes away.

Fernet is not a _hard_ requirement, so it shouldn't be required. I think from 
the log that it might have continued anyway as there is extra output 
afterwards. (I'm not certain though)

It's not great behaviour to see stack traces though so we should fix that.

> pip install apache-airflow does not install minimum packages for tutorial
> -
>
> Key: AIRFLOW-2101
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2101
> Project: Apache Airflow
>  Issue Type: Bug
>Affects Versions: 1.9.0
>Reporter: Paymahn Moghadasian
>Priority: Minor
> Attachments: DMNx1dG4.txt
>
>
> What's expected:
> running `pip install apache-airflow` should install the minimum requirements 
> for running `airflow initdb`
> What happens:
> `airflow initdb` errors out because Fernet cannot be imported.
> Solution:
> run `rm -rf $AIRFLOW_HOME && pip install "apache-airflow[crypto]" && airflow 
> initdb`
> Logs of my output can be seen at https://pastebin.com/DMNx1dG4



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (AIRFLOW-2101) pip install apache-airflow does not install minimum packages for tutorial

2018-02-12 Thread Ash Berlin-Taylor (JIRA)

 [ 
https://issues.apache.org/jira/browse/AIRFLOW-2101?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ash Berlin-Taylor updated AIRFLOW-2101:
---
Attachment: DMNx1dG4.txt

> pip install apache-airflow does not install minimum packages for tutorial
> -
>
> Key: AIRFLOW-2101
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2101
> Project: Apache Airflow
>  Issue Type: Bug
>Affects Versions: 1.9.0
>Reporter: Paymahn Moghadasian
>Priority: Minor
> Attachments: DMNx1dG4.txt
>
>
> What's expected:
> running `pip install apache-airflow` should install the minimum requirements 
> for running `airflow initdb`
> What happens:
> `airflow initdb` errors out because Fernet cannot be imported.
> Solution:
> run `rm -rf $AIRFLOW_HOME && pip install "apache-airflow[crypto]" && airflow 
> initdb`
> Logs of my output can be seen at https://pastebin.com/DMNx1dG4



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (AIRFLOW-2101) pip install apache-airflow does not install minimum packages for tutorial

2018-02-12 Thread Paymahn Moghadasian (JIRA)
Paymahn Moghadasian created AIRFLOW-2101:


 Summary: pip install apache-airflow does not install minimum 
packages for tutorial
 Key: AIRFLOW-2101
 URL: https://issues.apache.org/jira/browse/AIRFLOW-2101
 Project: Apache Airflow
  Issue Type: Bug
Affects Versions: 1.9.0
Reporter: Paymahn Moghadasian


What's expected:

running `pip install apache-airflow` should install the minimum requirements 
for running `airflow initdb`

What happens:

`airflow initdb` errors out because Fernet cannot be imported.

Solution:

run `rm -rf $AIRFLOW_HOME && pip install "apache-airflow[crypto]" && airflow 
initdb`

Logs of my output can be seen at https://pastebin.com/DMNx1dG4



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


incubator-airflow git commit: [AIRFLOW-XXX] Add PMC to list of companies using Airflow

2018-02-12 Thread fokko
Repository: incubator-airflow
Updated Branches:
  refs/heads/master 6e634ddff -> 76596744e


[AIRFLOW-XXX] Add PMC to list of companies using Airflow

Closes #3034 from andrewm4894/patch-1


Project: http://git-wip-us.apache.org/repos/asf/incubator-airflow/repo
Commit: http://git-wip-us.apache.org/repos/asf/incubator-airflow/commit/76596744
Tree: http://git-wip-us.apache.org/repos/asf/incubator-airflow/tree/76596744
Diff: http://git-wip-us.apache.org/repos/asf/incubator-airflow/diff/76596744

Branch: refs/heads/master
Commit: 76596744e6200382fcdd2e5d32862235719d2cac
Parents: 6e634dd
Author: Andrew Maguire 
Authored: Mon Feb 12 17:28:05 2018 +0100
Committer: Fokko Driesprong 
Committed: Mon Feb 12 17:28:09 2018 +0100

--
 README.md | 1 +
 1 file changed, 1 insertion(+)
--


http://git-wip-us.apache.org/repos/asf/incubator-airflow/blob/76596744/README.md
--
diff --git a/README.md b/README.md
index 9175f41..5d6863d 100644
--- a/README.md
+++ b/README.md
@@ -179,6 +179,7 @@ Currently **officially** using Airflow:
 1. [Pernod-Ricard](https://www.pernod-ricard.com/) 
[[@romain-nio](https://github.com/romain-nio) 
 1. [Playbuzz](https://www.playbuzz.com/) 
[[@clintonboys](https://github.com/clintonboys) & 
[@dbn](https://github.com/dbn)]
 1. [Plaid](https://www.plaid.com/) [[@plaid](https://github.com/plaid), 
[@AustinBGibbons](https://github.com/AustinBGibbons) & 
[@jeeyoungk](https://github.com/jeeyoungk)]
+1. [PMC](https://pmc.com/) [[@andrewm4894](https://github.com/andrewm4894)]
 1. [Postmates](http://www.postmates.com) 
[[@syeoryn](https://github.com/syeoryn)]
 1. [Pronto Tools](http://www.prontotools.io/) 
[[@zkan](https://github.com/zkan) & [@mesodiar](https://github.com/mesodiar)]
 1. [Qubole](https://qubole.com) [[@msumit](https://github.com/msumit)]



[jira] [Commented] (AIRFLOW-2100) Links to the official docs are all broken

2018-02-12 Thread ASF subversion and git services (JIRA)

[ 
https://issues.apache.org/jira/browse/AIRFLOW-2100?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16361012#comment-16361012
 ] 

ASF subversion and git services commented on AIRFLOW-2100:
--

Commit 6e634ddff1b0f53394cdf676abf5f03b3795d271 in incubator-airflow's branch 
refs/heads/master from [~kaxilnaik]
[ https://git-wip-us.apache.org/repos/asf?p=incubator-airflow.git;h=6e634dd ]

[AIRFLOW-2100] Fix Broken Documentation Links

- Replaced `Pythonhosted` documentation links with
`Apache Docs`

Closes #3033 from kaxil/AIRFLOW-2100


> Links to the official docs are all broken
> -
>
> Key: AIRFLOW-2100
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2100
> Project: Apache Airflow
>  Issue Type: Task
>  Components: docs, Documentation
>Affects Versions: 1.9.0, 2.0.0
>Reporter: Kaxil Naik
>Assignee: Kaxil Naik
>Priority: Major
> Fix For: 2.0.0
>
>
> Links to Pythonhosted are all broken. We can do 1 of the 2 things to resolve 
> it:
> # Change the all the links to readthedocs 
> ([latest|http://airflow.readthedocs.io/en/latest/] / 
> [stable|http://airflow.readthedocs.io/en/v1-9-stable/])?
> # Can someone who has access to PyPI create the docs?
> # Replace link to http://airflow.incubator.apache.org/



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Resolved] (AIRFLOW-2100) Links to the official docs are all broken

2018-02-12 Thread Fokko Driesprong (JIRA)

 [ 
https://issues.apache.org/jira/browse/AIRFLOW-2100?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Fokko Driesprong resolved AIRFLOW-2100.
---
   Resolution: Fixed
Fix Version/s: (was: 1.9.0)

Issue resolved by pull request #3033
[https://github.com/apache/incubator-airflow/pull/3033]

> Links to the official docs are all broken
> -
>
> Key: AIRFLOW-2100
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2100
> Project: Apache Airflow
>  Issue Type: Task
>  Components: docs, Documentation
>Affects Versions: 1.9.0, 2.0.0
>Reporter: Kaxil Naik
>Assignee: Kaxil Naik
>Priority: Major
> Fix For: 2.0.0
>
>
> Links to Pythonhosted are all broken. We can do 1 of the 2 things to resolve 
> it:
> # Change the all the links to readthedocs 
> ([latest|http://airflow.readthedocs.io/en/latest/] / 
> [stable|http://airflow.readthedocs.io/en/v1-9-stable/])?
> # Can someone who has access to PyPI create the docs?
> # Replace link to http://airflow.incubator.apache.org/



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-2100) Links to the official docs are all broken

2018-02-12 Thread ASF subversion and git services (JIRA)

[ 
https://issues.apache.org/jira/browse/AIRFLOW-2100?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16361013#comment-16361013
 ] 

ASF subversion and git services commented on AIRFLOW-2100:
--

Commit 6e634ddff1b0f53394cdf676abf5f03b3795d271 in incubator-airflow's branch 
refs/heads/master from [~kaxilnaik]
[ https://git-wip-us.apache.org/repos/asf?p=incubator-airflow.git;h=6e634dd ]

[AIRFLOW-2100] Fix Broken Documentation Links

- Replaced `Pythonhosted` documentation links with
`Apache Docs`

Closes #3033 from kaxil/AIRFLOW-2100


> Links to the official docs are all broken
> -
>
> Key: AIRFLOW-2100
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2100
> Project: Apache Airflow
>  Issue Type: Task
>  Components: docs, Documentation
>Affects Versions: 1.9.0, 2.0.0
>Reporter: Kaxil Naik
>Assignee: Kaxil Naik
>Priority: Major
> Fix For: 2.0.0
>
>
> Links to Pythonhosted are all broken. We can do 1 of the 2 things to resolve 
> it:
> # Change the all the links to readthedocs 
> ([latest|http://airflow.readthedocs.io/en/latest/] / 
> [stable|http://airflow.readthedocs.io/en/v1-9-stable/])?
> # Can someone who has access to PyPI create the docs?
> # Replace link to http://airflow.incubator.apache.org/



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


incubator-airflow git commit: [AIRFLOW-2100] Fix Broken Documentation Links

2018-02-12 Thread fokko
Repository: incubator-airflow
Updated Branches:
  refs/heads/master a289497c8 -> 6e634ddff


[AIRFLOW-2100] Fix Broken Documentation Links

- Replaced `Pythonhosted` documentation links with
`Apache Docs`

Closes #3033 from kaxil/AIRFLOW-2100


Project: http://git-wip-us.apache.org/repos/asf/incubator-airflow/repo
Commit: http://git-wip-us.apache.org/repos/asf/incubator-airflow/commit/6e634ddf
Tree: http://git-wip-us.apache.org/repos/asf/incubator-airflow/tree/6e634ddf
Diff: http://git-wip-us.apache.org/repos/asf/incubator-airflow/diff/6e634ddf

Branch: refs/heads/master
Commit: 6e634ddff1b0f53394cdf676abf5f03b3795d271
Parents: a289497
Author: Kaxil Naik 
Authored: Mon Feb 12 17:26:55 2018 +0100
Committer: Fokko Driesprong 
Committed: Mon Feb 12 17:26:55 2018 +0100

--
 README.md | 8 +---
 1 file changed, 5 insertions(+), 3 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/incubator-airflow/blob/6e634ddf/README.md
--
diff --git a/README.md b/README.md
index 6528ecc..9175f41 100644
--- a/README.md
+++ b/README.md
@@ -3,7 +3,7 @@
 [![PyPI 
version](https://badge.fury.io/py/apache-airflow.svg)](https://badge.fury.io/py/apache-airflow)
 [![Build 
Status](https://travis-ci.org/apache/incubator-airflow.svg?branch=master)](https://travis-ci.org/apache/incubator-airflow)
 [![Coverage 
Status](https://img.shields.io/codecov/c/github/apache/incubator-airflow/master.svg)](https://codecov.io/github/apache/incubator-airflow?branch=master)
-[![Documentation](https://img.shields.io/badge/docs-pythonhosted-blue.svg)](http://pythonhosted.org/airflow/)
+[![Documentation 
Status](https://readthedocs.org/projects/airflow/badge/?version=latest)](https://airflow.readthedocs.io/en/latest/?badge=latest)
 [![Join the chat at 
https://gitter.im/apache/incubator-airflow](https://badges.gitter.im/apache/incubator-airflow.svg)](https://gitter.im/apache/incubator-airflow?utm_source=badge_medium=badge_campaign=pr-badge_content=badge)
 
 _NOTE: The transition from 1.8.0 (or before) to 1.8.1 (or after) requires 
uninstalling Airflow before installing the new version. The package name was 
changed from `airflow` to `apache-airflow` as of version 1.8.1._
@@ -22,7 +22,9 @@ makes it easy to visualize pipelines running in production,
 monitor progress, and troubleshoot issues when needed.
 
 ## Getting started
-Please visit the Airflow Platform documentation for help with [installing 
Airflow](http://pythonhosted.org/airflow/installation.html), getting a [quick 
start](http://pythonhosted.org/airflow/start.html), or a more complete 
[tutorial](http://pythonhosted.org/airflow/tutorial.html).
+Please visit the Airflow Platform documentation (latest **stable** release) 
for help with [installing 
Airflow](https://airflow.incubator.apache.org/installation.html), getting a 
[quick start](https://airflow.incubator.apache.org/start.html), or a more 
complete [tutorial](https://airflow.incubator.apache.org/tutorial.html).
+
+Documentation of GitHub master (latest development branch): [ReadTheDocs 
Documentation](https://airflow.readthedocs.io/en/latest/)
 
 For further information, please visit the [Airflow 
Wiki](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Home).
 
@@ -219,7 +221,7 @@ Currently **officially** using Airflow:
 ## Links
 
 
-* [Documentation](http://airflow.incubator.apache.org/)
+* [Documentation](https://airflow.incubator.apache.org/)
 * [Chat](https://gitter.im/apache/incubator-airflow)
 * [Apache Airflow Incubation 
Status](http://incubator.apache.org/projects/airflow.html)
 * [More](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Links)



[jira] [Comment Edited] (AIRFLOW-1979) Redis celery backend not work on 1.9.0 (configuration is ignored)

2018-02-12 Thread Yuliya Volkova (JIRA)

[ 
https://issues.apache.org/jira/browse/AIRFLOW-1979?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16360664#comment-16360664
 ] 

Yuliya Volkova edited comment on AIRFLOW-1979 at 2/12/18 12:42 PM:
---

[~redtree1112], hello, version you used is form github/master branch? or from 
official release prom pypi?

seems like configuration variables are not visible for airflow on start airflow 
commands


was (Author: xnuinside):
[~redtree1112], hello, version you used is form github/master branch? or from 
official release prom pypi?

seems like configuration variables are not visible for airflow worker

> Redis celery backend not work on 1.9.0 (configuration is ignored)
> -
>
> Key: AIRFLOW-1979
> URL: https://issues.apache.org/jira/browse/AIRFLOW-1979
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: celery, worker
>Affects Versions: 1.9.0
>Reporter: Norio Akagi
>Priority: Major
>
> Worker tries to connect to RabbigMQ based on a default setting and shows an 
> error as below:
> {noformat}
> [2018-01-09 16:45:42,778] {driver.py:120} INFO - Generating grammar tables 
> from /usr/lib/python2.7/lib2to3/Grammar.txt
> [2018-01-09 16:45:42,802] {driver.py:120} INFO - Generating grammar tables 
> from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
> [2018-01-09 16:45:43,051] {configuration.py:206} WARNING - section/key 
> [celery/celery_ssl_active] not found in config
> [2018-01-09 16:45:43,051] {default_celery.py:41} WARNING - Celery Executor 
> will run without SSL
> [2018-01-09 16:45:43,052] {__init__.py:45} INFO - Using executor 
> CeleryExecutor
> [2018-01-09 16:45:43,140: WARNING/MainProcess] 
> /usr/local/lib/python2.7/dist-packages/celery/apps/worker.py:161: 
> CDeprecationWarning:
> Starting from version 3.2 Celery will refuse to accept pickle by default.
> The pickle serializer is a security concern as it may give attackers
> the ability to execute any command.  It's important to secure
> your broker from unauthorized access when using pickle, so we think
> that enabling pickle should require a deliberate action and not be
> the default choice.
> If you depend on pickle then you should set a setting to disable this
> warning and to be sure that everything will continue working
> when you upgrade to Celery 3.2::
> CELERY_ACCEPT_CONTENT = ['pickle', 'json', 'msgpack', 'yaml']
> You must only enable the serializers that you will actually use.
>   warnings.warn(CDeprecationWarning(W_PICKLE_DEPRECATED))
> [2018-01-09 16:45:43,240: ERROR/MainProcess] consumer: Cannot connect to 
> amqp://guest:**@127.0.0.1:5672//: [Errno 111] Connection refused.
> Trying again in 2.00 seconds...
> {noformat}
> I deploy Airflow on kubernetes so each component (web, scheduler, worker, and 
> flower) is containerized and distributed among nodes. I set 
> {{AIRFLOW__CELERY__CELERY_RESULT_BACKEND}}
>  and {{AIRFLOW__CELERY__BROKER_URL}} in environment variables and it can be 
> seen when I run {{printenv}} in a container, but it looks completely ignored.
> Moving these values to {{airflow.cfg}} doesn't work either.
> It worked just perfectly 1.8 and suddenly stopped working when I upgraded 
> Airflow to 1.9.
> Do you have any idea what may cause this configuration issue?



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Comment Edited] (AIRFLOW-1979) Redis celery backend not work on 1.9.0 (configuration is ignored)

2018-02-12 Thread Yuliya Volkova (JIRA)

[ 
https://issues.apache.org/jira/browse/AIRFLOW-1979?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16360664#comment-16360664
 ] 

Yuliya Volkova edited comment on AIRFLOW-1979 at 2/12/18 12:40 PM:
---

[~redtree1112], hello, version you used is form github/master branch? or from 
official release prom pypi?

seems like configuration variables are not visible for airflow worker


was (Author: xnuinside):
[~redtree1112], hello, did you see 
[https://github.com/apache/incubator-airflow/blob/master/UPDATING.md#celery-config]
 ? 

 {{AIRFLOW__CELERY__CELERY_RESULT_BACKEND could not be in 1.9.0, it was changed 
from CELERY_RESULT_BACKEND to {{result_backend 

> Redis celery backend not work on 1.9.0 (configuration is ignored)
> -
>
> Key: AIRFLOW-1979
> URL: https://issues.apache.org/jira/browse/AIRFLOW-1979
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: celery, worker
>Affects Versions: 1.9.0
>Reporter: Norio Akagi
>Priority: Major
>
> Worker tries to connect to RabbigMQ based on a default setting and shows an 
> error as below:
> {noformat}
> [2018-01-09 16:45:42,778] {driver.py:120} INFO - Generating grammar tables 
> from /usr/lib/python2.7/lib2to3/Grammar.txt
> [2018-01-09 16:45:42,802] {driver.py:120} INFO - Generating grammar tables 
> from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
> [2018-01-09 16:45:43,051] {configuration.py:206} WARNING - section/key 
> [celery/celery_ssl_active] not found in config
> [2018-01-09 16:45:43,051] {default_celery.py:41} WARNING - Celery Executor 
> will run without SSL
> [2018-01-09 16:45:43,052] {__init__.py:45} INFO - Using executor 
> CeleryExecutor
> [2018-01-09 16:45:43,140: WARNING/MainProcess] 
> /usr/local/lib/python2.7/dist-packages/celery/apps/worker.py:161: 
> CDeprecationWarning:
> Starting from version 3.2 Celery will refuse to accept pickle by default.
> The pickle serializer is a security concern as it may give attackers
> the ability to execute any command.  It's important to secure
> your broker from unauthorized access when using pickle, so we think
> that enabling pickle should require a deliberate action and not be
> the default choice.
> If you depend on pickle then you should set a setting to disable this
> warning and to be sure that everything will continue working
> when you upgrade to Celery 3.2::
> CELERY_ACCEPT_CONTENT = ['pickle', 'json', 'msgpack', 'yaml']
> You must only enable the serializers that you will actually use.
>   warnings.warn(CDeprecationWarning(W_PICKLE_DEPRECATED))
> [2018-01-09 16:45:43,240: ERROR/MainProcess] consumer: Cannot connect to 
> amqp://guest:**@127.0.0.1:5672//: [Errno 111] Connection refused.
> Trying again in 2.00 seconds...
> {noformat}
> I deploy Airflow on kubernetes so each component (web, scheduler, worker, and 
> flower) is containerized and distributed among nodes. I set 
> {{AIRFLOW__CELERY__CELERY_RESULT_BACKEND}}
>  and {{AIRFLOW__CELERY__BROKER_URL}} in environment variables and it can be 
> seen when I run {{printenv}} in a container, but it looks completely ignored.
> Moving these values to {{airflow.cfg}} doesn't work either.
> It worked just perfectly 1.8 and suddenly stopped working when I upgraded 
> Airflow to 1.9.
> Do you have any idea what may cause this configuration issue?



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-1650) Celery custom config is broken

2018-02-12 Thread Yuliya Volkova (JIRA)

[ 
https://issues.apache.org/jira/browse/AIRFLOW-1650?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16360667#comment-16360667
 ] 

Yuliya Volkova commented on AIRFLOW-1650:
-

[https://github.com/apache/incubator-airflow/pull/2639] didn't fix it? 

> Celery custom config is broken
> --
>
> Key: AIRFLOW-1650
> URL: https://issues.apache.org/jira/browse/AIRFLOW-1650
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: celery, configuration
>Reporter: Bolke de Bruin
>Priority: Major
> Fix For: 1.10.0
>
>
> Celery custom config loading is broken as is just loads a string instead of 
> loading a config



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-1979) Redis celery backend not work on 1.9.0 (configuration is ignored)

2018-02-12 Thread Yuliya Volkova (JIRA)

[ 
https://issues.apache.org/jira/browse/AIRFLOW-1979?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16360664#comment-16360664
 ] 

Yuliya Volkova commented on AIRFLOW-1979:
-

[~redtree1112], hello, did you see 
[https://github.com/apache/incubator-airflow/blob/master/UPDATING.md#celery-config]
 ? 

 {{AIRFLOW__CELERY__CELERY_RESULT_BACKEND could not be in 1.9.0, it was changed 
from CELERY_RESULT_BACKEND to {{result_backend 

> Redis celery backend not work on 1.9.0 (configuration is ignored)
> -
>
> Key: AIRFLOW-1979
> URL: https://issues.apache.org/jira/browse/AIRFLOW-1979
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: celery, worker
>Affects Versions: 1.9.0
>Reporter: Norio Akagi
>Priority: Major
>
> Worker tries to connect to RabbigMQ based on a default setting and shows an 
> error as below:
> {noformat}
> [2018-01-09 16:45:42,778] {driver.py:120} INFO - Generating grammar tables 
> from /usr/lib/python2.7/lib2to3/Grammar.txt
> [2018-01-09 16:45:42,802] {driver.py:120} INFO - Generating grammar tables 
> from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
> [2018-01-09 16:45:43,051] {configuration.py:206} WARNING - section/key 
> [celery/celery_ssl_active] not found in config
> [2018-01-09 16:45:43,051] {default_celery.py:41} WARNING - Celery Executor 
> will run without SSL
> [2018-01-09 16:45:43,052] {__init__.py:45} INFO - Using executor 
> CeleryExecutor
> [2018-01-09 16:45:43,140: WARNING/MainProcess] 
> /usr/local/lib/python2.7/dist-packages/celery/apps/worker.py:161: 
> CDeprecationWarning:
> Starting from version 3.2 Celery will refuse to accept pickle by default.
> The pickle serializer is a security concern as it may give attackers
> the ability to execute any command.  It's important to secure
> your broker from unauthorized access when using pickle, so we think
> that enabling pickle should require a deliberate action and not be
> the default choice.
> If you depend on pickle then you should set a setting to disable this
> warning and to be sure that everything will continue working
> when you upgrade to Celery 3.2::
> CELERY_ACCEPT_CONTENT = ['pickle', 'json', 'msgpack', 'yaml']
> You must only enable the serializers that you will actually use.
>   warnings.warn(CDeprecationWarning(W_PICKLE_DEPRECATED))
> [2018-01-09 16:45:43,240: ERROR/MainProcess] consumer: Cannot connect to 
> amqp://guest:**@127.0.0.1:5672//: [Errno 111] Connection refused.
> Trying again in 2.00 seconds...
> {noformat}
> I deploy Airflow on kubernetes so each component (web, scheduler, worker, and 
> flower) is containerized and distributed among nodes. I set 
> {{AIRFLOW__CELERY__CELERY_RESULT_BACKEND}}
>  and {{AIRFLOW__CELERY__BROKER_URL}} in environment variables and it can be 
> seen when I run {{printenv}} in a container, but it looks completely ignored.
> Moving these values to {{airflow.cfg}} doesn't work either.
> It worked just perfectly 1.8 and suddenly stopped working when I upgraded 
> Airflow to 1.9.
> Do you have any idea what may cause this configuration issue?



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Comment Edited] (AIRFLOW-2072) Calling task from different DAG

2018-02-12 Thread Yuliya Volkova (JIRA)

[ 
https://issues.apache.org/jira/browse/AIRFLOW-2072?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16360412#comment-16360412
 ] 

Yuliya Volkova edited comment on AIRFLOW-2072 at 2/12/18 8:15 AM:
--

[~AnilKumar2], hello!

What result do you want from this? Are you want just reuse code? If just reuse 
code - you can do it like with any python code - move task to separate python 
function inside DAG .py and use i in import

or you want to get with task some results from it parent DAG?

If second variant - need to describe more information, but I think in Airflow 
there are things like Xcom, if you need to share results.

Anyway if you will have more questions, join gitter - 
[https://gitter.im/apache/incubator-airflow] there you can get answers more 
faster 

 


was (Author: xnuinside):
[~AnilKumar2], hello!

What result do you want from this? Are you want just reuse code? If just reuse 
code - you can do it like with any python code - move task to separate python 
function inside DAG .py and import it from any place

or you want to get with task some results from it parent DAG?

If second variant - need to describe more information, but I think in Airflow 
there are things like Xcom, if you need to share results.

Anyway if you will have more questions, join gitter - 
[https://gitter.im/apache/incubator-airflow] there you can get answers more 
faster 

 

> Calling task from different DAG
> ---
>
> Key: AIRFLOW-2072
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2072
> Project: Apache Airflow
>  Issue Type: Task
>  Components: Dataflow
>Affects Versions: 1.9.0
>Reporter: Anil Kumar
>Priority: Major
>
> Hello Again,
> I am new for Airflow, started POC for ETL operation Orchestration activity 
> using Airflow. On creating new DAG, do we have any option to call existing 
> task from other DAG to new one. Like I want to call DAG_DataPrep.CopyEBCIDIC 
> to new DAG.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Comment Edited] (AIRFLOW-2072) Calling task from different DAG

2018-02-12 Thread Yuliya Volkova (JIRA)

[ 
https://issues.apache.org/jira/browse/AIRFLOW-2072?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16360412#comment-16360412
 ] 

Yuliya Volkova edited comment on AIRFLOW-2072 at 2/12/18 8:14 AM:
--

[~AnilKumar2], hello!

What result do you want from this? Are you want just reuse code? If just reuse 
code - you can do it like with any python code - move task to separate python 
function inside DAG .py and import it from any place

or you want to get with task some results from it parent DAG?

If second variant - need to describe more information, but I think in Airflow 
there are things like Xcom, if you need to share results.

Anyway if you will have more questions, join gitter - 
[https://gitter.im/apache/incubator-airflow] there you can get answers more 
faster 

 


was (Author: xnuinside):
[~AnilKumar2], hello!

What result do you want from this? Are you want just reuse code? If just reuse 
code - you can do it like with any python code - move task to separate python 
function inside DAG .py and import it from any place

or you want to get with task some results from it parent DAG?

If second variant - need to describe more information, but I think in Airflow 
there are things like Xcom, if you need to share results.

Anyway if you will have more questions, join gitter - 
https://gitter.im/apache/incubator-airflow there you can get answers more 
faster \{#emotions_dlg.smile}

 

> Calling task from different DAG
> ---
>
> Key: AIRFLOW-2072
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2072
> Project: Apache Airflow
>  Issue Type: Task
>  Components: Dataflow
>Affects Versions: 1.9.0
>Reporter: Anil Kumar
>Priority: Major
>
> Hello Again,
> I am new for Airflow, started POC for ETL operation Orchestration activity 
> using Airflow. On creating new DAG, do we have any option to call existing 
> task from other DAG to new one. Like I want to call DAG_DataPrep.CopyEBCIDIC 
> to new DAG.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (AIRFLOW-2072) Calling task from different DAG

2018-02-12 Thread Yuliya Volkova (JIRA)

[ 
https://issues.apache.org/jira/browse/AIRFLOW-2072?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16360412#comment-16360412
 ] 

Yuliya Volkova commented on AIRFLOW-2072:
-

[~AnilKumar2], hello!

What result do you want from this? Are you want just reuse code? If just reuse 
code - you can do it like with any python code - move task to separate python 
function inside DAG .py and import it from any place

or you want to get with task some results from it parent DAG?

If second variant - need to describe more information, but I think in Airflow 
there are things like Xcom, if you need to share results.

Anyway if you will have more questions, join gitter - 
https://gitter.im/apache/incubator-airflow there you can get answers more 
faster \{#emotions_dlg.smile}

 

> Calling task from different DAG
> ---
>
> Key: AIRFLOW-2072
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2072
> Project: Apache Airflow
>  Issue Type: Task
>  Components: Dataflow
>Affects Versions: 1.9.0
>Reporter: Anil Kumar
>Priority: Major
>
> Hello Again,
> I am new for Airflow, started POC for ETL operation Orchestration activity 
> using Airflow. On creating new DAG, do we have any option to call existing 
> task from other DAG to new one. Like I want to call DAG_DataPrep.CopyEBCIDIC 
> to new DAG.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)