Enable Travis CI Auto Cancellation?

2018-05-30 Thread Craig Rodrigues
Can someone who has administrator access to Travis CI enable Auto-Cancellation on branch and Pull Requests builds? See: https://blog.travis-ci.com/2017-09-21-default-auto-cancellation -- Craig

Re: KubernetesPodOperator: Invalid arguments were passed to BaseOperator

2018-05-30 Thread Craig Rodrigues
I have submitted a patch: https://github.com/apache/incubator-airflow/pull/3442 -- Craig On 2018/05/30 19:45:23, Craig Rodrigues wrote: > Oh, OK, I just saw this in example_kubernetes_operator.py: > > try: > # Kubernetes is optional, so not available in vanilla Airflow > # pip

EmailSensorOperator

2018-05-30 Thread Andy Cooper
All, I was recently answering a stackoverflow question that involved a workflow depending on an email being received. My knee jerk reaction was to reply that the user could leverage the EmailSensorOperator to accomplish this task. I was typing out the answer - when I decided to link to the actual

Re: KubernetesPodOperator: Invalid arguments were passed to BaseOperator

2018-05-30 Thread Chris Palmer
In the example it imports BaseOperator as KubernetesPodOperator, when the kubernetes modules can't be found. On Wed, May 30, 2018 at 3:34 PM, Craig Rodrigues wrote: > For this

Re: KubernetesPodOperator: Invalid arguments were passed to BaseOperator

2018-05-30 Thread Craig Rodrigues
For this particular DeprecationWarning, this problem is not caused by dependencies on kubernetes stuff. On this line: https://github.com/apache/incubator-airflow/blob/master/airflow/contrib/operators/kubernetes_pod_operator.py#L137 The __init__() method for KubernetesPodOperator is calling the

Re: KubernetesPodOperator: Invalid arguments were passed to BaseOperator

2018-05-30 Thread Craig Rodrigues
For this particular DeprecationWarning, this problem is not caused by dependencies on kubernetes stuff. On this line: https://github.com/apache/incubator-airflow/blob/master/airflow/contrib/operators/kubernetes_pod_operator.py#L137 The __init__() method for KubernetesPodOperator is calling the

Re: KubernetesPodOperator: Invalid arguments were passed to BaseOperator

2018-05-30 Thread Driesprong, Fokko
Hi Taylor, Thanks, I was thinking about something similar like you're suggesting. But I'm not confident if the sys.exit() won't kill the whole Airflow process. For example, if you do an airflow initdb, also the examples will be initialised, and if you don't have kubernetes installed, it will hit

Re: KubernetesPodOperator: Invalid arguments were passed to BaseOperator

2018-05-30 Thread Taylor Edmiston
I used requests instead of kube as an example, but what do you think about doing something like this? I'm happy to put this into a PR if it would solve the pain point today. import logging try: import requests except ModuleNotFoundError: import sys logging.warning('kube not

Re: Disable Processing of DAG file

2018-05-30 Thread ramandumcs
Thanks Maxime, we have 100(s) of dags with schedule set to @once with new DAGs keep on coming in the system. Scheduler process each and every DAG inside the local DAG folder. Each Dag file processing takes around 400 millisecond and we have set max_threads to 8(As we have 8 core machine). i.e

Re: conn_id breaking change; once more with feeling

2018-05-30 Thread Ash Berlin-Taylor
I was involved in the Github discussion about the rename to aws_conn_id, and it prompted me to write http://mail-archives.apache.org/mod_mbox/airflow-dev/201801.mbox/%3cCABYbY7dPS8X6Z4mgbahevQwF5BnYYHXezFo=avoLBNxPzp5=b...@mail.gmail.com%3e

Re: KubernetesPodOperator: Invalid arguments were passed to BaseOperator

2018-05-30 Thread Driesprong, Fokko
Hi Craig, This is something that needs to be fixed. I agree with you this is very dirty. In your installation you're not installing the kubernetes stuff, so the KubernetesPodOperator is ignored. We need to figure out how to have example dags that are not compatible with the vanilla installation,

Re: Disable Processing of DAG file

2018-05-30 Thread Maxime Beauchemin
The TLDR of how the processor works is: while True: * sets a multiprocessing queue with N processes (say 32) * main process looks for the list of all .py files in DAGS_FOLDER * fills in the queue with all .py * each one of the 32 suprocess opens a file and interprets it (it's insulated from the