[jira] [Created] (AIRFLOW-357) how should I use the right owner task in airflow?

2016-07-22 Thread wei.he (JIRA)
wei.he created AIRFLOW-357:
--

 Summary: how should I use the right owner task in airflow?
 Key: AIRFLOW-357
 URL: https://issues.apache.org/jira/browse/AIRFLOW-357
 Project: Apache Airflow
  Issue Type: Bug
Affects Versions: Airflow 1.7.1
Reporter: wei.he


I dont understand the "owner" in airflow. the comment of ower is "the owner of 
the task, using the unix username is recommended". I wrote some the following 
code.

Default_args = {
'owner': 'max',
'depends_on_past': False,
'start_date': datetime(2016, 7, 14),
'email': ['m...@test.com'],
'email_on_failure': False,
'email_on_retry': False,
'retries': 1,
'retry_delay': timedelta(minutes=5), 
dag = DAG('dmp-annalect', default_args=default_args,
schedule_interval='30 0 * * *')
task1_pigjob_basedata = """
{local_dir}/src/basedata/basedata.sh > {local_dir}/log/basedata/run_log &
""".format(local_dir=WORKSPACE)

task1_pigjob_basedata = BashOperator(
task_id='task1_pigjob_basedata_impclk',owner='max',
bash_command=pigjob_basedata_impclk,
dag=dag)

 I used the command "airflow test dagid taskid 2016-07-20" , But I got a error, 
... {bash_operator.py:77} INFO - put: Permission denied: user=airflow, 

I thought that my job ran with "max" user, but apperently , ran test using 
'airflow' user .

I hope if I run my task using 'max' user, how should I do.

Thanks




--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (AIRFLOW-357) how should I use the right owner task in airflow?

2016-07-22 Thread wei.he (JIRA)

 [ 
https://issues.apache.org/jira/browse/AIRFLOW-357?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

wei.he updated AIRFLOW-357:
---
Description: 
I dont understand the "owner" in airflow. the comment of ower is "the owner of 
the task, using the unix username is recommended". I wrote some the following 
code.

Default_args = {
'owner': 'max',
'depends_on_past': False,
'start_date': datetime(2016, 7, 14),
'email': ['m...@test.com'],
'email_on_failure': False,
'email_on_retry': False,
'retries': 1,
'retry_delay': timedelta(minutes=5), 
dag = DAG('dmp-annalect', default_args=default_args,
schedule_interval='30 0 * * *')


task1_pigjob_basedata = """
{local_dir}/src/basedata/basedata.sh > {local_dir}/log/basedata/run_log &
""".format(local_dir=WORKSPACE)

task1_pigjob_basedata = BashOperator(
task_id='task1_pigjob_basedata_impclk',owner='max',
bash_command=pigjob_basedata_impclk,
dag=dag)

 I used the command "airflow test dagid taskid 2016-07-20" , But I got a error, 
... {bash_operator.py:77} INFO - put: Permission denied: user=airflow, 

I thought that my job ran with "max" user, but apperently , ran test using 
'airflow' user .

I hope if I run my task using 'max' user, how should I do.

Thanks


  was:
I dont understand the "owner" in airflow. the comment of ower is "the owner of 
the task, using the unix username is recommended". I wrote some the following 
code.

Default_args = {
'owner': 'max',
'depends_on_past': False,
'start_date': datetime(2016, 7, 14),
'email': ['m...@test.com'],
'email_on_failure': False,
'email_on_retry': False,
'retries': 1,
'retry_delay': timedelta(minutes=5), 
dag = DAG('dmp-annalect', default_args=default_args,
schedule_interval='30 0 * * *')
task1_pigjob_basedata = """
{local_dir}/src/basedata/basedata.sh > {local_dir}/log/basedata/run_log &
""".format(local_dir=WORKSPACE)

task1_pigjob_basedata = BashOperator(
task_id='task1_pigjob_basedata_impclk',owner='max',
bash_command=pigjob_basedata_impclk,
dag=dag)

 I used the command "airflow test dagid taskid 2016-07-20" , But I got a error, 
... {bash_operator.py:77} INFO - put: Permission denied: user=airflow, 

I thought that my job ran with "max" user, but apperently , ran test using 
'airflow' user .

I hope if I run my task using 'max' user, how should I do.

Thanks



> how should I use the right owner task in airflow?
> -
>
> Key: AIRFLOW-357
> URL: https://issues.apache.org/jira/browse/AIRFLOW-357
> Project: Apache Airflow
>  Issue Type: Bug
>Affects Versions: Airflow 1.7.1
>Reporter: wei.he
>
> I dont understand the "owner" in airflow. the comment of ower is "the owner 
> of the task, using the unix username is recommended". I wrote some the 
> following code.
> Default_args = {
> 'owner': 'max',
> 'depends_on_past': False,
> 'start_date': datetime(2016, 7, 14),
> 'email': ['m...@test.com'],
> 'email_on_failure': False,
> 'email_on_retry': False,
> 'retries': 1,
> 'retry_delay': timedelta(minutes=5), 
> dag = DAG('dmp-annalect', default_args=default_args,
> schedule_interval='30 0 * * *')
> task1_pigjob_basedata = """
> {local_dir}/src/basedata/basedata.sh > {local_dir}/log/basedata/run_log &
> """.format(local_dir=WORKSPACE)
> task1_pigjob_basedata = BashOperator(
> task_id='task1_pigjob_basedata_impclk',owner='max',
> bash_command=pigjob_basedata_impclk,
> dag=dag)
>  I used the command "airflow test dagid taskid 2016-07-20" , But I got a 
> error, ... {bash_operator.py:77} INFO - put: Permission denied: user=airflow, 
> 
> I thought that my job ran with "max" user, but apperently , ran test using 
> 'airflow' user .
> I hope if I run my task using 'max' user, how should I do.
> Thanks



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (AIRFLOW-357) how should I use the right owner task in airflow?

2016-07-22 Thread wei.he (JIRA)

 [ 
https://issues.apache.org/jira/browse/AIRFLOW-357?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

wei.he updated AIRFLOW-357:
---
Description: 
I dont understand the "owner" in airflow. the comment of ower is "the owner of 
the task, using the unix username is recommended". I wrote some the following 
code.

Default_args = {
'owner': 'max',
'depends_on_past': False,
'start_date': datetime(2016, 7, 14),
'email': ['m...@test.com'],
'email_on_failure': False,
'email_on_retry': False,
'retries': 1,
'retry_delay': timedelta(minutes=5), 
dag = DAG('dmp-annalect', default_args=default_args,
schedule_interval='30 0 * * *')


task1_pigjob_basedata = """
{local_dir}/src/basedata/basedata.sh > {local_dir}/log/basedata/run_log &
""".format(local_dir=WORKSPACE)

task1_pigjob_basedata = BashOperator(
task_id='task1_pigjob_basedata_impclk',owner='max',
bash_command=pigjob_basedata_impclk,
dag=dag)

 I used the command "airflow test dagid taskid 2016-07-20" , 

But I got a error, 

... {bash_operator.py:77} INFO - put: Permission denied: user=airflow, 

I thought that my job ran with "max" user, but apperently , ran test using 
'airflow' user .

I hope if I run my task using 'max' user, how should I do.

Thanks


  was:
I dont understand the "owner" in airflow. the comment of ower is "the owner of 
the task, using the unix username is recommended". I wrote some the following 
code.

Default_args = {
'owner': 'max',
'depends_on_past': False,
'start_date': datetime(2016, 7, 14),
'email': ['m...@test.com'],
'email_on_failure': False,
'email_on_retry': False,
'retries': 1,
'retry_delay': timedelta(minutes=5), 
dag = DAG('dmp-annalect', default_args=default_args,
schedule_interval='30 0 * * *')


task1_pigjob_basedata = """
{local_dir}/src/basedata/basedata.sh > {local_dir}/log/basedata/run_log &
""".format(local_dir=WORKSPACE)

task1_pigjob_basedata = BashOperator(
task_id='task1_pigjob_basedata_impclk',owner='max',
bash_command=pigjob_basedata_impclk,
dag=dag)

 I used the command "airflow test dagid taskid 2016-07-20" , But I got a error, 
... {bash_operator.py:77} INFO - put: Permission denied: user=airflow, 

I thought that my job ran with "max" user, but apperently , ran test using 
'airflow' user .

I hope if I run my task using 'max' user, how should I do.

Thanks



> how should I use the right owner task in airflow?
> -
>
> Key: AIRFLOW-357
> URL: https://issues.apache.org/jira/browse/AIRFLOW-357
> Project: Apache Airflow
>  Issue Type: Bug
>Affects Versions: Airflow 1.7.1
>Reporter: wei.he
>
> I dont understand the "owner" in airflow. the comment of ower is "the owner 
> of the task, using the unix username is recommended". I wrote some the 
> following code.
> Default_args = {
> 'owner': 'max',
> 'depends_on_past': False,
> 'start_date': datetime(2016, 7, 14),
> 'email': ['m...@test.com'],
> 'email_on_failure': False,
> 'email_on_retry': False,
> 'retries': 1,
> 'retry_delay': timedelta(minutes=5), 
> dag = DAG('dmp-annalect', default_args=default_args,
> schedule_interval='30 0 * * *')
> task1_pigjob_basedata = """
> {local_dir}/src/basedata/basedata.sh > {local_dir}/log/basedata/run_log &
> """.format(local_dir=WORKSPACE)
> task1_pigjob_basedata = BashOperator(
> task_id='task1_pigjob_basedata_impclk',owner='max',
> bash_command=pigjob_basedata_impclk,
> dag=dag)
>  I used the command "airflow test dagid taskid 2016-07-20" , 
> But I got a error, 
> ... {bash_operator.py:77} INFO - put: Permission denied: user=airflow, 
> I thought that my job ran with "max" user, but apperently , ran test using 
> 'airflow' user .
> I hope if I run my task using 'max' user, how should I do.
> Thanks



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (AIRFLOW-357) how should I use the right owner task in airflow?

2016-07-22 Thread wei.he (JIRA)

[ 
https://issues.apache.org/jira/browse/AIRFLOW-357?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15389182#comment-15389182
 ] 

wei.he commented on AIRFLOW-357:


I figured out this issue. 
Because I set the AIRFLOW_HOME in /home/airflow/, only airflow can access this 
file directory.


> how should I use the right owner task in airflow?
> -
>
> Key: AIRFLOW-357
> URL: https://issues.apache.org/jira/browse/AIRFLOW-357
> Project: Apache Airflow
>  Issue Type: Bug
>Affects Versions: Airflow 1.7.1
>Reporter: wei.he
>
> I dont understand the "owner" in airflow. the comment of ower is "the owner 
> of the task, using the unix username is recommended". I wrote some the 
> following code.
> Default_args = {
> 'owner': 'max',
> 'depends_on_past': False,
> 'start_date': datetime(2016, 7, 14),
> 'email': ['m...@test.com'],
> 'email_on_failure': False,
> 'email_on_retry': False,
> 'retries': 1,
> 'retry_delay': timedelta(minutes=5), 
> dag = DAG('dmp-annalect', default_args=default_args,
> schedule_interval='30 0 * * *')
> task1_pigjob_basedata = """
> {local_dir}/src/basedata/basedata.sh > {local_dir}/log/basedata/run_log &
> """.format(local_dir=WORKSPACE)
> task1_pigjob_basedata = BashOperator(
> task_id='task1_pigjob_basedata_impclk',owner='max',
> bash_command=pigjob_basedata_impclk,
> dag=dag)
>  I used the command "airflow test dagid taskid 2016-07-20" , 
> But I got a error, 
> ... {bash_operator.py:77} INFO - put: Permission denied: user=airflow, 
> I thought that my job ran with "max" user, but apperently , ran test using 
> 'airflow' user .
> I hope if I run my task using 'max' user, how should I do.
> Thanks



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


incubator-airflow git commit: [AIRFLOW-348] Fix code style warnings

2016-07-22 Thread bolke
Repository: incubator-airflow
Updated Branches:
  refs/heads/master b6e609824 -> 189e6b887


[AIRFLOW-348] Fix code style warnings

Closes #1672 from skudriashev/airflow-348


Project: http://git-wip-us.apache.org/repos/asf/incubator-airflow/repo
Commit: http://git-wip-us.apache.org/repos/asf/incubator-airflow/commit/189e6b88
Tree: http://git-wip-us.apache.org/repos/asf/incubator-airflow/tree/189e6b88
Diff: http://git-wip-us.apache.org/repos/asf/incubator-airflow/diff/189e6b88

Branch: refs/heads/master
Commit: 189e6b88742ace8c46e72d59d7662284e34b7a2e
Parents: b6e6098
Author: Stanislav Kudriashev 
Authored: Fri Jul 22 13:03:45 2016 +0200
Committer: Bolke de Bruin 
Committed: Fri Jul 22 13:04:03 2016 +0200

--
 airflow/configuration.py|  7 ++---
 airflow/contrib/hooks/qubole_hook.py|  1 -
 airflow/example_dags/example_python_operator.py | 11 +++
 airflow/hooks/__init__.py   |  1 -
 airflow/hooks/dbapi_hook.py | 15 +-
 airflow/hooks/hdfs_hook.py  | 13 -
 airflow/hooks/jdbc_hook.py  |  8 ++
 airflow/hooks/webhdfs_hook.py   | 16 +--
 airflow/macros/__init__.py  |  1 -
 airflow/models.py   |  4 +--
 airflow/operators/__init__.py   |  1 -
 airflow/operators/dagrun_operator.py|  1 +
 airflow/settings.py |  5 +++-
 airflow/utils/file.py   | 12 
 airflow/utils/helpers.py|  2 ++
 airflow/www/views.py| 30 +---
 16 files changed, 58 insertions(+), 70 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/incubator-airflow/blob/189e6b88/airflow/configuration.py
--
diff --git a/airflow/configuration.py b/airflow/configuration.py
index e03b713..5a380ae 100644
--- a/airflow/configuration.py
+++ b/airflow/configuration.py
@@ -661,10 +661,9 @@ def mkdir_p(path):
 else:
 raise AirflowConfigException('Had trouble creating a directory')
 
-"""
-Setting AIRFLOW_HOME and AIRFLOW_CONFIG from environment variables, using
-"~/airflow" and "~/airflow/airflow.cfg" respectively as defaults.
-"""
+
+# Setting AIRFLOW_HOME and AIRFLOW_CONFIG from environment variables, using
+# "~/airflow" and "~/airflow/airflow.cfg" respectively as defaults.
 
 if 'AIRFLOW_HOME' not in os.environ:
 AIRFLOW_HOME = expand_env_var('~/airflow')

http://git-wip-us.apache.org/repos/asf/incubator-airflow/blob/189e6b88/airflow/contrib/hooks/qubole_hook.py
--
diff --git a/airflow/contrib/hooks/qubole_hook.py 
b/airflow/contrib/hooks/qubole_hook.py
index 57d00b5..694b12f 100755
--- a/airflow/contrib/hooks/qubole_hook.py
+++ b/airflow/contrib/hooks/qubole_hook.py
@@ -84,7 +84,6 @@ class QuboleHook(BaseHook):
 if self.cmd.status != 'done':
 raise AirflowException('Command Id: {0} failed with Status: 
{1}'.format(self.cmd.id, self.cmd.status))
 
-
 def kill(self, ti):
 """
 Kill (cancel) a Qubole commmand

http://git-wip-us.apache.org/repos/asf/incubator-airflow/blob/189e6b88/airflow/example_dags/example_python_operator.py
--
diff --git a/airflow/example_dags/example_python_operator.py 
b/airflow/example_dags/example_python_operator.py
index 6c0b93f..c5d7193 100644
--- a/airflow/example_dags/example_python_operator.py
+++ b/airflow/example_dags/example_python_operator.py
@@ -34,7 +34,7 @@ dag = DAG(
 
 
 def my_sleeping_function(random_base):
-'''This is a function that will run within the DAG execution'''
+"""This is a function that will run within the DAG execution"""
 time.sleep(random_base)
 
 
@@ -49,15 +49,12 @@ run_this = PythonOperator(
 python_callable=print_context,
 dag=dag)
 
+# Generate 10 sleeping tasks, sleeping from 0 to 9 seconds respectively
 for i in range(10):
-'''
-Generating 10 sleeping task, sleeping from 0 to 9 seconds
-respectively
-'''
 task = PythonOperator(
-task_id='sleep_for_'+str(i),
+task_id='sleep_for_' + str(i),
 python_callable=my_sleeping_function,
-op_kwargs={'random_base': float(i)/10},
+op_kwargs={'random_base': float(i) / 10},
 dag=dag)
 
 task.set_upstream(run_this)

http://git-wip-us.apache.org/repos/asf/incubator-airflow/blob/189e6b88/airflow/hooks/__init__.py
--
diff --git a/airflow/hooks/__init__.py b/airflow/hooks/__init__.py
index 4c1891d..883018d 100644
--- a/airflow/hooks/__init__.py
+++ b/airflow/hoo

[jira] [Resolved] (AIRFLOW-348) Fix code style warnings

2016-07-22 Thread Bolke de Bruin (JIRA)

 [ 
https://issues.apache.org/jira/browse/AIRFLOW-348?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Bolke de Bruin resolved AIRFLOW-348.

Resolution: Fixed

> Fix code style warnings
> ---
>
> Key: AIRFLOW-348
> URL: https://issues.apache.org/jira/browse/AIRFLOW-348
> Project: Apache Airflow
>  Issue Type: Improvement
>Affects Versions: Airflow 1.7.1.2
>Reporter: Stanislav Kudriashev
>Assignee: Stanislav Kudriashev
>Priority: Minor
>




--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (AIRFLOW-348) Fix code style warnings

2016-07-22 Thread ASF subversion and git services (JIRA)

[ 
https://issues.apache.org/jira/browse/AIRFLOW-348?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15389324#comment-15389324
 ] 

ASF subversion and git services commented on AIRFLOW-348:
-

Commit 189e6b88742ace8c46e72d59d7662284e34b7a2e in incubator-airflow's branch 
refs/heads/master from [~skudriashev]
[ https://git-wip-us.apache.org/repos/asf?p=incubator-airflow.git;h=189e6b8 ]

[AIRFLOW-348] Fix code style warnings

Closes #1672 from skudriashev/airflow-348


> Fix code style warnings
> ---
>
> Key: AIRFLOW-348
> URL: https://issues.apache.org/jira/browse/AIRFLOW-348
> Project: Apache Airflow
>  Issue Type: Improvement
>Affects Versions: Airflow 1.7.1.2
>Reporter: Stanislav Kudriashev
>Assignee: Stanislav Kudriashev
>Priority: Minor
>




--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


incubator-airflow git commit: AIRFLOW-261 Add bcc and cc fields to EmailOperator

2016-07-22 Thread bolke
Repository: incubator-airflow
Updated Branches:
  refs/heads/master 189e6b887 -> 7628a8656


AIRFLOW-261 Add bcc and cc fields to EmailOperator

Closes #1670 from ajayyadava/261


Project: http://git-wip-us.apache.org/repos/asf/incubator-airflow/repo
Commit: http://git-wip-us.apache.org/repos/asf/incubator-airflow/commit/7628a865
Tree: http://git-wip-us.apache.org/repos/asf/incubator-airflow/tree/7628a865
Diff: http://git-wip-us.apache.org/repos/asf/incubator-airflow/diff/7628a865

Branch: refs/heads/master
Commit: 7628a8656c37110ef9ea33c5142b52305ebe9440
Parents: 189e6b8
Author: Ajay Yadava 
Authored: Fri Jul 22 13:08:52 2016 +0200
Committer: Bolke de Bruin 
Committed: Fri Jul 22 13:08:56 2016 +0200

--
 airflow/operators/email_operator.py | 10 +++-
 airflow/utils/email.py  | 39 +++-
 tests/core.py   | 20 +++-
 3 files changed, 56 insertions(+), 13 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/incubator-airflow/blob/7628a865/airflow/operators/email_operator.py
--
diff --git a/airflow/operators/email_operator.py 
b/airflow/operators/email_operator.py
index 91a8d05..76cc56b 100644
--- a/airflow/operators/email_operator.py
+++ b/airflow/operators/email_operator.py
@@ -30,6 +30,10 @@ class EmailOperator(BaseOperator):
 :type html_content: string
 :param files: file names to attach in email
 :type files: list
+:param cc: list of recipients to be added in CC field
+:type cc: list or string (comma or semicolon delimited)
+:param bcc: list of recipients to be added in BCC field
+:type bcc: list or string (comma or semicolon delimited)
 """
 
 template_fields = ('subject', 'html_content')
@@ -43,12 +47,16 @@ class EmailOperator(BaseOperator):
 subject,
 html_content,
 files=None,
+cc=None,
+bcc=None,
 *args, **kwargs):
 super(EmailOperator, self).__init__(*args, **kwargs)
 self.to = to
 self.subject = subject
 self.html_content = html_content
 self.files = files or []
+self.cc = cc
+self.bcc = bcc
 
 def execute(self, context):
-send_email(self.to, self.subject, self.html_content, files=self.files)
+send_email(self.to, self.subject, self.html_content, files=self.files, 
cc=self.cc, bcc=self.bcc)

http://git-wip-us.apache.org/repos/asf/incubator-airflow/blob/7628a865/airflow/utils/email.py
--
diff --git a/airflow/utils/email.py b/airflow/utils/email.py
index c19bb89..6fe8662 100644
--- a/airflow/utils/email.py
+++ b/airflow/utils/email.py
@@ -33,17 +33,17 @@ from email.utils import formatdate
 from airflow import configuration
 
 
-def send_email(to, subject, html_content, files=None, dryrun=False):
+def send_email(to, subject, html_content, files=None, dryrun=False, cc=None, 
bcc=None):
 """
 Send email using backend specified in EMAIL_BACKEND.
 """
 path, attr = configuration.get('email', 'EMAIL_BACKEND').rsplit('.', 1)
 module = importlib.import_module(path)
 backend = getattr(module, attr)
-return backend(to, subject, html_content, files=files, dryrun=dryrun)
+return backend(to, subject, html_content, files=files, dryrun=dryrun, 
cc=cc, bcc=bcc)
 
 
-def send_email_smtp(to, subject, html_content, files=None, dryrun=False):
+def send_email_smtp(to, subject, html_content, files=None, dryrun=False, 
cc=None, bcc=None):
 """
 Send an email with html content
 
@@ -51,18 +51,23 @@ def send_email_smtp(to, subject, html_content, files=None, 
dryrun=False):
 """
 SMTP_MAIL_FROM = configuration.get('smtp', 'SMTP_MAIL_FROM')
 
-if isinstance(to, basestring):
-if ',' in to:
-to = to.split(',')
-elif ';' in to:
-to = to.split(';')
-else:
-to = [to]
+to = get_email_address_list(to)
 
 msg = MIMEMultipart('alternative')
 msg['Subject'] = subject
 msg['From'] = SMTP_MAIL_FROM
 msg['To'] = ", ".join(to)
+recipients = to
+if cc:
+cc = get_email_address_list(cc)
+msg['CC'] = ", ".join(cc)
+recipients = recipients + cc
+
+if bcc:
+# don't add bcc in header
+bcc = get_email_address_list(bcc)
+recipients = recipients + bcc
+
 msg['Date'] = formatdate(localtime=True)
 mime_text = MIMEText(html_content, 'html')
 msg.attach(mime_text)
@@ -76,7 +81,7 @@ def send_email_smtp(to, subject, html_content, files=None, 
dryrun=False):
 Name=basename
 ))
 
-send_MIME_email(SMTP_MAIL_FROM, to, msg, dryrun)
+send_MIME_email(SMTP_MAIL_FROM, recipients, msg, dryrun)
 
 
 def send_M

[jira] [Commented] (AIRFLOW-261) Add more fields for EmailOperator

2016-07-22 Thread ASF subversion and git services (JIRA)

[ 
https://issues.apache.org/jira/browse/AIRFLOW-261?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15389330#comment-15389330
 ] 

ASF subversion and git services commented on AIRFLOW-261:
-

Commit 7628a8656c37110ef9ea33c5142b52305ebe9440 in incubator-airflow's branch 
refs/heads/master from [~ajayyadava]
[ https://git-wip-us.apache.org/repos/asf?p=incubator-airflow.git;h=7628a86 ]

AIRFLOW-261 Add bcc and cc fields to EmailOperator

Closes #1670 from ajayyadava/261


> Add more fields for EmailOperator
> -
>
> Key: AIRFLOW-261
> URL: https://issues.apache.org/jira/browse/AIRFLOW-261
> Project: Apache Airflow
>  Issue Type: Improvement
>Reporter: Ajay Yadava
>Assignee: Ajay Yadava
>
> It will be nice to be able to specify cc, bcc recipients for the 
> EmailOperator. 



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)