[jira] [Commented] (AIRFLOW-3819) k8s executor - Allow the configuration of a global default for pod resource request/limits

2020-03-27 Thread Lihan Li (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3819?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17068394#comment-17068394
 ] 

Lihan Li commented on AIRFLOW-3819:
---

[~aizhamal] Just hit by this issue again, that we cannot do 1 size fit all. 
Some pipelines require a larger resources while it's inefficient to set default 
to use the large resource request. 

Can we make this per task basis?

> k8s executor - Allow the configuration of a global default for pod resource 
> request/limits 
> ---
>
> Key: AIRFLOW-3819
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3819
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: executors
>Affects Versions: 1.10.2
>Reporter: afusr
>Priority: Minor
>  Labels: kubernetes
>
> Currently the kubernetes executor allows you to specify pod resources 
> requests and limits (cpu and memory). For example:
> {noformat}
> # Limit resources on this operator/task with node affinity & tolerations
> three_task = PythonOperator(
> task_id="three_task", python_callable=print_stuff, dag=dag,
> executor_config={
> "KubernetesExecutor": {"request_memory": "128Mi",
>"limit_memory": "128Mi",
>"tolerations": tolerations,
>"affinity": affinity}}
> )
> {noformat}
> These values are used by kubernetes when making scaling decisions. It would 
> be nice to be able to specify a global default for these values, to ensure 
> that each pod airflow creates has a value specified for these properties. 
> There is still the requirement to override these values on a dag by dag 
> basis. 



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Comment Edited] (AIRFLOW-3819) k8s executor - Allow the configuration of a global default for pod resource request/limits

2019-12-30 Thread Lihan Li (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3819?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17005898#comment-17005898
 ] 

Lihan Li edited comment on AIRFLOW-3819 at 12/31/19 2:38 AM:
-

Also just hit by this issue.

I think this better resolved within Airflow, it is a very small addition; Also, 
Mesos has task_cpu within Airflow.

[~aizhamal]


was (Author: lihan):
I think this better resolved within Airflow, it is a very small addition; Also, 
Mesos has task_cpu within Airflow.

> k8s executor - Allow the configuration of a global default for pod resource 
> request/limits 
> ---
>
> Key: AIRFLOW-3819
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3819
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: executors
>Affects Versions: 1.10.2
>Reporter: afusr
>Priority: Minor
>  Labels: kubernetes
>
> Currently the kubernetes executor allows you to specify pod resources 
> requests and limits (cpu and memory). For example:
> {noformat}
> # Limit resources on this operator/task with node affinity & tolerations
> three_task = PythonOperator(
> task_id="three_task", python_callable=print_stuff, dag=dag,
> executor_config={
> "KubernetesExecutor": {"request_memory": "128Mi",
>"limit_memory": "128Mi",
>"tolerations": tolerations,
>"affinity": affinity}}
> )
> {noformat}
> These values are used by kubernetes when making scaling decisions. It would 
> be nice to be able to specify a global default for these values, to ensure 
> that each pod airflow creates has a value specified for these properties. 
> There is still the requirement to override these values on a dag by dag 
> basis. 



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (AIRFLOW-3819) k8s executor - Allow the configuration of a global default for pod resource request/limits

2019-12-30 Thread Lihan Li (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3819?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17005898#comment-17005898
 ] 

Lihan Li commented on AIRFLOW-3819:
---

I think this better resolved within Airflow, it is a very small addition; Also, 
Mesos has task_cpu within Airflow.

> k8s executor - Allow the configuration of a global default for pod resource 
> request/limits 
> ---
>
> Key: AIRFLOW-3819
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3819
> Project: Apache Airflow
>  Issue Type: Improvement
>  Components: executors
>Affects Versions: 1.10.2
>Reporter: afusr
>Priority: Minor
>  Labels: kubernetes
>
> Currently the kubernetes executor allows you to specify pod resources 
> requests and limits (cpu and memory). For example:
> {noformat}
> # Limit resources on this operator/task with node affinity & tolerations
> three_task = PythonOperator(
> task_id="three_task", python_callable=print_stuff, dag=dag,
> executor_config={
> "KubernetesExecutor": {"request_memory": "128Mi",
>"limit_memory": "128Mi",
>"tolerations": tolerations,
>"affinity": affinity}}
> )
> {noformat}
> These values are used by kubernetes when making scaling decisions. It would 
> be nice to be able to specify a global default for these values, to ensure 
> that each pod airflow creates has a value specified for these properties. 
> There is still the requirement to override these values on a dag by dag 
> basis. 



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (AIRFLOW-3965) GoogleCloudStorageToBigQueryOperator has no "location" parameter

2019-09-04 Thread Lihan Li (Jira)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-3965?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16922238#comment-16922238
 ] 

Lihan Li commented on AIRFLOW-3965:
---

[~jackjack10] I don't think it's a duplicate. Because 
https://issues.apache.org/jira/browse/AIRFLOW-3601 concerns about BigQueryHook.

This ticket is about the operator, it needs to support location parameter on 
*GoogleCloudStorageToBigQueryOperator* and pass it into the BigQueryHook

> GoogleCloudStorageToBigQueryOperator has no "location" parameter
> 
>
> Key: AIRFLOW-3965
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3965
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: operators
>Affects Versions: 1.10.2
>Reporter: Lihan Li
>Assignee: Lihan Li
>Priority: Critical
>   Original Estimate: 24h
>  Remaining Estimate: 24h
>
> *GoogleCloudStorageToBigQueryOperator* class does not accept "location" 
> parameter and for jobs that are not "US", "EU" will fail.
> To be more precise, the bigquery job itself will still work, however the task 
> instance will mark as error because it isn't able to fetch the job status 
> (HTTP 404). Setting the location parameter will fix this problem.
>  
> See error traceback
>  
> {code:java}
>  
> Traceback (most recent call last): File 
> "/Users/lihanli/.virtualenvs/nbw/lib/python3.6/site-packages/airflow/contrib/hooks/bigquery_hook.py",
>  line 1124, in run_with_configuration try: File 
> "/Users/lihanli/.virtualenvs/nbw/lib/python3.6/site-packages/googleapiclient/_helpers.py",
>  line 130, in positional_wrapper return wrapped(*args, **kwargs) File 
> "/Users/lihanli/.virtualenvs/nbw/lib/python3.6/site-packages/googleapiclient/http.py",
>  line 851, in execute raise HttpError(resp, content, uri=self.uri) 
> googleapiclient.errors.HttpError:  https://www.googleapis.com/bigquery/v2/projects//jobs/job_sf_CVX8Pa49m7u6YSaFetj62qxmM?alt=json
>  returned "Not found: Job :job_sf_CVX8Pa49m7u6YSaFetj62qxmM"> 
> During handling of the above exception, another exception occurred: Traceback 
> (most recent call last): File "/Users/lihanli/.virtualenvs/nbw/bin/airflow", 
> line 32, in  args.func(args) File 
> "/Users/lihanli/.virtualenvs/nbw/lib/python3.6/site-packages/airflow/utils/cli.py",
>  line 74, in wrapper return f(*args, **kwargs) File 
> "/Users/lihanli/.virtualenvs/nbw/lib/python3.6/site-packages/airflow/bin/cli.py",
>  line 651, in test ti.run(ignore_task_deps=True, ignore_ti_state=True, 
> test_mode=True) File 
> "/Users/lihanli/.virtualenvs/nbw/lib/python3.6/site-packages/airflow/utils/db.py",
>  line 73, in wrapper return func(*args, **kwargs) File 
> "/Users/lihanli/.virtualenvs/nbw/lib/python3.6/site-packages/airflow/models.py",
>  line 1750, in run session=session) File 
> "/Users/lihanli/.virtualenvs/nbw/lib/python3.6/site-packages/airflow/utils/db.py",
>  line 69, in wrapper return func(*args, **kwargs) File 
> "/Users/lihanli/.virtualenvs/nbw/lib/python3.6/site-packages/airflow/models.py",
>  line 1657, in _run_raw_task result = task_copy.execute(context=context) File 
> "/Users/lihanli/.virtualenvs/nbw/lib/python3.6/site-packages/airflow/contrib/operators/gcs_to_bq.py",
>  line 257, in execute ignore_unknown_values=self.ignore_unknown_values, File 
> "/Users/lihanli/.virtualenvs/nbw/lib/python3.6/site-packages/airflow/contrib/hooks/bigquery_hook.py",
>  line 1096, in run_load return self.run_with_configuration(configuration) 
> File 
> "/Users/lihanli/.virtualenvs/nbw/lib/python3.6/site-packages/airflow/contrib/hooks/bigquery_hook.py",
>  line 1124, in run_with_configuration try: Exception: ('BigQuery job status 
> check failed. Final error was: %s', 404)
> {code}
>  



--
This message was sent by Atlassian Jira
(v8.3.2#803003)


[jira] [Created] (AIRFLOW-3965) GoogleCloudStorageToBigQueryOperator has no "location" parameter

2019-02-26 Thread Lihan Li (JIRA)
Lihan Li created AIRFLOW-3965:
-

 Summary: GoogleCloudStorageToBigQueryOperator has no "location" 
parameter
 Key: AIRFLOW-3965
 URL: https://issues.apache.org/jira/browse/AIRFLOW-3965
 Project: Apache Airflow
  Issue Type: Bug
  Components: operators
Affects Versions: 1.10.2
Reporter: Lihan Li
Assignee: Lihan Li


*GoogleCloudStorageToBigQueryOperator* class does not accept "location" 
parameter and for jobs that are not "US", "EU" will fail.

To be more precise, the bigquery job itself will still work, however the task 
instance will mark as error because it isn't able to fetch the job status (HTTP 
404). Setting the location parameter will fix this problem.

 

See error traceback

 
{code:java}
 
Traceback (most recent call last): File 
"/Users/lihanli/.virtualenvs/nbw/lib/python3.6/site-packages/airflow/contrib/hooks/bigquery_hook.py",
 line 1124, in run_with_configuration try: File 
"/Users/lihanli/.virtualenvs/nbw/lib/python3.6/site-packages/googleapiclient/_helpers.py",
 line 130, in positional_wrapper return wrapped(*args, **kwargs) File 
"/Users/lihanli/.virtualenvs/nbw/lib/python3.6/site-packages/googleapiclient/http.py",
 line 851, in execute raise HttpError(resp, content, uri=self.uri) 
googleapiclient.errors.HttpError: https://www.googleapis.com/bigquery/v2/projects//jobs/job_sf_CVX8Pa49m7u6YSaFetj62qxmM?alt=json
 returned "Not found: Job :job_sf_CVX8Pa49m7u6YSaFetj62qxmM"> 
During handling of the above exception, another exception occurred: Traceback 
(most recent call last): File "/Users/lihanli/.virtualenvs/nbw/bin/airflow", 
line 32, in  args.func(args) File 
"/Users/lihanli/.virtualenvs/nbw/lib/python3.6/site-packages/airflow/utils/cli.py",
 line 74, in wrapper return f(*args, **kwargs) File 
"/Users/lihanli/.virtualenvs/nbw/lib/python3.6/site-packages/airflow/bin/cli.py",
 line 651, in test ti.run(ignore_task_deps=True, ignore_ti_state=True, 
test_mode=True) File 
"/Users/lihanli/.virtualenvs/nbw/lib/python3.6/site-packages/airflow/utils/db.py",
 line 73, in wrapper return func(*args, **kwargs) File 
"/Users/lihanli/.virtualenvs/nbw/lib/python3.6/site-packages/airflow/models.py",
 line 1750, in run session=session) File 
"/Users/lihanli/.virtualenvs/nbw/lib/python3.6/site-packages/airflow/utils/db.py",
 line 69, in wrapper return func(*args, **kwargs) File 
"/Users/lihanli/.virtualenvs/nbw/lib/python3.6/site-packages/airflow/models.py",
 line 1657, in _run_raw_task result = task_copy.execute(context=context) File 
"/Users/lihanli/.virtualenvs/nbw/lib/python3.6/site-packages/airflow/contrib/operators/gcs_to_bq.py",
 line 257, in execute ignore_unknown_values=self.ignore_unknown_values, File 
"/Users/lihanli/.virtualenvs/nbw/lib/python3.6/site-packages/airflow/contrib/hooks/bigquery_hook.py",
 line 1096, in run_load return self.run_with_configuration(configuration) File 
"/Users/lihanli/.virtualenvs/nbw/lib/python3.6/site-packages/airflow/contrib/hooks/bigquery_hook.py",
 line 1124, in run_with_configuration try: Exception: ('BigQuery job status 
check failed. Final error was: %s', 404)
{code}
 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)