yohei1126 commented on a change in pull request #4324: [AIRFLOW-3327] Add 
support for location in BigQueryHook
URL: https://github.com/apache/incubator-airflow/pull/4324#discussion_r244113891
 
 

 ##########
 File path: airflow/contrib/hooks/bigquery_hook.py
 ##########
 @@ -580,11 +587,18 @@ def run_query(self,
             by one or more columns. This is only available in combination with
             time_partitioning. The order of columns given determines the sort 
order.
         :type cluster_fields: list of str
+        :param location: The geographic location of the job. Required except 
for
+            US and EU. See details at
 
 Review comment:
   Hi, I tried to import to a data set in `asia-northeast1` using 
`GoogleCloudStorageToBigQueryOperator ` and I got the following error as kaxil 
mentioned.
   ```[2018-12-26 21:35:47,464] {base_task_runner.py:107} INFO - Job 146: 
Subtask bq_load_data_into_dest_table_from_gcs [2018-12-26 21:35:47,464] 
{discovery.py:871} INFO - URL being requested: GET 
https://www.googleapis.com/bigquery/v2/projects/my-project/jobs/job_abc123?alt=json
   [2018-12-26 21:35:47,931] {models.py:1736} ERROR - ('BigQuery job status 
check failed. Final error was: %s', 404)
   Traceback (most recent call last)
     File "/usr/local/lib/airflow/airflow/contrib/hooks/bigquery_hook.py", line 
981, in run_with_configuratio
       jobId=self.running_job_id).execute(
     File "/usr/local/lib/python3.6/site-packages/googleapiclient/_helpers.py", 
line 130, in positional_wrappe
       return wrapped(*args, **kwargs
     File "/usr/local/lib/python3.6/site-packages/googleapiclient/http.py", 
line 851, in execut
       raise HttpError(resp, content, uri=self.uri
   googleapiclient.errors.HttpError: <HttpError 404 when requesting 
https://www.googleapis.com/bigquery/v2/projects/my-project/jobs/job_abc123?alt=json
 returned "Not found: Job my-project:job_abc123"
   
   During handling of the above exception, another exception occurred
   
   Traceback (most recent call last)
     File "/usr/local/lib/airflow/airflow/models.py", line 1633, in _run_raw_tas
       result = task_copy.execute(context=context
     File "/usr/local/lib/airflow/airflow/contrib/operators/gcs_to_bq.py", line 
237, in execut
       time_partitioning=self.time_partitioning
     File "/usr/local/lib/airflow/airflow/contrib/hooks/bigquery_hook.py", line 
951, in run_loa
       return self.run_with_configuration(configuration
     File "/usr/local/lib/airflow/airflow/contrib/hooks/bigquery_hook.py", line 
1003, in run_with_configuratio
       err.resp.status
   Exception: ('BigQuery job status check failed. Final error was: %s', 404
   ```
   https://issues.apache.org/jira/browse/AIRFLOW-3571
   

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

Reply via email to