david30907d opened a new issue #19570:
URL: https://github.com/apache/airflow/issues/19570


   ### Apache Airflow Provider(s)
   
   google
   
   ### Versions of Apache Airflow Providers
   
   `apache-airflow-providers-google          4.0.0`
   
   ### Apache Airflow version
   
   2.1.2
   
   ### Operating System
   
   PRETTY_NAME="Debian GNU/Linux 10 (buster)" NAME="Debian GNU/Linux" 
VERSION_ID="10" VERSION="10 (buster)" VERSION_CODENAME=buster ID=debian 
HOME_URL="https://www.debian.org/"; SUPPORT_URL="https://www.debian.org/support"; 
BUG_REPORT_URL="https://bugs.debian.org/";
   
   ### Deployment
   
   Docker-Compose
   
   ### Deployment details
   
   I'm using official docker-compose
   
   ### What happened
   
   ```bash
   [2021-11-12 09:24:21,409] {logging_mixin.py:104} WARNING - 
/home/***/.local/lib/python3.8/site-packages/***/providers/google/cloud/hooks/bigquery.py:141
 DeprecationWarning: This method will be deprecated. Please use 
`BigQueryHook.get_client` method
   [2021-11-12 09:24:23,033] {logging_mixin.py:104} WARNING - 
/home/***/.local/lib/python3.8/site-packages/***/providers/google/cloud/hooks/bigquery.py:2195
 DeprecationWarning: This method is deprecated. Please use 
`BigQueryHook.insert_job` method.
   [2021-11-12 09:24:23,042] {bigquery.py:1639} INFO - Inserting job 
***_1636709063039439_14741bf3004db91ee4cbb5eb024ac5ba
   [2021-11-12 09:24:27,463] {taskinstance.py:1501} ERROR - Task failed with 
exception
   Traceback (most recent call last):
     File 
"/home/airflow/.local/lib/python3.8/site-packages/airflow/models/taskinstance.py",
 line 1157, in _run_raw_task
       self._prepare_and_execute_task_with_callbacks(context, task)
     File 
"/home/airflow/.local/lib/python3.8/site-packages/airflow/models/taskinstance.py",
 line 1331, in _prepare_and_execute_task_with_callbacks
       result = self._execute_task(context, task_copy)
     File 
"/home/airflow/.local/lib/python3.8/site-packages/airflow/models/taskinstance.py",
 line 1361, in _execute_task
       result = task_copy.execute(context=context)
     File 
"/home/airflow/.local/lib/python3.8/site-packages/airflow/operators/python.py", 
line 150, in execute
       return_value = self.execute_callable()
     File 
"/home/airflow/.local/lib/python3.8/site-packages/airflow/operators/python.py", 
line 161, in execute_callable
       return self.python_callable(*self.op_args, **self.op_kwargs)
     File "/opt/airflow/dags/dags/utils/others/subscription_related.py", line 
112, in wrapper
       return func(*args, **kwargs)
     File 
"/opt/airflow/dags/dags/utils/extractors/platform_data_extractors/shopify_extractor.py",
 line 75, in wrapper
       return func(*args, **kwargs)
     File 
"/opt/airflow/dags/dags/utils/extractors/platform_data_extractors/shopify_extractor.py",
 line 1019, in add_abandoned
       abandoned_checkouts_of_this_page = _parse_this_page(response_json)
     File 
"/opt/airflow/dags/dags/utils/extractors/platform_data_extractors/shopify_extractor.py",
 line 980, in _parse_this_page
       persons_queried_by_checkout_id = db_hook.get_records(
     File 
"/home/airflow/.local/lib/python3.8/site-packages/airflow/hooks/dbapi.py", line 
135, in get_records
       return cur.fetchall()
     File 
"/home/airflow/.local/lib/python3.8/site-packages/airflow/providers/google/cloud/hooks/bigquery.py",
 line 2886, in fetchall
       one = self.fetchone()
     File 
"/home/airflow/.local/lib/python3.8/site-packages/airflow/providers/google/cloud/hooks/bigquery.py",
 line 2811, in fetchone
       return self.next()
     File 
"/home/airflow/.local/lib/python3.8/site-packages/airflow/providers/google/cloud/hooks/bigquery.py",
 line 2827, in next
       self.service.jobs()
     File 
"/home/airflow/.local/lib/python3.8/site-packages/googleapiclient/_helpers.py", 
line 134, in positional_wrapper
       return wrapped(*args, **kwargs)
     File 
"/home/airflow/.local/lib/python3.8/site-packages/googleapiclient/http.py", 
line 915, in execute
       raise HttpError(resp, content, uri=self.uri)
   googleapiclient.errors.HttpError: <HttpError 404 when requesting 
https://bigquery.googleapis.com/bigquery/v2/projects/tresl-co-001/queries/airflow_1636709063039439_14741bf3004db91ee4cbb5eb024ac5ba?alt=json
 returned "Not found: Job 
tresl-co-001:airflow_1636709063039439_14741bf3004db91ee4cbb5eb024ac5ba". 
Details: "Not found: Job 
tresl-co-001:airflow_1636709063039439_14741bf3004db91ee4cbb5eb024ac5ba">
   ```
   
   ### What you expected to happen
   
   `bq_hook.get_records(sql)` should work anyway
   
   ### How to reproduce
   
   ```python
   from dags.utils.hooks.bigquery import BigQueryHook
   bq_hook = BigQueryHook(gcp_conn_id='xxxx', use_legacy_sql=False)
   bq_hook.get_records(sql)
   ```
   
   ### Anything else
   
   _No response_
   
   ### Are you willing to submit PR?
   
   - [X] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Reply via email to