potiuk commented on issue #25721:
URL: https://github.com/apache/airflow/issues/25721#issuecomment-1216603447
Tested:
#25299:
```
>>> from airflow.providers.apache.hive.hooks.hive import HiveServer2Hook
>>> hook = HiveServer2Hook()
>>> hook.get_pandas_df(hql='xxx')
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
TypeError: get_pandas_df() missing 1 required positional argument: 'sql'
```
#25293
```
root@a49eb20ef97a:/opt/airflow# airflow providers list
package_name | description
| version
====================================+=============================================================================+========
apache-airflow-providers-common-sql | Common SQL Provider
https://en.wikipedia.org/wiki/SQL | 1.0.0
apache-airflow-providers-ftp | File Transfer Protocol (FTP)
https://tools.ietf.org/html/rfc114 | 3.0.0
apache-airflow-providers-http | Hypertext Transfer Protocol (HTTP)
https://www.w3.org/Protocols/ | 3.0.0
apache-airflow-providers-imap | Internet Message Access Protocol
(IMAP) https://tools.ietf.org/html/rfc3501 | 3.0.0
apache-airflow-providers-postgres | PostgreSQL https://www.postgresql.org/
| 5.2.0
apache-airflow-providers-sqlite | SQLite https://www.sqlite.org/
| 3.0.0
root@a49eb20ef97a:/opt/airflow# python
Python 3.7.13 (default, Aug 2 2022, 12:15:43)
[GCC 10.2.1 20210110] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> from airflow.sensors.sql import SqlSensor
>>> sensor = SqlSensor(conn_id='postgres_default', sql='SELECT * FROM DUAL',
task_id='id')
>>> sensor._get_hook()
[2022-08-16 12:44:15,582] {base.py:68} INFO - Using connection ID
'postgres_default' for task execution.
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/usr/local/lib/python3.7/site-packages/airflow/sensors/sql.py", line
84, in _get_hook
f'The connection type is not supported by {self.__class__.__name__}. '
airflow.exceptions.AirflowException: The connection type is not supported by
SqlSensor. The associated hook should be a subclass of `DbApiHook`. Got
PostgresHook
>>> quit()
root@a49eb20ef97a:/opt/airflow# pip install
apache-airflow-providers-common-sql==1.1.0rc4
Collecting apache-airflow-providers-common-sql==1.1.0rc4
Downloading apache_airflow_providers_common_sql-1.1.0rc4-py3-none-any.whl
(27 kB)
Requirement already satisfied: sqlparse>=0.4.2 in
/usr/local/lib/python3.7/site-packages (from
apache-airflow-providers-common-sql==1.1.0rc4) (0.4.2)
Installing collected packages: apache-airflow-providers-common-sql
Attempting uninstall: apache-airflow-providers-common-sql
Found existing installation: apache-airflow-providers-common-sql 1.0.0
Uninstalling apache-airflow-providers-common-sql-1.0.0:
Successfully uninstalled apache-airflow-providers-common-sql-1.0.0
Successfully installed apache-airflow-providers-common-sql-1.1.0rc4
WARNING: Running pip as the 'root' user can result in broken permissions and
conflicting behaviour with the system package manager. It is recommended to use
a virtual environment instead: https://pip.pypa.io/warnings/venv
root@a49eb20ef97a:/opt/airflow# python
Python 3.7.13 (default, Aug 2 2022, 12:15:43)
[GCC 10.2.1 20210110] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> from airflow.sensors.sql import SqlSensor
>>> sensor = SqlSensor(conn_id='postgres_default', sql='SELECT * FROM DUAL',
task_id='id')
>>> sensor._get_hook()
[2022-08-16 12:45:24,882] {base.py:68} INFO - Using connection ID
'postgres_default' for task execution.
<airflow.providers.postgres.hooks.postgres.PostgresHook object at
0x7f3ddfa16750>
>>>
```
#25350
```
[GCC 10.2.1 20210110] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> from airflow.providers.common.sql.operators.sql import (
SQLCheckOperator,
SQLIntervalCheckOperator,
SQLValueCheckOperator,
)
>>>
```
#25713
common-sql: 1.1.0rc4:
```
>>> from airflow.providers.databricks.hooks.databricks_sql import
DatabricksSqlHook
>>> hook = DatabricksSqlHook()
>>> hook.run(sql='SELECT * FROM TABLE')
[2022-08-16 12:51:21,954] {base.py:68} INFO - Using connection ID
'databricks_default' for task execution.
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File
"/usr/local/lib/python3.7/site-packages/airflow/providers/databricks/hooks/databricks_sql.py",
line 179, in run
with closing(self.get_conn()) as conn:
File
"/usr/local/lib/python3.7/site-packages/airflow/providers/databricks/hooks/databricks_sql.py",
line 105, in get_conn
"http_path should be provided either explicitly, "
airflow.exceptions.AirflowException: http_path should be provided either
explicitly, or in extra parameter of Databricks connection, or
sql_endpoint_name should be specified
>>> import logging
>>> hook.log.setLevel(logging.DEBUG)
>>> hook.run(sql='SELECT * FROM TABLE')
[2022-08-16 12:53:00,963] {databricks_sql.py:172} DEBUG - Executing
following statements against Databricks DB: ['SELECT * FROM TABLE']
```
common-sql: 1.1.0rc3:
```
root@a49eb20ef97a:/opt/airflow# pip install
apache-airflow-providers-common-sql==1.1.0rc3
Collecting apache-airflow-providers-common-sql==1.1.0rc3
Downloading apache_airflow_providers_common_sql-1.1.0rc3-py3-none-any.whl
(27 kB)
Requirement already satisfied: sqlparse>=0.4.2 in
/usr/local/lib/python3.7/site-packages (from
apache-airflow-providers-common-sql==1.1.0rc3) (0.4.2)
Installing collected packages: apache-airflow-providers-common-sql
Attempting uninstall: apache-airflow-providers-common-sql
Found existing installation: apache-airflow-providers-common-sql 1.0.0
Uninstalling apache-airflow-providers-common-sql-1.0.0:
Successfully uninstalled apache-airflow-providers-common-sql-1.0.0
Successfully installed apache-airflow-providers-common-sql-1.1.0rc3
WARNING: Running pip as the 'root' user can result in broken permissions and
conflicting behaviour with the system package manager. It is recommended to use
a virtual environment instead: https://pip.pypa.io/warnings/venv
root@a49eb20ef97a:/opt/airflow# python
Python 3.7.13 (default, Aug 2 2022, 12:15:43)
[GCC 10.2.1 20210110] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> from airflow.providers.databricks.hooks.databricks_sql import
DatabricksSqlHook
>>> hook = DatabricksSqlHook()
>>> import logging
>>> hook.log.setLevel(logging.DEBUG)
>>> hook.run(sql='SELECT * FROM TABLE')
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File
"/usr/local/lib/python3.7/site-packages/airflow/providers/databricks/hooks/databricks_sql.py",
line 174, in run
raise ValueError("List of SQL statements is empty")
ValueError: List of SQL statements is empty
```
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]