kaxil commented on issue #9638:
URL: https://github.com/apache/airflow/issues/9638#issuecomment-653914798


   > I'll give an example of my use case, I'm using the Snowflake hook (which 
is based on the dbapihook). The hook itself prints to the logs the SQL queries 
it sends. Snowflake has a SQL query that requires feeding AWS credentials. 
These are ingested into the SQL string by pulling them from the Airflow 
metadatabase connections and fired via the Snowflake Hook. However, this hook 
prints to the Airflow logs the credentials.
   > 
   > Posting this on the Slack channel was answered with this:
   > 
   > ```
   > David Ohayon Jul 3rd at 12:04 PM
   > hey hey,
   > Is there any workaround to masking sensitive information from being 
printed to the logs while using hooks? (e.g. Snowflake Hook that is based on 
the dbapihook)
   > 
   > 
   > 
   > 
   > 4 replies
   > 
   > Kamil Breguła🐈  2 days ago
   > It's an bug. This should not happen. You can create a ticket so that we 
can make changes to the code so that this data is not printed in the log
   > 
   > David Ohayon  2 days ago
   > In the sense that these operators shouldn’t log credentials retrieved from 
the Airflow connections, correct? So this behaviour is to not be expected and 
isn’t something I need to tweak on my side but should be patched?
   > 
   > Kamil Breguła🐈  2 days ago
   > We sshould patch it in Airflow.
   > 
   > David Ohayon  2 days ago
   > Okay, will open a ticket on github
   > ```
   > 
   > I am not quite sure what's the bug, however, I am still wondering whether 
a way exists to mask this kind of sensitive information.
   
   You will have to setup a custom logger and mask those information as there 
is no easy way of knowing what is a secret and what is now when using a SQL 
statement.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Reply via email to