bolkedebruin commented on PR #34729:
URL: https://github.com/apache/airflow/pull/34729#issuecomment-1776620355

   >     * If the plugin interface is extended does it mean "newer" extended 
providers like AWS will work on older versions e.g. Airflow 2.7.3 (See also the 
comment in CWiki ([Comment in Section "What defines this AIP as 
"done"?"](https://cwiki.apache.org/confluence/pages/viewpage.action?pageId=263430565))?
 Don't know if the provider API is changed if this is a breaking backwards 
change for newer provider package versions.
   > 
   
   Like I said on the wiki, but rephrasing here: the provider API is tested 
for, so if a provider does not have the API there is no issue. This is also 
goes vice versa. Airflow will not test for the API if core doesn't support it. 
That does not prevent the provider of having the API. With the exception of 
`common.io` none the providers is dependent on the newly provider functionality 
in core. So there was also no reason to bump the minimum required version of 
Airflow. Thus updates to the provider packages will continue to work with 
Airflow > 2.5, as is the case now. The exception being, again, `common.io` 
which relies on the new functionality.
   
   >     * I'd really love to see an example DAG in the standard/public 
examples, there is one in the test code which would be nice to push to a "user 
visible" sections (standards users do not search within pytest scope): 
`tests/system/providers/common/io/example_file_transfer_local_to_s3.py` also 
also show-case the XCom transfer of storage references. Would be also okay to 
have this on a follow-up PR (if not forgotten :-D)
   
   I can add it, but it also included in the docs now.
   
   > 
   >     * I (still) miss some pytests for
   >       
   >       * Extensions in AWS/Azure/Google provider packages - added logic is 
there but no pytests. Would leave this finally to provider package approvers.
   
   If I add tests here I think I would need to bump the minimum required 
Airflow version. I'll double check if that is the case.
   
   >       * In `airflow.io.store` the `path.py` does not have any tests. 
Though it is mostly an interface still it carries _some_ logic which a test 
would be value-able to ensure code is covered in tests.
   
   `path.py` is tested by means of `test_store`. It didnt really make sense to 
keep those tests separate.
   
   > 
   >     * Maybe I am a bit picky on names and am the only one raising the 
discussion - fair if over-ruled by majority :-) but I still would favor to 
shorten the "new" API entry point from `airflow.io.storage.ObjectStorage` to be 
rather a `airflow.io.Storage`. It is shorter and only two python modules (only) 
defining a complete sub-package. But only my personal opinion.
   
   I will not rename to `Storage` and will keep the class name as is. I can 
consider adding it to the higher level module and export it *also* from there 
so you could do `from airflow.io import ObjectStoragePath`. Note that we got 
rid of this pattern for Operators in the past, so it might not be the right 
approach. 
   
   Thanks again for your thoughts and review.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org

Reply via email to