Hi everyone, As a follow-up to the SPIP to clean up SparkSQL logical plans <https://docs.google.com/document/d/1gYm5Ji2Mge3QBdOliFV5gSPTKlX4q1DCBXIkiyMv62A/edit?ts=5a987801#heading=h.m45webtwxf2d>, I've written up a proposal for catalog APIs that are required for Spark to implement reliable high-level operations like CTAS. This includes an API that is an extension to DataSourceV2 that provides table operations, and proposes a public API to create and alter tables.
The proposal is here: Spark Catalog APIs <https://docs.google.com/document/d/1zLFiA1VuaWeVxeTDXNg8bL6GP3BVoOZBkewFtEnjEoo/edit?usp=sharing> . Comments and feedback are welcome! Feel free to comment on the doc or reply to this thread. rb -- Ryan Blue Software Engineer Netflix