Jackey,

The proposal to add a sql-api module was based on the need to have the SQL
API classes, like `Table` available to Catalyst so we can have logical
plans and analyzer rules in that module. But, nothing in Catalyst is public
and so it doesn't contain user-implemented APIs. There are 3 options to
solve that problem:

1. Add a module that catalyst depends on with the APIs, sql-api. But I ran
into the problem I described above: needing to depend on Catalyst classes.
2. Add the API to catalyst. The problem is adding publicly available API
classes to a previously non-public module.
3. Add the API to core. The problem here is that it is more difficult to
keep rules and logical plans in catalyst, where I would expect them to be.

I'm not sure which option is the right one, but I no longer think that
option #1 is very promising.

On Fri, Nov 30, 2018 at 10:47 PM JackyLee <qcsd2...@163.com> wrote:

> Hi, Ryan Blue.
>
> I don't think it would be a good idea to add the sql-api module.
> I prefer to add sql-api to sql/core. The sql is just another representation
> of dataset, thus there is no need to add new module to do this. Besides, it
> would be easier to add sql-api in core.
>
> By the way, I don't think it's a good time to add sql api, we have not yet
> determined many details of the DataSource V2 API.
>
>
>
> --
> Sent from: http://apache-spark-developers-list.1001551.n3.nabble.com/
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>
>

-- 
Ryan Blue
Software Engineer
Netflix

Reply via email to