[ 
https://issues.apache.org/jira/browse/SPARK-6200?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14366569#comment-14366569
 ] 

haiyang commented on SPARK-6200:
--------------------------------

Thank you for your comments.

The core idea of this implementation is to provide two interface {{Dialect}} 
and {{DialectManager}}.Every dialect must implement the {{Dialect}} interface 
if they want to use their own dialects.{{DialectManager}} is to manager all 
kinds of {{Dialect}}, and there is a default implementation of this interface, 
even we can open a api that allows users to provide their own 
{{DialectManager}} if they want to.If you can't aggree with this core idea, I 
will close this.

Other questions as to what you say,I think I misunderstood the original design 
intent,but is just {{DefaultDialectManager}} implementation problem.At a high 
level,in other words, it is just different {{DialectManager}} implementation 
problem.

We can simply modify little code to achive the following goals:

1. always parses Spark SQL its own DDL first:
    a. add DDLParser filed in {{DefaultDialectManager}}
    b. change its parse method like this: {{ddlParser(sqlText, 
false).getOrElse(currentDialect.parse(sqlText))}}

2. switch dialect through the open api {{SET spark.sql.dialect}} : 
     drop curDialect field in {{DefaultDialectManager}}, use 
{{sqlContext.conf.dialect}} to read and switch dialect.

3. drop dialect commands to make code simpler.

> Support dialect in SQL
> ----------------------
>
>                 Key: SPARK-6200
>                 URL: https://issues.apache.org/jira/browse/SPARK-6200
>             Project: Spark
>          Issue Type: Improvement
>          Components: SQL
>            Reporter: haiyang
>
> Created a new dialect manager,support dialect command and add new dialect use 
> sql statement etc.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to