Hi Yujia,

You might take inspiration from Coral https://github.com/linkedin/coral. It
is based on Calcite but uses the Hive parser (which is compatible with
Spark SQL) to generate the SQL and Rel nodes. There is a PR that uses the
native Spark parser as well https://github.com/linkedin/coral/pull/339.
Merging it is a work in progress.

Thanks,
Walaa.


On Thu, May 30, 2024 at 9:07 AM Mihai Budiu <mbu...@gmail.com> wrote:

> The SQL language has several sublanguages: the query language, the data
> definition language, and the data manipulation language. The core of
> Calcite is mostly about the query language, but there are Calcite
> components that deal with the other languages as well (e.g., server, babel).
>
> Both these components also show how the Calcite parser can be customized.
>
> In our project we have also extended the parser. You can see for example
> our PR which does a minimal change to the Calcite parser:
> https://github.com/feldera/feldera/pull/210, it could be a useful
> guideline for your needs. We use maven in our build, so you can see how the
> build has to be structured. The config.fmpp file has some "metadata" about
> how the changes are integrated into the existing parser, while the *.ftl
> files contain actual parser code written in the JavaCC parser generator
> language.
>
> Mihai
> ________________________________
> From: 奚钰佳 <yu...@qq.com.INVALID>
> Sent: Wednesday, May 29, 2024 7:01 PM
> To: dev <dev@calcite.apache.org>
> Subject: How to PARSER the SPARK SQL
>
> Hello calcite team,
>
> I am Yujia and I want to parse the spark sql by calcite. But some keywords
> are not supported by calcite.&nbsp;
> Here is my question in stackoverflow:&nbsp;
> https://stackoverflow.com/questions/78547328/how-can-i-parse-the-spark-sql-by-calcite-sqlparser-like-create-temporary-table
>
>
>
> I have two questions:
>
> How to parse the SPARK SQL by calcite?
>
> How can I extend the syntax on demand? like&nbsp;create temporary table?
>
> For question 1: Is there any way to parse the spark sql with the parser
> that support the spark sql?
>
>
> For question 2:
> Also, I tried to extend the&nbsp;syntax with below steps but met the
> strange compile error with the copied Parser.jj file.
> Steps
>
> Copy the Parser.jj file from calcite
>
> add&nbsp;SqlNode SqlCreateTempTable() :...&nbsp;in&nbsp;parserImpls.ftl
>
> add class&nbsp;CreateTempTable&nbsp;that&nbsp;extends SqlCall&nbsp;to
> define the getOperator, getOperandList, unparse
>
> add related config in&nbsp;config.fmpp
>
> mvn generate-sources&nbsp;to generate the&nbsp;TestSqlParserImpl&nbsp;that
> defined in the step 4
>
> at step5, it has below error:
> FMPP processing session failed. [ERROR] Caused by:
> freemarker.core.InvalidReferenceException: The following has evaluated to
> null or missing: [ERROR] ==&gt; default &nbsp;[in template "Parser.jj" at
> line 1124, column 43] [ERROR]  [ERROR] ---- [ERROR] Tip: If the failing
> expression is known to legally refer to something that's sometimes null or
> missing, either specify a default value like myOptionalVar!myDefault, or
> use <#if myOptionalVar??&gt;when-present<#else&gt;when-missing</#if&gt;.
> (These only cover the last step of the expression; to cover the whole
> expression, use parenthesis: (myOptionalVar.foo)!myDefault,
> (myOptionalVar.foo)?? [ERROR] ---- [ERROR]  [ERROR] ---- [ERROR] FTL stack
> trace ("~" means nesting-related): [ERROR] &nbsp; &nbsp; &nbsp; &nbsp; -
> Failed at: #if (parser.createStatementParserMeth... &nbsp;[in template
> "Parser.jj" at line 1124, column 1]
> Is my steps right? And why the same Parser.jj file has error when compile?
>
>
>
>
> Thanks,
>
> Yujia
>

Reply via email to