[GitHub] [flink] danny0405 commented on a change in pull request #11791: [FLINK-17210][sql-parser][hive] Implement database DDLs for Hive dialect

2020-04-26 Thread GitBox


danny0405 commented on a change in pull request #11791:
URL: https://github.com/apache/flink/pull/11791#discussion_r415475644



##
File path: 
flink-table/flink-table-planner-blink/src/main/java/org/apache/flink/table/planner/delegation/FlinkSqlParserImplFactory.java
##
@@ -0,0 +1,50 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.table.planner.delegation;
+
+import org.apache.flink.sql.parser.hive.impl.FlinkHiveSqlParserImpl;
+import org.apache.flink.sql.parser.impl.FlinkSqlParserImpl;
+import org.apache.flink.sql.parser.validate.FlinkSqlConformance;
+
+import org.apache.calcite.sql.parser.SqlAbstractParserImpl;
+import org.apache.calcite.sql.parser.SqlParserImplFactory;
+import org.apache.calcite.sql.validate.SqlConformance;
+
+import java.io.Reader;
+
+/**
+ * A SqlParserImplFactory that creates the parser according to SqlConformance.
+ */
+public class FlinkSqlParserImplFactory implements SqlParserImplFactory {
+
+   private final SqlConformance conformance;
+
+   public FlinkSqlParserImplFactory(SqlConformance conformance) {
+   this.conformance = conformance;
+   }
+
+   @Override
+   public SqlAbstractParserImpl getParser(Reader stream) {
+   if (conformance == FlinkSqlConformance.HIVE) {
+   return FlinkHiveSqlParserImpl.FACTORY.getParser(stream);
+   } else {
+   return FlinkSqlParserImpl.FACTORY.getParser(stream);
+   }
+   }

Review comment:
   We do not need to extend `SqlParserImplFactory `, how about just a tool 
class named `FlinkSqlParserFactories` and there is a method to return the 
factory by conformance `FlinkSqlParserFactories#createFactory(SqlConformance)` ?





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [flink] danny0405 commented on a change in pull request #11791: [FLINK-17210][sql-parser][hive] Implement database DDLs for Hive dialect

2020-04-26 Thread GitBox


danny0405 commented on a change in pull request #11791:
URL: https://github.com/apache/flink/pull/11791#discussion_r415474853



##
File path: flink-table/flink-table-planner-blink/pom.xml
##
@@ -100,6 +100,18 @@ under the License.


 
+   
+   org.apache.flink
+   flink-sql-parser-hive
+   ${project.version}
+   
+   

Review comment:
   I'm fine with that ~





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [flink] danny0405 commented on a change in pull request #11791: [FLINK-17210][sql-parser][hive] Implement database DDLs for Hive dialect

2020-04-26 Thread GitBox


danny0405 commented on a change in pull request #11791:
URL: https://github.com/apache/flink/pull/11791#discussion_r415284543



##
File path: 
flink-table/flink-sql-parser-hive/src/test/java/org/apache/flink/sql/parser/hive/FlinkHiveSqlParserImplTest.java
##
@@ -0,0 +1,118 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.sql.parser.hive;
+
+import org.apache.flink.sql.parser.hive.impl.FlinkHiveSqlParserImpl;
+
+import org.apache.calcite.sql.parser.SqlParserImplFactory;
+import org.apache.calcite.sql.parser.SqlParserTest;
+import org.junit.Test;
+
+/**
+ * Tests for {@link FlinkHiveSqlParserImpl}.
+ */
+public class FlinkHiveSqlParserImplTest extends SqlParserTest {
+
+   @Override
+   protected SqlParserImplFactory parserImplFactory() {
+   return FlinkHiveSqlParserImpl.FACTORY;
+   }
+
+   // overrides test methods that we don't support
+   @Override
+   public void testDescribeStatement() {
+   }
+
+   @Override
+   public void testTableHintsInInsert() {
+   }
+
+   @Override
+   public void testDescribeSchema() {
+   }
+
+   @Test
+   public void testShowDatabases() {
+   sql("show databases").ok("SHOW DATABASES");
+   }
+
+   @Test
+   public void testUseDatabase() {
+   // use database
+   sql("use db1").ok("USE `DB1`");
+   }
+
+   @Test
+   public void testCreateDatabase() {
+   sql("create database db1")
+   .ok("CREATE DATABASE `DB1` WITH (\n" +
+   "  'is_generic' = 'false'\n" +
+   ")");
+   sql("create database db1 comment 'comment db1' location 
'/path/to/db1'")
+   .ok("CREATE DATABASE `DB1`\n" +
+   "COMMENT 'comment db1' WITH 
(\n" +
+   "  'is_generic' = 'false',\n" +
+   "  'database.location_uri' = 
'/path/to/db1'\n" +
+   ")");
+   sql("create database db1 with dbproperties 
('k1'='v1','k2'='v2')")
+   .ok("CREATE DATABASE `DB1` WITH (\n" +
+   "  'k1' = 'v1',\n" +
+   "  'k2' = 'v2',\n" +
+   "  'is_generic' = 'false'\n" +

Review comment:
   I mean the `dbproperties` keyword.





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [flink] danny0405 commented on a change in pull request #11791: [FLINK-17210][sql-parser][hive] Implement database DDLs for Hive dialect

2020-04-26 Thread GitBox


danny0405 commented on a change in pull request #11791:
URL: https://github.com/apache/flink/pull/11791#discussion_r415275894



##
File path: flink-table/flink-sql-parser-hive/pom.xml
##
@@ -0,0 +1,217 @@
+
+
+http://maven.apache.org/POM/4.0.0";
+xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance";
+xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 
http://maven.apache.org/xsd/maven-4.0.0.xsd";>
+
+   4.0.0
+
+   
+   flink-table
+   org.apache.flink
+   1.11-SNAPSHOT
+   
+
+   flink-sql-parser-hive
+   flink-sql-parser-hive
+
+   jar
+
+   
+   
+   
+   
+
+   
+   
+   
+   org.apache.flink
+   flink-sql-parser
+   ${project.version}
+   provided
+   

Review comment:
   The provided scope is useless when it becomes an in-direct dependency.





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [flink] danny0405 commented on a change in pull request #11791: [FLINK-17210][sql-parser][hive] Implement database DDLs for Hive dialect

2020-04-26 Thread GitBox


danny0405 commented on a change in pull request #11791:
URL: https://github.com/apache/flink/pull/11791#discussion_r415275085



##
File path: flink-table/flink-table-planner-blink/pom.xml
##
@@ -100,6 +100,18 @@ under the License.


 
+   
+   org.apache.flink
+   flink-sql-parser-hive
+   ${project.version}
+   
+   

Review comment:
   Why we append a new dependency instead of override the 
`flink-sql-parser` ?





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [flink] danny0405 commented on a change in pull request #11791: [FLINK-17210][sql-parser][hive] Implement database DDLs for Hive dialect

2020-04-26 Thread GitBox


danny0405 commented on a change in pull request #11791:
URL: https://github.com/apache/flink/pull/11791#discussion_r415274910



##
File path: 
flink-table/flink-sql-parser-hive/src/test/java/org/apache/flink/sql/parser/hive/FlinkHiveSqlParserImplTest.java
##
@@ -0,0 +1,118 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.sql.parser.hive;
+
+import org.apache.flink.sql.parser.hive.impl.FlinkHiveSqlParserImpl;
+
+import org.apache.calcite.sql.parser.SqlParserImplFactory;
+import org.apache.calcite.sql.parser.SqlParserTest;
+import org.junit.Test;
+
+/**
+ * Tests for {@link FlinkHiveSqlParserImpl}.
+ */
+public class FlinkHiveSqlParserImplTest extends SqlParserTest {
+
+   @Override
+   protected SqlParserImplFactory parserImplFactory() {
+   return FlinkHiveSqlParserImpl.FACTORY;
+   }
+
+   // overrides test methods that we don't support
+   @Override
+   public void testDescribeStatement() {
+   }
+
+   @Override
+   public void testTableHintsInInsert() {
+   }
+
+   @Override
+   public void testDescribeSchema() {
+   }
+
+   @Test
+   public void testShowDatabases() {
+   sql("show databases").ok("SHOW DATABASES");
+   }
+
+   @Test
+   public void testUseDatabase() {
+   // use database
+   sql("use db1").ok("USE `DB1`");
+   }
+
+   @Test
+   public void testCreateDatabase() {
+   sql("create database db1")
+   .ok("CREATE DATABASE `DB1` WITH (\n" +
+   "  'is_generic' = 'false'\n" +
+   ")");
+   sql("create database db1 comment 'comment db1' location 
'/path/to/db1'")
+   .ok("CREATE DATABASE `DB1`\n" +
+   "COMMENT 'comment db1' WITH 
(\n" +
+   "  'is_generic' = 'false',\n" +
+   "  'database.location_uri' = 
'/path/to/db1'\n" +
+   ")");
+   sql("create database db1 with dbproperties 
('k1'='v1','k2'='v2')")
+   .ok("CREATE DATABASE `DB1` WITH (\n" +
+   "  'k1' = 'v1',\n" +
+   "  'k2' = 'v2',\n" +
+   "  'is_generic' = 'false'\n" +
+   ")");
+   }
+
+   @Test
+   public void testAlterDatabase() {
+   sql("alter database db1 set dbproperties('k1'='v1')")
+   .ok("ALTER DATABASE `DB1` SET (\n" +
+   "  'k1' = 'v1',\n" +
+   "  'alter.database.op' = 
'CHANGE_PROPS'\n" +

Review comment:
   The unparse SQL is different.





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [flink] danny0405 commented on a change in pull request #11791: [FLINK-17210][sql-parser][hive] Implement database DDLs for Hive dialect

2020-04-26 Thread GitBox


danny0405 commented on a change in pull request #11791:
URL: https://github.com/apache/flink/pull/11791#discussion_r415274850



##
File path: 
flink-table/flink-sql-parser-hive/src/test/java/org/apache/flink/sql/parser/hive/FlinkHiveSqlParserImplTest.java
##
@@ -0,0 +1,118 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.sql.parser.hive;
+
+import org.apache.flink.sql.parser.hive.impl.FlinkHiveSqlParserImpl;
+
+import org.apache.calcite.sql.parser.SqlParserImplFactory;
+import org.apache.calcite.sql.parser.SqlParserTest;
+import org.junit.Test;
+
+/**
+ * Tests for {@link FlinkHiveSqlParserImpl}.
+ */
+public class FlinkHiveSqlParserImplTest extends SqlParserTest {
+
+   @Override
+   protected SqlParserImplFactory parserImplFactory() {
+   return FlinkHiveSqlParserImpl.FACTORY;
+   }
+
+   // overrides test methods that we don't support
+   @Override
+   public void testDescribeStatement() {
+   }
+
+   @Override
+   public void testTableHintsInInsert() {
+   }
+
+   @Override
+   public void testDescribeSchema() {
+   }
+
+   @Test
+   public void testShowDatabases() {
+   sql("show databases").ok("SHOW DATABASES");
+   }
+
+   @Test
+   public void testUseDatabase() {
+   // use database
+   sql("use db1").ok("USE `DB1`");
+   }
+
+   @Test
+   public void testCreateDatabase() {
+   sql("create database db1")
+   .ok("CREATE DATABASE `DB1` WITH (\n" +
+   "  'is_generic' = 'false'\n" +
+   ")");
+   sql("create database db1 comment 'comment db1' location 
'/path/to/db1'")
+   .ok("CREATE DATABASE `DB1`\n" +
+   "COMMENT 'comment db1' WITH 
(\n" +
+   "  'is_generic' = 'false',\n" +
+   "  'database.location_uri' = 
'/path/to/db1'\n" +
+   ")");
+   sql("create database db1 with dbproperties 
('k1'='v1','k2'='v2')")
+   .ok("CREATE DATABASE `DB1` WITH (\n" +
+   "  'k1' = 'v1',\n" +
+   "  'k2' = 'v2',\n" +
+   "  'is_generic' = 'false'\n" +

Review comment:
   Why the unparse SQL is different ?





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [flink] danny0405 commented on a change in pull request #11791: [FLINK-17210][sql-parser][hive] Implement database DDLs for Hive dialect

2020-04-26 Thread GitBox


danny0405 commented on a change in pull request #11791:
URL: https://github.com/apache/flink/pull/11791#discussion_r415247355



##
File path: 
flink-connectors/flink-connector-hive/src/main/java/org/apache/flink/table/catalog/hive/HiveCatalog.java
##
@@ -1403,4 +1446,29 @@ public CatalogColumnStatistics 
getPartitionColumnStatistics(ObjectPath tablePath
}
}
 
+   private static boolean createObjectIsGeneric(Map 
properties) {
+   // When creating an object, a hive object needs explicitly have 
a key is_generic = false

Review comment:
   I'm also fine with name `isGenericForCreate` and `isGenericForGet`.





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [flink] danny0405 commented on a change in pull request #11791: [FLINK-17210][sql-parser][hive] Implement database DDLs for Hive dialect

2020-04-26 Thread GitBox


danny0405 commented on a change in pull request #11791:
URL: https://github.com/apache/flink/pull/11791#discussion_r415247136



##
File path: 
flink-connectors/flink-connector-hive/src/main/java/org/apache/flink/table/catalog/hive/HiveCatalog.java
##
@@ -247,7 +253,10 @@ public CatalogDatabase getDatabase(String databaseName) 
throws DatabaseNotExistE
 
Map properties = hiveDatabase.getParameters();
 
-   properties.put(HiveCatalogConfig.DATABASE_LOCATION_URI, 
hiveDatabase.getLocationUri());
+   boolean isGeneric = getObjectIsGeneric(properties);
+   if (!isGeneric) {
+   
properties.put(SqlCreateHiveDatabase.DATABASE_LOCATION_URI, 
hiveDatabase.getLocationUri());
+   }

Review comment:
   After another think i think it is okey, because we passes around the 
Hive specific properties through the `SqlNode` which is the root cause i think.
   
   We can have a refactor when we really have a new planner.





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [flink] danny0405 commented on a change in pull request #11791: [FLINK-17210][sql-parser][hive] Implement database DDLs for Hive dialect

2020-04-24 Thread GitBox


danny0405 commented on a change in pull request #11791:
URL: https://github.com/apache/flink/pull/11791#discussion_r414957216



##
File path: 
flink-table/flink-sql-parser/src/main/java/org/apache/flink/sql/parser/ddl/hive/HiveDDLUtils.java
##
@@ -0,0 +1,94 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.sql.parser.ddl.hive;
+
+import org.apache.flink.sql.parser.ddl.SqlTableOption;
+import org.apache.flink.sql.parser.impl.ParseException;
+import org.apache.flink.table.catalog.config.CatalogConfig;
+
+import org.apache.calcite.sql.SqlLiteral;
+import org.apache.calcite.sql.SqlNode;
+import org.apache.calcite.sql.SqlNodeList;
+import org.apache.calcite.sql.parser.SqlParserPos;
+
+import java.util.Arrays;
+import java.util.HashSet;
+import java.util.Set;
+
+import static 
org.apache.flink.sql.parser.ddl.hive.SqlAlterHiveDatabase.ALTER_DATABASE_OP;
+import static 
org.apache.flink.sql.parser.ddl.hive.SqlCreateHiveDatabase.DATABASE_LOCATION_URI;
+
+/**
+ * Util methods for Hive DDL Sql nodes.
+ */
+public class HiveDDLUtils {
+
+   private static final Set RESERVED_DB_PROPERTIES = new 
HashSet<>();
+

Review comment:
   As discussed offline, this is by-design intentionally, previously we 
design the Flink databases DDLs to have a extensible properties for such 
"hacky" extention way to make the planner code clean.
   
   While for the long term, if we also want to make a Hive dialect DML, we may 
need another planner which is another big story.





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [flink] danny0405 commented on a change in pull request #11791: [FLINK-17210][sql-parser][hive] Implement database DDLs for Hive dialect

2020-04-24 Thread GitBox


danny0405 commented on a change in pull request #11791:
URL: https://github.com/apache/flink/pull/11791#discussion_r414956968



##
File path: 
flink-connectors/flink-connector-hive/src/main/java/org/apache/flink/table/catalog/hive/HiveCatalog.java
##
@@ -1403,4 +1446,29 @@ public CatalogColumnStatistics 
getPartitionColumnStatistics(ObjectPath tablePath
}
}
 
+   private static boolean createObjectIsGeneric(Map 
properties) {
+   // When creating an object, a hive object needs explicitly have 
a key is_generic = false

Review comment:
   I think each method should take care itself, if it only appends a 
property item, how about name it "appendIsGenericIfNecessary" or something like 
that ?





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [flink] danny0405 commented on a change in pull request #11791: [FLINK-17210][sql-parser][hive] Implement database DDLs for Hive dialect

2020-04-24 Thread GitBox


danny0405 commented on a change in pull request #11791:
URL: https://github.com/apache/flink/pull/11791#discussion_r414956660



##
File path: 
flink-connectors/flink-connector-hive/src/main/java/org/apache/flink/table/catalog/hive/HiveCatalog.java
##
@@ -247,7 +253,10 @@ public CatalogDatabase getDatabase(String databaseName) 
throws DatabaseNotExistE
 
Map properties = hiveDatabase.getParameters();
 
-   properties.put(HiveCatalogConfig.DATABASE_LOCATION_URI, 
hiveDatabase.getLocationUri());
+   boolean isGeneric = getObjectIsGeneric(properties);
+   if (!isGeneric) {
+   
properties.put(SqlCreateHiveDatabase.DATABASE_LOCATION_URI, 
hiveDatabase.getLocationUri());
+   }

Review comment:
   But generally, the Catalog code should not see the `SqlNodes`, because 
the former belongs to the planner, the later belongs to parser.





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [flink] danny0405 commented on a change in pull request #11791: [FLINK-17210][sql-parser][hive] Implement database DDLs for Hive dialect

2020-04-24 Thread GitBox


danny0405 commented on a change in pull request #11791:
URL: https://github.com/apache/flink/pull/11791#discussion_r414956231



##
File path: 
flink-table/flink-sql-parser-hive/src/main/java/org/apache/flink/sql/parser/hive/package-info.java
##
@@ -0,0 +1,45 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to you under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+/**
+ * Flink sql parser for hive dialect.
+ *
+ * This module contains the DDLs and some custom DMLs for Apache Flink.
+ *
+ * Most of the sql grammars belong to sql standard or Flink's dialect. To 
support
+ * a new sql dialect, add a new sql conformance to
+ * {@link org.apache.flink.sql.parser.validate.FlinkSqlConformance},
+ * then use this sql conformance to make context aware decisions in parse 
block. See the usage of
+ * {@link org.apache.flink.sql.parser.validate.FlinkSqlConformance#HIVE} in 
{@code parserimpls.ftl}.
+ *
+ * To use a specific sql dialect for the parser, config the parser to the 
specific sql conformance
+ * with a code snippet like below:
+ * 
+ *   SqlParser.create(source,
+ * SqlParser.configBuilder()
+ * .setParserFactory(parserImplFactory())
+ * .setQuoting(Quoting.DOUBLE_QUOTE)
+ * .setUnquotedCasing(Casing.TO_UPPER)
+ * .setQuotedCasing(Casing.UNCHANGED)
+ * .setConformance(conformance0) // the sql 
conformance you want use.
+ * .build());
+ * 

Review comment:
   The comment id not correct now, we actually do not switch parser dialect 
with the `setConformance` interface.

##
File path: 
flink-table/flink-sql-parser-hive/src/main/java/org/apache/flink/sql/parser/hive/package-info.java
##
@@ -0,0 +1,45 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to you under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+/**
+ * Flink sql parser for hive dialect.
+ *
+ * This module contains the DDLs and some custom DMLs for Apache Flink.
+ *
+ * Most of the sql grammars belong to sql standard or Flink's dialect. To 
support
+ * a new sql dialect, add a new sql conformance to
+ * {@link org.apache.flink.sql.parser.validate.FlinkSqlConformance},
+ * then use this sql conformance to make context aware decisions in parse 
block. See the usage of
+ * {@link org.apache.flink.sql.parser.validate.FlinkSqlConformance#HIVE} in 
{@code parserimpls.ftl}.
+ *
+ * To use a specific sql dialect for the parser, config the parser to the 
specific sql conformance
+ * with a code snippet like below:
+ * 
+ *   SqlParser.create(source,
+ * SqlParser.configBuilder()
+ * .setParserFactory(parserImplFactory())
+ * .setQuoting(Quoting.DOUBLE_QUOTE)
+ * .setUnquotedCasing(Casing.TO_UPPER)
+ * .setQuotedCasing(Casing.UNCHANGED)
+ * .setConformance(conformance0) // the sql 
conformance you want use.
+ * .build());
+ * 

Review comment:
   The comment is not correct now, we actually do not switch parser dialect 
with the `setConformance` interface.





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [flink] danny0405 commented on a change in pull request #11791: [FLINK-17210][sql-parser][hive] Implement database DDLs for Hive dialect

2020-04-24 Thread GitBox


danny0405 commented on a change in pull request #11791:
URL: https://github.com/apache/flink/pull/11791#discussion_r414454031



##
File path: 
flink-connectors/flink-connector-hive/src/main/java/org/apache/flink/table/catalog/hive/HiveCatalog.java
##
@@ -1403,4 +1446,29 @@ public CatalogColumnStatistics 
getPartitionColumnStatistics(ObjectPath tablePath
}
}
 
+   private static boolean createObjectIsGeneric(Map 
properties) {
+   // When creating an object, a hive object needs explicitly have 
a key is_generic = false

Review comment:
   The implementation is not really `create` the object but only append a 
property there.





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [flink] danny0405 commented on a change in pull request #11791: [FLINK-17210][sql-parser][hive] Implement database DDLs for Hive dialect

2020-04-23 Thread GitBox


danny0405 commented on a change in pull request #11791:
URL: https://github.com/apache/flink/pull/11791#discussion_r414314520



##
File path: flink-table/flink-sql-parser/pom.xml
##
@@ -298,6 +348,23 @@ under the License.

${project.build.directory}/generated-sources/


+   
+   generate-sources
+   javacc-hive
+   
+   javacc
+   
+   
+   
${project.build.directory}/generated-sources-hive/

Review comment:
   The plugin is a mess, can we just create another module instead put the 
code in one ? And why we copy the parser file of Flink, it seems there is no 
any reuse parse code block.





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [flink] danny0405 commented on a change in pull request #11791: [FLINK-17210][sql-parser][hive] Implement database DDLs for Hive dialect

2020-04-23 Thread GitBox


danny0405 commented on a change in pull request #11791:
URL: https://github.com/apache/flink/pull/11791#discussion_r414313942



##
File path: 
flink-table/flink-sql-parser/src/main/java/org/apache/flink/sql/parser/ddl/hive/HiveDDLUtils.java
##
@@ -0,0 +1,94 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.sql.parser.ddl.hive;
+
+import org.apache.flink.sql.parser.ddl.SqlTableOption;
+import org.apache.flink.sql.parser.impl.ParseException;
+import org.apache.flink.table.catalog.config.CatalogConfig;
+
+import org.apache.calcite.sql.SqlLiteral;
+import org.apache.calcite.sql.SqlNode;
+import org.apache.calcite.sql.SqlNodeList;
+import org.apache.calcite.sql.parser.SqlParserPos;
+
+import java.util.Arrays;
+import java.util.HashSet;
+import java.util.Set;
+
+import static 
org.apache.flink.sql.parser.ddl.hive.SqlAlterHiveDatabase.ALTER_DATABASE_OP;
+import static 
org.apache.flink.sql.parser.ddl.hive.SqlCreateHiveDatabase.DATABASE_LOCATION_URI;
+
+/**
+ * Util methods for Hive DDL Sql nodes.
+ */
+public class HiveDDLUtils {
+
+   private static final Set RESERVED_DB_PROPERTIES = new 
HashSet<>();
+

Review comment:
   It seems hacky we put these properties internal through the table 
options, i think these properties should be kept in each SqlNode but not the 
property list.





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [flink] danny0405 commented on a change in pull request #11791: [FLINK-17210][sql-parser][hive] Implement database DDLs for Hive dialect

2020-04-23 Thread GitBox


danny0405 commented on a change in pull request #11791:
URL: https://github.com/apache/flink/pull/11791#discussion_r414308065



##
File path: flink-table/flink-sql-parser/src/main/codegen-hive/data/Parser.tdd
##
@@ -0,0 +1,547 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+{
+  # Generated parser implementation package and class name.
+  package: "org.apache.flink.sql.parser.impl",
+  class: "FlinkHiveSqlParserImpl",
+
+  # List of additional classes and packages to import.
+  # Example. "org.apache.calcite.sql.*", "java.util.List".
+  # Please keep the import classes in alphabetical order if new class is added.
+  imports: [
+"org.apache.flink.sql.parser.ddl.hive.HiveDDLUtils"
+"org.apache.flink.sql.parser.ddl.hive.SqlAlterHiveDatabaseLocation"
+"org.apache.flink.sql.parser.ddl.hive.SqlAlterHiveDatabaseOwner"
+"org.apache.flink.sql.parser.ddl.hive.SqlAlterHiveDatabaseProps"
+"org.apache.flink.sql.parser.ddl.hive.SqlCreateHiveDatabase"
+"org.apache.flink.sql.parser.ddl.SqlAlterDatabase"
+"org.apache.flink.sql.parser.ddl.SqlAlterTable"
+"org.apache.flink.sql.parser.ddl.SqlAlterTableProperties"
+"org.apache.flink.sql.parser.ddl.SqlAlterTableRename"
+"org.apache.flink.sql.parser.ddl.SqlCreateCatalog"
+"org.apache.flink.sql.parser.ddl.SqlCreateFunction"

Review comment:
   Remove the useless imports.





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [flink] danny0405 commented on a change in pull request #11791: [FLINK-17210][sql-parser][hive] Implement database DDLs for Hive dialect

2020-04-23 Thread GitBox


danny0405 commented on a change in pull request #11791:
URL: https://github.com/apache/flink/pull/11791#discussion_r414305379



##
File path: 
flink-connectors/flink-connector-hive/src/main/java/org/apache/flink/table/catalog/hive/HiveCatalog.java
##
@@ -1403,4 +1446,29 @@ public CatalogColumnStatistics 
getPartitionColumnStatistics(ObjectPath tablePath
}
}
 
+   private static boolean createObjectIsGeneric(Map 
properties) {
+   // When creating an object, a hive object needs explicitly have 
a key is_generic = false
+   // otherwise, this is a generic object if 1) the key is missing 
2) is_generic = true
+   // this is opposite to reading an object. See 
getObjectIsGeneric().
+   if (properties == null) {
+   return true;
+   }
+   boolean isGeneric;
+   if (!properties.containsKey(CatalogConfig.IS_GENERIC)) {
+   // must be a generic object
+   isGeneric = true;
+   properties.put(CatalogConfig.IS_GENERIC, 
String.valueOf(true));
+   } else {
+   isGeneric = 
Boolean.parseBoolean(properties.get(CatalogConfig.IS_GENERIC));
+   }
+   return isGeneric;
+   }
+
+   private static boolean getObjectIsGeneric(Map 
properties) {
+   // When retrieving an object, a generic object needs explicitly 
have a key is_generic = true

Review comment:
   How about name it `getObjectIsGenericDefaultFalse`





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [flink] danny0405 commented on a change in pull request #11791: [FLINK-17210][sql-parser][hive] Implement database DDLs for Hive dialect

2020-04-23 Thread GitBox


danny0405 commented on a change in pull request #11791:
URL: https://github.com/apache/flink/pull/11791#discussion_r414269563



##
File path: 
flink-connectors/flink-connector-hive/src/main/java/org/apache/flink/table/catalog/hive/HiveCatalog.java
##
@@ -247,7 +253,10 @@ public CatalogDatabase getDatabase(String databaseName) 
throws DatabaseNotExistE
 
Map properties = hiveDatabase.getParameters();
 
-   properties.put(HiveCatalogConfig.DATABASE_LOCATION_URI, 
hiveDatabase.getLocationUri());
+   boolean isGeneric = getObjectIsGeneric(properties);
+   if (!isGeneric) {
+   
properties.put(SqlCreateHiveDatabase.DATABASE_LOCATION_URI, 
hiveDatabase.getLocationUri());
+   }

Review comment:
   We usually put the properties key in the connector validator instead of 
the SqlNode, i think.

##
File path: 
flink-connectors/flink-connector-hive/src/main/java/org/apache/flink/table/catalog/hive/HiveCatalog.java
##
@@ -1403,4 +1446,29 @@ public CatalogColumnStatistics 
getPartitionColumnStatistics(ObjectPath tablePath
}
}
 
+   private static boolean createObjectIsGeneric(Map 
properties) {
+   // When creating an object, a hive object needs explicitly have 
a key is_generic = false

Review comment:
   How about name it `getObjectIsGenericDefaultTrue`





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org