[GitHub] [flink] godfreyhe commented on a change in pull request #11892: [FLINK-17112][table] Support DESCRIBE statement in Flink SQL

2020-05-08 Thread GitBox


godfreyhe commented on a change in pull request #11892:
URL: https://github.com/apache/flink/pull/11892#discussion_r422050608



##
File path: 
flink-table/flink-table-planner/src/main/scala/org/apache/flink/table/api/internal/TableEnvImpl.scala
##
@@ -766,7 +768,29 @@ abstract class TableEnvImpl(
   case descOperation: DescribeTableOperation =>
 val result = catalogManager.getTable(descOperation.getSqlIdentifier)
 if (result.isPresent) {
-  buildShowResult(Array(result.get().getTable.getSchema.toString))
+  val schema = result.get.getTable.getSchema
+  val fieldToWatermark =
+schema
+  .getWatermarkSpecs
+  .map(w => (w.getRowtimeAttribute, w.getWatermarkExpr)).toMap
+  val fieldToPrimaryKey = new JHashMap[String, String]()
+  if (schema.getPrimaryKey.isPresent) {
+val columns = schema.getPrimaryKey.get.getColumns.asScala
+columns.foreach(c => fieldToPrimaryKey.put(c, 
s"PRI(${columns.mkString(",")})"))
+  }
+  val data = Array.ofDim[String](schema.getFieldCount, 6)
+  schema.getTableColumns.asScala.zipWithIndex.foreach {
+case (c, i) => {
+  val logicalType = c.getType.getLogicalType
+  data(i)(0) = c.getName
+  data(i)(1) = StringUtils.removeEnd(logicalType.toString, " NOT 
NULL")
+  data(i)(2) = if (logicalType.isNullable) "true" else "false"
+  data(i)(3) = fieldToPrimaryKey.getOrDefault(c.getName, "(NULL)")
+  data(i)(4) = c.getExpr.orElse("(NULL)")
+  data(i)(5) = fieldToWatermark.getOrDefault(c.getName, "(NULL)")
+}
+  }
+  buildShowResult(Array("name", "type", "null", "key", "compute 
column", "watermark"), data)

Review comment:
   ditto

##
File path: 
flink-table/flink-table-api-java/src/main/java/org/apache/flink/table/api/internal/TableEnvironmentImpl.java
##
@@ -861,23 +865,32 @@ private TableResult executeOperation(Operation operation) 
{

catalogManager.getTable(describeTableOperation.getSqlIdentifier());
if (result.isPresent()) {
TableSchema schema = 
result.get().getTable().getSchema();
-   String[][] rows = Stream.concat(
+   Map fieldToWatermark =
+   schema.getWatermarkSpecs()
+   .stream()
+   
.collect(Collectors.toMap(WatermarkSpec::getRowtimeAttribute, 
WatermarkSpec::getWatermarkExpr));
+
+   Map fieldToPrimaryKey = new 
HashMap<>();
+   schema.getPrimaryKey().ifPresent((p) -> {
+   List columns = p.getColumns();
+   columns.forEach((c) -> 
fieldToPrimaryKey.put(c, String.format("PRI(%s)", String.join(",", columns;
+   });
+
+   String[][] rows =
schema.getTableColumns()
.stream()
-   .map((c) -> new 
String[]{
+   .map((c) -> {
+   LogicalType 
logicalType = c.getType().getLogicalType();
+   return new 
String[]{
c.getName(),
-   
c.getType().getLogicalType().toString(),
-   
c.getExpr().orElse("(NULL)")}),
-   schema.getWatermarkSpecs()
-   .stream()
-   .map((w) -> new 
String[]{
-   "WATERMARK",
-   "(NULL)",
-   
w.getWatermarkExpr()
-   })
-   ).toArray(String[][]::new);
-
-   return buildShowResult(new String[]{"name", 
"type", "expr"}, rows);
+   
StringUtils.removeEnd(logicalType.toString(), " NOT NULL"),
+   
logicalType.isNullable() ? "true" : "false",
+   
fieldToPrimaryKey.getOrDefault(c.getName(), "(NULL)"),
+   

[GitHub] [flink] godfreyhe commented on a change in pull request #11892: [FLINK-17112][table] Support DESCRIBE statement in Flink SQL

2020-04-26 Thread GitBox


godfreyhe commented on a change in pull request #11892:
URL: https://github.com/apache/flink/pull/11892#discussion_r415486982



##
File path: 
flink-table/flink-table-planner/src/main/scala/org/apache/flink/table/api/internal/TableEnvImpl.scala
##
@@ -721,6 +721,15 @@ abstract class TableEnvImpl(
 dropViewOperation.isIfExists)
 }
 TableResultImpl.TABLE_RESULT_OK
+  case descOperation: DescribeTableOperation =>
+val result = catalogManager.getTable(descOperation.getSqlIdentifier)
+if (result.isPresent) {
+  buildShowResult(Array(result.get().getTable.getSchema.toString))

Review comment:
   I find SQL parser support column comments 
([see](https://github.com/apache/flink/blob/master/flink-table/flink-sql-parser/src/test/java/org/apache/flink/sql/parser/FlinkSqlParserImplTest.java#L225)),
 but we do not store the comments into TableSchema. we ignore `comment` column.





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [flink] godfreyhe commented on a change in pull request #11892: [FLINK-17112][table] Support DESCRIBE statement in Flink SQL

2020-04-26 Thread GitBox


godfreyhe commented on a change in pull request #11892:
URL: https://github.com/apache/flink/pull/11892#discussion_r415473253



##
File path: 
flink-table/flink-table-planner/src/main/scala/org/apache/flink/table/api/internal/TableEnvImpl.scala
##
@@ -721,6 +721,15 @@ abstract class TableEnvImpl(
 dropViewOperation.isIfExists)
 }
 TableResultImpl.TABLE_RESULT_OK
+  case descOperation: DescribeTableOperation =>
+val result = catalogManager.getTable(descOperation.getSqlIdentifier)
+if (result.isPresent) {
+  buildShowResult(Array(result.get().getTable.getSchema.toString))

Review comment:
   I find the print (through `TableResult#print`) result of `describe xx` 
looks a little strange:
   ```
   ++
   | result |
   ++
   |  root
|-- a: INT
|-- b: S... |
   ++
   1 row in set
   ```
   
   Users have to parse the unstructured result if he/she want to use the result 
to do sth through  TableResult#collect method.
   
   How about we return a structured tableau result, e.g. 
   ```
   +-+-+-+
   | name|   type  |   comment   |
   +-+-+-+
   |  a  |   INT   | |
   |  b  |  STRING | |
   +-+-+-+
2 rows in set
   ```
   This is different from the describe result in SQL client.
   
   Anther thing we should consider is how to print `watermarkSpecs` and compute 
column.
   How about we add a column named `expr` to represent `watermarkSpecs` and 
compute column.
   ```
 +-+-+-+-+
 | name| type|   comment   |   expr  |
 +-+-+-+-+
 |  a  |INT  | |  (NULL) |
 |  b  |INT  | |   a + 1 |
 |  c  |TIMESTAMP(3) | |  (NULL) |
 | WATERMARK   |  (NULL) | | c AS now()  |
 +-+-+-+-+
   ```
   

##
File path: 
flink-table/flink-table-planner/src/main/scala/org/apache/flink/table/api/internal/TableEnvImpl.scala
##
@@ -721,6 +721,15 @@ abstract class TableEnvImpl(
 dropViewOperation.isIfExists)
 }
 TableResultImpl.TABLE_RESULT_OK
+  case descOperation: DescribeTableOperation =>
+val result = catalogManager.getTable(descOperation.getSqlIdentifier)
+if (result.isPresent) {
+  buildShowResult(Array(result.get().getTable.getSchema.toString))

Review comment:
   cc @twalthr 





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org