Github user cloud-fan commented on a diff in the pull request: https://github.com/apache/spark/pull/16561#discussion_r96157834 --- Diff: sql/hive/src/test/scala/org/apache/spark/sql/hive/execution/SQLViewSuite.scala --- @@ -680,21 +700,70 @@ class SQLViewSuite extends QueryTest with SQLTestUtils with TestHiveSingleton { } } - test("correctly handle type casting between view output and child output") { + test("resolve a view with custom column names") { withTable("testTable") { + spark.range(1, 10).selectExpr("id", "id + 1 id1").write.saveAsTable("testTable") withView("testView") { - spark.range(1, 10).toDF("id1").write.format("json").saveAsTable("testTable") - sql("CREATE VIEW testView AS SELECT * FROM testTable") + val testView = CatalogTable( + identifier = TableIdentifier("testView"), + tableType = CatalogTableType.VIEW, + storage = CatalogStorageFormat.empty, + schema = new StructType().add("x", "long").add("y", "long"), --- End diff -- let's think about how we will persistent view when we have custom column names. Ideally we will have a logical plan representing the view, a SQL statement of the view query, and a `Seq[String]` for the custom column names. 1. call `plan.schema` to get the view schema, and zip it with custom column names, to get the final schema and save it. Then use `plan.schema.map(_.name)` to generate the `VIEW_QUERY_OUTPUT_COLUMN_NAME` in table properties. 2. call `plan.schema` to get the view schema, and save it as the final schema. Then use custom column names to generate the `VIEW_QUERY_OUTPUT_COLUMN_NAME` in table properties. Personally I think option 2 is more natural, what do you think?
--- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org