Github user OopsOutOfMemory commented on a diff in the pull request: https://github.com/apache/spark/pull/4127#discussion_r23276556 --- Diff: sql/core/src/main/scala/org/apache/spark/sql/execution/commands.scala --- @@ -178,3 +180,34 @@ case class DescribeCommand( child.output.map(field => Row(field.name, field.dataType.toString, null)) } } + +/** + * :: DeveloperApi :: + */ +@DeveloperApi +case class DDLDescribeCommand( + dbName: Option[String], + tableName: String, isExtended: Boolean) extends RunnableCommand { + + override def run(sqlContext: SQLContext) = { + val tblRelation = dbName match { + case Some(db) => UnresolvedRelation(Seq(db, tableName)) + case None => UnresolvedRelation(Seq(tableName)) + } + val logicalRelation = sqlContext.executePlan(tblRelation).analyzed + val rows = new ArrayBuffer[Row]() + rows ++= logicalRelation.schema.fields.map{field => + Row(field.name, field.dataType.toSimpleString, null)} + + /* + * TODO if future support partition table, add header below: + * # Partition Information + * # col_name data_type comment --- End diff -- Here what I mean is to display the normal columns information first and then `append` the partitioned columns at bottom of the normal columns description. Like below: ``` CREATE TABLE temp_shengli ( viewTime int, userid bigint, page_url string, referrer_url string, ip string comment 'IP Address of the User' ) comment 'This is the page view table' PARTITIONED BY(date string, pos string) ```` To describe it: ``` viewtime int None userid bigint None page_url string None referrer_url string None ip string IP Address of the User date string None pos string None # Partition Information # col_name data_type comment date string None pos string None ```
--- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org