Github user gatorsmile commented on a diff in the pull request:

    https://github.com/apache/spark/pull/16561#discussion_r95947477
  
    --- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/view.scala 
---
    @@ -29,40 +29,31 @@ import org.apache.spark.sql.catalyst.rules.Rule
     
     /**
      * Make sure that a view's child plan produces the view's output 
attributes. We wrap the child
    - * with a Project and add an alias for each output attribute. The 
attributes are resolved by
    - * name. This should be only done after the batch of Resolution, because 
the view attributes are
    - * not completely resolved during the batch of Resolution.
    + * with a Project and add an alias for each output attribute by mapping 
the child output by index,
    + * if the view output doesn't have the same number of columns with the 
child output, throw an
    + * AnalysisException.
    + * This should be only done after the batch of Resolution, because the 
view attributes are not
    + * completely resolved during the batch of Resolution.
      */
     case class AliasViewChild(conf: CatalystConf) extends Rule[LogicalPlan] {
       override def apply(plan: LogicalPlan): LogicalPlan = plan 
resolveOperators {
         case v @ View(_, output, child) if child.resolved =>
    -      val resolver = conf.resolver
    -      val newOutput = output.map { attr =>
    -        val originAttr = findAttributeByName(attr.name, child.output, 
resolver)
    -        // The dataType of the output attributes may be not the same with 
that of the view output,
    -        // so we should cast the attribute to the dataType of the view 
output attribute. If the
    -        // cast can't perform, will throw an AnalysisException.
    -        Alias(Cast(originAttr, attr.dataType), attr.name)(exprId = 
attr.exprId,
    -          qualifier = attr.qualifier, explicitMetadata = 
Some(attr.metadata))
    +      if (output.length != child.output.length) {
    +        throw new AnalysisException(
    +          s"The view output ${output.mkString("[", ",", "]")} doesn't have 
the same number of " +
    +            s"columns with the child output ${child.output.mkString("[", 
",", "]")}")
    +      }
    +      val newOutput = output.zip(child.output).map {
    +        case (attr, originAttr) =>
    +          if (attr.dataType != originAttr.dataType) {
    --- End diff --
    
    ```
    hive> explain extended select * from testview;
    OK
    ABSTRACT SYNTAX TREE:
      
    TOK_QUERY
       TOK_FROM
          TOK_TABREF
             TOK_TABNAME
                testview
       TOK_INSERT
          TOK_DESTINATION
             TOK_DIR
                TOK_TMP_FILE
          TOK_SELECT
             TOK_SELEXPR
                TOK_ALLCOLREF
    
    
    STAGE DEPENDENCIES:
      Stage-0 is a root stage
    
    STAGE PLANS:
      Stage: Stage-0
        Fetch Operator
          limit: -1
          Processor Tree:
            TableScan
              alias: testtable
              Statistics: Num rows: 1 Data size: 10 Basic stats: COMPLETE 
Column stats: NONE
              GatherStats: false
              Select Operator
                expressions: a (type: bigint), b (type: tinyint)
                outputColumnNames: _col0, _col1
                Statistics: Num rows: 1 Data size: 10 Basic stats: COMPLETE 
Column stats: NONE
                ListSink
    ```
    
    **`expressions: a (type: bigint), b (type: tinyint)`**. I tried to alter 
the columns in the underlying tables to different types. I can see the types of 
view columns are always casted to the same one as the altered one
    
    



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to