Github user marmbrus commented on a diff in the pull request:

    https://github.com/apache/spark/pull/10184#discussion_r46991908
  
    --- Diff: sql/core/src/main/scala/org/apache/spark/sql/Dataset.scala ---
    @@ -67,15 +67,21 @@ class Dataset[T] private[sql](
         tEncoder: Encoder[T]) extends Queryable with Serializable {
     
       /**
    -   * An unresolved version of the internal encoder for the type of this 
dataset.  This one is marked
    -   * implicit so that we can use it when constructing new [[Dataset]] 
objects that have the same
    -   * object type (that will be possibly resolved to a different schema).
    +   * An unresolved version of the internal encoder for the type of this 
[[Dataset]].  This one is
    +   * marked implicit so that we can use it when constructing new 
[[Dataset]] objects that have the
    +   * same object type (that will be possibly resolved to a different 
schema).
        */
       private[sql] implicit val unresolvedTEncoder: ExpressionEncoder[T] = 
encoderFor(tEncoder)
     
       /** The encoder for this [[Dataset]] that has been resolved to its 
output schema. */
       private[sql] val resolvedTEncoder: ExpressionEncoder[T] =
    -    unresolvedTEncoder.resolve(queryExecution.analyzed.output, 
OuterScopes.outerScopes)
    +    unresolvedTEncoder.resolve(logicalPlan.output, OuterScopes.outerScopes)
    +
    +  /**
    +   * The encoder where the expressions used to construct an object from an 
input row have been
    +   * bound to the ordinals of the given schema.
    --- End diff --
    
    Nit: I'm going to change this to say `this [[Dataset]]'s output schema`


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to