Github user mgaido91 commented on a diff in the pull request:

    https://github.com/apache/spark/pull/22318#discussion_r214678100
  
    --- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/AttributeMap.scala
 ---
    @@ -26,6 +26,8 @@ object AttributeMap {
       def apply[A](kvs: Seq[(Attribute, A)]): AttributeMap[A] = {
         new AttributeMap(kvs.map(kv => (kv._1.exprId, kv)).toMap)
       }
    +
    +  def empty[A](): AttributeMap[A] = new AttributeMap(Map.empty)
    --- End diff --
    
    I think it'd be better if we can avoid creating a new Map every time. In 
Scala, this is usually handled creating an empty object and returning it on the 
invocation of empty (eg. refer to the `Nil` implementation for `List`). The 
only problem is that this would require changing `AttributeMap[A]` to 
`AttributeMap[+A]`. I am not sure whether this may break binary compatibility 
and therefore whether this is an acceptable change or not. In case it is, it 
would be great to do this I think.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to